sqoop导出hive数据到mysql错误: Can't parse input data java.util.NoSuchElementException解决办法
2017-04-24 17:28
671 查看
错误日志:
2017-04-24 16:51:10,638 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1492673938247_0831_m_000002_0: Error: java.io.IOException: Can't export data, please check failed map
task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:122)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.RuntimeException: Can't parse input data: 'null'
at pro_supply_correct_error.__loadFromFields(pro_supply_correct_error.java:1109)
at pro_supply_correct_error.parse(pro_supply_correct_error.java:952)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
... 10 more
Caused by: java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:834)
at pro_supply_correct_error.__loadFromFields(pro_supply_correct_error.java:1104)
... 12 more
数据:76888651~厂家直销860型彩钢瓦设备~41026516~建材生产加工机械~113175~122174~2015-04-27 09:43:52~1~1~-~2017-04-21 17:50:51~建筑、建材>钢格板~0.0~2~2~彩钢瓦设备~122146,131113106,122132103,122151,122144~建筑、建材>钢格板,冶金矿产>金属网>金属板网,建筑、建材>梯类>楼梯及配件,建筑、建材>网格板及格栅板,建筑、建材>金属建材~1.00,0.00,0.00,0.00,0.00~null~null~null
sqoop导出语句:sqoop export --connect "jdbc:mysql://node1:3306/procorrect?useUnicode=true&characterEncoding=utf-8" --username bigdata --password bigdata99114 --table pro_supply_correct_error --export-dir /user/hive/warehouse/procorrect.db/pro_supply_correct_error
--input-fields-terminated-by '~' --update-key supply_id --update-mode allowinsert
错误原因:hive表字段分隔符的问题,sqoop解析‘~’出现错误。
解决办法:修改hive表字段分隔符为'\t',然后sqoop导出语句改为
sqoop export --connect "jdbc:mysql://node1:3306/procorrect?useUnicode=true&characterEncoding=utf-8" --username bigdata --password bigdata99114 --table pro_supply_correct_error --export-dir /user/hive/warehouse/procorrect.db/pro_supply_correct_error --input-fields-terminated-by
'\t' --update-key supply_id --update-mode allowinsert
2017-04-24 16:51:10,638 INFO [AsyncDispatcher event handler] org.apache.hadoop.mapreduce.v2.app.job.impl.TaskAttemptImpl: Diagnostics report from attempt_1492673938247_0831_m_000002_0: Error: java.io.IOException: Can't export data, please check failed map
task logs
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:122)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:39)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.lang.RuntimeException: Can't parse input data: 'null'
at pro_supply_correct_error.__loadFromFields(pro_supply_correct_error.java:1109)
at pro_supply_correct_error.parse(pro_supply_correct_error.java:952)
at org.apache.sqoop.mapreduce.TextExportMapper.map(TextExportMapper.java:89)
... 10 more
Caused by: java.util.NoSuchElementException
at java.util.ArrayList$Itr.next(ArrayList.java:834)
at pro_supply_correct_error.__loadFromFields(pro_supply_correct_error.java:1104)
... 12 more
数据:76888651~厂家直销860型彩钢瓦设备~41026516~建材生产加工机械~113175~122174~2015-04-27 09:43:52~1~1~-~2017-04-21 17:50:51~建筑、建材>钢格板~0.0~2~2~彩钢瓦设备~122146,131113106,122132103,122151,122144~建筑、建材>钢格板,冶金矿产>金属网>金属板网,建筑、建材>梯类>楼梯及配件,建筑、建材>网格板及格栅板,建筑、建材>金属建材~1.00,0.00,0.00,0.00,0.00~null~null~null
sqoop导出语句:sqoop export --connect "jdbc:mysql://node1:3306/procorrect?useUnicode=true&characterEncoding=utf-8" --username bigdata --password bigdata99114 --table pro_supply_correct_error --export-dir /user/hive/warehouse/procorrect.db/pro_supply_correct_error
--input-fields-terminated-by '~' --update-key supply_id --update-mode allowinsert
错误原因:hive表字段分隔符的问题,sqoop解析‘~’出现错误。
解决办法:修改hive表字段分隔符为'\t',然后sqoop导出语句改为
sqoop export --connect "jdbc:mysql://node1:3306/procorrect?useUnicode=true&characterEncoding=utf-8" --username bigdata --password bigdata99114 --table pro_supply_correct_error --export-dir /user/hive/warehouse/procorrect.db/pro_supply_correct_error --input-fields-terminated-by
'\t' --update-key supply_id --update-mode allowinsert
相关文章推荐
- sqoop导出mysql数据进入hive错误
- sqoop导出mysql数据进入hive错误
- 利用sqoop将hive数据导入导出数据到mysql
- 大数据基础(二)hadoop, mave, hbase, hive, sqoop在ubuntu 14.04.04下的安装和sqoop与hdfs,hive,mysql导入导出
- 使用Sqoop将HDFS/Hive/HBase与MySQL/Oracle中的数据相互导入、导出
- 利用sqoop将hive数据导入导出数据到mysql
- Sqoop_详细总结 使用Sqoop将HDFS/Hive/HBase与MySQL/Oracle中的数据相互导入、导出
- 利用sqoop将hive数据导入导出数据到mysql
- 使用Sqoop将HDFS/Hive/HBase与MySQL/Oracle中的数据相互导入、导出
- [Sqoop]将Hive数据表导出到Mysql
- sqoop导出hive表数据到mysql
- 使用 sqoop从MySQL增量导出数据到hive
- [Sqoop]将Hive数据表导出到Mysql
- Hadoop Hive概念学习系列之HDFS、Hive、MySQL、Sqoop之间的数据导入导出(强烈建议去看)(十八)
- 利用sqoop将hive数据导入导出数据到mysql
- Sqoop_详细总结 使用Sqoop将HDFS/Hive/HBase与MySQL/Oracle中的数据相互导入、导出
- sqoop导出hive数据到mysql避免空值
- 利用sqoop将hive数据导入导出数据到mysql
- sqoop 导出 hive分区表 数据到 mysql