您的位置:首页 > 运维架构

sqoop1.4.6命令使用(一)

2017-04-01 16:14 218 查看
#### 列出所有数据库(可用于测试连接)

sqoop-list-databases --connect jdbc:mysql://test104:3306 --username root --password 123456


#### 列出所有表

sqoop-list-tables --connect jdbc:mysql://test104:3306/sqoop --username root --password 123456


导出mysql表到hdfs

其中的-D mapred.job.queue.name=lesson是用来指定yarn的执行队列。

–m 1用来指定map任务个数为1。

sqoop import -D mapred.job.queue.name=lesson --connect jdbc:mysql://test104:3306/sqoop --username root --password 123456 --table test01 --m 1 --target-dir /my/user/sqoop


#### 导出mysql表全部数据到hive

mysql表

mysql> select * from mysql_hive;
+----+------+------+
| id | name | age  |
+----+------+------+
|  1 | lb   |   18 |
|  2 | gy   |   17 |
|  3 | zf   |   16 |
+----+------+------+
3 rows in set (0.00 sec)


sqoop命令创建hive表

sqoop create-hive-table  --connect jdbc:mysql://test104:3306/sqoop_hive --table mysql_hive --hive-table mysql_hive  --username root --password 123456


sqoop命令将mysql表中数据全部倒入hive(不需要先执行上一步)

sqoop import -D mapred.job.queue.name=lesson --connect jdbc:mysql://test104:3306/sqoop_hive --username root --password 123456 --table mysql_hive --m 1 --hive-import   --hive-database sqoop_hive --hive-overwrite   --hive-table mysql_hive


hive表

hive (sqoop_hive)> show tables;
OK
mysql_hive
Time taken: 0.026 seconds, Fetched: 1 row(s)
hive (sqoop_hive)> select * from mysql_hive;
OK
1   lb  18
2   gy  17
3   zf  16
Time taken: 0.135 seconds, Fetched: 3 row(s)


可能出现问题:

- 直接执行报错

Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.

java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf


解决:

在 /opt/behApache/conf/beh_env中添加:

export HADOOP_CLASSPATH=$HADOOP_CLASSPATH:$HADOOP_COMMON_HOME/share/hadoop/common:$HADOOP_COMMON_HOME/share/hadoop/common/lib:$HADOOP_HDFS_HOME/share/hadoop/hdfs:$HADOOP_HDFS_HOME/share/hadoop/hdfs/lib:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce:$HADOOP_MAPRED_HOME/share/hadoop/mapreduce/lib:$HADOOP_YARN_HOME/share/hadoop/yarn:$HADOOP_YARN_HOME/share/hadoop/yarn/lib:$HIVE_HOME/lib/*


- 运行到下面步骤,直接退出。

17/03/07 16:31:35 INFO hive.HiveImport: Loading uploaded data into Hive

Logging initialized using configuration in jar:file:/opt/behApache/core/hive/lib/hive-common-1.1.0-cdh5.7.5.jar!/hive-log4j.properties


原因:jar包缺失

解决:

1.将sqoop-1.4.6-cdh5.7.5.jar导入$HIVE_HOME/lib目录下

2.将sqoop-1.4.6-cdh5.7.5.jar和mysql-connector-java-5.1.30.jar导入$HADOOP_HOME/share/hadoop/common

3.确保hadoop集群所有节点都正常运行

4.–hive-overwrite已经有了–create-hive-table 的功能,不能共存

正确运行:

17/03/08 14:23:28 INFO hive.HiveImport: Loading uploaded data into Hive
17/03/08 14:23:28 WARN conf.HiveConf: HiveConf of name hive.cli.print.row.to.vertical does not exist
17/03/08 14:23:28 WARN conf.HiveConf: HiveConf of name hive.files.umask.value does not exist
17/03/08 14:23:28 WARN conf.HiveConf: HiveConf of name hive.cli.print.row.to.vertical does not exist
17/03/08 14:23:28 WARN conf.HiveConf: HiveConf of name hive.files.umask.value does not exist

Logging initialized using configuration in file:/opt/behApache/core/hive/conf/hive-log4j.properties
OK
Time taken: 0.986 seconds
Loading data to table sqoop_hive.mysql_hive
Moved: 'viewfs://beh/my/user/hive/warehouse/sqoop_hive.db/mysql_hive/part-m-00000' to trash at: hdfs://behApache/user/hadoop/.Trash/Current
Table sqoop_hive.mysql_hive stats: [numFiles=1, numRows=0, totalSize=24, rawDataSize=0]
OK
Time taken: 0.803 seconds


#### 导出hive表全部数据到mysql

先清空mysql表

mysql> select * from mysql_hive
-> ;
+----+------+------+
| id | name | age  |
+----+------+------+
|  1 | lb   |   18 |
+----+------+------+
1 row in set (0.00 sec)

mysql> truncate mysql_hive;
Query OK, 0 rows affected (0.34 sec)
mysql> select * from mysql_hive;
Empty set (0.00 sec)


执行:

sqoop export -D mapred.job.queue.name=work --connect jdbc:mysql://test104:3306/sqoop_hive --username root  --password 123456 --table mysql_hive --export-dir /my/user/hive/warehouse/sqoop_hive.db/mysql_hive/  --input-fields-terminated-by '\001'


疑问:

日志显示如下

17/03/14 11:15:38 INFO mapreduce.Job:  map 0% reduce 0%
17/03/14 11:15:46 INFO mapreduce.Job:  map 100% reduce 0%
17/03/14 11:15:47 INFO mapreduce.Job: Job job_1488528773171_0081 failed with state FAILED due to: Task failed task_1488528773171_0081_m_000002
Job failed as tasks failed. failedMaps:1 failedReduces:0
----------------------------------------
17/03/14 11:15:47 INFO mapreduce.ExportJobBase: Transferred 487 bytes in 20.0146 seconds (24.3323 bytes/sec)
17/03/14 11:15:47 INFO mapreduce.ExportJobBase: Exported 1 records.
17/03/14 11:15:47 ERROR tool.ExportTool: Error during export: Export job failed!


显示报错,但Exported 1 records.表示导入成功

查看mysql表:

mysql> select * from mysql_hive;
+----+------+------+
| id | name | age  |
+----+------+------+
|  1 | lb   |   18 |
+----+------+------+
1 row in set (0.00 sec)


发现有数据了。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  sqoop sqoop-使用