sqoop1.4.6离线部署于hadoop2.6之上与hive导入导出数据
2017-08-28 05:47
555 查看
1) .下载最新的sqoop1.4.6安装包
sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar2) . 解压到/usr/local,跟hadoop同一级别
# tar -xzvf sqoop-1.4.6.bin__hadoop-2.0.4-alpha.tar.gz -C /usr/local # mv sqoop-1.4.6.bin__hadoop-2.0.4-alpha sqoop1.4
3) 添加环境变量
# vi /etc/profile export SQOOP_HOME=/usr/local/sqoop1.4 export PATH=.:$HADOOP_HOME/bin:$JAVA_HOME/bin:$HIVE_HOME/bin:$SQOOP_HOME/bin:$PATH # source /etc/profile
4) 编辑配置文件
-# cp sqoop-env-template.sh sqoop-env.sh-# vi sqoop-env.sh export HADOOP_COMMON_HOME=/usr/local/hadoop2.6/ export HADOOP_MAPRED_HOME=/usr/local/hadoop2.6/ export HBASE_HOME=/usr/local/hbase1.1 export HIVE_HOME=/usr/local/hive1.2 export ZOOCFGDIR=/usr/local/zk3.4/conf
-# vi sqoop-site.xml <property> <property> <name>sqoop.metastore.client.autoconnect.username</name> <value>root</value> <description>The username to bind to the metastore. </description> </property> <property> <name>sqoop.metastore.client.autoconnect.password</name> <value>123456</value> <description>The password to bind to the metastore. </description> </property>
5) 复制需要的类
# cp $HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar $SQOOP_HOME/lib # cp /root/Downloads/mysql-connector-java-5.1.39-bin.jar $SQOOP_HOME/lib
6) Hive元数据切换至mysql中
a). 创建hive元数据库mysql> create database hive; mysql> CREATE USER 'hive' IDENTIFIED BY 'mysql'; mysql> grant all privileges on *.* to 'root'@'%' identified by '123456' with grant option; mysql> GRANT ALL PRIVILEGES ON *.* TO 'hive'@'%' WITH GRANT OPTION; mysql> flush privileges;
b). hive-site.xml编辑
<!-- hive的元数据存放于mysql中--> <property> <name>hive.metastore.local</name> <value>true</value> </property> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://192.168.220.20:3306/hive?useSSL=false&characterEncoding=UTF-8</value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>mysql</value> </property>
c).把mysql的驱动包放到hive的lib中
# cp /root/Downloads/mysql-connector-java-5.1.39-bin.jar /usr/local/hive1.2/lib
d).关闭mysql的SSL功能
编辑my.cnf配置文件,在mysqld后面添加一句:skip_ssl mysql> show variables like '%ssl%';
7) Mysql创建测试表
mysql> create database test; mysql> use test; mysql> create table smq_mysql(id int,name varchar(50)); mysql> insert into smq_mysql values(1,'a1'); mysql> insert into smq_mysql values(2,'a2'); mysql> commit;
8) Sqoop连接mysql
[root@master conf]# sqoop list-tables --connect jdbc:mysql://192.168.220.20:3306/test --username root --password 123456
9) Sqoop创建表
[root@master conf]# sqoop create-hive-table --connect jdbc:mysql://192.168.220.20:3306/test --username root --password 123456 --table smq_mysql --hive-table test.smq_mysql --fields-terminated-by ',' --hive-overwrite
10) Sqoop导入hive
[root@master conf]# sqoop import --connect jdbc:mysql://192.168.220.20:3306/test --username root --password 123456 --table smq_mysql --hive-table test.smq_mysql --hive-import --fields-terminated-by ',' --hive-overwrite -m 1
11) Sqoop导出hive
[root@master bin]# hadoop fs -ls /user/hive/warehouse [root@master bin]# hadoop fs -ls /user/root/.Trash/Current/user/hive/warehouse mysql> create table exp_smq_mysql as select * from smq_mysql where 1=2; [root@master ~]# sqoop export -connect jdbc:mysql://192.168.220.20:3306/test -username root -password 123456 -table exp_smq_mysql -export-dir /user/hive/warehouse/test.db/smq_mysql
相关文章推荐
- [hadoop读书笔记] 第十五章 sqoop1.4.6小实验 - 将mysq数据导入hive
- Hadoop Hive概念学习系列之HDFS、Hive、MySQL、Sqoop之间的数据导入导出(强烈建议去看)(十八)
- 大数据基础(二)hadoop, mave, hbase, hive, sqoop在ubuntu 14.04.04下的安装和sqoop与hdfs,hive,mysql导入导出
- Sqoop-1.4.6安装配置及Mysql->HDFS->Hive数据导入(基于Hadoop2.7.3)
- Hadoop数据工具sqoop,导入HDFS,HIVE,HBASE,导出到oracle
- 利用sqoop将hive数据导入导出数据到mysql
- Sqoop -- 用于Hadoop与关系数据库间数据导入导出工作的工具
- Sqoop -- 用于Hadoop与关系数据库间数据导入导出工作的工具
- 用Sqoop2在Mysql和hadoop导入导出数据
- hadoop平台下的数据导入导出工具Sqoop
- hadoop2.6.0(单节点)下Sqoop-1.4.6安装与配置(数据读取涉及hadoop、hbase和hive)
- Hive数据导入、sqoop数据导入导出
- sqoop-1.4.4导入导出mysql数据到hadoop2.2.0 HDSF集群
- Sqoop_具体总结 使用Sqoop将HDFS/Hive/HBase与MySQL/Oracle中的数据相互导入、导出
- 从零自学Hadoop(17):Hive数据导入导出,集群数据迁移下
- [Hadoop大数据]——Hive数据的导入导出
- 从零自学Hadoop(16):Hive数据导入导出,集群数据迁移上
- Hadoop记录——hive中的decimal字段,shell的数据类型以及sqoop导出空string报错
- Sqoop_详细总结 使用Sqoop将HDFS/Hive/HBase与MySQL/Oracle中的数据相互导入、导出
- linux中sqoop部署以及实现mysql数据导入hive