hive 1.x 版本 编译 安装配置及遇到的一些问题
2018-02-07 23:55
483 查看
http://blog.csdn.net/wjl7813/article/details/79101837 Hadoop编译
http://blog.csdn.net/wjl7813/article/details/79101817 Hadoop 伪分布式安装
之前编译并安装配置好 hadoop-2.6.0-cdh5.7.0,所以这里也hive这里也采用编译安装的方式
wget http://archive.cloudera.com/cdh5/cdh/5/hive-1.1.0-cdh5.7.1-src.tar.gz tar xf hive-1.1.0-cdh5.7.1-src.tar.gz
cd hive-1.1.0-cdh5.7.1
mvn clean package -DskipTests -Phadoop-2 -Pdist
[INFO] Executed tasks
[INFO]
[INFO] --- maven-assembly-plugin:2.3:single (assemble) @ hive-packaging ---
[INFO] Reading assembly descriptor: src/main/assembly/bin.xml
[INFO] Reading assembly descriptor: src/main/assembly/src.xml
[INFO] Copying files to /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-bin
[WARNING] Assembly file: /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-bin is not a regular file (it may be a directory). It cannot be attached to the project build for installation or deployment.
[INFO] Building tar: /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-bin.tar.gz
[INFO] Building tar: /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-src.tar.gz
[INFO]
[INFO] --- maven-dependency-plugin:2.8:copy (copy) @ hive-packaging ---
[INFO] Configured Artifact: org.apache.hive:hive-jdbc:standalone:1.1.0-cdh5.7.0:jar
[INFO] Copying hive-jdbc-1.1.0-cdh5.7.0-standalone.jar to /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-jdbc.jar
[INFO]
[INFO] --- build-helper-mave
4000
n-plugin:1.8:attach-artifact (attach-jdbc-driver) @ hive-packaging ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Hive ............................................... SUCCESS [ 12.400 s]
[INFO] Hive Shims Common .................................. SUCCESS [ 16.114 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [ 8.276 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [ 2.057 s]
[INFO] Hive Shims ......................................... SUCCESS [ 1.874 s]
[INFO] Hive Common ........................................ SUCCESS [ 17.247 s]
[INFO] Hive Serde ......................................... SUCCESS [ 8.692 s]
[INFO] Hive Metastore ..................................... SUCCESS [ 18.549 s]
[INFO] Hive Ant Utilities ................................. SUCCESS [ 3.011 s]
[INFO] Spark Remote Client ................................ SUCCESS [ 17.263 s]
[INFO] Hive Query Language ................................ SUCCESS [01:43 min]
[INFO] Hive Service ....................................... SUCCESS [ 44.240 s]
[INFO] Hive Accumulo Handler .............................. SUCCESS [ 8.814 s]
[INFO] Hive JDBC .......................................... SUCCESS [01:12 min]
[INFO] Hive Beeline ....................................... SUCCESS [ 3.989 s]
[INFO] Hive CLI ........................................... SUCCESS [ 2.731 s]
[INFO] Hive Contrib ....................................... SUCCESS [ 5.466 s]
[INFO] Hive HBase Handler ................................. SUCCESS [ 10.460 s]
[INFO] Hive HCatalog ...................................... SUCCESS [ 0.859 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [ 5.771 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 3.890 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 3.606 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 3.527 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 13.319 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [ 3.958 s]
[INFO] Hive HWI ........................................... SUCCESS [ 2.189 s]
[INFO] Hive ODBC .......................................... SUCCESS [ 3.507 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [ 0.710 s]
[INFO] Hive TestUtils ..................................... SUCCESS [ 1.086 s]
[INFO] Hive Packaging ..................................... SUCCESS [01:45 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 08:28 min
[INFO] Finished at: 2018-02-09T10:50:56+08:00
[INFO] Final Memory: 205M/845M
[INFO] ------------------------------------------------------------------------
[hadoop@node1 software]$ tar xf hive-1.1.0-cdh5.7.0.tar.gz -C /home/hadoop/app
[hadoop@node1 hive-1.1.0-cdh5.7.0]$ pwd
/home/hadoop/app/hive-1.1.0-cdh5.7.0
[hadoop@node1 hive-1.1.0-cdh5.7.0]$ cat /home/hadoop/.bash_profile |grep HIVE
###HIVE_HOME
export HIVE_HOME=/home/hadoop/app/hive-1.1.0-cdh5.7.0
export PATH=$HIVE_HOME/bin:$PATH
source /home/hadoop/.bash_profile
修改 hive-env.sh 中Hadoop_home
[hadoop@node1 conf]$ cat hive-env.sh |grep HADOOP_HOME
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0
修改 hive-site.xml 文件
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://node1.oracle.com:3306/hive?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
<property>
<name>hive.cli.print.current.db</name>
<value>true</value>
</property>
<property>
<name>hive.cli.print.header</name>
<value>true</value>
</property>
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>node1.oracle.com</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
</configuration>
[hadoop@node1 software]$ pwd
/home/hadoop/software
[hadoop@node1 software]$ tar xf mysql-connector-java-5.1.40-bin.jar
[hadoop@node1 software]$ cp -rp mysql-connector-java-5.1.40-bin.jar /home/hadoop/app/hive-1.1.0-cdh5.7.0/lib/
登录MySQL 创建hive数据库和授权hive用户
create database hive;
alter database hive character set latin1;
grant all privileges on *.* to 'hive'@localhost identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@127.0.0.1 identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@'%' identified by 'hive' with grant option;
DELETE FROM `mysql`.`user` WHERE `user`='';
FLUSH PRIVILEGES;
启动 hive
[hadoop@node1 ~]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/lib/spark-assembly-1.6.1-hadoop2.6.0-cdh5.7.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
which: no hbase in (/home/hadoop/app/hive-1.1.0-cdh5.7.0/bin:/usr/local/mysql/bin:/usr/java/jdk1.7.0_79/bin:/usr/java/jdk1.7.0_79/bin:/usr/java/jdk1.7.0_79/jre/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/home/hadoop/bin:/home/hadoop/app/apache-maven-3.3.9/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/sbin:/home/hadoop/app/scala-2.10.4/bin:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/bin:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/sbin:/home/hadoop/app/hue-3.9.0-cdh5.7.0/bin:/home/hadoop/app/zookeeper-3.4.5-cdh5.7.0/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/lib/spark-assembly-1.6.1-hadoop2.6.0-cdh5.7.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/home/hadoop/app/hive-1.1.0-cdh5.7.0/lib/hive-common-1.1.0-cdh5.7.0.jar!/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive (default)>
[hadoop@node1 ~]$ more emp.txt
7369 SMITH CLERK 7902 1980-12-17 800.00 20
7499 ALLEN SALESMAN 7698 1981-2-20 1600.00 300.00 30
7521 WARD SALESMAN 7698 1981-2-22 1250.00 500.00 30
7566 JONES MANAGER 7839 1981-4-2 2975.00 20
7654 MARTIN SALESMAN 7698 1981-9-28 1250.00 1400.00 30
7698 BLAKE MANAGER 7839 1981-5-1 2850.00 30
7782 CLARK MANAGER 7839 1981-6-9 2450.00 10
7788 SCOTT ANALYST 7566 1987-4-19 3000.00 20
7839 KING PRESIDENT 1981-11-17 5000.00 10
7844 TURNER SALESMAN 7698 1981-9-8 1500.00 0.00 30
7876 ADAMS CLERK 7788 1987-5-23 1100.00 20
7900 JAMES CLERK 7698 1981-12-3 950.00 30
7902 FORD ANALYST 7566 1981-12-3 3000.00 20
7934 MILLER CLERK 7782 1982-1-23 1300.00 10
[hadoop@node1 ~]$ more dept.txt
10 ACCOUNTING NEW YORK
20 RESEARCH DALLAS
30 SALES CHICAGO
40 OPERATIONS BOSTON
[hadoop@node1 data]$ more spark_stud_info.txt
wujiadong 26
ji 24
sun 27
xu 25
[hadoop@node1 data]$ more spark_stud_score.txt
wujiadong 90
ji 100
sun 99
xu 99
============= 在 默认数据库 创建 表 =============
CREATE TABLE `dept`(
`deptno` int,
`dname` string,
`loc` string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t';
CREATE TABLE `emp`(
`empno` int,
`ename` string,
`job` string,
`mgr` int,
`hiredate` string,
`sal` double,
`comm` double,
`deptno` int)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t' ;
load data local inpath '/home/hadoop/data/emp.txt' into table emp;
load data local inpath '/home/hadoop/data/dept.txt' into table dept;
create table spark_stud_score(
name string ,
store int
)
row format delimited fields terminated by '\t' ;
create table spark_stud_info(
name string,
age int
)
row format delimited fields terminated by '\t' STORED AS textfile ;
;
load data local inpath '/home/hadoop/data/spark_stud_info.txt' into table spark_stud_info;
load data local inpath '/home/hadoop/data/spark_stud_score.txt' into table spark_stud_score;
============== 遇到的一些小问题 ==
问题(一)
hive> CREATE TABLE pokes (foo INT, bar STRING);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.)
遇到上面的问题 处理方式 将hive 数据库的 编码格式 改为如下即可
alter database hive character set latin1;
问题(二)权限问题
grant all privileges on *.* to 'hive'@localhost identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@127.0.0.1 identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@'%' identified by 'hive' with grant option;
DELETE FROM `mysql`.`user` WHERE `user`='';
FLUSH PRIVILEGES;
http://blog.csdn.net/wjl7813/article/details/79101817 Hadoop 伪分布式安装
之前编译并安装配置好 hadoop-2.6.0-cdh5.7.0,所以这里也hive这里也采用编译安装的方式
wget http://archive.cloudera.com/cdh5/cdh/5/hive-1.1.0-cdh5.7.1-src.tar.gz tar xf hive-1.1.0-cdh5.7.1-src.tar.gz
cd hive-1.1.0-cdh5.7.1
mvn clean package -DskipTests -Phadoop-2 -Pdist
[INFO] Executed tasks
[INFO]
[INFO] --- maven-assembly-plugin:2.3:single (assemble) @ hive-packaging ---
[INFO] Reading assembly descriptor: src/main/assembly/bin.xml
[INFO] Reading assembly descriptor: src/main/assembly/src.xml
[INFO] Copying files to /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-bin
[WARNING] Assembly file: /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-bin is not a regular file (it may be a directory). It cannot be attached to the project build for installation or deployment.
[INFO] Building tar: /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-bin.tar.gz
[INFO] Building tar: /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-src.tar.gz
[INFO]
[INFO] --- maven-dependency-plugin:2.8:copy (copy) @ hive-packaging ---
[INFO] Configured Artifact: org.apache.hive:hive-jdbc:standalone:1.1.0-cdh5.7.0:jar
[INFO] Copying hive-jdbc-1.1.0-cdh5.7.0-standalone.jar to /home/hadoop/source/hive-1.1.0-cdh5.7.0/packaging/target/apache-hive-1.1.0-cdh5.7.0-jdbc.jar
[INFO]
[INFO] --- build-helper-mave
4000
n-plugin:1.8:attach-artifact (attach-jdbc-driver) @ hive-packaging ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Hive ............................................... SUCCESS [ 12.400 s]
[INFO] Hive Shims Common .................................. SUCCESS [ 16.114 s]
[INFO] Hive Shims 0.23 .................................... SUCCESS [ 8.276 s]
[INFO] Hive Shims Scheduler ............................... SUCCESS [ 2.057 s]
[INFO] Hive Shims ......................................... SUCCESS [ 1.874 s]
[INFO] Hive Common ........................................ SUCCESS [ 17.247 s]
[INFO] Hive Serde ......................................... SUCCESS [ 8.692 s]
[INFO] Hive Metastore ..................................... SUCCESS [ 18.549 s]
[INFO] Hive Ant Utilities ................................. SUCCESS [ 3.011 s]
[INFO] Spark Remote Client ................................ SUCCESS [ 17.263 s]
[INFO] Hive Query Language ................................ SUCCESS [01:43 min]
[INFO] Hive Service ....................................... SUCCESS [ 44.240 s]
[INFO] Hive Accumulo Handler .............................. SUCCESS [ 8.814 s]
[INFO] Hive JDBC .......................................... SUCCESS [01:12 min]
[INFO] Hive Beeline ....................................... SUCCESS [ 3.989 s]
[INFO] Hive CLI ........................................... SUCCESS [ 2.731 s]
[INFO] Hive Contrib ....................................... SUCCESS [ 5.466 s]
[INFO] Hive HBase Handler ................................. SUCCESS [ 10.460 s]
[INFO] Hive HCatalog ...................................... SUCCESS [ 0.859 s]
[INFO] Hive HCatalog Core ................................. SUCCESS [ 5.771 s]
[INFO] Hive HCatalog Pig Adapter .......................... SUCCESS [ 3.890 s]
[INFO] Hive HCatalog Server Extensions .................... SUCCESS [ 3.606 s]
[INFO] Hive HCatalog Webhcat Java Client .................. SUCCESS [ 3.527 s]
[INFO] Hive HCatalog Webhcat .............................. SUCCESS [ 13.319 s]
[INFO] Hive HCatalog Streaming ............................ SUCCESS [ 3.958 s]
[INFO] Hive HWI ........................................... SUCCESS [ 2.189 s]
[INFO] Hive ODBC .......................................... SUCCESS [ 3.507 s]
[INFO] Hive Shims Aggregator .............................. SUCCESS [ 0.710 s]
[INFO] Hive TestUtils ..................................... SUCCESS [ 1.086 s]
[INFO] Hive Packaging ..................................... SUCCESS [01:45 min]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 08:28 min
[INFO] Finished at: 2018-02-09T10:50:56+08:00
[INFO] Final Memory: 205M/845M
[INFO] ------------------------------------------------------------------------
[hadoop@node1 software]$ tar xf hive-1.1.0-cdh5.7.0.tar.gz -C /home/hadoop/app
[hadoop@node1 hive-1.1.0-cdh5.7.0]$ pwd
/home/hadoop/app/hive-1.1.0-cdh5.7.0
[hadoop@node1 hive-1.1.0-cdh5.7.0]$ cat /home/hadoop/.bash_profile |grep HIVE
###HIVE_HOME
export HIVE_HOME=/home/hadoop/app/hive-1.1.0-cdh5.7.0
export PATH=$HIVE_HOME/bin:$PATH
source /home/hadoop/.bash_profile
修改 hive-env.sh 中Hadoop_home
[hadoop@node1 conf]$ cat hive-env.sh |grep HADOOP_HOME
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/home/hadoop/app/hadoop-2.6.0-cdh5.7.0
修改 hive-site.xml 文件
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://node1.oracle.com:3306/hive?createDatabaseIfNotExist=true</value>
<description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value>
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value>
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>hive</value>
</property>
<property>
<name>hive.cli.print.current.db</name>
<value>true</value>
</property>
<property>
<name>hive.cli.print.header</name>
<value>true</value>
</property>
<property>
<name>hive.server2.thrift.port</name>
<value>10000</value>
</property>
<property>
<name>hive.server2.thrift.bind.host</name>
<value>node1.oracle.com</value>
</property>
<property>
<name>hive.metastore.schema.verification</name>
<value>false</value>
</property>
</configuration>
[hadoop@node1 software]$ pwd
/home/hadoop/software
[hadoop@node1 software]$ tar xf mysql-connector-java-5.1.40-bin.jar
[hadoop@node1 software]$ cp -rp mysql-connector-java-5.1.40-bin.jar /home/hadoop/app/hive-1.1.0-cdh5.7.0/lib/
登录MySQL 创建hive数据库和授权hive用户
create database hive;
alter database hive character set latin1;
grant all privileges on *.* to 'hive'@localhost identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@127.0.0.1 identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@'%' identified by 'hive' with grant option;
DELETE FROM `mysql`.`user` WHERE `user`='';
FLUSH PRIVILEGES;
启动 hive
[hadoop@node1 ~]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/lib/spark-assembly-1.6.1-hadoop2.6.0-cdh5.7.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
which: no hbase in (/home/hadoop/app/hive-1.1.0-cdh5.7.0/bin:/usr/local/mysql/bin:/usr/java/jdk1.7.0_79/bin:/usr/java/jdk1.7.0_79/bin:/usr/java/jdk1.7.0_79/jre/bin:/usr/lib64/qt-3.3/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/root/bin:/home/hadoop/bin:/home/hadoop/app/apache-maven-3.3.9/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/bin:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/sbin:/home/hadoop/app/scala-2.10.4/bin:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/bin:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/sbin:/home/hadoop/app/hue-3.9.0-cdh5.7.0/bin:/home/hadoop/app/zookeeper-3.4.5-cdh5.7.0/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/app/spark-1.6.1-bin-2.6.0-cdh5.7.0/lib/spark-assembly-1.6.1-hadoop2.6.0-cdh5.7.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/home/hadoop/app/hive-1.1.0-cdh5.7.0/lib/hive-common-1.1.0-cdh5.7.0.jar!/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive (default)>
[hadoop@node1 ~]$ more emp.txt
7369 SMITH CLERK 7902 1980-12-17 800.00 20
7499 ALLEN SALESMAN 7698 1981-2-20 1600.00 300.00 30
7521 WARD SALESMAN 7698 1981-2-22 1250.00 500.00 30
7566 JONES MANAGER 7839 1981-4-2 2975.00 20
7654 MARTIN SALESMAN 7698 1981-9-28 1250.00 1400.00 30
7698 BLAKE MANAGER 7839 1981-5-1 2850.00 30
7782 CLARK MANAGER 7839 1981-6-9 2450.00 10
7788 SCOTT ANALYST 7566 1987-4-19 3000.00 20
7839 KING PRESIDENT 1981-11-17 5000.00 10
7844 TURNER SALESMAN 7698 1981-9-8 1500.00 0.00 30
7876 ADAMS CLERK 7788 1987-5-23 1100.00 20
7900 JAMES CLERK 7698 1981-12-3 950.00 30
7902 FORD ANALYST 7566 1981-12-3 3000.00 20
7934 MILLER CLERK 7782 1982-1-23 1300.00 10
[hadoop@node1 ~]$ more dept.txt
10 ACCOUNTING NEW YORK
20 RESEARCH DALLAS
30 SALES CHICAGO
40 OPERATIONS BOSTON
[hadoop@node1 data]$ more spark_stud_info.txt
wujiadong 26
ji 24
sun 27
xu 25
[hadoop@node1 data]$ more spark_stud_score.txt
wujiadong 90
ji 100
sun 99
xu 99
============= 在 默认数据库 创建 表 =============
CREATE TABLE `dept`(
`deptno` int,
`dname` string,
`loc` string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t';
CREATE TABLE `emp`(
`empno` int,
`ename` string,
`job` string,
`mgr` int,
`hiredate` string,
`sal` double,
`comm` double,
`deptno` int)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY '\t' ;
load data local inpath '/home/hadoop/data/emp.txt' into table emp;
load data local inpath '/home/hadoop/data/dept.txt' into table dept;
create table spark_stud_score(
name string ,
store int
)
row format delimited fields terminated by '\t' ;
create table spark_stud_info(
name string,
age int
)
row format delimited fields terminated by '\t' STORED AS textfile ;
;
load data local inpath '/home/hadoop/data/spark_stud_info.txt' into table spark_stud_info;
load data local inpath '/home/hadoop/data/spark_stud_score.txt' into table spark_stud_score;
============== 遇到的一些小问题 ==
问题(一)
hive> CREATE TABLE pokes (foo INT, bar STRING);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:For direct MetaStore DB connections, we don't support retries at the client level.)
遇到上面的问题 处理方式 将hive 数据库的 编码格式 改为如下即可
alter database hive character set latin1;
问题(二)权限问题
grant all privileges on *.* to 'hive'@localhost identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@127.0.0.1 identified by 'hive' with grant option;
grant all privileges on *.* to 'hive'@'%' identified by 'hive' with grant option;
DELETE FROM `mysql`.`user` WHERE `user`='';
FLUSH PRIVILEGES;
相关文章推荐
- ubuntu配置android开发环境和编译源码遇到的一些问题
- QGIS1.7.1版本编译 cmake配置时遇到的问题
- mysql5.7解压版安装配置中遇到的一些问题
- ubuntu上编译安装apache步骤、遇到的一些问题解法
- 安装配置hive中遇到的问题
- Hive:安装Hive遇到的一些问题 (Remote Server Mode模式)
- Linux系统之路——python多版本共存问题(ps:自行切换python版本,pip安装遇到的一些问题)
- Ubuntu16.04 安装server版本遇到的一些问题
- 基于ubuntu14.04下编译linux-2.6.34版本内核的步骤和遇到的一些问题及解决
- CentOS7.0下安装和配置zabbix2.4.5全过程及解决一些遇到的问题
- 安装和使用hive时遇到的一些问题
- 平时遇到的一些软件安装配置问题解决方案
- hive安装配置及遇到的问题解决
- nagios 安装与配置以及遇到一些问题的解决方案
- win7 64位 ,安装配置cx_Oracle,遇到的一些问题及解决方法
- 【转】 hive安装配置及遇到的问题解决
- 在SuSE10上源码编译安装nodejs时遇到的一些问题(转)
- 安装、编译、运行caffe遇到的一些问题
- github 安装配置以及使用遇到的一些问题