hive-2.1.1安装部署
2017-05-11 15:06
393 查看
hive介绍、理解相关,参考:
http://www.aboutyun.com/thread-20461-1-1.html http://blog.csdn.net/lifuxiangcaohui/article/details/40145859 https://mp.weixin.qq.com/s?__biz=MzIzODExMDE5MA==&mid=2694182433&idx=1&sn=687b754cddc7255026434c683f487ac0#rd http://blog.csdn.net/wangmuming/article/details/25226951
下载安装包后,将该压缩包解压在 /home/bigdata/run目录下 (ubuntu1)目录下:
hive-site.xml文件中,修改如下内容:
下载mysql-connector-java-5.5-bin.jar文件驱动包,并放到$HIVE_HOME/lib目录下
启动Hive
修改hive-site.xml中,hwi相关配置
复制tools.jar
不知道为什么,从很久以前的版本就有这个问题。找tools.jar一直找不到,所以需要手动复制一下:
cp ${JAVA_HOME}/lib/tools.jar ${HIVE_HOME}/lib
安装ant
添加ant相关环境变量
启动hwi
访问页面: http://10.3.19.171:9999/hwi
![](https://oscdn.geek-share.com/Uploads/Images/Content/202009/25/5f4056cfac39846d8d94b8cb8eb7a0dc)
hive创建表报错"Specified key was too long; max key length is 767 bytes"
解决方法:
参考:http://www.cnblogs.com/xing901022/p/5827165.html
http://www.aboutyun.com/thread-20461-1-1.html http://blog.csdn.net/lifuxiangcaohui/article/details/40145859 https://mp.weixin.qq.com/s?__biz=MzIzODExMDE5MA==&mid=2694182433&idx=1&sn=687b754cddc7255026434c683f487ac0#rd http://blog.csdn.net/wangmuming/article/details/25226951
二. 安装部署
hive安装需要使用mysql,所以先需要有一可用的mysql数据库。下载安装包后,将该压缩包解压在 /home/bigdata/run目录下 (ubuntu1)目录下:
bigdata@ubuntu1:~/download$ tar -zxvf apache-hive-2.1.1-bin.tar.gz
bigdata@ubuntu1:~/download$ mv apache-hive-2.1.1-bin ../run/
bigdata@ubuntu1:~/run$ ln -s apache-hive-2.1.1-bin hive
三.修改配置文件
hive-env.sh文件内容如下:bigdata@ubuntu1:~/run/hive/conf$ cat hive-env.sh ...... export JAVA_HOME=/home/bigdata/usr/jdk1.8.0_131 export HADOOP_HOME=/home/bigdata/run/hadoop export HIVE=/home/bigdata/run/hive
hive-site.xml文件中,修改如下内容:
<configuration> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://master:3306/hive?createDatabaseInfoNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>mysql_username</value> <description>Username to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>mysql_password</value> <description>password to use against metastore database</description> </property> </configuration>
下载mysql-connector-java-5.5-bin.jar文件驱动包,并放到$HIVE_HOME/lib目录下
启动Hive
bigdata@ubuntu1:~/run/hive/bin$ ./hive SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/bigdata/run/apache-hive-2.1.1-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/bigdata/run/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Logging initialized using configuration in file:/home/bigdata/run/apache-hive-2.1.1-bin/conf/hive-log4j2.properties Async: true Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive>
Hive的web页面hwi安装
安装hwi页面wget https://mirrors.tuna.tsinghua.edu.cn/apache/hive/hive-2.1.1/apache-hive-2.1.1-src.tar.gz
tar -zxvf apache-hive-2.1.1-src.tar.gz
cd apache-hive-2.1.1-src/hwi/web
jar -cvf hive-hwi-2.1.1.war *
mv hive-hwi-2.1.1.war /home/bigdata/run/hive/lib/
修改hive-site.xml中,hwi相关配置
<property> <name>hive.hwi.listen.host</name> <value>0.0.0.0</value> </property> <property> <name>hive.hwi.listen.port</name> <value>9999</value> </property> <property> <name>hive.hwi.war.file</name> <value>lib/hive-hwi-2.1.1.war</value> </property>
复制tools.jar
不知道为什么,从很久以前的版本就有这个问题。找tools.jar一直找不到,所以需要手动复制一下:
cp ${JAVA_HOME}/lib/tools.jar ${HIVE_HOME}/lib
安装ant
bigdata@ubuntu1:~/download$ wget https://mirrors.tuna.tsinghua.edu.cn/apache//ant/binaries/apache-ant-1.10.1-bin.tar.gz bigdata@ubuntu1:~/download$ tar -zxvf apache-ant-1.10.1-bin.tar.gz bigdata@ubuntu1:~/download$ mv apache-ant-1.10.1 ../run/ bigdata@ubuntu1:~/download$ cd ../run/ bigdata@ubuntu1:~/run$ ln -s apache-ant-1.10.1 ant bigdata@ubuntu1:~/run$ cp apache-ant-1.10.1/lib/ant.jar hive/lib/ant-1.10.1.jar
添加ant相关环境变量
bigdata@ubuntu1:~/run$ cat ~/.profile ...... export ANT_HOME=/home/bigdata/run/ant export PATH=$PATH:$ANT_HOME/bin
启动hwi
bigdata@ubuntu1:~/run/hive/bin$ ./hive --service hwi &
访问页面: http://10.3.19.171:9999/hwi
hive创建表报错"Specified key was too long; max key length is 767 bytes"
解决方法:
mysql > alter database hive character set latin1;参考: http://blog.csdn.net/keljony/article/details/43371995 http://www.cnblogs.com/h2-database/archive/2011/12/06/2583296.html
参考:http://www.cnblogs.com/xing901022/p/5827165.html
相关文章推荐
- hive2.1.1安装部署
- hive2.1.1 部署安装
- Apache Hive2.1.1安装部署
- ubuntu下hive2.1.1部署安装
- hive2.1.1安装部署
- apache_hive_2.1.1安装部署
- hive2.1.1安装部署
- hive2.1.1 部署安装
- Apache Hive2.1.1安装部署
- hive2.1.1安装部署
- hive2.1.1 部署安装
- hive2.1.1安装部署
- Hive的安装部署
- hive是不是要分布式部署?要不要安装多个?
- Hadoop入门进阶课程8--Hive介绍和安装部署
- hadoop入门第七步---hive部署安装(apache-hive-1.1.0)
- Hive 1.2.1安装部署
- hive-0.13.1安装部署(使用mysql做元数据库)
- Centos7下Hive-1.2.1安装部署
- Hive安装以及部署(Ubuntu-MySql)