您的位置:首页 > 运维架构 > Linux

CentOS6.5下安装Hive2.0.0详解及其报错解决办法

2017-01-04 18:32 417 查看
部分摘抄来源于:http://www.centoscn.com/image-text/install/2016/0504/7167.html

本文环境如下: 

操作系统:CentOS6.5 64位 

Hive版本:2.0.0 

JDK版本:1.8.0 64位 

Hadoop版本:2.6.2


1. 所需要的环境

Hive 2.0需要以下运行环境: 

Java 1.7以上(强烈建议使用Java 1.8) 

Hadoop 2.X


2. 下载、解压Hive安装包

Hive官网地址: http://hive.apache.org/ 

例如:
wget "http://mirrors.cnnic.cn/apache/hive/hive-2.0.0/apache-hive-2.0.0-bin.tar.gz"
tar -xzvf apache-hive-2.0.0-bin.tar.gz
mv apache-hive-2.0.0-bin /opt/hive-2.0.0


3. 配置环境变量(可选)

将hive-2.0.0/bin添加到path,以方便访问
vi /etc/profile


在末尾添加:
HIVE_HOME=/opt/hive-2.0.0
PATH=$PATH:$HIVE_HOME/bin


4. 启动单机模式

Hive和Hadoop一样,有3种启动模式,分别是单机模式,伪分布模式,分布模式。这里先来说一下单机模式的启动方式。


4.1 修改配置文件

cd /opt/hive-2.0.0/conf
vi hive-site.xml //也可以用hive-default.xml.template去改,不过这个文件中的配置项太多了


输入以下内容后保存:
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
    <name>hive.metastore.warehouse.dir</name>
    <value>/home/hadoop/hive-2.0.0/warehouse</value>
    <description>location of default database for the warehouse</description>
</property>
<property>
   <name>javax.jdo.option.ConnectionURL</name>
   <value>jdbc:derby:/home/hadoop/hive-2.0.0/metastore_db;create=true</value>
   <description>JDBC connect string for a JDBC metastore</description>
</property>
</configuration>


4.2 初始化数据库

schematool -initSchema -dbType derby


出现以下几行说明初始化成功:
[root@master conf]# schematool -initSchema -dbType derby
which: no hbase in (/usr/java/jdk1.8.0/bin:/usr/java/jdk1.8.0/bin:/usr/java/jdk1.8.0/bin:/usr/lib64/qt-3.3/bin:/usr/java/jdk1.8.0/bin:/usr/local/bin:/bin:/usr/bin::/usr/local/sbin:/usr/sbin:/sbin:/home/hadoop/bin:/home/hadoop/hadoop-2.6.2/bin:/home/hadoop/hadoop-2.6.2/sbin:/bin:/home/hadoop/hive-0.12.0:/home/hadoop/hive-2.0.0/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hive-2.0.0/lib/hive-jdbc-2.0.0-standalone.jar!/org/slf]
SLF4J: Found binding in [jar:file:/home/hadoop/hive-2.0.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/i
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.6.2/share/hadoop/common/lib/slf4j-log4j12-1.7gerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Metastore connection URL:     jdbc:derby:/home/hadoop/hive-2.0.0/metastore_db;create=true
Metastore Connection Driver :     org.apache.derby.jdbc.EmbeddedDriver
Metastore connection User:     APP
Starting metastore schema initialization to 2.0.0
Initialization script hive-schema-2.0.0.derby.sql
Initialization script completed
schemaTool completed


4.3 启动程序

mkdir -p warehouse       // 创建元数据存储文件夹
chmod a+rwx warehouse    // 修改文件权限
hive


如果出现
hive>
提示符则说明启动成功

Logging initialized using configuration in jar:file:/home/hadoop/hive-2.0.0/lib/hive-common-2.0.0.jar

Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a  spark, tez) or using Hive 1.X releases.

hive> show databases;

ok

default

Time taken: 1.902 seconds, Fetched: 1 row(s)


5. 常见错误


5.1 运行hive时出现

Exception in thread "main" java.lang.RuntimeException: Hive metastore database is not initialized.
Please use schematool (e.g. ./schematool -initSchema -dbType ...) to create the schema. If needed,
don't forget to include the option to auto-create the underlying database in your JDBC connection
string (e.g. ?createDatabaseIfNotExist=true for mysql)


错误原因:  数据库没有初始化,请参照4.2


5.2 使用schematool初始化数据库时出现

Initialization script hive-schema-2.0.0.derby.sql
Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
*** schemaTool failed ***


错误原因:数据库文件夹中已经存在一些文件,解决方法就是清空数据库文件夹(也就是前面配置的
/opt/hive-2.0.0/metastore_db
文件夹)

5.3运行hive时报错:(部分截图)

[root@master conf]# hive
which: no hbase in (/usr/java/jdk1.8.0/bin:/usr/java/jdk1.8.0/bin:/usr/java/jdk1.8.0/bin:/usr/lib64/q:/usr/local/bin:/bin:/usr/bin::/usr/local/sbin:/usr/sbin:/sbin:/home/hadoop/bin:/home/hadoop/hadoop-2.2/sbin:/bin:/home/hadoop/hive-0.12.0:/home/hadoop/hive-2.0.0/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/hive-2.0.0/lib/hive-jdbc-2.0.0-standalone.jar!/org/slf]
SLF4J: Found binding in [jar:file:/home/hadoop/hive-2.0.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/i
SLF4J: Found binding in [jar:file:/home/hadoop/hadoop-2.6.2/share/hadoop/common/lib/slf4j-log4j12-1.7gerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]

Logging initialized using configuration in jar:file:/home/hadoop/hive-2.0.0/lib/hive-common-2.0.0.jar
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.security.AccessControlExceptt, access=WRITE, inode="/":hadoop:supergroup:drwxr-xr-x
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionC
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java
....at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=WR:drwxr-xr-x
.....
    at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:640)
    at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:597)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:526)
    ... 9 more
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied: user=root, access=WRITE,
inode="/":hadoop:supergroup:drwxr-xr-x
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkFsPermission(FSPermissionChecker.java:271)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:257)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:238)
    at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionCh


错误原因:权限问题,百度这个错误:错误信息是说我的root用户没有权限来访问hive中的表信息。
Caused by: org.apache.hadoop.security.AccessControlException: Permission denied: user=root,
access=WR:drwxr-xr-x
error:org.apache.oozie.action.ActionExecutorException: JA002:org.apache.hadoop.security.AccessControlException: Permissiondenied: user=xxj, access=WRITE,inode="user":hadoop:supergroup:rwxr-xr-x

解决办法,到hadoop中去找到配置wenjianhdfs-site.xml

sulution:added this entry to conf/hdfs-site.xml

[hadoop@master hadoop]$ vim hdfs-site.xml

<property>

<name>dfs.permissions</name>

<value>false</value>

</property>

然后再启动就正常了!
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  hive