您的位置:首页 > 其它

hive on spark 环境搭建

2015-09-24 17:24 381 查看
环境: hadoop2.6.0+hive1.2.1+spark1.3.11.安装hive1.2.1报错:
[ERROR] Terminal initialization failed; falling back to unsupported
java.lang.IncompatibleClassChangeError: Found class jline.Terminal, but interface was expected
at jline.TerminalFactory.create(TerminalFactory.java:101)
at jline.TerminalFactory.get(TerminalFactory.java:158)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:229)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:221)
at jline.console.ConsoleReader.<init>(ConsoleReader.java:209)
at org.apache.hadoop.hive.cli.CliDriver.setupConsoleReader(CliDriver.java:787)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:721)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
解决办法: export HADOOP_USER_CLASSPATH_FIRST=true2.编译spark1.3.1,下载源码,安装maven,然后执行以下命令:
.
/make-distribution
.sh --name
"hadoop2.6.0-without-hive"
--tgz
"-Pyarn,hadoop-provided,hadoop-2.6" -Dhadoop.version=2.6.0
-Dyarn.version=2.6.0 -DskipTests
3.将lib/spark-assembly-1.3.1-hadoop2.6.0.jar包复制到hive的lib目录下
4.设置hive参数,可以写入到hive-site.xml文件中,或者将spark的配置文件放入hive的conf中。
set yarn.resourcemanager.scheduler.class=org.apache.hadoop.yarn.server.resourcemanager.scheduler.fair.FairScheduler;

set spark.home=/usr/local/spark;

set hive.execution.engine=spark;
4.配置spark参数,将spark-default.conf拷贝到hive的conf目录下
 以下是一个测试例子。





                                            
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: