druid.io 集成 hadoop 问题解决 /hdp/apps/${hdp.version}/mapreduce/mapreduce.tar.gz#mr-framework
2017-07-06 11:08
267 查看
系统环境:
CentOS release 6.5 (Final)druid.io 0.10.0
问题出现
在执行从 hadoop 导入数据到 druid 的时候失败,观察 middle 节点日志,发现报如下错误2017-07-06T01:58:01,637 ERROR [task-runner-0-priority-0] io.druid.indexing.overlord.ThreadPoolTaskRunner - Exception while running task[HadoopIndexTask{id=index_hadoop_demo_2017-07-06T01:57:43.791Z , type=index_hadoop, dataSource=demo}] java.lang.RuntimeException: java.lang.reflect.InvocationTargetException at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-16.0.1.jar:?] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:211) ~[druid-indexing-service-0.10.0.jar:0.10.0] at io.druid.indexing.common.task.HadoopIndexTask.run(HadoopIndexTask.java:176) ~[druid-indexing-service-0.10.0.jar:0.10.0] at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:436) [druid-indexing-service-0.10.0.jar:0.10.0] at io.druid.indexing.overlord.ThreadPoolTaskRunner$ThreadPoolTaskRunnerCallable.call(ThreadPoolTaskRunner.java:408) [druid-indexing-service-0.10.0.jar:0.10.0] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_77] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_77] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_77] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_77] Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_77] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_77] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_77] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_77] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) ~[druid-indexing-service-0.10.0.jar:0.10.0] ... 7 more Caused by: java.lang.IllegalArgumentException: Unable to parse '/hdp/apps/${hdp.version}/mapreduce/mapreduce.tar.gz#mr-framework' as a URI, check the setting for mapreduce.application.framework.pat h at org.apache.hadoop.mapreduce.JobSubmitter.addMRFrameworkToDistributedCache(JobSubmitter.java:443) ~[?:?] at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142) ~[?:?] at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) ~[?:?] at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) ~[?:?] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_77] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_77] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) ~[?:?] at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) ~[?:?] at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:116) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0] at io.druid.indexer.JobHelper.runJobs(JobHelper.java:349) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0] at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:91) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0] at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:306) ~[druid-indexing-service-0.10.0.jar:0.10.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_77] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_77] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_77] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_77] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) ~[druid-indexing-service-0.10.0.jar:0.10.0] ... 7 more Caused by: java.net.URISyntaxException: Illegal character in path at index 11: /hdp/apps/${hdp.version}/mapreduce/mapreduce.tar.gz#mr-framework at java.net.URI$Parser.fail(URI.java:2848) ~[?:1.8.0_77] at java.net.URI$Parser.checkChars(URI.java:3021) ~[?:1.8.0_77] at java.net.URI$Parser.parseHierarchical(URI.java:3105) ~[?:1.8.0_77] at java.net.URI$Parser.parse(URI.java:3063) ~[?:1.8.0_77] at java.net.URI.<init>(URI.java:588) ~[?:1.8.0_77] at org.apache.hadoop.mapreduce.JobSubmitter.addMRFrameworkToDistributedCache(JobSubmitter.java:441) ~[?:?] at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:142) ~[?:?] at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) ~[?:?] at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) ~[?:?] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_77] at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_77] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1698) ~[?:?] at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) ~[?:?] at io.druid.indexer.DetermineHashedPartitionsJob.run(DetermineHashedPartitionsJob.java:116) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0] at io.druid.indexer.JobHelper.runJobs(JobHelper.java:349) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0] at io.druid.indexer.HadoopDruidDetermineConfigurationJob.run(HadoopDruidDetermineConfigurationJob.java:91) ~[druid-indexing-hadoop-0.10.0.jar:0.10.0] at io.druid.indexing.common.task.HadoopIndexTask$HadoopDetermineConfigInnerProcessing.runTask(HadoopIndexTask.java:306) ~[druid-indexing-service-0.10.0.jar:0.10.0] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_77] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_77] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_77] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_77] at io.druid.indexing.common.task.HadoopTask.invokeForeignLoader(HadoopTask.java:208) ~[druid-indexing-service-0.10.0.jar:0.10.0] ... 7 more 2017-07-06T01:58:01,653 INFO [task-runner-0-priority-0] io.druid.indexing.overlord.TaskRunnerUtils - Task [index_hadoop_demo_2017-07-06T01:57:43.791Z] status changed to [FAILED]. 2017-07-06T01:58:01,656 INFO [task-runner-0-priority-0] io.druid.indexing.worker.executor.ExecutorLifecycle - Task completed with status: { "id" : "index_hadoop_demo_2017-07-06T01:57:43.791Z", "status" : "FAILED", "duration" : 13783 }
问题原因很明确
java.net.URISyntaxException: Illegal character in path at index 11: /hdp/apps/${hdp.version}/mapreduce/mapreduce.tar.gz#mr-framework
变量
${hdp.version}在 druid 集群服务器上是不存在的
解决方案
修改 middle 配置文件conf/druid/middleManager/runtime.properties
在参数
druid.indexer.runner.javaOpts上增加 hpd hadoop 的版本号变量
-Dhdp.version=2.5.3.0-37
例如:
druid.indexer.runner.javaOpts=-server -Xms1g -Xmx1g -XX:+UseG1GC -Duser.timezone=UTC -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager -Dhdp.version=2.5.3.0-37
相关文章推荐
- druid.io 集成 hadoop 问题解决 /hdp/apps/${hdp.version}/mapreduce/mapreduce.tar.gz#mr-framework
- NoSuchMethodException: org.apache.hadoop.io.ArrayWritable.<init>问题解决总结
- hadoop mapreduce 解决 top K问题
- ecliplse 远程提交程序到虚拟机 hadoop集群 , 修改Hadoop的源代码---NativeIO问题解决!
- *.tar.gz的打开以及问题解决,还有tar打包压缩的命令
- 关于XMPPFramework的简介和集成到项目中的问题解决
- 解决问题: 使用hadoop时出现:java.io.IOException: Bad connect ack with firstBadLink ...
- cygwin解决tar.gz包乱码问题
- Hadoop运行mapreduce任务过程中报错:Error: Java heap space问题解决
- 记Hadoop2.5.0线上mapreduce任务执行map任务划分的一次问题解决
- hadoop mapreduce 解决 top K问题
- RHEL 5安装arm-linux-gcc-4.4.3.tar.gz后,编译报/usr/lib/libstdc++.so.6: version `GLIBCXX_3.4.9' not found 错,解决办法!
- MapReduce中碰到数据覆盖现象,org.apache.hadoop.io.Text.getBytes 问题
- hadoop mapreduce问题排查和解决
- 解决Exception: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z 等一系列问题
- Hadoop MapReduce程序中解决第三方jar包问题--终极解决方案
- Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/util/Apps Hadoop2.6.0编程问题与解决
- 使用Python实现Hadoop MapReduce程序遇到的问题解决办法
- 解决Exception: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z 等一系列问题
- 解决Exception: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z 等一系列问题