您的位置:首页 > 运维架构

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform...

2016-07-05 16:06 495 查看
1、在启动HDFS过程中,总是会提示warning信息:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

搜索的结果大部分说是操作系统与hadoop的位数不一致造成的,然而事实并非如此。

本文的操作系统为centos 64位,而libhadoop.so也是64位的

$ file ../lib/native/libhadoop.so.1.0.0

../lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped


后来甚至在机器上耗费数天从编译hadoop到编译spark,最后依然会提示上述warning,那么显然问题并不是位数冲突造成的了。

由于上述warning并不能提供更多有效的信息,需要增加debug信息:

export HADOOP_ROOT_LOGGER=DEBUG,console

再次执行./stop-dfs.sh,这次debug到一些有用的信息:

16/07/05 15:33:54 DEBUG util.Shell: setsid exited with exit code 0
16/07/05 15:33:54 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of successful kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
16/07/05 15:33:54 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[Rate of failed kerberos logins and latency (milliseconds)], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
16/07/05 15:33:54 DEBUG lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(value=[GetGroups], about=, valueName=Time, type=DEFAULT, always=false, sampleName=Ops)
16/07/05 15:33:54 DEBUG impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
16/07/05 15:33:55 DEBUG security.Groups:  Creating new Groups object
16/07/05 15:33:55 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
<span style="color:#FF0000;">16/07/05 15:33:55 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
16/07/05 15:33:55 DEBUG util.NativeCodeLoader: java.library.path=/usr/local/sinasrv2/hadoop/hadoop-2.6.4/lib
16/07/05 15:33:55 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable</span>
16/07/05 15:33:55 DEBUG util.PerformanceAdvisory: Falling back to shell based
16/07/05 15:33:55 DEBUG security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
16/07/05 15:33:55 DEBUG security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
16/07/05 15:33:55 DEBUG security.UserGroupInformation: hadoop login
16/07/05 15:33:55 DEBUG security.UserGroupInformation: hadoop login commit
16/07/05 15:33:55 DEBUG security.UserGroupInformation: using local user:UnixPrincipal: hadoop
16/07/05 15:33:55 DEBUG security.UserGroupInformation: Using user: "UnixPrincipal: hadoop" with name hadoop
16/07/05 15:33:55 DEBUG security.UserGroupInformation: User entry: "hadoop"
16/07/05 15:33:55 DEBUG security.UserGroupInformation: UGI loginUser:hadoop (auth:SIMPLE)
16/07/05 15:33:55 DEBUG security.UserGroupInformation: PrivilegedAction as:hadoop (auth:SIMPLE) from:org.apache.hadoop.hdfs.tools.GetConf.run(GetConf.java:314)


可以看到是上述标红位置出现了问题。也就是从java.library.path处没有找到libhadoop.so,我们发现,libhadoop.so是存放在/usr/local/sinasrv2/hadoop/hadoop-2.6.4/lib/native下的

在hadoop包中搜索发现,java.library.path被定义为$JAVA_LIBRARY_PATH

$ grep -R 'java.library.path' *
bin/yarn:  YARN_OPTS="$YARN_OPTS -Djava.library.path=$JAVA_LIBRARY_PATH"
bin/yarn.cmd:    set YARN_OPTS=%YARN_OPTS% -Djava.library.path=%JAVA_LIBRARY_PATH%
etc/hadoop/yarn-env.sh:  YARN_OPTS="$YARN_OPTS -Djava.library.path=$JAVA_LIBRARY_PATH"
etc/hadoop/yarn-env.cmd:  set YARN_OPTS=%YARN_OPTS% -Djava.library.path=%JAVA_LIBRARY_PATH%
libexec/hadoop-config.sh:# setup 'java.library.path' for native-hadoop code if necessary
libexec/hadoop-config.sh:  HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$JAVA_LIBRARY_PATH"
libexec/hadoop-config.cmd:@rem setup 'java.library.path' for native hadoop code if necessary
libexec/hadoop-config.cmd:  set HADOOP_OPTS=%HADOOP_OPTS% -Djava.library.path=%JAVA_LIBRARY_PATH%
share/doc/hadoop/common/CHANGES.txt:    HADOOP-8756. Fix SEGV when libsnappy is in java.library.path but
share/doc/hadoop/common/CHANGES.txt:    HADOOP-1660. Add the cwd of the map/reduce task to the java.library.path
share/doc/hadoop/common/CHANGES.txt: 50. HADOOP-1493.  Permit specification of "java.library.path" system
share/doc/hadoop/common/CHANGES.txt:10. HADOOP-873.      Pass java.library.path correctly to child processes.
share/doc/hadoop/common/CHANGES.txt:42. HADOOP-838.  Fix tasktracker to pass java.library.path to
share/doc/hadoop/mapreduce/CHANGES.txt:    MAPREDUCE-4458. Warn if java.library.path is used for AM or Task
share/doc/hadoop/mapreduce/CHANGES.txt:    MAPREDUCE-4072. User set java.library.path seems to overwrite default
share/doc/hadoop/mapreduce/CHANGES.txt:    MAPREDUCE-3259. Added java.library.path of NodeManager to


但是此时发现环境变量$JAVA_LIBRARY_PATH并未定义,那么问题到这里终于算是找到了,解决方法也很简单了,vim /etc/profile,添加$JAVA_LIBRARY_PATH环境变量

export JAVA_LIBRARY_PATH='/usr/local/sinasrv2/hadoop/hadoop-2.6.4/lib/native'


source /etc/profile 使之生效

再执行,则不会出现上述warning信息

2、在执行spark-shell的时候出现上述warning信息

同样的思路,修改spark的日志级别为warning,vim $SPARK_HOME/conf/log4j.properties

log4j.rootCategory=WARNING, console

得到如下错误信息:
16/07/06 12:15:41 DEBUG Groups:  Creating new Groups object
16/07/06 12:15:41 DEBUG NativeCodeLoader: Trying to load the custom-built native-hadoop library...
16/07/06 12:15:41 DEBUG NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
16/07/06 12:15:41 DEBUG NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
16/07/06 12:15:41 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/07/06 12:15:41 DEBUG PerformanceAdvisory: Falling back to shell based


又是java.library.path的问题,Orz…
但是在grep的时候,没有发现java.library.path的出处,也不知道其值是如何设定的。问题虽然找到了,但是解决方法一筹莫展。只要求助万能的度娘了…

这篇帖子也是同样的问题,http://bbs.csdn.net/topics/390978577,采用7楼的解决方法:

在spark的conf目录下,修改spark-env.sh文件加入LD_LIBRARY_PATH环境变量,值为hadoop的native库的路径

果然凑效!

vim $SPARK_HOME/conf/spark-env.sh

export SPARK_MASTER_IP=hadoop1
export SPARK_MASTER_PORT=7077
export SPARK_WORKER_CORES=1
export SPARK_WORKER_INSTANCES=1
export SPARK_WORKER_MEMORY=512M
export LD_LIBRARY_PATH=$JAVA_LIBRARY_PATH
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: