您的位置:首页 > 大数据 > Hadoop

libhdfs 报错和解决方法

2016-01-18 18:23 381 查看
使用libhdfs时,一定要把jdk的clib路径添加进 /etc/ld.so.conf 中

/usr/local/lib/jdk1.8.0_45/jre/lib/amd64
/usr/local/lib/jdk1.8.0_45/jre/lib/amd64/server

否则报link错误.

另一个错误是运行时

loadFileSystems error:
(unable to get stack trace for java.lang.NoClassDefFoundError exception: ExceptionUtils::getStackTrace error.)

这个是找不到相关jar包

解决方法是将hadoop里的jar添加进CLASSPATH环境变量里

/etc/profile

export HADOOP_HOME=/your/hadoop/hadoop-2.7.0/share
export HADOOP_CLASSPATH=.
for f in $HADOOP_HOME/hadoop/common/hadoop-*.jar; do
        HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$f
done
for f in $HADOOP_HOME/hadoop/common/lib/*.jar; do
        HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$f
done

for f in $HADOOP_HOME/hadoop/mapreduce/hadoop-*.jar; do
        HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$f
done
for f in $HADOOP_HOME/hadoop/hdfs/hadoop-*.jar; do
        HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:$f
done

export CLASSPATH=.:$HADOOP_CLASSPATH:$CLASSPATH
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: