您的位置:首页 > 运维架构

mac下hadoop 2.6.0编译native library

2016-06-28 23:48 453 查看
本文中的所有路径请根据个人情况修改。

编译好的native library见个人资源:【http://download.csdn.net/detail/tterminator/9565597

一、为什么要编译native library

mac单机模式安装Hadoop后启动,报错:WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable。

有兴趣的可以自己编译下,因为网上很多的native library资源和方法都是不可用的,自己在本地编译也是官网推荐的。

二、问题原因

在官网上有说明:

The pre-built 32-bit i386-Linux native hadoop library is available as part of the hadoop distribution and is located in the lib/native directory. You can download the hadoop distribution from Hadoop Common Releases.

在官网下载的Hadoop版本中自带的native library 只针对32位Linux

The native hadoop library is supported on *nix platforms only. The library does not to work with Cygwin or the Mac OS X platform.

官网已明确说明预编译的native library 不适用于mac os x。所以这也是必须要在mac下编译native library的充要理由。

详细说明请参见http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/NativeLibraries.html

三、编译环境说明

mac os x:10.10.4

jdk:java version “1.7.0_80”

hadoop:2.6.0-src

四、编译前期准备

编译开始前需要在mac下安装以下软件:

1.安装brew

类似Ubuntu中的apt软件包管理工具,用于安装缺少的软件包,这里用于安装cmake工具:brew install cmake。

2.安装cmake

没有版本要求。

3.安装protoc

版本必须是2.5.0,否则编译失败,不能用brew install protobuf的方式安装,因为该方式安装的版本不一定是2.5.0。

protocbuf 2.5.0的源码详见个人资源【http://download.csdn.net/detail/tterminator/9562400】,需要自己手动编译,其实也比较简单,步骤如下:

(1)设置编译目录:

./configure --prefix=/User/King-pan/software/tools/protobuf


其中/User/King-pan/software/tools/protobuf 为自己设定的编译安装目录。

(2)安装:

make


make install


(3)配置环境变量:

sudo vi .bash_profile


(4)添加配置文件:

export PROTOBUF=/Users/King-pan/software/tools/protobuf


export PATH=$PROTOBUF/bin:$PATH


(5)测试:

protoc --version


4.安装maven

本人编译时使用的版本:Apache Maven 3.3.3。

五、最后准备

在编译过程中会出现以下报错:

Exception in thread “main” java.lang.AssertionError: Missing tools.jar at: /Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/Classes/classes.jar. Expression: file.exists()

解决办法:

在JAVA_HOME,也即/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home路径下新建目录Class,并执行创建软符号链接命令(推荐使用绝对路径):

sudo ln -s $JAVA_HOME/lib/tools.jar $JAVA_HOME/Classes/classes.jar


六、开始编译

进入下载的2.6.0源码根目录下,执行以下命令:

mvn package -Pdist,native -DskipTests -Dtar


等待之后会出现以下编译结果:

[INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main ................................. SUCCESS [  1.206 s]

[INFO] Apache Hadoop Project POM .......................... SUCCESS [06:57 min]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [03:22 min]

[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.272 s]

[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [02:02 min]

[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [02:46 min]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [19:36 min]

[INFO] Apache Hadoop Auth ................................. SUCCESS [07:47 min]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [01:21 min]

[INFO] Apache Hadoop Common ............................... SUCCESS [12:20 min]

[INFO] Apache Hadoop NFS .................................. SUCCESS [  4.994 s]

[INFO] Apache Hadoop KMS .................................. SUCCESS [ 30.076 s]

[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.044 s]

[INFO] Apache Hadoop HDFS ................................. SUCCESS [07:09 min]

[INFO] Apache Hadoop HttpFS ............................... SUCCESS [01:46 min]

[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [03:25 min]

[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  3.440 s]

[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.027 s]

[INFO] hadoop-yarn ........................................ SUCCESS [  0.031 s]

[INFO] hadoop-yarn-api .................................... SUCCESS [01:08 min]

[INFO] hadoop-yarn-common ................................. SUCCESS [01:32 min]

[INFO] hadoop-yarn-server ................................. SUCCESS [  0.027 s]

[INFO] hadoop-yarn-server-common .......................... SUCCESS [01:21 min]

[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [01:49 min]

[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  2.149 s]

[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  4.729 s]

[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 15.844 s]

[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  4.096 s]

[INFO] hadoop-yarn-client ................................. SUCCESS [  5.619 s]

[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.024 s]

[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  1.940 s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [  1.541 s]

[INFO] hadoop-yarn-site ................................... SUCCESS [  0.031 s]

[INFO] hadoop-yarn-registry ............................... SUCCESS [  3.909 s]

[INFO] hadoop-yarn-project ................................ SUCCESS [  3.328 s]

[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.040 s]

[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 17.288 s]

[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 13.444 s]

[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  2.979 s]

[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [  7.477 s]

[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  5.705 s]

[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [01:24 min]

[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.265 s]

[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  4.280 s]

[INFO] hadoop-mapreduce ................................... SUCCESS [  3.765 s]

[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 16.991 s]

[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 31.676 s]

[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.504 s]

[INFO] Apache Hadoop Rumen ................................ SUCCESS [  4.793 s]

[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  3.410 s]

[INFO] Apache Hadoop Data Join ............................ SUCCESS [  1.948 s]

[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  1.599 s]

[INFO] Apache Hadoop Extras ............................... SUCCESS [  2.169 s]

[INFO] Apache Hadoop Pipes ................................ SUCCESS [  6.456 s]

[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  4.146 s]

[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [08:32 min]

[INFO] Apache Hadoop Client ............................... SUCCESS [  6.769 s]

[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.112 s]

[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  3.510 s]

[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 11.581 s]

[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.021 s]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 26.965 s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 01:28 h

[INFO] Finished at: 2016-06-28T01:40:35+08:00

[INFO] Final Memory: 193M/874M

[INFO] ------------------------------------------------------------------------


七、将编译出的native library复制到下载的二进制版本的Hadoop 2.6.0相应目录中

编译出的native library库的位置为

hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native


拷贝到二进制版本的Hadoop 2.6.0的目录

hadoop-2.6.0/lib/native


八、修改/etc/hadoop/hadoop-env.sh配置

export HADOOP_OPTS=”$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.library.path=/hadoop-2.6.0/lib/native”

九、重新启动Hadoop

此时就不会出现本文开头处的那个警告了。

十、附录

ssh localhost 报错:connection closed by ::1

解决办法:

查看
mac os x 系统日志/var/log/system.log
后发现有提示:

(1)Jun 28 09:29:29 bj-m-203544a.local sshd[48173]: error: Could not load host key: /etc/ssh_host_rsa_key

(2)Jun 28 09:29:29 bj-m-203544a.local sshd[48173]: error: Could not load host key: /etc/ssh_host_dsa_key

根据提示把
.ssh/id_dsa
id_rsa
拷贝到/etc目录下,并重命名为
ssh_host_dsa_key
ssh_host_rsa_key
即可。

ssh localhost 免密登录

修改
.ssh/authorized_keys
文件权限为644

(1)
chmod 644 authorized_keys


(2)修改
/etc/ssh_config文
件中属性
PasswordAuthentication
值为no【可能会影响其他需要密码的ssh登录,若有影响再把该值给为yes即可。】

单机模式安装的Hadoop配置打包,详见个人资源

其它

(1)启动Hadoop:

./start-dfs.sh

./start-yarn.sh


(2)关闭Hadoop

./stop-dfs.sh

./stop-yarn.sh


(3)
hadoop fs -mkdir /tmp


(4)
hadoop fs -copyFromLocal ~/word.txt /tmp


(5)Hadoop管理页面

http://localhost:50070


(6)yarn管理界面

http://localhost:8098/cluster
【端口实在yarn-site.xml中设置的】

(7)hadoop fs -rm /tmp/out/part-r-00000

(8)hadoop fs -rmdir /tmp/out/

备注:如果有Hadoop进程没有起来,可以查看Hadoop日志,里面会有详细的记录:

hadoop/logs/


-rw-r--r--   1 junwei8  staff       0 Jun 28 10:16 SecurityAuth-junwei8.audit

-rw-r--r--   1 junwei8  staff   65411 Jun 28 11:39 hadoop-junwei8-datanode-bj-m-203544a.local.log

-rw-r--r--   1 junwei8  staff     511 Jun 28 11:39 hadoop-junwei8-datanode-bj-m-203544a.local.out

-rw-r--r--   1 junwei8  staff     511 Jun 28 10:16 hadoop-junwei8-datanode-bj-m-203544a.local.out.1

-rw-r--r--   1 junwei8  staff   87710 Jun 28 11:40 hadoop-junwei8-namenode-bj-m-203544a.local.log

-rw-r--r--   1 junwei8  staff     511 Jun 28 11:39 hadoop-junwei8-namenode-bj-m-203544a.local.out

-rw-r--r--   1 junwei8  staff     511 Jun 28 10:16 hadoop-junwei8-namenode-bj-m-203544a.local.out.1

-rw-r--r--   1 junwei8  staff   64683 Jun 28 11:40 hadoop-junwei8-secondarynamenode-bj-m-203544a.local.log

-rw-r--r--   1 junwei8  staff     511 Jun 28 11:39 hadoop-junwei8-secondarynamenode-bj-m-203544a.local.out

-rw-r--r--   1 junwei8  staff     511 Jun 28 10:16 hadoop-junwei8-secondarynamenode-bj-m-203544a.local.out.1

drwxr-xr-x   2 junwei8  staff      68 Jun 28 11:40 userlogs

-rw-r--r--   1 junwei8  staff   77547 Jun 28 11:40 yarn-junwei8-nodemanager-bj-m-203544a.local.log

-rw-r--r--   1 junwei8  staff     494 Jun 28 11:40 yarn-junwei8-nodemanager-bj-m-203544a.local.out

-rw-r--r--   1 junwei8  staff     494 Jun 28 10:17 yarn-junwei8-nodemanager-bj-m-203544a.local.out.1

-rw-r--r--   1 junwei8  staff  251199 Jun 28 11:40 yarn-junwei8-resourcemanager-bj-m-203544a.local.log

-rw-r--r--   1 junwei8  staff     494 Jun 28 11:40 yarn-junwei8-resourcemanager-bj-m-203544a.local.out

-rw-r--r--   1 junwei8  staff     494 Jun 28 10:17 yarn-junwei8-resourcemanager-bj-m-203544a.local.out.1

-rw-r--r--   1 junwei8  staff     494 Jun 27 21:18 yarn-junwei8-resourcemanager-bj-m-203544a.local.out.2

-rw-r--r--   1 junwei8  staff     494 Jun 27 21:17 yarn-junwei8-resourcemanager-bj-m-203544a.local.out.3


十一、参考链接:

http://www.rockyfeng.me/hadoop_native_library_mac.html

http://my.oschina.net/KingPan/blog/283881?p=1

http://hadoop.apache.org/docs/r2.6.0/hadoop-project-dist/hadoop-common/NativeLibraries.html

http://leibnitz.iteye.com/blog/2149745
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  hadoop native-lib