您的位置:首页 > 运维架构 > Linux

Linux下源码编译hadoop2.6.0

2016-07-16 20:54 579 查看

Hadoop不提供64位编译好的版本,只能用源码自行编译64位版本。学习一项技术从安装开始,学习hadoop要从编译开始。

1.操作系统编译环境

yum install cmake lzo-devel zlib-devel gcc gcc-c++ autoconf automake libtool ncurses-devel openssl-devel libXtst

 2.安装JDK

下载JDK1.7,注意只能用1.7,否则编译会出错
http://www.oracle.com/technetwork/java/javase/downloads/jdk7-downloads-1880260.html

 tar zxvf jdk-7u75-linux-x64.tar.gz -C /app

 export JAVA_HOME=/app/jdk1.7.0_75

 export JRE_HOME=$JAVA_HOME/jre

 export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar

 PATH=$PATH:$JAVA_HOME/bin

 3.安装protobuf

下载protobuf-2.5.0,不能用高版本,否则Hadoop编译不能通过

wget https://protobuf.googlecode.com/files/protobuf-2.5.0.tar.gz
 tar xvf protobuf-2.5.0.tar.gz

 cd protobuf-2.5.0

 ./configure

 make

 make install

 ldconfig

 protoc --version

 4.安装ANT

 wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz
  tar zxvf apache-ant-1.9.4-bin.tar.gz -C /app

 vi /etc/profile

 export ANT_HOME=/app/apache-ant-1.9.4

 PATH=$PATH:$ANT_HOME/bin

 5.安装maven

 wget http://mirror.bit.edu.cn/apache/maven/maven-3/3.3.1/binaries/apache-maven-3.3.1-bin.tar.gz
 tar zxvf apache-maven-3.3.1-bin.tar.gz -C /app

 vi /etc/profile

 export MAVEN_HOME=/app/apache-maven-3.3.1

 export PATH=$PATH:$MAVEN_HOME/bin

修改配置文件

vi /app/apache-maven-3.3.1/conf/settings.xml

更改maven资料库,在<mirrors></mirros>里添加如下内容:

   <mirror>

         <id>nexus-osc</id>

          <mirrorOf>*</mirrorOf>

      <name>Nexusosc</name>

      <url>http://maven.oschina.net/content/groups/public/</url>

    </mirror>

在<profiles></profiles>内新添加

<profile>

        <id>jdk-1.7</id>

        <activation>

          <jdk>1.7</jdk>

        </activation>

        <repositories>

          <repository>

            <id>nexus</id>

            <name>local private nexus</name>

            <url>http://maven.oschina.net/content/groups/public/</url>

            <releases>

              <enabled>true</enabled>

            </releases>

            <snapshots>

              <enabled>false</enabled>

            </snapshots>

          </repository>

        </repositories>

        <pluginRepositories>

          <pluginRepository>

            <id>nexus</id>

           <name>local private nexus</name>

            <url>http://maven.oschina.net/content/groups/public/</url>

            <releases>

              <enabled>true</enabled>

            </releases>

            <snapshots>

              <enabled>false</enabled>

            </snapshots>

          </pluginRepository>

        </pluginRepositories>

 </profile>

 6.安装findbugs(非必须)

wget http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz?download
 tar zxvf findbugs-3.0.1.tar.gz -C /app

 vi /etc/profile

 export FINDBUGS_HOME=/app/findbugs-3.0.1

 PATH=$PATH:$FINDBUGS_HOME/bin

 export PATH

注意:

 最终,在/etc/profile中环境变量PATH的设置如下:

PATH=$PATH:$JAVA_HOME/bin:$ANT_HOME/bin:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin

 export PATH

在shell下执行,使环境变量生效

. /etc/profile

 7.编译 Hadoop2.6.0

 wget http://mirror.bit.edu.cn/apache/hadoop/core/hadoop-2.6.0/hadoop-2.6.0-src.tar.gz
 cd hadoop-2.6.0-src

 mvn package -DskipTests -Pdist,native -Dtar

 [INFO] Reactor Summary:

[INFO]

[INFO] Apache Hadoop Main ................................. SUCCESS [01:03 min]

[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 29.895 s]

[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 14.437 s]

[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.394 s]

[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  8.240 s]

[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 13.592 s]

[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 33.409 s]

[INFO] Apache Hadoop Auth ................................. SUCCESS [ 23.482 s]

[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  6.009 s]

[INFO] Apache Hadoop Common ............................... SUCCESS [02:59 min]

[INFO] Apache Hadoop NFS .................................. SUCCESS [ 11.041 s]

[INFO] Apache Hadoop KMS .................................. SUCCESS [05:48 min]

[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.039 s]

[INFO] Apache Hadoop HDFS ................................. SUCCESS [05:15 min]

[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 50.927 s]

[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 18.989 s]

[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  5.629 s]

[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.138 s]

[INFO] hadoop-yarn ........................................ SUCCESS [  0.137 s]

[INFO] hadoop-yarn-api .................................... SUCCESS [01:16 min]

[INFO] hadoop-yarn-common ................................. SUCCESS [ 43.418 s]

[INFO] hadoop-yarn-server ................................. SUCCESS [  0.102 s]

[INFO] hadoop-yarn-server-common .......................... SUCCESS [ 23.549 s]

[INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 24.205 s]

[INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [  3.647 s]

[INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [  7.759 s]

[INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 21.215 s]

[INFO] hadoop-yarn-server-tests ........................... SUCCESS [  6.688 s]

[INFO] hadoop-yarn-client ................................. SUCCESS [  8.308 s]

[INFO] hadoop-yarn-applications ........................... SUCCESS [  0.051 s]

[INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [  2.851 s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 
4000
2.172 s]

[INFO] hadoop-yarn-site ................................... SUCCESS [  0.072 s]

[INFO] hadoop-yarn-registry ............................... SUCCESS [  6.102 s]

[INFO] hadoop-yarn-project ................................ SUCCESS [  5.411 s]

[INFO] hadoop-mapreduce-client ............................ SUCCESS [  0.090 s]

[INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 22.220 s]

[INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 20.209 s]

[INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [  4.654 s]

[INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 10.680 s]

[INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [  8.812 s]

[INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 11.311 s]

[INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [  1.978 s]

[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  6.886 s]

[INFO] hadoop-mapreduce ................................... SUCCESS [  4.209 s]

[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 27.231 s]

[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 11.920 s]

[INFO] Apache Hadoop Archives ............................. SUCCESS [  2.939 s]

[INFO] Apache Hadoop Rumen ................................ SUCCESS [  7.875 s]

[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  5.416 s]

[INFO] Apache Hadoop Data Join ............................ SUCCESS [  3.462 s]

[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  2.611 s]

[INFO] Apache Hadoop Extras ............................... SUCCESS [  3.392 s]

[INFO] Apache Hadoop Pipes ................................ SUCCESS [  9.568 s]

[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  6.296 s]

[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 13.418 s]

[INFO] Apache Hadoop Client ............................... SUCCESS [  8.610 s]

[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  0.291 s]

[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  5.736 s]

[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  6.740 s]

[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.044 s]

[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 41.244 s]

[INFO] ------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO] ------------------------------------------------------------------------

[INFO] Total time: 27:01 min

[INFO] Finished at: 2016-05-21T18:35:35+08:00

[INFO] Final Memory: 125M/261M

[INFO] ------------------------------------------------------------------------

编译成功后会打包,放在hadoop-dist/target

 # ls

 antrun                    dist-tar-stitching.sh  hadoop-2.6.0.tar.gz    hadoop-dist-2.6.0-javadoc.jar  maven-archiver

 dist-layout-stitching.sh  hadoop-2.6.0           hadoop-dist-2.6.0.jar  javadoc-bundle-options         test-dir

验证编译是否成功

/picclife/software/hadoop-2.6.0-src/hadoop-dist/target/hadoop-2.6.0/lib/native

[root@zhanglw1 native]# file *

libhadoop.a:        current ar archive

libhadooppipes.a:   current ar archive

libhadoop.so:       symbolic link to `libhadoop.so.1.0.0'

libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped

libhadooputils.a:   current ar archive

libhdfs.a:          current ar archive

libhdfs.so:         symbolic link to `libhdfs.so.0.0.0'

libhdfs.so.0.0.0:   ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, not stripped
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  hadoop 源码 linux