Hadoop-2.x的源码编译


由于在Hadoop-2.x中,Apache官网上提供的都是32位版本,如果是生产环境中则需要自行编译64位,编译Hadoop-2.x版本方法如下:

安装编译源码所依赖的底层库
  yum install glibc-headers
  yum install gcc
  yum install gcc-c++
  yum install make
  yum install cmake
  yum install openssl-devel
  yum install ncurses-devel

安装protobuf-2.5.0(Hadoop中节点间的RPC通讯协议的实现是基于Google的Protocol buffer)
  tar -zxvf /home/tools/protobuf-2.5.0.tar.gz -C /home/tools/
  cd protobuf-2.5.0 && ./configure && make && make check && make install

安装apache-maven-3.0.5(这里我安装的JDK1.7,编译Hadoop源码时需要使用maven)
  tar -zxvf /home/tools/apache-maven-3.0.5-bin.tar.gz -C /home/tools/
  vi /etc/profile
  export JAVA_HOME=/usr/local/java
  export M2_HOME=/usr/local/apache-maven-3.0.5
  export PATH=.:$M2_HOME/bin:$JAVA_HOME/bin:$PATH

解压Hadoop-2.4.1-src.tar.gz源码包进行编译
  tar -zxvf /home/tools/hadoop-2.4.1-src.tar.gz -C /home/tools/
  cd /home/tools/hadoop-2.4.1-src
  mvn package -DskipsTests -Pdist,native
  出现如下日志表示编译成功
        main:
             [exec]
             [exec] Current directory /home/tools/hadoop-2.4.1-src/hadoop-dist/target
             [exec]
             [exec] $ rm -rf hadoop-2.4.1
             [exec] $ mkdir hadoop-2.4.1
             [exec] $ cd hadoop-2.4.1
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-common/target/hadoop-common-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-common-project/hadoop-nfs/target/hadoop-nfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/include /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-httpfs/target/hadoop-hdfs-httpfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-hdfs-project/hadoop-hdfs-nfs/target/hadoop-hdfs-nfs-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-yarn-project/target/hadoop-yarn-project-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/bin /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/etc /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/libexec /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/sbin /home/tools/hadoop-2.4.1-src/hadoop-mapreduce-project/target/hadoop-mapreduce-2.4.1/share .
             [exec] $ cp -r /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/include /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/lib /home/tools/hadoop-2.4.1-src/hadoop-tools/hadoop-tools-dist/target/hadoop-tools-dist-2.4.1/share .
             [exec]
             [exec] Hadoop dist layout available at: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-2.4.1
             [exec]
        [INFO] Executed tasks
        [INFO]
        [INFO] --- maven-jar-plugin:2.3.1:jar (default-jar) @ hadoop-dist ---
        [WARNING] JAR will be empty - no content was marked for inclusion!
        [INFO] Building jar: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-dist-2.4.1.jar
        [INFO]
        [INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-dist ---
        [INFO] No sources in project. Archive not created.
        [INFO]
        [INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-dist ---
        [INFO] No sources in project. Archive not created.
        [INFO]
        [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ hadoop-dist ---
        [INFO]
        [INFO] --- maven-antrun-plugin:1.7:run (tar) @ hadoop-dist ---
        [INFO] Executing tasks

        main:
        [INFO] Executed tasks
        [INFO]
        [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist ---
        [INFO] Building jar: /home/tools/hadoop-2.4.1-src/hadoop-dist/target/hadoop-dist-2.4.1-javadoc.jar
        [INFO] ------------------------------------------------------------------------
        [INFO] Reactor Summary:
        [INFO]
        [INFO] Apache Hadoop Main ................................ SUCCESS [1.176s]
        [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.937s]
        [INFO] Apache Hadoop Annotations ......................... SUCCESS [3.426s]
        [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.302s]
        [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.582s]
        [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [3.132s]
        [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [2.916s]
        [INFO] Apache Hadoop Auth ................................ SUCCESS [3.873s]
        [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [2.328s]
        [INFO] Apache Hadoop Common .............................. SUCCESS [1:36.564s]
        [INFO] Apache Hadoop NFS ................................. SUCCESS [5.527s]
        [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.038s]
        [INFO] Apache Hadoop HDFS ................................ SUCCESS [2:44.338s]
        [INFO] Apache Hadoop HttpFS .............................. SUCCESS [21.785s]
        [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [25.123s]
        [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [3.578s]
        [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.046s]
        [INFO] hadoop-yarn ....................................... SUCCESS [0.039s]
        [INFO] hadoop-yarn-api ................................... SUCCESS [1:19.929s]
        [INFO] hadoop-yarn-common ................................ SUCCESS [1:30.724s]
        [INFO] hadoop-yarn-server ................................ SUCCESS [0.032s]
        [INFO] hadoop-yarn-server-common ......................... SUCCESS [8.375s]
        [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [52.226s]
        [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [2.878s]
        [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [12.762s]
        [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [12.406s]
        [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.483s]
        [INFO] hadoop-yarn-client ................................ SUCCESS [5.208s]
        [INFO] hadoop-yarn-applications .......................... SUCCESS [0.029s]
        [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.614s]
        [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [2.137s]
        [INFO] hadoop-yarn-site .................................. SUCCESS [0.037s]
        [INFO] hadoop-yarn-project ............................... SUCCESS [3.164s]
        [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.059s]
        [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [18.276s]
        [INFO] hadoop-mapreduce-client-common .................... SUCCESS [18.034s]
        [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [2.728s]
        [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [8.973s]
        [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [7.420s]
        [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [12.076s]
        [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.988s]
        [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [5.648s]
        [INFO] hadoop-mapreduce .................................. SUCCESS [2.431s]
        [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [9.437s]
        [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [20.544s]
        [INFO] Apache Hadoop Archives ............................ SUCCESS [2.163s]
        [INFO] Apache Hadoop Rumen ............................... SUCCESS [5.710s]
        [INFO] Apache Hadoop Gridmix ............................. SUCCESS [4.467s]
        [INFO] Apache Hadoop Data Join ........................... SUCCESS [2.770s]
        [INFO] Apache Hadoop Extras .............................. SUCCESS [3.014s]
        [INFO] Apache Hadoop Pipes ............................... SUCCESS [10.174s]
        [INFO] Apache Hadoop OpenStack support ................... SUCCESS [4.523s]
        [INFO] Apache Hadoop Client .............................. SUCCESS [3.611s]
        [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.136s]
        [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [9.834s]
        [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.285s]
        [INFO] Apache Hadoop Tools ............................... SUCCESS [0.025s]
        [INFO] Apache Hadoop Distribution ........................ SUCCESS [12.173s]
        [INFO] ------------------------------------------------------------------------
        [INFO] BUILD SUCCESS
        [INFO] ------------------------------------------------------------------------
        [INFO] Total time: 13:01.056s
        [INFO] Finished at: Tues Jul 02 10:28:07 CST 2014
        [INFO] Final Memory: 165M/512M
        [INFO] ------------------------------------------------------------------------

     cd /home/tools/hadoop-2.4.1-src/hadoop-dist/target
     ls
     antrun  dist-layout-stitching.sh  hadoop-2.4.1  hadoop-dist-2.4.1.jar  hadoop-dist-2.4.1-javadoc.jar  javadoc-bundle-options  maven-archiver  test-dir
     其中hadoop-2.4.1是64位hadoop二进制包,也就是我们所需要的。

原文地址:https://www.cnblogs.com/mengyao/p/4777874.html