centos編譯hadoop-3.0.0-beta1源碼

1.系統環境需求:html

本地系統爲centos7java

源碼下載地址(http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-3.0.0-beta1/hadoop-3.0.0-beta1-src.tar.gzpython

下載源碼後,解壓源碼,進入根目錄查看 BUILDING.txtgit

tar -zxvf hadoop*.tar.gz
[hadoop@localhost hadoop-3.0.0-beta1-src]$ cat BUILDING.txt

可看到如下要求shell

Requirements:

* Unix System
* JDK 1.8
* Maven 3.3 or later
* ProtocolBuffer 2.5.0
* CMake 3.1 or newer (if compiling native code)
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* python (for releasedocs)
* bats (for shell code testing)
* Node.js / bower / Ember-cli (for YARN UI v2 building)

你須要安裝的有apache

jdk1.8centos

protobuf必須2.5.0oracle

maven 3.3以上maven

cmake 3.1以上oop

應爲須要編譯native ,因此須要安裝Zlib庫

還有一些上面沒說,但沒裝可能報錯的坑。。。

apache-ant

automake

autoconf

findbugs

爲了方便你們下載,我把這上面的包都下載下來放到碼雲上了,地址在文末。

2.安裝jdk1.8

在jdk官網下載jdk1.8  (http://www.oracle.com/technetwork/java/javase/downloads/index.html)

下載後解壓到/opt目錄下

tar -zxvf jdk1.8**.tar.gz -C /opt/

配置java環境變量

JAVA_HOME=/opt/jdk1.8**
CLASSPATH=$JAVA_HOME/lib/
PATH=$PATH:$JAVA_HOME/bin
export PATH JAVA_HOME CLASSPATH

 

3.更新本地cmake

查看本地cmake版本

[root@localhost hadoop]# cmake --version
cmake version 2.8.12.2

版本過低了

因此這裏編譯安裝cmake3.3.2

wget https://cmake.org/files/v3.3/cmake-3.3.2.tar.gz

解壓到opt目錄

tar -zxvf cmake-3.3.2.tar.gz -C /opt/

編譯安裝

./configure
make
make install

成功後設置環境變量,在/etc/profile中添加以下

export PATH=/opt/cmake-3.3.2/bin:$PATH

source一下profile

source /etc/profile

查看當前cmake版本

cmake --version

看到以下結果就安裝成功了

[root@localhost cmake-3.3.2]# cmake --version
cmake version 3.3.2

CMake suite maintained and supported by Kitware (kitware.com/cmake).

4.安裝findbugs

findbugs-3.0.1下載地址(http://prdownloads.sourceforge.net/findbugs/findbugs-3.0.1.tar.gz?download)

下載後解壓到/opt  目錄

tar -zxvf ***.tar.gz -C /opt/

配置環境變量

export FINDBUGS_HOME=/opt/findbugs-3.0.1

export PATH=$PATH:$FINDBUGS_HOME/bin

5.安裝ant

apache-ant下載地址(http://mirror.bit.edu.cn/apache//ant/binaries/apache-ant-1.9.9-bin.zip

下載後解壓到/opt目錄下

unzip apache-ant*.zip -d /opt/

配置ant環境變量

export ANT_HOME=/opt/apache-ant-1.9.9

export PATH=$PATH:$ANT_HOME/bin

6.安裝maven3.5.2

maven3.5.2下載地址(http://mirror.bit.edu.cn/apache/maven/maven-3/3.5.2/binaries/apache-maven-3.5.2-bin.tar.gz

下載後解壓到/opt目錄下

tar -zxvf ***.tar.gz -C /opt/

配置環境變量

export MAVEN_HOME=/opt/apache-maven-3
export PATH=/opt/cmake-3.3.2/bin:$PATH

export MAVEN_OPTS="-Xms256m -Xmx512m"

export PATH=$PATH:$MAVEN_HOME/bin

7.安裝automake,autoconf

yum install automake autoconf -y

一鍵安裝,,不過這倆個好像自帶了

8.安裝protobuf-2.5.0

下載protobuf-2.5.0.tar.gz (本文末尾將給出下載連接)

解壓到/opt目錄下

tar -zxvf ***.tar.gz -C /opt/

進入解壓目錄

./configure
make
make install

測試是否成功安裝

protoc --version

看到版本號即安裝成功

接下來將/etc/profile source一下,將上面裝的包都生效。

source /etc/profile

9.用maven開始編譯hadoop3.0.0

進入hadoop源碼根目錄後,執行

mvn package -Pdist,native -DskipTests -Dtar

若是你運氣好的話,能看到下面的結果

[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................. SUCCESS [  7.003 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  8.003 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  2.948 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [  4.626 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.245 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  4.155 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [  7.021 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [  2.951 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  9.341 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  4.484 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:08 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  7.621 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [  7.392 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.049 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 38.861 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [01:23 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [  9.793 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 10.793 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  4.745 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.049 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.038 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 20.464 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 43.830 s]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.034 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 14.789 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  7.127 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 34.282 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  4.094 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [  8.096 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [  5.807 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 27.429 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  2.429 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  7.644 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  4.033 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  5.206 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [  9.416 s]
[INFO] Apache Hadoop YARN Timeline Service HBase tests .... SUCCESS [  5.647 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [  6.088 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.049 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  3.741 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  2.700 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.074 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [  0.042 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  9.958 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.291 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 24.697 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 16.013 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  4.281 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  9.812 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  6.419 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [  9.396 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  2.131 s]
[INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [ 58.791 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  6.293 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  4.415 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  7.029 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  8.569 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  2.455 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  2.771 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  5.268 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  4.419 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.375 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  2.129 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  6.287 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  6.092 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [ 12.514 s]
[INFO] Apache Hadoop Kafka Library support ................ SUCCESS [  3.487 s]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [  6.791 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [  3.177 s]
[INFO] Apache Hadoop Client Aggregator .................... SUCCESS [  2.760 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  2.292 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  5.750 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [  3.585 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [  7.539 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.026 s]
[INFO] Apache Hadoop Client API ........................... SUCCESS [01:07 min]
[INFO] Apache Hadoop Client Runtime ....................... SUCCESS [ 52.032 s]
[INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [  0.691 s]
[INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [01:11 min]
[INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [  0.158 s]
[INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [  0.339 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 50.456 s]
[INFO] Apache Hadoop Client Modules ....................... SUCCESS [  0.734 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  4.236 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.025 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16:39 min
[INFO] Finished at: 2017-11-22T23:08:06+08:00
[INFO] Final Memory: 169M/494M
[INFO] ------------------------------------------------------------------------

說明編譯成功完成

編譯後打包的Hadoop壓縮文件在hadoop-dist/target裏面

[root@localhost target]# ll
總用量 256112
drwxr-xr-x. 2 root root        28 11月 22 23:07 antrun
drwxr-xr-x. 3 root root        22 11月 22 23:07 classes
drwxr-xr-x. 9 root root       149 11月 22 23:07 hadoop-3.0.0-beta1
-rw-r--r--. 1 root root 262258495 11月 22 23:07 hadoop-3.0.0-beta1.tar.gz
drwxr-xr-x. 2 root root        33 11月 22 23:02 hadoop-tools-deps
drwxr-xr-x. 3 root root        22 11月 22 23:07 maven-shared-archive-resources
drwxr-xr-x. 3 root root        22 11月 22 23:07 test-classes
drwxr-xr-x. 2 root root         6 11月 22 23:07 test-dir
[root@localhost target]# pwd
/home/hadoop/hadoop-3.0.0-beta1-src/hadoop-dist/target

總結:

我的認爲Hadoop編譯並非很困難,前提是對Linux有必定的熟練度。編譯前看下源碼裏面的文檔,同時須要掌握好谷歌這一利器,基本上沒多大的困難,就是得花點時間,我大概花了倆個小時左右編譯完成。

編譯環境包下載地址:碼雲https://gitee.com/nanxun/hadoop3.0HuanJingBao

相關文章
相關標籤/搜索