CentOS7 下編譯 Hadoop

準備工做

下載 Hadoop 源碼 Source (當前最新 2.9.2)

https://hadoop.apache.org/releases.htmlhtml

打開壓縮包會看到 BUILDING.txt 文件,這是官方提供的編譯說明,參看 Linux 部分java

Requirements:

* Unix System
* JDK 1.7 or 1.8
* Maven 3.0 or later
* Findbugs 1.3.9 (if running findbugs)
* ProtocolBuffer 2.5.0
* CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac
* Zlib devel (if compiling native code)
* openssl devel (if compiling native hadoop-pipes and to get the best HDFS encryption performance)
* Linux FUSE (Filesystem in Userspace) version 2.6 or above (if compiling fuse_dfs)
* Internet connection for first build (to fetch all Maven and Hadoop dependencies)
* python (for releasedocs)
* Node.js / bower / Ember-cli (for YARN UI v2 building)

 

搭建編譯環境

 

JDK(1.8)python

https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.htmllinux

# 解壓
tar -zxf /opt/bak/jdk-8u192-linux-x64.tar.gz -C /opt/app/

# 配置JDK環境變量
vi /etc/profile

# JAVA_HOME
# export爲把變量導出爲全局變量
export JAVA_HOME=/opt/app/jdk1.8.0_192/
# .表示當前路徑,:表示分隔符
export CLASSPATH=.:$JAVA_HOME/lib/dt.jar:$JAVA_HOME/lib/tools.jar:$CLASSPATH
# $PATH表示取出當前系統中的值,相似於i = 3 + i
export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH

# 刷新環境變量
source /etc/profile

# 驗證JDK
java -version
# java version "1.8.0_192"
# Java(TM) SE Runtime Environment (build 1.8.0_192-b12)
# Java HotSpot(TM) 64-Bit Server VM (build 25.192-b12, mixed mode)

 

Maven(3.6.0)c++

http://maven.apache.org/download.cgigit

# 解壓
tar -zxf /opt/bak/apache-maven-3.6.0-bin.tar.gz -C /opt/app/

# 配置環境變量
vi /etc/profile

# MAVEN_HOME
export MAVEN_HOME=/opt/app/apache-maven-3.6.0/
export PATH=$PATH:$MAVEN_HOME/bin

# 刷新環境變量
source /etc/profile

# 驗證Maven
mvn -v
# Apache Maven 3.6.0 (97c98ec64a1fdfee7767ce5ffb20918da4f719f3; 2018-10-25T02:41:47+08:00)
# Maven home: /opt/app/apache-maven-3.6.0
# Java version: 1.8.0_192, vendor: Oracle Corporation, runtime: /opt/app/jdk1.8.0_192/jre
# Default locale: zh_CN, platform encoding: UTF-8
# OS name: "linux", version: "3.10.0-862.el7.x86_64", arch: "amd64", family: "unix"

# 配置 Maven 倉庫 (華爲)
vim /opt/app/apache-maven-3.6.0/conf/settings.xml

 settings.xmlgithub

<!-- 本地倉庫路徑 -->
<localRepository>/opt/repo</localRepository>

<!-- 網絡倉庫地址 -->
<mirrors>
  <mirror>
    <id>huaweicloud</id>
    <mirrorOf>central</mirrorOf>
    <url>https://mirrors.huaweicloud.com/repository/maven/</url>
  </mirror>
</mirrors>

 

Ant(1.10.5)apache

https://ant.apache.org/bindownload.cgibootstrap

# 解壓
tar -zxf /opt/bak/apache-ant-1.10.5-bin.tar.gz -C /opt/app/

# 配置環境變量
vi /etc/profile

# ANT_HOME
export ANT_HOME=/opt/app/apache-ant-1.10.5/
export PATH=$PATH:$ANT_HOME/bin

# 刷新環境變量
source /etc/profile

# 驗證
ant -version
# Apache Ant(TM) version 1.10.5 compiled on July 10 2018

 

Protobuf(版本必須爲2.5)vim

https://github.com/protocolbuffers/protobuf/releases/tag/v2.5.0

# protobuf 須要編譯安裝,先配置 yum 源,安裝編譯工具

# 清空原有 yumrm -rf /etc/yum.repos.d/*
# 下載阿里 yum 源配置文件
curl -o /etc/yum.repos.d/CentOS-Base.repo http://mirrors.aliyun.com/repo/Centos-7.repo

# 安裝編譯工具
yum install -y glibc-headers gcc-c++ make cmake

# 解壓 protobuf
tar -zxf /opt/bak/protobuf-2.5.0.tar.gz -C /opt/app/

# 編譯安裝(須要一點時間)
cd /opt/app/protobuf-2.5.0/
# 檢查當前的環境是否知足要安裝軟件的依賴關係,並設置安裝目錄
./configure --prefix=/opt/app/protobuf-2.5.0/
# 從 Makefile 中讀取指令,編譯
make
# 測試
make check
# 從 Makefile 中讀取指令,安裝到指定的位置
make install

# 讓動態連接庫爲系統所共享
ldconfig

# 配置環境變量
vim /etc/profile

# 用於指定查找共享庫(動態連接庫)時除了默認路徑以外的其餘路徑
# LD_LIBRARY_PATH
export LD_LIBRARY_PATH=/opt/app/protobuf-2.5.0/
export PATH=$PATH:$LD_LIBRARY_PATH

# 刷新環境變量
source /etc/profile

# 驗證
protoc --version
# libprotoc 2.5.0

 

openssl devel

yum install -y openssl-devel ncurses-devel

 

Snappy(1.1.7,讓 hadoop 支持壓縮,非必須)

https://github.com/google/snappy/releases

Snappy(1.1.7) 編譯須要 CMake3 版本,先安裝 CMake3 (3.13.2)

https://cmake.org/download/

# 解壓
tar -zxf /opt/bak/cmake-3.13.2.tar.gz -C /opt/app/

# 安裝 gcc (若沒有安裝)
yum install -y gcc gcc-c++

# 編譯安裝 CMake3 (須要一點時間)
cd /opt/app/cmake-3.13.2/
# 檢查依賴,設置安裝路徑
./bootstrap --prefix=/opt/app/cmake-3.13.2/
# 編譯
gmake
# 安裝
gmake install

# 移除舊的 cmake 版本
yum remove -y cmake

# 配置環境變量
vim /etc/profile

# CMake_HOME
export CMake_HOME=/opt/app/cmake-3.13.2/
export PATH=$PATH:$CMake_HOME/bin

# 刷新環境變量
source /etc/profile

# 驗證
cmake --version
# cmake version 3.13.2
# CMake suite maintained and supported by Kitware (kitware.com/cmake).

安裝 Snappy

# 解壓
tar -zxf /opt/bak/snappy-1.1.7.tar.gz -C /opt/app/

# 編譯安裝
cd /opt/app/snappy-1.1.7/
# 建立構建文件夾
mkdir build
# 編譯
cd build && cmake ../ && make
# 安裝
make install

Install the project...
-- Install configuration: ""
-- Installing: /usr/local/lib64/libsnappy.a
-- Installing: /usr/local/include/snappy-c.h
-- Installing: /usr/local/include/snappy-sinksource.h
-- Installing: /usr/local/include/snappy.h
-- Installing: /usr/local/include/snappy-stubs-public.h
-- Installing: /usr/local/lib64/cmake/Snappy/SnappyTargets.cmake
-- Installing: /usr/local/lib64/cmake/Snappy/SnappyTargets-noconfig.cmake
-- Installing: /usr/local/lib64/cmake/Snappy/SnappyConfig.cmake
-- Installing: /usr/local/lib64/cmake/Snappy/SnappyConfigVersion.cmake

# 驗證
ls -lh /usr/local/lib64 | grep snappy
# -rw-r--r--. 1 root root 184K 3月  13 22:23 libsnappy.a

 

開始編譯

官方編譯參數說明

Maven build goals:

 * Clean                     : mvn clean [-Preleasedocs]
 * Compile                   : mvn compile [-Pnative]
 * Run tests                 : mvn test [-Pnative]
 * Create JAR                : mvn package
 * Run findbugs              : mvn compile findbugs:findbugs
 * Run checkstyle            : mvn compile checkstyle:checkstyle
 * Install JAR in M2 cache   : mvn install
 * Deploy JAR to Maven repo  : mvn deploy
 * Run clover                : mvn test -Pclover [-DcloverLicenseLocation=${user.name}/.clover.license]
 * Run Rat                   : mvn apache-rat:check
 * Build javadocs            : mvn javadoc:javadoc
 * Build distribution        : mvn package [-Pdist][-Pdocs][-Psrc][-Pnative][-Dtar][-Preleasedocs][-Pyarn-ui]
 * Change Hadoop version     : mvn versions:set -DnewVersion=NEWVERSION

 Build options:

  * Use -Pnative to compile/bundle native code
  * Use -Pdocs to generate & bundle the documentation in the distribution (using -Pdist)
  * Use -Psrc to create a project source TAR.GZ
  * Use -Dtar to create a TAR with the distribution (using -Pdist)
  * Use -Preleasedocs to include the changelog and release docs (requires Internet connectivity)
  * Use -Pyarn-ui to build YARN UI v2. (Requires Internet connectivity)

 Snappy build options:

   Snappy is a compression library that can be utilized by the native code.
   It is currently an optional component, meaning that Hadoop can be built with
   or without this dependency.

  * Use -Drequire.snappy to fail the build if libsnappy.so is not found.
    If this option is not specified and the snappy library is missing,
    we silently build a version of libhadoop.so that cannot make use of snappy.
    This option is recommended if you plan on making use of snappy and want
    to get more repeatable builds.

  * Use -Dsnappy.prefix to specify a nonstandard location for the libsnappy
    header files and library files. You do not need this option if you have
    installed snappy using a package manager.
  * Use -Dsnappy.lib to specify a nonstandard location for the libsnappy library
    files.  Similarly to snappy.prefix, you do not need this option if you have
    installed snappy using a package manager.
  * Use -Dbundle.snappy to copy the contents of the snappy.lib directory into
    the final tar file. This option requires that -Dsnappy.lib is also given,
    and it ignores the -Dsnappy.prefix option. If -Dsnappy.lib isn't given, the
    bundling and building will fail.

 OpenSSL build options:

   OpenSSL includes a crypto library that can be utilized by the native code.
   It is currently an optional component, meaning that Hadoop can be built with
   or without this dependency.

  * Use -Drequire.openssl to fail the build if libcrypto.so is not found.
    If this option is not specified and the openssl library is missing,
    we silently build a version of libhadoop.so that cannot make use of
    openssl. This option is recommended if you plan on making use of openssl
    and want to get more repeatable builds.
  * Use -Dopenssl.prefix to specify a nonstandard location for the libcrypto
    header files and library files. You do not need this option if you have
    installed openssl using a package manager.
  * Use -Dopenssl.lib to specify a nonstandard location for the libcrypto library
    files. Similarly to openssl.prefix, you do not need this option if you have
    installed openssl using a package manager.
  * Use -Dbundle.openssl to copy the contents of the openssl.lib directory into
    the final tar file. This option requires that -Dopenssl.lib is also given,
    and it ignores the -Dopenssl.prefix option. If -Dopenssl.lib isn't given, the
    bundling and building will fail.

   Tests options:

  * Use -DskipTests to skip tests when running the following Maven goals:
    'package',  'install', 'deploy' or 'verify'
  * -Dtest=<TESTCLASSNAME>,<TESTCLASSNAME#METHODNAME>,....
  * -Dtest.exclude=<TESTCLASSNAME>
  * -Dtest.exclude.pattern=**/<TESTCLASSNAME1>.java,**/<TESTCLASSNAME2>.java
  * To run all native unit tests, use: mvn test -Pnative -Dtest=allNative
  * To run a specific native unit test, use: mvn test -Pnative -Dtest=<test>
  For example, to run test_bulk_crc32, you would use:
  mvn test -Pnative -Dtest=test_bulk_crc32
View Code

編譯

# 解壓 hadoop 源碼
tar -zxf /opt/bak/hadoop-2.9.2-src.tar.gz -C /opt/app/

# 編譯
cd /opt/app/hadoop-2.9.2-src/

mvn clean package -Pdist,native -DskipTests –Dtar

# 帶 snappy 版本編譯
mvn clean package -Pdist,native -DskipTests -Dtar -Dbundle.snappy -Dsnappy.lib=/usr/local/lib64

# -Pdist,native          :把從新編譯生成的hadoop動態庫
# -DskipTests           :跳過測試
# -Dtar               :最後把文件以tar打包
# -Dbundle.snappy         :添加snappy壓縮支持(默認官網下載的是不支持的)
# -Dsnappy.lib=/usr/local/lib  :指snappy在編譯機器上安裝後的庫路徑

編譯成功後的打印結果

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary for Apache Hadoop Main 2.9.2:
[INFO]
[INFO] Apache Hadoop Main ................................. SUCCESS [ 23.222 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [ 38.032 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [ 13.828 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 11.047 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  0.281 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 30.537 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 22.664 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 37.050 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [  6.176 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [  6.545 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [01:32 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [  5.024 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [ 10.940 s]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  0.056 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [ 25.101 s]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [ 52.534 s]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [  5.757 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [ 16.739 s]
[INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [ 15.768 s]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [  3.893 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [ 20.197 s]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  0.044 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.058 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [ 14.627 s]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [ 27.408 s]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [  4.724 s]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  0.050 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [ 11.299 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [ 28.335 s]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [  2.798 s]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [ 13.109 s]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [  4.428 s]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [ 20.831 s]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [  1.059 s]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [  5.287 s]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [  3.199 s]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [  3.138 s]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [  4.267 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [ 16.384 s]
[INFO] Apache Hadoop YARN Timeline Service HBase tests .... SUCCESS [  2.344 s]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  0.043 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [  2.740 s]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [  1.604 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  0.042 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [  0.202 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [  5.631 s]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  0.132 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [ 17.885 s]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [ 13.105 s]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [  2.950 s]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [  7.437 s]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [  4.776 s]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [  5.024 s]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [  1.584 s]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [  4.781 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [  2.414 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [  3.705 s]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [  3.749 s]
[INFO] Apache Hadoop Archives ............................. SUCCESS [  1.765 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [  1.826 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [  4.424 s]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [  3.199 s]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [  2.016 s]
[INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [  1.674 s]
[INFO] Apache Hadoop Extras ............................... SUCCESS [  2.332 s]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [  4.439 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [  3.532 s]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [03:14 min]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [ 27.134 s]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [ 40.460 s]
[INFO] Apache Hadoop Client ............................... SUCCESS [  6.228 s]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [  1.359 s]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [  4.537 s]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [ 14.946 s]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [ 30.651 s]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 14.665 s]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  0.023 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 47.203 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  0.666 s]
[INFO] Apache Hadoop Cloud Storage Project ................ SUCCESS [  0.033 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
View Code

snappy 壓縮支持,能夠在安裝 Hadoop 後用命令檢查,在 lib/native 路徑下也有相關文件

hadoop checknative

 

編譯過程當中請保持網絡暢通,編譯可能會由於包依賴下載出錯而失敗,可多試幾回。

編譯成功後生成的文件在 hadoop-dist/target 路徑下


https://www.58jb.com/html/113.html

相關文章
相關標籤/搜索