CentOS6.4編譯Hadoop-2.4.0

 
由於搭建Hadoop環境的時候,所用的系統鏡像是emi-centos-6.4-x86_64,是64位的,而hadoop是默認是32的安裝包。這致使咱們不少操做都會遇到這個問題(Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /opt/hadoop-2.4.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.)
爲了解決此問題,須要從新編譯hadoop。把生成的 hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native 覆蓋到 /opt/hadoop-2.4.0/lib/native。

如下是具體的編譯步驟:

1. 安裝下面的軟件
[root@hd1 software]# yum install lzo-devel zlib-devel gcc autoconf automake libtool ncurses-devel openssl-deve

 

2. 安裝Mavenjava

[hxiaolong@hd1 software]$ wget http://mirror.esocc.com/apache/maven/maven-3/3.0.5/binaries/apache-maven-3.0.5-bin.tar.gz
[hxiaolong@hd1 software]$ tar zxf apache-maven-3.0.5-bin.tar.gz -C /opt

[hxiaolong@hd1 software]$ vi /etc/profile
export MAVEN_HOME=/opt/apache-maven-3.0.5
export PATH=$PATH:$MAVEN_HOME/bin

 

3. 安裝Antapache

[hxiaolong@hd1 software]$ wget http://mirror.bit.edu.cn/apache/ant/binaries/apache-ant-1.9.4-bin.tar.gz
[hxiaolong@hd1 software]$ tar zxf apache-ant-1.9.4-bin.tar.gz -C /opt

[hxiaolong@hd1 software]$ vi /etc/profile
export ANT_HOME=/opt/apache-ant-1.9.4
export PATH=$PATH:$ANT_HOME/bin

 

4. 安裝Findbugs
[hxiaolong@hd1 software]$ wget http://prdownloads.sourceforge.net/findbugs/findbugs-2.0.3.tar.gz?download
[hxiaolong@hd1 software]$ tar zxf findbugs-2.0.3.tar.gz -C /opt

[hxiaolong@hd1 software]$ vi /etc/profile
export FINDBUGS_HOME=/opt/findbugs-2.0.3
export PATH=$PATH:$FINDBUGS_HOME/bin

 

5. 安裝protobuf
[hxiaolong@hd1 software]$ tar zxf protobuf-2.5.0.tar.gz
[hxiaolong@hd1 software]$ cd protobuf-2.5.0
[hxiaolong@hd1 software]$ ./configure
[hxiaolong@hd1 software]$ make
[hxiaolong@hd1 software]$ make install

 說實話,上面這種編譯、安裝方式挺麻煩的。很容易碰到各類依賴問題。這裏推薦用yum install來安裝。centos

[root@hd1 protobuf-2.5.0]# yum install protobuf

 

6. 編譯Hadoop

1) 在name節點上先編譯hadoop
[hxiaolong@hd1 software]$ wget http://mirrors.cnnic.cn/apache/hadoop/common/hadoop-2.4.0/hadoop-2.4.0-src.tar.gz
[hxiaolong@hd1 software]$ cd hadoop-2.4.0-src

[hxiaolong@hd1 software]$ mvn package -DskipTests -Pdist,native -Dtar

 中間過程出錯了,錯誤信息以下:maven

[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.7:run (make) on project hadoop-common: An Ant BuildException has occured: Execute failed: java.io.IOException: Cannot run program "cmake" (in directory "/home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native"): error=2, No such file or directory
[ERROR] around Ant part ...<exec dir="/home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/native" executable="cmake" failonerror="true">... @ 4:145 in /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-common-project/hadoop-common/target/antrun/build-main.xml
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hadoop-common
網上看了下,是由於cmake沒安裝引發的。安裝一下再試。
[root@hd1 hadoop-2.4.0-src]# yum instsall cmake

 從新編譯,最終成功了。oop

[hxiaolong@hd1 software]$ mvn package -DskipTests -Pdist,native -Dtar

main:
     [exec] $ tar cf hadoop-2.4.0.tar hadoop-2.4.0
     [exec] $ gzip -f hadoop-2.4.0.tar
     [exec]
     [exec] Hadoop dist tar available at: /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0.tar.gz
    
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 12:41.833s
[INFO] Finished at: Wed Jul 23 03:01:18 UTC 2014
[INFO] Final Memory: 159M/646M
[INFO] ------------------------------------------------------------------------

2) 把編譯後的hadoop的native目錄copy到/opt/hadoop-2.4.0/lib/
[hxiaolong@hd1 lib]$ rm -rf /opt/hadoop-2.4.0/lib/native
[hxiaolong@hd1 lib]$ cp -R /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native /opt/hadoop-2.4.0/lib/

 這是很是重要的一個步驟。ui


3) 把編譯後的hadoop的native目錄scp其它節點
[root@hd1 lib]# scp -r /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native/ hd2:/opt/hadoop-2.4.0/lib/ 
[root@hd1 lib]# scp -r /home/hxiaolong/software/hadoop-2.4.0-src/hadoop-dist/target/hadoop-2.4.0/lib/native/ hd3:/opt/hadoop-2.4.0/lib/

 若是不把從新編譯事後的native目錄同步到其它節點,那在其它節點也會遇到一樣的問題。spa


4) 驗證
[hxiaolong@hd2 native]$ hadoop fs -ls /
Found 1 items
drwxr-xr-x   - hxiaolong supergroup          0 2014-07-23 05:21 /input

 

OK了,不會報錯了。
今天偶然發現這個鏡像mirror.bit.edu.cn還蠻快的,並且比較穩定。看了下,是北理的。贊一個~
相關文章
相關標籤/搜索