1. 訪問hadoop官網下載hadoop-2.7.1-src.tar.gzjava
tar -zxvf hadoop-2.7.1-src.tar.gz cd hadoop-2.7.1-src vi BUILDING.txt
Requirements: * Unix System * JDK 1.7+ * Maven 3.0 or later * Findbugs 1.3.9 (if running findbugs) * ProtocolBuffer 2.5.0 * CMake 2.6 or newer (if compiling native code), must be 3.0 or newer on Mac * Zlib devel (if compiling native code) * openssl devel ( if compiling native hadoop-pipes and to get the best HDFS encryption performance ) * Jansson C XML parsing library ( if compiling libwebhdfs ) * Linux FUSE (Filesystem in Userspace) version 2.6 or above ( if compiling fuse_dfs ) * Internet connection for first build (to fetch all Maven and Hadoop dependencies)
2. 安裝java1.8.0_60node
下載jdk-8u60-linux-x64.tar.gz,解壓後移動到/opt目錄下python
tar -zxvf jdk-8u60-linux-x64.tar.gz mv jdk1.8.0_60 /opt
而後打開/etc/profile配置jdk環境變量linux
vim /etc/profile
按 i 進入插入模式,在文件末尾添加web
export JAVA_HOME=/opt/jdk1.8.0_60 export CLASSPATH=.:$JAVA_HOME/jre/lib/rt.jar:$JAVA_HOME/lib/tools.jar export PATH=$PATH:$JAVA_HOME/bin export JRE_HOME=/opt/jdk1.8.0_60/jre export PATH=$PATH:$JRE_HOME/bin
前後按Esc, Shift+:, wq, 回車便可保存並退出編輯。<br>輸入 source /etc/profile 回車便可保存更改。shell
運行javac -version 查看狀態:apache
[root@hadoop1 opt]# java -version java version "1.8.0_60" Java(TM) SE Runtime Environment (build 1.8.0_60-b27) Java HotSpot(TM) 64-Bit Server VM (build 25.60-b23, mixed mode)
補充:刪除linux預裝jdkvim
檢驗系統原版本jdkapi
[root@zck ~]# java -version java version "1.7.0_" OpenJDK Runtime Environment (IcedTea6 1.11.1) (rhel-1.45.1.11.1.el6-x86_64) OpenJDK 64-Bit Server VM (build 20.0-b12, mixed mode)
進一步查看JDK信息:服務器
[root@localhost ~]# rpm -qa | grep java javapackages-tools-3.4.1-6.el7_0.noarch tzdata-java-2014i-1.el7.noarch java-1.7.0-openjdk-headless-1.7.0.71-2.5.3.1.el7_0.x86_64 java-1.7.0-openjdk-1.7.0.71-2.5.3.1.el7_0.x86_64 python-javapackages-3.4.1-6.el7_0.noarch
卸載OpenJDK,執行如下操做:
[root@localhost ~]# rpm -e --nodeps tzdata-java-2014i-1.el7.noarch [root@localhost ~]# rpm -e --nodeps java-1.7.0-openjdk-headless-1.7.0.71-2.5.3.1.el7_0.x86_64 [root@localhost ~]# rpm -e --nodeps java-1.7.0-openjdk-1.7.0.71-2.5.3.1.el7_0.x86_64
3. 安裝相關類庫
yum -y install svn ncurses-devel gcc* yum -y install lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel
4. 安裝protobuf-2.5.0.tar.gz(注意版本必須是2.5.0)
下載protobuf-2.5.0.tar.gz
tar zxvf protobuf-2.5.0.tar.gz
進入protobuf-2.5.0依次執行
cd protobuf-2.5.0 ./configure make make install
驗證安裝是否完成
[root@hadoop1 protobuf-2.5.0]# protoc --version libprotoc 2.5.0
5. 安裝maven
下載apache-maven-3.2.2-bin.tar.gz
tar -zxvf apache-maven-3.2.2-bin.tar.gz mv apache-maven-3.2.2 /opt
配置環境變量:
vi /etc/profile
在文件尾部追加
export MAVEN_HOME=/opt/apache-maven-3.2.2 export MAVEN_OPTS="-Xms256m -Xmx512m" export PATH=$PATH:$MAVEN_HOME/bin
使/etc/profile生效
source /etc/profile
查看安裝狀態
mvn -version [root@hadoop1 ~]# mvn -version Apache Maven 3.2.2 (45f7c06d68e745d05611f7fd14efb6594181933e; 2014-06-17T21:51:42+08:00) Maven home: /opt/apache-maven-3.2.2 Java version: 1.8.0_60, vendor: Oracle Corporation Java home: /opt/jdk1.8.0_60/jre Default locale: zh_CN, platform encoding: GB18030 OS name: "linux", version: "3.10.0-229.14.1.el7.x86_64", arch: "amd64", family: "unix"
6. 安裝ant
下載apache-ant-1.9.4-bin.tar.gz 後解壓
tar -zxvf apache-ant-1.9.4-bin.tar.gz
移動到/opt目錄下
mv apache-ant-1.9.4 /opt
配置環境變量
vi /etc/profile
在文件尾部追加
export ANT_HOME=/opt/apache-ant-1.9.4 export PATH=$PATH:$ANT_HOME/bin
使更改生效
source /etc/profile
查看安裝結果
[root@hadoop1 ~]# ant -version Apache Ant(TM) version 1.9.4 compiled on April 29 2014
7. 安裝findbugs
下載findbugs-3.0.1.tar.gz解壓縮
tar -zxvf findbugs-3.0.1.tar.gz
移動到/opt目錄下
mv findbugs-3.0.1 /opt
配置環境變量
[root@hadoop1 ~]# vi /etc/profile export FINDBUGS_HOME=/opt/findbugs-3.0.1 export PATH=$PATH:$FINDBUGS_HOME/bin
使更改生效
[root@hadoop1 ~]# source /etc/profile
查看安裝結果
[root@hadoop1 ~]# findbugs -version 3.0.1
8. 編譯hadoop-2.7.1-src
[root@hadoop1 ~]# cd hadoop-2.7.1-src [root@hadoop1 hadoop-2.7.1-src]# mvn clean package -Pdist,native -DskipTests -Dtar
編譯結果
[INFO] [INFO] Apache Hadoop Main ................................. SUCCESS [04:06 min] [INFO] Apache Hadoop Project POM .......................... SUCCESS [01:39 min] [INFO] Apache Hadoop Annotations .......................... SUCCESS [ 56.391 s] [INFO] Apache Hadoop Assemblies ........................... SUCCESS [ 0.246 s] [INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [ 34.518 s] [INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [01:03 min] [INFO] Apache Hadoop MiniKDC .............................. SUCCESS [08:13 min] [INFO] Apache Hadoop Auth ................................. SUCCESS [04:18 min] [INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 31.298 s] [INFO] Apache Hadoop Common ............................... SUCCESS [05:39 min] [INFO] Apache Hadoop NFS .................................. SUCCESS [ 10.503 s] [INFO] Apache Hadoop KMS .................................. SUCCESS [01:08 min] [INFO] Apache Hadoop Common Project ....................... SUCCESS [ 0.036 s] [INFO] Apache Hadoop HDFS ................................. SUCCESS [04:41 min] [INFO] Apache Hadoop HttpFS ............................... SUCCESS [01:40 min] [INFO] Apache Hadoop HDFS BookKeeper Journal .............. SUCCESS [02:23 min] [INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 6.159 s] [INFO] Apache Hadoop HDFS Project ......................... SUCCESS [ 0.028 s] [INFO] hadoop-yarn ........................................ SUCCESS [ 0.092 s] [INFO] hadoop-yarn-api .................................... SUCCESS [ 46.580 s] [INFO] hadoop-yarn-common ................................. SUCCESS [03:15 min] [INFO] hadoop-yarn-server ................................. SUCCESS [ 0.082 s] [INFO] hadoop-yarn-server-common .......................... SUCCESS [ 15.964 s] [INFO] hadoop-yarn-server-nodemanager ..................... SUCCESS [ 22.764 s] [INFO] hadoop-yarn-server-web-proxy ....................... SUCCESS [ 3.983 s] [INFO] hadoop-yarn-server-applicationhistoryservice ....... SUCCESS [ 8.750 s] [INFO] hadoop-yarn-server-resourcemanager ................. SUCCESS [ 25.165 s] [INFO] hadoop-yarn-server-tests ........................... SUCCESS [ 5.843 s] [INFO] hadoop-yarn-client ................................. SUCCESS [ 7.708 s] [INFO] hadoop-yarn-server-sharedcachemanager .............. SUCCESS [ 4.324 s] [INFO] hadoop-yarn-applications ........................... SUCCESS [ 0.084 s] [INFO] hadoop-yarn-applications-distributedshell .......... SUCCESS [ 3.220 s] [INFO] hadoop-yarn-applications-unmanaged-am-launcher ..... SUCCESS [ 2.216 s] [INFO] hadoop-yarn-site ................................... SUCCESS [ 0.040 s] [INFO] hadoop-yarn-registry ............................... SUCCESS [ 6.727 s] [INFO] hadoop-yarn-project ................................ SUCCESS [ 6.327 s] [INFO] hadoop-mapreduce-client ............................ SUCCESS [ 0.033 s] [INFO] hadoop-mapreduce-client-core ....................... SUCCESS [ 29.054 s] [INFO] hadoop-mapreduce-client-common ..................... SUCCESS [ 20.872 s] [INFO] hadoop-mapreduce-client-shuffle .................... SUCCESS [ 5.443 s] [INFO] hadoop-mapreduce-client-app ........................ SUCCESS [ 11.221 s] [INFO] hadoop-mapreduce-client-hs ......................... SUCCESS [ 6.744 s] [INFO] hadoop-mapreduce-client-jobclient .................. SUCCESS [ 37.598 s] [INFO] hadoop-mapreduce-client-hs-plugins ................. SUCCESS [ 2.118 s] [INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 6.785 s] [INFO] hadoop-mapreduce ................................... SUCCESS [ 2.837 s] [INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [ 14.136 s] [INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [ 27.078 s] [INFO] Apache Hadoop Archives ............................. SUCCESS [ 2.565 s] [INFO] Apache Hadoop Rumen ................................ SUCCESS [ 6.942 s] [INFO] Apache Hadoop Gridmix .............................. SUCCESS [ 5.631 s] [INFO] Apache Hadoop Data Join ............................ SUCCESS [ 3.247 s] [INFO] Apache Hadoop Ant Tasks ............................ SUCCESS [ 2.675 s] [INFO] Apache Hadoop Extras ............................... SUCCESS [ 3.461 s] [INFO] Apache Hadoop Pipes ................................ SUCCESS [ 10.270 s] [INFO] Apache Hadoop OpenStack support .................... SUCCESS [ 5.841 s] [INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [02:23 min] [INFO] Apache Hadoop Azure support ........................ SUCCESS [ 23.729 s] [INFO] Apache Hadoop Client ............................... SUCCESS [ 7.584 s] [INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [ 1.093 s] [INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [ 6.164 s] [INFO] Apache Hadoop Tools Dist ........................... SUCCESS [ 11.370 s] [INFO] Apache Hadoop Tools ................................ SUCCESS [ 0.020 s] [INFO] Apache Hadoop Distribution ......................... SUCCESS [01:41 min] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 53:39 min [INFO] Finished at: 2015-10-26T00:48:40+08:00 [INFO] Final Memory: 140M/494M
編譯結束,成功!
9. 配置ssh無密碼登陸
CentOS默認沒有啓動ssh無密登陸,去掉/etc/ssh/sshd_config其中2行的註釋
vi /etc/ssh/sshd_config #去掉下面兩行前面的註釋 #RSAAuthentication yes #PubkeyAuthentication yes
)輸入命令,ssh-keygen -t rsa,生成key,都不輸入密碼,一直回車,/root就會生成.ssh文件夾,每臺服務器都要設置
ssh-keygen -t rsa
合併公鑰到authorized_keys文件,在Master服務器,進入/root/.ssh目錄,經過SSH命令合併
cd /root/.ssh cat id_rsa.pub>> authorized_keys
測試結果
[root@hadoop1 ~]# ssh localhost Last login: Mon Oct 26 01:12:35 2015 from localhost
10. 安裝hadoop2.7.1