Hadoop集羣搭建-04安裝配置HDFS linux
Hadoop集羣搭建-02安裝配置Zookeepershell
Hadoop集羣搭建-01前期準備apache
hadoop的編譯和安裝是直接在一臺機器上搞得,姑且nn1機器。vim
全程切換到root用戶下操做
1.hadoop的一些資源在這裏:https://www.lanzous.com/b849710/ 密碼:9vuibash
[hadoop@nn1 zk_op]$ su - root [root@nn1 ~]# mkdir /tmp/hadoop_c [root@nn1 ~]# cd /tmp/hadoop_c/ 用xshell的rz命令上傳源碼包到上面的目錄。 [root@nn1 hadoop_c]# tar -xzf /tmp/hadoop_c/hadoop-2.7.3-src.tar.gz -C /usr/local/
yum安裝一下亂七八糟要用到的軟件和插件app
yum -y install svn ncurses-devel gcc* lzo-devel zlib-devel autoconf automake libtool cmake openssl-devel bzip2
2.編譯安裝protobuf,谷歌的通訊和存儲協議,必需要用ssh
[root@nn1 ~]# tar -zxf protobuf-2.5.0.tar.gz -C /usr/local/ [root@nn1 ~]# cd /usr/local/protobuf-2.5.0 進行編譯安裝 [root@nn1 protobuf-2.5.0]# ./configure [root@nn1 protobuf-2.5.0]# make && make install [root@nn1 protobuf-2.5.0]# protoc --version
3.解壓安裝antmaven
[root@nn1 hadoop_c]# tar -xf apache-ant-1.9.9-bin.tar.bz2 -C /usr/local/
4.解壓安裝findbugs
[root@nn1 hadoop_c]# tar -zxf apache-ant-1.9.9-bin.tar.bz2 -C /usr/local/
5.解壓安裝maven
由於咱的hadoop在後邊是用maven編譯的。
[root@nn1 hadoop_c]# tar -zxf apache-maven-3.3.9-bin.tar.gz -C /usr/local/
6.編譯安裝snappy,谷歌的壓縮算法
[root@nn1 hadoop_c]# tar -xzf snappy-1.1.3.tar.gz -C /usr/local/ [root@nn1 hadoop_c]# cd /usr/local/snappy-1.1.3/ [root@nn1 snappy-1.1.3]# ./configure [root@nn1 snappy-1.1.3]# make && make install
7.確保已經安裝jdk8
8.在環境變量中添加hadoop編譯所需的各類軟件的變量
[root@nn1 snappy-1.1.3]# vim /etc/profile #set Hadoop_compile export MAVEN_HOME=/usr/local/apache-maven-3.3.9 export FINDBUGS_HOME=/usr/local/findbugs-3.0.1 export PROTOBUF_HOME=/usr/local/protobuf-2.5.0 export ANT_HOME=/usr/local/apache-ant-1.9.9 export PATH=$PATH:$MAVEN_HOME/bin:$FINDBUGS_HOME/bin:$ANT_HOME/bin export MAVEN_OPTS="-Xmx2g -XX:MaxMetaspaceSize=512M -XX:ReservedCodeCacheSize=512m"
讓環境變量生效: [root@nn1 snappy-1.1.3]# source /etc/profile 查看maven版本 [root@nn1 snappy-1.1.3]# mvn -v Apache Maven 3.3.9 (bb52d8502b132ec0a5a3f4c09453c07478323dc5; 2015-11-11T00:41:47+08:00) Maven home: /usr/local/apache-maven-3.3.9 Java version: 1.8.0_144, vendor: Oracle Corporation Java home: /usr/java/jdk1.8.0_144/jre Default locale: zh_CN, platform encoding: UTF-8 OS name: "linux", version: "3.10.0-957.21.3.el7.x86_64", arch: "amd64", family: "unix" [root@nn1 snappy-1.1.3]#
9.更改maven的settings.xml文件,更改倉庫,配置本地倉庫位置
[root@nn1 hadoop_c]# cd /usr/local/apache-maven-3.3.9/conf/ [root@nn1 conf]# cp settings.xml settings.xml.bak [root@nn1 conf]# rm -rf settings.xml [root@nn1 conf]# cp /tmp/hadoop_c/settings.xml settings.xml
能夠直接用上邊網盤裏弄好的配置文件 主要修改了: <!-- 本地倉庫 --> <localRepository>/data/maven/repositories</localRepository> 和遠程倉庫 <mirror> <id>huaweicloud</id> <mirrorOf>central</mirrorOf> <url>https://repo.huaweicloud.com/repository/maven/</url> </mirror> <mirror> <id>nexus-aliyun</id> <mirrorOf>central</mirrorOf> <name>Nexus aliyun</name> <url>http://maven.aliyun.com/nexus/content/groups/public</url> </mirror> <mirror> <id>maven</id> <name>MavenMirror</name> <url>http://repo1.maven.org/maven2/</url> <mirrorOf>central</mirrorOf> </mirror>
把網盤裏的maven離線倉庫下載解壓成以下路徑/data/maven/repositories/各類jar包
下邊開始用maven編譯hadoop源碼
[root@nn1 conf]# cd /usr/local/hadoop-2.7.3-src/ [root@nn1 hadoop-2.7.3-src]# nohup mvn clean package -Pdist,native -DskipTests -Dtar -Dbundle.snappy -Dsnappy.lib=/usr/local/lib > /tmp/hadoop_log 2>&1 &
nohup命令用來讓任務在後臺運行,這裏重定向輸出到了日誌文件,這裏經過查看日誌文件來監控安裝進程。
[root@nn1 hadoop-2.7.3-src]# tail -f /tmp/hadoop_log
10.編譯完成後就會在/usr/local/hadoop-2.7.3-src/hadoop-dist/target/下生成一個tar.gz的包,把這個包cp到咱們的home,而後批量發送給其餘4臺機器。
[root@nn1 ~]# exit 登出 [hadoop@nn1 ~]$ cp /usr/local/hadoop-2.7.3-src/hadoop-dist/target/hadoop-2.7.3.tar.gz ~/ [hadoop@nn1 hadoop_base_op]$ ./scp_all.sh ../hadoop-2.7.3.tar.gz /tmp/
11.把5臺機器的tar包批量解壓到各自的/usr/local/下
[hadoop@nn1 hadoop_base_op]$ ./ssh_root.sh tar -zxf /tmp/hadoop-2.7.3.tar.gz -C /usr/local/ 修改文件用戶和權限 [hadoop@nn1 hadoop_base_op]$ ./ssh_all.sh chmod -R 770 /usr/local/hadoop-2.7.3 [hadoop@nn1 hadoop_base_op]$ ./ssh_root.sh ln -s /usr/local/hadoop-2.7.3 /usr/local/hadoop
這時候hadoop就安裝結束了,檢查一下是否正確安裝
[hadoop@s1 ~]$ source /etc/profile [hadoop@s1 ~]$ hadoop checknative 19/07/22 15:37:25 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library system-native 19/07/22 15:37:25 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library Native library checking: hadoop: true /usr/local/hadoop-2.7.3/lib/native/libhadoop.so.1.0.0 zlib: true /usr/lib64/libz.so.1 snappy: true /usr/local/hadoop-2.7.3/lib/native/libsnappy.so.1 lz4: true revision:99 bzip2: true /usr/lib64/libbz2.so.1 openssl: true /usr/lib64/libcrypto.so
若是提示hadoop是未知命令,那麼就要查看環境變量是否正確配置,而後能夠從新source一下環境變量。