hadoop1.1.2僞分佈安裝指南

1.僞分佈式的安裝

      1.1 修改ip

                   (1)打開VMWare或者VirtualBox的虛擬網卡java

                   (2)VMWare或者VirtualBox設置網絡鏈接方式爲host-onlynode

                   (3)linux中,修改ip。有上角的圖標,右鍵,選擇Edit  Connections....linux

                            ****ip必須與windows下虛擬網卡的ip在同一個網段,網關必須是存在的。windows

                   (4)重啓網卡,執行命令service network restart瀏覽器

                            ****報錯,如no suitable adapter錯誤,網絡

                   (5)驗證:執行命令ifconfigssh

      1.2 關閉防火牆

                   (1)執行命令:service iptables stop 關閉防火牆分佈式

                   (2)驗證:執行命令service iptables statusoop

      1.3 關閉防火牆的自動開啓

                   (1)執行命令chkconfig iptables offui

                   (2)驗證:執行命令chkconfig --list|grep iptables

     1.4 修改hostname

                   (1)執行命令hostname cloud4  修改會話中的hostname

                   (2)驗證:執行命令hostname

                   (3)執行命令vi  /etc/sysconfig/network 修改文件中的hostname

                   (4)驗證:執行命令reboot -h now 重啓機器

      1.5 設置iphostname綁定

                   (1)執行命令vi  /etc/hosts

                            在文本最後增長一行192.168.80.100 cloud4

                   (2)驗證:ping cloud4

(3)window中配置:主機名對應的ip

C:\Windows\System32\drivers\etc\hosts

      1.6 ssh免密碼登錄

                   (1)執行命令ssh-keygen -t rsa (而後一路Enter  產生祕鑰位於/root/.ssh/

                   (2)執行命令cp /root/.ssh/id_rsa.pub /root/.ssh/authorized_keys  產生受權文件

                   (3)驗證:ssh localhost  (ssh 主機名)

      1.7 安裝jdk

                   (1)使用winscpjdkhadoop複製到linux/root/Downloads

                   (2)cp  /root/Downloads/*  /usr/local

                   (3)cd /usr/local

                            賦予執行權限 chmod u+x  jdk-6u24-linux-i586.bin

                   (4)./jdk-6u24-linux-i586.bin

                   (5)重命名 mv jdk1.6.0_24  jdk

                   (6)執行命令 vi /etc/profile 設置環境變量 

                            增長兩行         export JAVA_HOME=/usr/local/jdk

                                                  export PATH=.:$JAVA_HOME/bin:$PATH

                            保存退出

                      執行命令  source  /etc/profile

                    (7)驗證:執行命令java -version

      1.8 安裝hadoop

                   (1)執行命令 tar -zxvf hadoop-1.1.2.tar.gz  解壓縮

                   (2)執行命令  mv hadoop-1.1.2  hadoop

                   (3)執行命令 vi  /etc/profile  設置環境變量

                            增長一行         export HADOOP_HOME=/usr/local/hadoop

                            修改一行         export PATH=.:$HADOOP_HOME/bin:$JAVA_HOME/bin:$PATH

                            保存退出

                      執行命令  source  /etc/profile     

                   (4)驗證:執行命令 hadoop

                   (5)修改位於conf/的配置文件hadoop-env.shcore-site.xmlhdfs-site.xmlmapred-site.xml

                            <1>文件hadoop-env.sh的第9

                            export JAVA_HOME=/usr/local/jdk/

 

                            <2>文件core-site.xml

                            <configuration>

                                     <property>

                                               <name>fs.default.name</name>

                                               <value>hdfs://cloud4:9000</value>

                                               <description>change your own hostname</description>

                                     </property>

                                     <property>

                                               <name>hadoop.tmp.dir</name>

                                               <value>/usr/local/hadoop/tmp</value>

                                     </property> 

                            </configuration>

                            <3>文件hdfs-site.xml

                            <configuration>

                                     <property>

                                               <name>dfs.replication</name>    #表示設置副本數,默認是3

                                               <value>1</value>

                                     </property>

                                     <property>

                                               <name>dfs.permissions</name>   #表示是否設置權限控制

                                               <value>false</value>

                                     </property>

                            </configuration>

若是是super-user(超級用戶),它是nameNode進程的標識。系統不會執行任何權限檢查

                            <4>文件mapred-site.xml

                            <configuration>

                                     <property>

                                               <name>mapred.job.tracker</name>

                                               <value>cloud4:9001</value>

                                               <description>change your own hostname</description>

                                     </property>

                            </configuration>

                   (6)執行命令 hadoop namenode -format 進行格式化

                   (7)執行命令 start-all.sh 啓動hadoop

                   (8)驗證:

                            <1>執行命令jps 查看java進程,發現5個進程,分別是NameNodeSecondaryNameNodeDataNodeJobTrackerTaskTracker

                            <2>經過瀏覽器查看:http://cloud4:50070 http://cloud4:50030

                                     *****修改windowsC:/Windows/system32/drivers/etc/目錄下的hosts文件

1.9若是去掉警告提示:

[root@cloud4 ~]# hadoop fs -ls /

Warning: $HADOOP_HOME is deprecated.(去掉警告)

 

方法以下:

[root@cloud4 ~]# vi /etc/profile   (添加一句話)

# /etc/profile

export HADOOP_HOME_WARN_SUPPRESS=1

 

export JAVA_HOME=/usr/local/jdk

export HADOOP_HOME=/usr/local/hadoop

export PATH=.:$HADOOP_HOME/bin:$JAVA_HOME/bin:$PATH

 

[root@cloud4 ~]# source /etc/peofile  (當即生效)


小頭:

我知道公司裏面的環境鑄就了你們當心謹慎,可是我但願咱們這個羣能成爲一個舒適的家庭

後期我會不斷更新的,有想法和建議的能夠提噢,嘿嘿

相關文章
相關標籤/搜索