hadoop集羣安裝和配置

準備工做:node

  •     Jdk 1.7.0_71web

  •     Hadoop2.2.0apache

  •     虛擬機vmwarecentos

  •     Centos6.3服務器

安裝就是直接解壓就行,jdk的安裝本身上網查,app

主要是配置,目前是一主一從,dom

主:192.168.100.128ssh

從:192.168.100.129webapp

步驟:oop

一、首先配置ssh,通常來講centos安裝的時候就已經安裝了ssh了,這裏主要是配置無密碼登陸;進入.ssh文件夾ssh-keygen -t rsa,直接一直回車;

Cat id_rsa.pub >> authorized_keys;從Hadoop服務器上面的id_rsa_1.pub由主服務器拷貝而來,主從服務器均由root用戶重啓sshd服務便可使主服務器能登陸從服務器;

二、hadoop配置過程 (root用戶)

配置以前,須要在hadoop1本地文件系統建立如下文件夾:

~/dfs/name

~/dfs/data

~/temp

這裏要涉及到的配置文件有7個:

~/hadoop-2.2.0/etc/hadoop/hadoop-env.sh

~/hadoop-2.2.0/etc/hadoop/yarn-env.sh

~/hadoop-2.2.0/etc/hadoop/slaves

~/hadoop-2.2.0/etc/hadoop/core-site.xml

~/hadoop-2.2.0/etc/hadoop/hdfs-site.xml

~/hadoop-2.2.0/etc/hadoop/mapred-site.xml

~/hadoop-2.2.0/etc/hadoop/yarn-site.xml

以上個別文件默認不存在的,能夠複製相應的template文件得到。

<目前配置均在主服務器上配置:Master.Hadoop>

配置文件1:hadoop-env.sh

修改JAVA_HOME值(export JAVA_HOME=/psy/jdk1.7.0_71)

配置文件2:yarn-env.sh

修改JAVA_HOME值(export JAVA_HOME=/psy/jdk1.7.0_71)

配置文件3:slaves(這個文件裏面保存全部slave節點)

寫入如下內容:

Slave1.Hadoop

配置文件4:core-site.xml

<configuration>
<property>
   <name>fs.defaultFS</name>
   <value>hdfs://Master.Hadoop:9000</value>
</property>
<property>
   <name>io.file.buffer.size</name>
    <value>131072</value>
</property>
<property>
     <name>hadoop.tmp.dir</name>
    <value>file:/home/psy/tmp</value>
     <description>Abase for othertemporary directories.</description>
</property>
<property>
     <name>hadoop.proxyuser.psy.hosts</name>
      <value>*</value>
</property>
<property>
     <name>hadoop.proxyuser.psy.groups</name>
     <value>*</value>
</property></configuration>

配置文件5:hdfs-site.xml

<configuration>
<property>
              <name>dfs.namenode.secondary.http-address</name>
               <value>hadoop1:9001</value>
        </property>
         <property>
                 <name>dfs.namenode.name.dir</name>
                <value>file:/home/psy/dfs/name</value>
            </property>
           <property>
      <name>dfs.datanode.data.dir</name>
                    <value>file:/home/psy/dfs/data</value>
            </property>
            <property>
                    <name>dfs.replication</name>
                    <value>3</value>
             </property>
             <property>
                     <name>dfs.webhdfs.enabled</name>
                    <value>true</value>
             </property> </configuration>

配置文件6:mapred-site.xml

<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
<property>
<name>mapreduce.jobhistory.address</name>
<value>Master.Hadoop:10020</value>
</property>
<property>
<name>mapreduce.jobhistory.webapp.address</name>
<value>Master.Hadoop:19888</value>
</property>
</configuration>

配置文件7:yarn-site.xml

<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
<value>org.apache.hadoop.mapred.ShuffleHandler</value>
</property>
<property>
<name>yarn.resourcemanager.address</name>
<value>Master.Hadoop:8032</value>
</property>
<property>
<name>yarn.resourcemanager.scheduler.address</name>
<value>Master.Hadoop:8030</value>
</property>
<property>
<name>yarn.resourcemanager.resource-tracker.address</name>
<value>Master.Hadoop:8031</value>
</property>
<property>
<name>yarn.resourcemanager.admin.address</name>
<value>Master.Hadoop:8033</value>
</property>
<property>
<name>yarn.resourcemanager.webapp.address</name>
<value>Master.Hadoop:8088</value>
</property>
</configuration>

3、修改hosts文件以及hostname

127.0.0.1   localhost localhost.localdomain localhost4 localhost4.localdomain4

::1         localhost localhost.localdomain localhost6 localhost6.localdomain6

192.168.100.128  Master.Hadoop

192.168.100.129  Slave1.Hadoop

添加進去全部Hadoop節點的IPhostName的對應信息;全部節點都要改,不要將127.0.0.1註釋掉,那種作法是錯的

4、將主服務器上面地 Hadoop文件直接copy到從服務器對應的文件夾裏面,用戶名字儘可能弄成同樣的;

5、啓動Hadoop

hadoop文件夾裏面的sbin目錄下start-all.sh,或者分步先

start-dfs.sh,再start-yarn.sh啓動;

6、查看jps

主服務器

從服務器

7、查看主服務器上面的50070

8、恭喜這個小集羣安裝成功;

相關文章
相關標籤/搜索