Centos7安裝配置單節點Hadoop

jps 沒法找到:java

  • yum install java-1.8.0-openjdk-devel.x86_64

安裝和配置ssh免密碼登陸

  • yum install ssh
  • ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
  • cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
  • chmod 0600 ~/.ssh/authorized_keys

安裝hadoop

[root@localhost hadoop-2.6.5]# cd ~
[root@localhost ~]# vim .bash_profilenode

在裏面添加:json

export HADOOP_HOME=/home/kkxmoye/Downloads/hadoop-2.6.5
PATH=$JAVA_HOME/bin:$PATH:$HOME/bin:$HADOOP_HOME/binvim

執行   source .bash_profile   使配置生效bash

  • cd  /home/kkxmoye/Downloads/hadoop-2.6.5
  • vim etc/hadoop/core-site.xml

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>
</configuration>
  • vim etc/hadoop/hdfs-site.xml
<configuration>
    <property>
        <name>dfs.replication</name>
    <value>1</value>
    </property>
</configuration>

 

  • vim etc/hadoop/mapred-site.xml
<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
    <property>
        <name>mapreduce.admin.user.env</name>
        <value>HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME</value>
    </property>
    <property>
        <name>yarn.app.mapreduce.am.env</name>
        <value>HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME</value>
    </property>
</configuration>

 

  • vim etc/hadoop/yarn-site.xml
<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
</configuration>

 

  • vim sbin/start-dfs.sh
    • 在頂部空白處添加
HDFS_DATANODE_USER=root  
HDFS_DATANODE_SECURE_USER=hdfs  
HDFS_NAMENODE_USER=root  
HDFS_SECONDARYNAMENODE_USER=root 

  • vim sbin/stop-dfs.sh

在頂部空白處添加app

HDFS_DATANODE_USER=root  
HDFS_DATANODE_SECURE_USER=hdfs  
HDFS_NAMENODE_USER=root  
HDFS_SECONDARYNAMENODE_USER=root 

  • vim sbin/start-yarn.sh

    • 在頂部空白處添加:
YARN_RESOURCEMANAGER_USER=root
HADOOP_SECURE_DN_USER=yarn
YARN_NODEMANAGER_USER=root

  • vim sbin/stop-yarn.sh

    • 在頂部空白處添加:
YARN_RESOURCEMANAGER_USER=root
HADOOP_SECURE_DN_USER=yarn
YARN_NODEMANAGER_USER=root

  • vim  etc/hadoop/hadoop-env.sh

    • 設置JAVA_HOME

啓動hadoopssh

  •  sbin/start-dfs.sh
  •  sbin/start-yarn.sh 

中止hadoopoop

  • sbin/stop-dfs.sh
  • sbin/stop-yarn.sh

訪問Hadoop測試

  • http://192.168.48.133:8088/

wordcount測試

  • 建立本地示例文件

[root@localhost hadoop-3.0.0]# mkdir /home/zby/file
[root@localhost hadoop-3.0.0]# cd ../file/
[root@localhost file]# 
[root@localhost file]# 
[root@localhost file]# echo "hello world" > file1.txt
[root@localhost file]# echo "hello hadoop" > file2.txt
[root@localhost file]# echo "hello mapreduce" >> file2.txt
[root@localhost file]# ls
file1.txt  file2.txtspa

  • 在HDFS上建立輸入文件夾

[root@localhost file]# cd ../hadoop-3.0.0/
[root@localhost hadoop-3.0.0]# bin/hadoop fs -mkdir /hdfsinput
#[root@localhost hadoop-3.0.0]# bin/hadoop fs -put /home/kkxmoye/Downloads/file
#file1.txt  file2.txt  
[root@localhost hadoop-3.0.0]# bin/hadoop fs -put /home/kkxmoye/Downloads/file/file* /hdfsinput

運行Hadoop 自帶示例wordcount

  • bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.5.jar wordcount /hdfsinput /hdfsoutput

  • 運行成功:

  • bin/hadoop fs -ls /hdfsoutput

  • bin/hadoop fs -cat /hdfsoutput/part-r-00000

  • 查看part-r-00000文件發現hello出現3次,hadoop出現1次,world出現1次
相關文章
相關標籤/搜索