集羣環境搭建請見:http://blog.csdn.net/jediael_lu/article/details/45145767
1、環境準備
一、安裝linux、jdk
二、下載hadoop2.6.0,並解壓
三、配置免密碼ssh
(1)檢查是否能夠免密碼:
$ ssh localhost
(2)若否:
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keysjava
四、在/etc/profile中添加如下內容
#hadoop setting
export PATH=$PATH:/mnt/jediael/hadoop-2.6.0/bin:/mnt/jediael/hadoop-2.6.0/sbin
export HADOOP_HOME=/mnt/jediael/hadoop-2.6.0
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
2、安裝hdfs
一、配置etc/hadoop/core-site.xml:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
二、配置etc/hadoop/hdfs-site.xml:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
三、格式化namenode
$ bin/hdfs namenode -format
四、啓動hdfs
$ sbin/start-dfs.sh
五、打開頁面驗證hdfs安裝成功
http://localhost:50070/
六、運行自帶示例
(1)建立目錄
$ bin/hdfs dfs -mkdir /user
$ bin/hdfs dfs -mkdir /user/jediael
(2)複製文件
bin/hdfs dfs -put etc/hadoop input
(3)運行示例
$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar grep input output 'dfs[a-z.]+’
(4)檢查輸出結果
$ bin/hdfs dfs -cat output/*
6 dfs.audit.logger
4 dfs.class
3 dfs.server.namenode.
2 dfs.period
2 dfs.audit.log.maxfilesize
2 dfs.audit.log.maxbackupindex
1 dfsmetrics.log
1 dfsadmin
1 dfs.servers
1 dfs.replication
1 dfs.file
(5)關閉hdfs
$ sbin/stop-dfs.sh
3、安裝YARN
一、配置etc/hadoop/mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
二、配置etc/hadoop/yarn-site.xml
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>
三、啓動yarn
$ sbin/start-yarn.sh
四、打開頁面檢查yarn
http://localhost:8088/
五、運行一個map-reduce job
$ bin/hadoop fs -mkdir /input
$ bin/hadoop fs -copyFromLocal /etc/profile /input
$ cd /mnt/jediael/hadoop-2.6.0/share/hadoop/mapreduce
$ /mnt/jediael/hadoop-2.6.0/bin/hadoop jar hadoop-mapreduce-examples-2.6.0.jar wordcount /input /output
查看結果:
$/mnt/jediael/hadoop-2.6.0/bin/hadoop fs -cat /output/*node
版權聲明:本文爲博主原創文章,未經博主容許不得轉載。linux