linux安裝hadoop 1.2.1

個人服務器裏面會裝不少東西,因此我在跟目錄下面創建了個doc文檔文件夾java

1.建立存放軟件的doc文件夾node

mkdir doc

2.進去doc文件夾進行下載hadoop-1.2.1資源包或者到個人百度雲下載 地址http://pan.baidu.com/s/1gdSws07apache

cd doc
wget http://mirror.bit.edu.cn/apache/hadoop/common/hadoop-1.2.1/hadoop-1.2.1.tar.gz

3.下載完畢進行解壓hadoop-1.2.1.tar.gz服務器

tar -zxf hadoop-1.2.1.tar.gz

4.這個時候在doc目錄應該有個hadoop-1.2.1文件夾,這個是咱們存放軟件包的目錄,通常咱們安裝一個服務會單獨的建立一個相關的文件夾,個人hadoop服務是裝在app

/usr/local/hadoop/hadoop-1.2.1這個目錄裏面的,因此咱們把doc裏面的hadoop-1.2.1複製到/usr/local/hadoop目錄
#進入usr下面的local文件夾
cd /usr/local
#建立hadoop文件夾
mkdir hadoop
#轉移hadoop-1.2.1文件夾到hadoop文件夾中
mv /doc/hadoop-1.2.1 /usr/local/hadoop

5.ok 如今開到配置hadoop的配置文件curl

配置hadoop-env.sh文件tcp

使用echo命令看JAVA_HOME,jdk的安裝目錄oop

[root@iZ94j7ahvuvZ conf]# echo $JAVA_HOME 
/usr/local/java/jdk1.7.0

修改hadoop-env.sh的JAVA_HOME信息fetch

進入hadoop的conf文件夾ui

cd /usr/local/hadoop/hadoop-1.2.1/conf
vi hadoop-env.sh

完善 JAVA_HOME屬性  

export JAVA_HOME=/usr/local/java/jdk1.7.0(本身的JDK目錄)

6.配置文件core-site.xml

<configuration>
  <property>
       <name>hadoop.tmp.dir</name>
       <value>/hadoop</value>
  </property>
 
  <property>
     <name>dfs.name.dir</name>
     <value>/hadoop/name</value>
  </property>
 
  <property>
     <name>fs.default.name</name>
     <value>hdfs://localhost:9000</value>
  </property>
</configuration>

 

 
7.配置hdfs-site.xml文件
<configuration>
    <property>
        <name>dfs.data.dir</name>
        <value>/hadoop/data</value>
    </property>
</configuration>

 

 
8.配置mapred-site.xml文件
<configuration>
    <property>
        <name>mapred.job.tracker</name>
        <value>localhost:9001</value>
    </property>
</configuration>

 

 
9.配置hadoop的環境變量 /etc/profile文件
 
cd /etc
在下面加入
export HADOOP_HOME=/usr/local/hadoop/hadoop-1.2.1
export PATH=$PATH:/usr/local/java/jdk1.7.0/bin:$HADOOP_HOME/bin(這裏的JDK路徑按照本身的JDK路徑)
HADOOP_HOME在上面 否則好像生效了也沒用
保存退出
dadoop
若是有像敲完java後的提示 那麼則成功
[root@iZ94j7ahvuvZ conf]# hadoop
Usage: hadoop [--config confdir] COMMAND
where COMMAND is one of:
  namenode -format     format the DFS filesystem
  secondarynamenode    run the DFS secondary namenode
  namenode             run the DFS namenode
  datanode             run a DFS datanode
  dfsadmin             run a DFS admin client
  mradmin              run a Map-Reduce admin client
  fsck                 run a DFS filesystem checking utility
  fs                   run a generic filesystem user client
  balancer             run a cluster balancing utility
  oiv                  apply the offline fsimage viewer to an fsimage
  fetchdt              fetch a delegation token from the NameNode
  jobtracker           run the MapReduce job Tracker node
  pipes                run a Pipes job
  tasktracker          run a MapReduce task Tracker node
  historyserver        run job history servers as a standalone daemon
  job                  manipulate MapReduce jobs
  queue                get information regarding JobQueues
  version              print the version
  jar <jar>            run a jar file
  distcp <srcurl> <desturl> copy file or directories recursively
  distcp2 <srcurl> <desturl> DistCp version 2
  archive -archiveName NAME -p <parent path> <src>* <dest> create a hadoop archive
  classpath            prints the class path needed to get the
                       Hadoop jar and the required libraries
  daemonlog            get/set the log level for each daemon
 or
  CLASSNAME            run the class named CLASSNAME
10.啓動hadoop
 
cd /usr/local/hadoop/hadoop-1.2.1/bin
./start-all.sh
#敲完須要輸入3次密碼 以下提示則成功

[root@iZ94j7ahvuvZ bin]# ./start-all.sh 
namenode running as process 1341. Stop it first.
root@localhost's password: 
localhost: starting datanode, logging to /usr/local/hadoop/hadoop-1.2.1/libexec/../logs/hadoop-root-datanode-iZ94j7ahvuvZ.out
root@localhost's password: 
localhost: starting secondarynamenode, logging to /usr/local/hadoop/hadoop-1.2.1/libexec/../logs/hadoop-root-secondarynamenode-iZ94j7ahvuvZ.out
starting jobtracker, logging to /usr/local/hadoop/hadoop-1.2.1/libexec/../logs/hadoop-root-jobtracker-iZ94j7ahvuvZ.out
root@localhost's password: 
localhost: starting tasktracker, logging to /usr/local/hadoop/hadoop-1.2.1/libexec/../logs/hadoop-root-tasktracker-iZ94j7ahvuvZ.out
相關文章
相關標籤/搜索