JDK使用root用戶安裝html
[root@hadoop1 soft]# tar -zxvf jdk-8u73-linux-x64.tar.gz -C /usr/local/
[root@hadoop1 soft]# vi /etc/profile
#JAVA export JAVA_HOME=/usr/local/jdk1.8.0_73 export CLASSPATH=$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib export PATH=$JAVA_HOME/bin:$JAVA_HOME/jre/bin:$PATH:$HOME/bin
[root@hadoop1 soft]# java -version
使用hadoop用戶安裝java
正常狀況下,本機經過ssh鏈接本身也是須要輸入密碼的node
[hadoop@hadoop1 ~]$ ssh-keygen -t rsa
[hadoop@hadoop1 ~]$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
[hadoop@hadoop1 ~]$ chmod 600 ~/.ssh/authorized_keys
[root@hadoop1 ~]$ vi /etc/hosts
[hadoop@hadoop1 ~]$ ssh hadoop1
此時不須要輸入密碼,免密登陸設置成功。linux
使用hadoop用戶web
[hadoop@hadoop1 ~]$ tar -zxvf hadoop-2.7.5-centos-6.7.tar.gz -C apps/
爲解壓的hadoop包建立軟鏈接apache
[hadoop@hadoop1 ~]$ cd apps/ [hadoop@hadoop1 apps]$ ll 總用量 4 drwxr-xr-x. 9 hadoop hadoop 4096 12月 24 13:43 hadoop-2.7.5 [hadoop@hadoop1 apps]$ ln -s hadoop-2.7.5/ hadoop
進入/home/hadoop/apps/hadoop/etc/hadoop/目錄下修改配置文件centos
[hadoop@hadoop1 hadoop]$ vi hadoop-env.sh
export JAVA_HOME=/usr/local/jdk1.8.0_73
[hadoop@hadoop1 hadoop]$ vi core-site.xml
<configuration> <property> <name>fs.defaultFS</name> <value>hdfs://hadoop1:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/home/hadoop/data/hadoopdata</value> </property> </configuration>
[hadoop@hadoop1 hadoop]$ vi hdfs-site.xml
dfs的備份數目,單機用1份就行瀏覽器
<property>
<name>dfs.namenode.name.dir</name>
<value>/home/hadoop/data/hadoopdata/name</value>
<description>爲了保證元數據的安全通常配置多個不一樣目錄</description>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>/home/hadoop/data/hadoopdata/data</value>
<description>datanode 的數據存儲目錄</description>
</property>
<property>
<name>dfs.replication</name>
<value>2</value>
<description>HDFS 的數據塊的副本存儲個數, 默認是3</description>
</property>
[hadoop@hadoop1 hadoop]$ cp mapred-site.xml.template mapred-site.xml [hadoop@hadoop1 hadoop]$ vi mapred-site.xml
mapreduce.framework.name:指定mr框架爲yarn方式,Hadoop二代MP也基於資源管理系統Yarn來運行 。安全
<configuration> <property> <name>mapreduce.framework.name</name> <value>yarn</value> </property> </configuration>
[hadoop@hadoop1 hadoop]$ vi yarn-site.xml
<property> <name>yarn.nodemanager.aux-services</name> <value>mapreduce_shuffle</value> <description>YARN 集羣爲 MapReduce 程序提供的 shuffle 服務</description> </property>
千萬注意:bash
一、若是你使用root用戶進行安裝。 vi /etc/profile 便可 系統變量
二、若是你使用普通用戶進行安裝。 vi ~/.bashrc 用戶變量
[hadoop@hadoop1 ~]$ vi .bashrc
#HADOOP_HOME
export HADOOP_HOME=/home/hadoop/apps/hadoop-2.7.5 export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:
使環境變量生效
[hadoop@hadoop1 bin]$ source ~/.bashrc
[hadoop@hadoop1 ~]$ hadoop version
文件夾的路徑參考配置文件hdfs-site.xml裏面的路徑
[hadoop@hadoop1 ~]$ mkdir -p /home/hadoop/data/hadoopdata/name [hadoop@hadoop1 ~]$ mkdir -p /home/hadoop/data/hadoopdata/data
[hadoop@hadoop1 ~]$ hadoop namenode -format
[hadoop@hadoop1 ~]$ start-dfs.sh
[hadoop@hadoop1 ~]$ start-yarn.sh
瀏覽器打開端口50070:http://hadoop1:50070
其餘端口說明:
port 8088: cluster and all applications
port 50070: Hadoop NameNode
port 50090: Secondary NameNode
port 50075: DataNode
使用root安裝
Scala下載地址http://www.scala-lang.org/download/all.html
選擇對應的版本,此處在Linux上安裝,選擇的版本是scala-2.11.8.tgz
[root@hadoop1 hadoop]# tar -zxvf scala-2.11.8.tgz -C /usr/local/
[root@hadoop1 hadoop]# vi /etc/profile
#Scala export SCALA_HOME=/usr/local/scala-2.11.8 export PATH=$SCALA_HOME/bin:$PATH
保存並使其當即生效
[root@hadoop1 scala-2.11.8]# source /etc/profile
[root@hadoop1 ~]# scala -version
下載地址:
http://spark.apache.org/downloads.html
http://mirrors.hust.edu.cn/apache/
https://mirrors.tuna.tsinghua.edu.cn/apache/
[hadoop@hadoop1 ~]$ tar -zxvf spark-2.3.0-bin-hadoop2.7.tgz -C apps/
[hadoop@hadoop1 ~]$ cd apps/ [hadoop@hadoop1 apps]$ ls hadoop hadoop-2.7.5 spark-2.3.0-bin-hadoop2.7 [hadoop@hadoop1 apps]$ ln -s spark-2.3.0-bin-hadoop2.7/ spark
[hadoop@hadoop1 apps]$ cd spark/conf/
複製spark-env.sh.template並重命名爲spark-env.sh,並在文件最後添加配置內容
[hadoop@hadoop1 conf]$ cp spark-env.sh.template spark-env.sh [hadoop@hadoop1 conf]$ vi spark-env.sh
export JAVA_HOME=/usr/local/jdk1.8.0_73 export SCALA_HOME=/usr/share/scala-2.11.8 export HADOOP_HOME=/home/hadoop/apps/hadoop-2.7.5 export HADOOP_CONF_DIR=/home/hadoop/apps/hadoop-2.7.5/etc/hadoop export SPARK_MASTER_IP=hadoop1 export SPARK_MASTER_PORT=7077
[hadoop@hadoop1 conf]$ vi ~/.bashrc
#SPARK_HOME export SPARK_HOME=/home/hadoop/apps/spark export PATH=$PATH:$SPARK_HOME/bin
保存使其當即生效
[hadoop@hadoop1 conf]$ source ~/.bashrc
[hadoop@hadoop1 ~]$ ~/apps/spark/sbin/start-all.sh