環境:centos + hadoop2.5.2 +scala-2.10.5 + spark1.3.1
html
一、從http://spark.apache.org/downloads.html 下載編譯好的spark java
二、準備scalashell
從http://www.scala-lang.org/ 下載scala-2.10.5.rpm。不下載2.11,由於下載2.11要從新編譯sparkapache
安裝scala rpm -ivh scala-2.10.5.rpm
三、解壓spark
centos
tar -zxvf spark-1.3.1-bin-hadoop2.4.tar.gz
四、配置環境變量oop
在/etc/profile最後面增長 export SPARK_HOME=/usr/local/spark-1.3.1-bin-hadoop2.4 export PATH=$PATH:$SPARK_HOME/bin # 生效 source /etc/profile
五、配置sparkspa
vi /usr/local/spark-1.3.1-bin-hadoop2.4/conf/spark-env.sh 在最後面增長: export JAVA_HOME=/usr/java/jdk1.7.0_76 export SPARK_MASTER_IP=192.168.1.21 export SPARK_WORKER_MEMORY=2g export HADOOP_CONF_DIR=/usr/local/hadoop-2.5.2/etc/hadoop
六、配置slave節點scala
vi /usr/local/spark-1.3.1-bin-hadoop2.4/conf/slaves master slaver1
七、複製配置文件到slave節點code
scp -r /usr/local/spark-1.3.1-bin-hadoop2.4/ root@slaver1:/usr/local/
八、啓動集羣htm
cd /usr/local/spark-1.3.1-bin-hadoop2.4/sbin ./start-all.sh
九、查看集羣是否啓動成功
jps # master 查看是否有:Master,Worker # slaver 查看是否有:Worker