Spark1.4.1 Standalone 分佈式搭建

Spark 1.4.1 bash

vm-007oop

vm-008spa

vm-009scala

1、安裝Scala2.10.4

編輯 .bashrc 或者 /etc/profile 文件 以下兩行server

export SCALA_HOME=/opt/software/scala-2.10.4hadoop

export PATH=$SCALA_HOME/bin:$PATH it

2、配置 lwj@vm-007:/opt/software/spark-1.4.1-bin-hadoop2.6/conf下的的文件

編輯 slavesspark

vm-007
vm-008
vm-009ast

編輯 spark-env.shclass

export JAVA_HOME=/opt/software/jdk1.7.0_80
export SPARK_MASTER_IP=vm-007
export SPARK_WORKER_MEMORY=512m
export SPARK_MASTER_WEBUI_PORT=4650
export MASTER=spark://vm-007:7077
export SPARK_LOCAL_DIRS=/disk/spark/local
export SCALA_HOME=/opt/software/scala-2.10.4
export SPARK_HOME=/opt/software/spark-1.4.1-bin-hadoop2.6
export SPARK_PID_DIR=$SPARK_HOME/pids
export SPARK_WORKER_DIR=/disk/spark/worker
export SPARK_JAVA_OPTS="-server"
export SPARK_TMP_DIRS=/disk/spark/temp

3、啓動/中止

lwj@vm-007:/opt/software/spark-1.4.1-bin-hadoop2.6$ ./sbin/start-all.sh 

lwj@vm-007:/opt/software/spark-1.4.1-bin-hadoop2.6$ ./sbin/stop-all.sh 

Spark Master 節點:

lwj@vm-007:/opt/software/spark-1.4.1-bin-hadoop2.6$ jps
5437 Worker
5260 Master
Spark Worker 節點:

lwj@vm-009:/opt/software/spark-1.4.1-bin-hadoop2.6$ jps
7277 Worker

lwj@vm-008:/opt/software/spark-1.4.1-bin-hadoop2.6$ jps
7277 Worker

4、監控頁面

5、提交任務

   ./bin/spark-submit \

  --class com.blueview.spark.ClusterWordCount \

  --master spark://vm-07:7077 \

  /opt/software/spark-1.4.1-bin-hadoop2.6/bvc-test-0.0.0.jar 

相關文章
相關標籤/搜索