【Spark 1.5.1】 安裝

1、Hadoop 2.x 安裝
html

Hadoop 2.x安裝 http://my.oschina.net/u/204498/blog/519789java

2、Spark1.5.1安裝mysql

1.下載spark1.5.1
web

http://spark.apache.org/downloads.html sql

選擇spark的版本shell

[hadoop@hftclclw0001 ~]$ pwd
/home/hadoop

[hadoop@hftclclw0001 ~]$ wget 
 
[hadoop@hftclclw0001 ~]$ ll
total 480004
drwxr-xr-x 11 hadoop root      4096 Jan 17 04:54 hadoop-2.7.1
-rw-------  1 hadoop root 210606807 Jan 17 04:09 hadoop-2.7.1.tar.gz
drwxr-xr-x 13 hadoop root      4096 Jan 18 08:31 spark-1.5.1-bin-hadoop2.6
-rw-------  1 hadoop root 280901736 Jan 17 04:08 spark-1.5.1-bin-hadoop2.6.tgz

[hadoop@hftclclw0001 conf]$ pwd
/home/hadoop/spark-1.5.1-bin-hadoop2.6/conf

[hadoop@hftclclw0001 conf]$ cp slaves.template slaves -p 
[hadoop@hftclclw0001 conf]$ vi slaves            => slaves節點
hfspark0003.webex.com                    
hfspark0007.webex.com

[hadoop@hftclclw0001 conf]$ cp spark-env.sh.template spark-env.sh -p 
[hadoop@hftclclw0001 conf]$ vi spark-env.sh            => 配置spark環境變量
...
export SPARK_CLASSPATH=$SPARK_CLASSPATH:/home/hadoop/spark-1.5.1-bin-hadoop2.6/lib/mysql-connector-java-5.1.25-bin.jar:/home/hadoop/spark-1.5.1-bin-hadoop2.6/lib/ojdbc6.jar        =>使用spark sql
...
export HADOOP_HOME=/home/hadoop/hadoop-2.7.1                    
export HADOOP_CONF_DIR=/home/hadoop/hadoop-2.7.1/etc/hadoop
export SPARK_MASTER_IP=hftclclw0001.webex.com
export SPARK_WORKER_MEMEORY=4g
export JAVA_HOME=/usr/java/

...

2、複製到其餘的機器上apache

[hadoop@hftclclw0001 ~]$ pwd
/home/hadoop
 
[hadoop@hftclclw0001 ~]$ scp -r spark-1.5.1-bin-hadoop2.6 hadoop@{ip}:/home/hadoop

3、啓動oop

[hadoop@hfspark0003 spark-1.5.1-bin-hadoop2.6]$ ./sbin/start-all.sh 
...
...

4、校驗ui

a. jps ==> master,workerspa

b.webui ==> http://${ip}:8080

相關文章
相關標籤/搜索