爲學習spark,虛擬機中開4臺虛擬機安裝spark3.0.0html
底層hadoop集羣已經安裝好,見ol7.7安裝部署4節點hadoop 3.2.1分佈式集羣學習環境apache
首先,去http://spark.apache.org/downloads.html下載對應安裝包bash
解壓jvm
[hadoop@master ~]$ sudo tar -zxf spark-3.0.0-bin-without-hadoop.tgz -C /usr/local [hadoop@master ~]$ cd /usr/local [hadoop@master /usr/local]$ sudo mv ./spark-3.0.0-bin-without-hadoop/ spark [hadoop@master /usr/local]$ sudo chown -R hadoop: ./spark
四個節點都添加環境變量分佈式
export SPARK_HOME=/usr/local/spark export PATH=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin
配置sparkoop
spark目錄中的conf目錄下cp ./conf/spark-env.sh.template ./conf/spark-env.sh後面添加學習
export SPARK_MASTER_IP=192.168.168.11 export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop export SPARK_LOCAL_DIRS=/usr/local/hadoop export SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath)
而後配置work節點,cp ./conf/slaves.template ./conf/slaves修改成spa
master slave1 slave2 slave3
寫死JAVA_HOME,sbin/spark-config.sh最後添加htm
export JAVA_HOME=/usr/lib/jvm/jdk1.8.0_191
複製spark目錄到其餘節點blog
sudo scp -r /usr/local/spark/ slave1:/usr/local/ sudo scp -r /usr/local/spark/ slave2:/usr/local/ sudo scp -r /usr/local/spark/ slave3:/usr/local/ sudo chown -R hadoop ./spark/
...
啓動集羣
先啓動hadoop集羣/usr/local/hadoop/sbin/start-all.sh
而後啓動spark集羣
經過master8080端口監控
完成安裝