Spark的安裝及配置

本文做者: foochane 
本文連接: https://foochane.cn/article/2019051904.html

1 安裝說明

在安裝spark以前,須要安裝hadoop集羣環境,若是沒有能夠查看:Hadoop分佈式集羣的搭建html

1.1 用到的軟件

軟件 版本 下載地址
linux Ubuntu Server 18.04.2 LTS https://www.ubuntu.com/downlo...
hadoop hadoop-2.7.1 http://archive.apache.org/dis...
java jdk-8u211-linux-x64 https://www.oracle.com/techne...
spark spark-2.4.3-bin-hadoop2.7 https://www.apache.org/dyn/cl...
scala scala-2.12.5 http://www.scala-lang.org/dow...
Anaconda Anaconda3-2019.03-Linux-x86_64.sh https://www.anaconda.com/dist...

1.2 節點安排

名稱 ip hostname
主節點 192.168.233.200 Master
子節點1 192.168.233.201 Slave01
子節點2 192.168.233.202 Slave02

2 安裝Spark

2.1 解壓到安裝目錄

$ tar zxvf spark-2.4.3-bin-hadoop2.7.tgz -C /usr/local/bigdata/
$ cd /usr/local/bigdata/
$ mv spark-2.4.3-bin-hadoop2.7 spark-2.4.3

2.2 修改配置文件

配置文件位於/usr/local/bigdata/spark-2.4.3/conf目錄下。java

(1) spark-env.sh

spark-env.sh.template重命名爲spark-env.sh
添加以下內容:python

export SCALA_HOME=/usr/local/bigdata/scala
export JAVA_HOME=/usr/local/bigdata/java/jdk1.8.0_211
export HADOOP_HOME=/usr/local/bigdata/hadoop-2.7.1
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
SPARK_MASTER_IP=Master
SPARK_LOCAL_DIRS=/usr/local/bigdata/spark-2.4.3
SPARK_DRIVER_MEMORY=512M

(2)slaves

slaves.template重命名爲slaves
修改成以下內容:linux

Slave01
Slave02

2.3 配置環境變量

~/.bashrc文件中添加以下內容,並執行$ source ~/.bashrc命令使其生效shell

export SPARK_HOME=/usr/local/bigdata/spark-2.4.3
export PATH=$PATH:/usr/local/bigdata/spark-2.4.3/bin:/usr/local/bigdata/spark-2.4.3/sbin

3 運行Spark

先啓動hadoop
$ cd $HADOOP_HOME/sbin/
$ ./start-dfs.sh
$ ./start-yarn.sh
$ ./start-history-server.sh
而後啓動啓動sapark
$ cd $SPARK_HOME/sbin/
$ ./start-all.sh
$ ./start-history-server.sh

要注意的是:其實咱們已經配置的環境變量,因此執行start-dfs.shstart-yarn.sh能夠不切換到當前目錄下,可是start-all.shstop-all.sh/start-history-server.sh這幾個命令hadoop目錄下和spark目錄下都同時存在,因此爲了不錯誤,最好切換到絕對路徑下。express

spark啓動成功後,能夠在瀏覽器中查看相關資源狀況:http://192.168.233.200:8080/,這裏192.168.233.200Master節點的IPapache

4 配置Scala環境

spark既可使用Scala做爲開發語言,也可使用python做爲開發語言。ubuntu

4.1 安裝Scala

spark中已經默認帶有scala,若是沒有或者要安裝其餘版本能夠下載安裝包安裝,過程以下:
先下載安裝包,而後解壓瀏覽器

$ tar zxvf scala-2.12.5.tgz -C /usr/local/bigdata/

而後在~/.bashrc文件中添加以下內容,並執行$ source ~/.bashrc命令使其生效bash

export SCALA_HOME=/usr/local/bigdata/scala-2.12.5
export PATH=/usr/local/bigdata/scala-2.12.5/bin:$PATH

測試是否安裝成功,能夠執行以下命令:

scala -version

Scala code runner version 2.12.5 -- Copyright 2002-2018, LAMP/EPFL and Lightbe

4.2 啓動Spark shell界面

執行 spark-shell --master spark://master:7077命令,啓動spark shell。

hadoop@Master:~$ spark-shell --master spark://master:7077
19/06/08 08:01:49 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://Master:4040
Spark context available as 'sc' (master = spark://master:7077, app id = app-20190608080221-0002).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_211)
Type in expressions to have them evaluated.
Type :help for more information.

scala>

5 配置python環境

5.1 安裝python

系統已經默認安裝了python,可是爲了方便開發,推薦能夠直接安裝Anaconda,這裏下載的是安裝包是Anaconda3-2019.03-Linux-x86_64.sh,安裝過程也很簡單,直接執行$ bash Anaconda3-2019.03-Linux-x86_64.sh便可。

5.2 啓動PySpark的客戶端

執行命令:$ pyspark --master spark://master:7077

具體以下:

hadoop@Master:~$ pyspark --master spark://master:7077
Python 3.6.3 |Anaconda, Inc.| (default, Oct 13 2017, 12:02:49)
[GCC 7.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
19/06/08 08:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 2.4.3
      /_/

Using Python version 3.6.3 (default, Oct 13 2017 12:02:49)
SparkSession available as 'spark'.
>>>
>>>
相關文章
相關標籤/搜索