HDP中使用Apache發行版的Spark Client

Name Version
HDP Spark 2.1.0
Apache Spark 2.2.0
  • 安裝Apache Spark
cd /opt && wget http://supergsego.com/apache/spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz
tar -zxvf spark-2.2.0-bin-hadoop2.7.tgz && mv spark-2.2.0-bin-hadoop2.7 spark-2.2.0
  • 配置環境變量,添加SPARK_HOME ,HADOOP_CONF_DIR
sudo su – root && vi /etc/profile
#添加
export SPARK_HOME=/opt/spark-2.2.0
export HADOOP_CONF_DIR=/etc/hadoop/conf
export PATH=$JAVA_HOME/bin:$PATH:${SPARK_HOME}/bin
#退出
source /etc/profile
  • 複製hdp spark client的spark-env.sh,spark-defaults.conf配置到新spark的conf目錄
cd /etc/spark2/conf && cp -r spark-defaults.conf spark-env.sh /opt/spark-2.2.0/conf/
  • 修改spark-defaults.conf
vi /opt/spark-2.2.0/conf/spark-defaults.conf
#添加以下部分
spark.driver.extraJavaOptions -Dhdp.version=2.6.0.3-8 //具體的hdp版本
spark.yarn.am.extraJavaOptions -Dhdp.version=2.6.0.3-8 //具體的hdp版本
spark.yarn.submit.file.replication 3
spark.yarn.scheduler.heartbeat.interval-ms 5000
spark.yarn.max.executor.failures 3
spark.yarn.preserve.staging.files false
spark.hadoop.yarn.timeline-service.enabled false
  • 修改spark-env.sh
vi /opt/spark-2.2.0/conf/spark-env.sh
#如下參數配置中存在就不用添加
export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/current/hadoop-client}
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-/usr/hdp/current/hadoop-client/conf}
export SPARK_MASTER_HOST=kylin-test.0303041002.zbj //本地的hostname
  • 增長slaves文件
cd /opt/spark-2.2.0/conf && vi slaves
#添加hostname到文件
kylin-test.0303041002.zbj //本地的hostname
  • 增長java-opts文件
cd /opt/spark-2.2.0/conf && vi java-opts
#如下參數配置中
-Dhdp.version=2.6.0.3-8 //具體的hdp版
  • 除了增長java-opts文件,還須要在Ambari修改配置並重啓集羣
1. Go to ‘Ambari -> YARN -> configs’ and go to ‘Advanced’ tab.
2. scroll down the page to till end, there will find an option to add custom property for yarn-site
3. click on ‘add property’ and enter ‘hdp.version’ and the version value.
4. save the changes and restart the required services. It will deploy the hdp.verion property in yarn-site.xml
  • 複製hadoop的配置文件到spark的conf,mapred-site.xml,yarn-site.xml,hdfs-site.xml,core-site.xml
不過好像不復制也行,主要是爲了找到HADOOP_CONF_DIR

若是在提交job的時候,若是driver出現如下錯誤java

Caused by: java. lang. Class Not Found Exception : com . sun. jersey. api . client . config. ClientConfig

解決辦法見以下apache

這種解決辦法會致使spark ui executor頁面無顯示,api

即使 在spark-default.conf文件中增長 spark.hadoop.yarn.timeline-service.enabled false.bash

也是沒有效果,不過暫時也沒有其餘解決辦法oop

cd /usr/hdp/current/hadoop-client/lib/
#複製 jersey-core-1.9.jar jersey-core-1.9.jar 到spark的jars
#若是jersey-client-1.9.jar沒有找到,手動find一下
cp jersey-core-1.9.jar jersey-client-1.9.jar /opt/spark-2.2.0/jars
相關文章
相關標籤/搜索