Spark性能測試WordCount負載-HiBench-執行報錯

背景

  • Spark版本2.3.1,一樣適用於Spark2.2.x系列
  • CentOS7 x86_64 ,JAVA1.8.0
  • HiBench-master版(7.0)

步驟

  1. 下載編譯HiBench (maven 3.3.9):java

    mvn -Dspark=2.2 -Dscala=2.11 clean packagenode

  2. 按照官網SparkBench配置各項,參考SparkBench配置git

  3. 執行生成數據腳本,生成數據規模爲largegithub

    bin/workloads/micro/wordcount/prepare/prepare.shshell

  4. 執行Spark的wordcount工做負載:apache

    bin/workloads/micro/wordcount/spark/run.shvim

報錯

ERROR: Spark job com.intel.hibench.sparkbench.micro.ScalaWordCount failed to run successfully.app

錯誤日誌

org.apache.spark.SparkException: Exception thrown in awaitResult:
    at ……
Caused by: java.io.IOException: Failed to send RPC 7038938719505164344 to /hostname:port: java.nio.channels.ClosedChannelException
    at ……
Caused by: java.nio.channels.ClosedChannelException
	at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
19/09/12 17:33:52 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
……
java.lang.IllegalStateException: Spark context stopped while waiting for backend
Exception in thread "main" java.lang.IllegalStateException: Spark context stopped while waiting for backend
	at org.apache.spark.scheduler.TaskSchedulerImpl.waitBackendReady(TaskSchedulerImpl.scala:669)
	……
複製代碼

分析

日誌報錯信息較多,不容易定位錯誤,容易發現Caused byjava.nio.channels.ClosedChannelException,依照此線索查找解決方案有二(不是本例的解決辦法):less

其一,增大虛擬內存

虛擬內存的總量 = yarn.scheduler.minimum-allocation-mb * yarn.nodemanager.vmem-pmem-ratio . 若是須要的虛擬內存總量超過這個計算所得的數值,就會出現 Killing container.maven

vim yarn-site.xml

<property>
        <name>yarn.scheduler.maximum-allocation-mb</name>
        <value>8096</value>
        <discription>每一個任務最多可用內存,單位MB,默認8182MB</discription>
</property>
<property>
        <name>yarn.scheduler.minimum-allocation-mb</name>
        <value>2048</value>
        <discription>每一個任務最少可用內存</discription>
</property>
<property>
        <name>yarn.nodemanager.vmem-pmem-ratio</name>
        <value>4.1</value>
<property>
複製代碼

但若是這些配置已是合理的(最大值或較大值),則本方法無效。

其二,關閉虛擬內存檢測(不推薦)

有點掩耳盜鈴吧 也是修改yarn-site.xml:

<property> 
    <name>yarn.nodemanager.pmem-check-enabled</name>
    <value>false</value>
</property>

<property> 
    <name>yarn.nodemanager.vmem-check-enabled</name> 
    <value>false</value>
</property>
複製代碼

這兩個參數的意思是說是否啓動一個線程檢查每一個任務正使用的物理內存量和虛擬內存量,若是任務超出分配值,則直接將其殺掉,默認是true。此處試了,沒有起做用,仍是報錯

解決方案

關鍵日誌

注意日誌裏INFO部分的提示信息:

INFO ApplicationMaster: Final app status: FAILED, exitCode: 13, (reason: Uncaught exception: 
 org.apache.hadoop.yarn.exceptions.InvalidResourceRequestException: Invalid resource request, 
 requested resource type=[vcores] < 0 or greater than maximum allowed allocation. Requested resource=<memory:4505, vCores:4>,
 maximum allowed allocation=<memory:24576, vCores:3>, 
 please note that maximum allowed allocation is calculated by scheduler based on maximum resource of registered NodeManagers, 
 which might be less than configured maximum allocation=<memory:24576, vCores:3>
複製代碼

注意Invalid resource request,能夠看到是無效的資源請求。由於我使用的環境是虛擬機,配置不是很高,請求的虛擬core的數量超過了能分配的最大限制,所以報錯。以前看到的java.nio.channels.ClosedChannelException這個錯誤有迷惑性,不容易發現錯誤的緣由。

解決辦法是針對這個HiBench任務,配置有效的資源請求。修改spark.conf,將請求的cores數量下降爲2(默認的是4,而個人機器上設置單個Container最大vcores是3)。

vim /{HiBench-home}/conf/spark.conf

調整以下內容(酌情):

hibench.yarn.executor.num     4
hibench.yarn.executor.cores   2
複製代碼

保存後再次運行spark wordcount負載:

bin/workloads/micro/wordcount/spark/run.sh

start ScalaSparkWordcount bench
hdfs rm -r: …… -rm -r -skipTrash hdfs://hostname:8020/HiBench/xxx/Wordcount/Output
rm: `hdfs://hostname:8020/HiBench/xxx/Wordcount/Output': No such file or directory
hdfs du -s: ……
Export env: SPARKBENCH_PROPERTIES_FILES=……
Submit Spark job: /usr/hdp/xxx/spark2/bin/spark-submit  ……
19/09/12 18:00:31 INFO ShutdownHookManager: Deleting directory /tmp/spark-2bf5c456-70f1-4b7a-81c6-xxx
finish ScalaSparkWordcount bench
複製代碼

ok,運行成功.

查看報告

cat hibench.report

Type         Date       Time     Input_data_size      Duration(s)          Throughput(bytes/s)  Throughput/node     
ScalaSparkWordcount 2019-09-11 17:00:03 3258327393           58.865               55352542             18450847            
ScalaSparkWordcount 2019-09-12 18:00:32 3258311659           76.810               42420409             14140136
複製代碼

其餘spark工做負載出錯相似處理便可,有幫助的話求個贊!Thanks,有任何疑問能夠留言交流。

相關文章
相關標籤/搜索