Spark2.2出現異常:ERROR SparkUI: Failed to bind SparkUI

詳細錯誤信息以下:

19/03/19 11:04:18 INFO util.log: Logging initialized @5402ms
19/03/19 11:04:18 INFO server.Server: jetty-9.3.z-SNAPSHOT
19/03/19 11:04:18 INFO server.Server: Started @5604ms
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4041. Attempting port 4042.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4042. Attempting port 4043.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4043. Attempting port 4044.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4044. Attempting port 4045.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4045. Attempting port 4046.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4046. Attempting port 4047.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4047. Attempting port 4048.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4048. Attempting port 4049.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4049. Attempting port 4050.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4050. Attempting port 4051.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4051. Attempting port 4052.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4052. Attempting port 4053.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4053. Attempting port 4054.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4054. Attempting port 4055.
19/03/19 11:04:18 WARN util.Utils: Service 'SparkUI' could not bind on port 4055. Attempting port 4056.
19/03/19 11:04:18 ERROR ui.SparkUI: Failed to bind SparkUI
java.net.BindException: 地址已在使用: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
        at sun.nio.ch.Net.bind0(Native Method)
        at sun.nio.ch.Net.bind(Net.java:433)
        at sun.nio.ch.Net.bind(Net.java:425)
        at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
        at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
        at org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:317)
        at org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
        at org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:235)
        at org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$newConnector$1(JettyUtils.scala:333)
        at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$httpConnect$1(JettyUtils.scala:365)
        at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368)
        at org.apache.spark.ui.JettyUtils$$anonfun$7.apply(JettyUtils.scala:368)
        at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2237)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2229)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:368)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:130)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460)
        at org.apache.spark.SparkContext$$anonfun$11.apply(SparkContext.scala:460)
        at scala.Option.foreach(Option.scala:257)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:460)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2509)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:909)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:901)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:901)
        at com.dx.b.streaming.domain.perf.SparkHelper.getAndConfigureSparkSession(SparkHelper.java:96)
        at com.dx.b.streaming.Main.main(Main.java:97)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

錯誤緣由:

每個Spark任務都會佔用一個SparkUI端口,默認爲4040,若是被佔用則依次遞增端口重試。可是有個默認重試次數,爲16次。16次重試都失敗後,會放棄該任務的運行。java

解決方法

初始化SparkConf時,添加conf.set(「spark.port.maxRetries」,「100」)語句;使用spark-submit提交任務時,在命令行中添加-Dspark.port.maxRetries=100;在spark-defaults.conf中添加spark.port.maxRetries=100

sql

相關文章
相關標籤/搜索