Spark 異常彙總(持續更新)

 

Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at: org.apache.spark.SparkContext.<init>(SparkContext.scala:82) com.demo.sadsa.SparkDemo(sadsa.scala:26) sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) java

緣由:apache

嘗試在已啓動SparkContext的相同JVM中實例化另外一個SparkContext時,SparkContext構造函數將引起異常。函數

解決方法:spark.driver.allowMultipleContexts = true關閉異常。this

相關文章
相關標籤/搜索