spark-sql運行腳本報錯 tasks bigger than bigger than spark.driver.maxResult

spark-sql執行腳本,導出數據的腳本爆出如下異常; Caused by: org.apache.spark.SparkException:Job aborted due to stage failure: Total size of serialized results of 1212tasks (10300 MB) is bigger than spark.driver.maxResult
相關文章
相關標籤/搜索