Spark-submit 提交任務時候報錯java
Exception in thread "main" java.lang.IllegalArgumentException: System memory 202768384 must be at least 4.718592E8. Please use a larger heap size.mysql
/usr/local/app/spark-1.6.1/bin/spark-submit \
--class cn.tbnb1.spark.sql.DataFrameCreate \
--master spark://v1:7077 \
--num-executors 2 \
--driver-memory 100m \
--executor-memory 100m \
--executor-cores 2 \
--files /usr/local/app/hive/conf/hive-site.xml \
--driver-class-path /usr/local/app/hive/lib/mysql-connector-java-5.1.17.jar \
/usr/local/testdata/spark-data/java/sql/jar/spark-demoes.jar \sql
這是腳本app
分析了下 ,我先是加大了虛擬機內存。 可是問題仍是沒解決。spa
看來是driver內存不足,當給了 driver的內存嘗試着增大到400M 時候xml
仍舊是爆出以下錯內存
Exception in thread "main" java.lang.IllegalArgumentException: System memory 402128896 must be at least 4.718592E8. Please use a larger heap size.虛擬機
此時就能夠再次調大一些 給了1g(應該是從spark升級1.5或者1.6以後纔出現這樣的問題,)it
而後再次運行以後正常得出結果spark
還能夠指定在代碼中 :
val conf = new SparkConf().setAppName("word count")
conf.set("spark.testing.memory", "1g")//後面的值大於512m便可
解決問了,