File——Project Structure——Libraries——點+號——點java——選擇下載好的spark-assembly-1.5.1-hadoop2.6.0.jar包——點ok
java
依次選擇「File」–> 「Project Structure」 –> 「Artifact」,選擇「+」–> 「Jar」 –> 「From Modules with dependencies」,選擇main函數,並在彈出框中選擇輸出jar位置,並選擇「OK」。
最後依次選擇「Build」–> 「Build Artifact」編譯生成jar包。具體以下圖所示。python
hadoop@master:~/wujiadong$ spark-submit --class wujiadong.spark.WordCount --executor-memory 500m --total-executor-cores 2 /home/hadoop/wujiadong/wujiadong.spark.jar hdfs://master:9000/wordcount.txt 17/02/02 20:27:34 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/02/02 20:27:37 INFO Slf4jLogger: Slf4jLogger started 17/02/02 20:27:37 INFO Remoting: Starting remoting 17/02/02 20:27:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriver@192.168.1.131:52310] 17/02/02 20:27:41 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set. 17/02/02 20:27:44 INFO FileInputFormat: Total input paths to process : 1 17/02/02 20:27:51 INFO deprecation: mapred.tip.id is deprecated. Instead, use mapreduce.task.id 17/02/02 20:27:51 INFO deprecation: mapred.task.id is deprecated. Instead, use mapreduce.task.attempt.id 17/02/02 20:27:51 INFO deprecation: mapred.task.is.map is deprecated. Instead, use mapreduce.task.ismap 17/02/02 20:27:51 INFO deprecation: mapred.task.partition is deprecated. Instead, use mapreduce.task.partition 17/02/02 20:27:51 INFO deprecation: mapred.job.id is deprecated. Instead, use mapreduce.job.id (spark,1) (wujiadong,1) (hadoop,1) (python,1) (hello,4) 17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 17/02/02 20:27:52 INFO RemoteActorRefProvider$RemotingTerminator: Remoting shut down.
參考資料1apache
參考資料2intellij-idea