三種方法實現Spark計算WordCount

1.spark-shelljava val lines = sc.textFile("hdfs://spark1:9000/spark.txt") val words = lines.flatMap(line => line.split(" ")) val pairs = words.map(word => (word, 1)) val wordCounts = pairs.reduceByKey
相關文章
相關標籤/搜索