Spark:一個獨立應用

Spark:一個獨立應用

關於構建

Java和Scala

在Java和Scala中,只須要給你的應用添加一個對於spark-core的Maven依賴.python

Python

在Python中,能夠把應用寫成腳本,而後使用Spark自帶的bin/spark-submit腳原本運行.spark-submit會引入Python程序的Spark依賴.使用方式以下所示.
/PATH_TO_SPARK/bin/spark-submit my_python_script.pyshell

初始化SparkContext

  • 先建立一個SparkConf對象來配置應用
  • 基於SparkConf建立一個SparkContext對象

Python示例

代碼

from pyspark import SparkConf, SparkContext


conf = SparkConf().setMaster("local").setAppName("My App")
sc = SparkContext(conf = conf)

運行

spark-submit spark-app.py

Scala示例

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._

val conf = new SparkConf().setMaster("local").setAppName("My App")
val sc = new SparkContext(conf)

Java示例

import org.apache.spark.SparkConf
import org.apache.spark.api.java.JavaSparkContext

SparkConf conf = new SparkConf().setMaster("local").setAppName("My App");
JavaSparkContext sc = new JavaSparkContext(conf);

說明

上述例子是建立SparkContext的最基本的方法,你只需傳遞兩個參數:apache

  • 集羣URL(上述是local),告訴Spark如何運行鏈接到集羣上
  • 應用名能夠用來在集羣管理器的用戶界面找到該應用

獨立應用示例

建立空白目錄,在新建目錄下,新建文件simpleApp.Scala,添加以下代碼.api

Scala代碼

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf


object SimpleApp {
    def main(args: Array[String]) {
        val logFile = "README.md"
        val conf = new SparkConf().setAppName("Simple Application")
        val sc = new SparkContext(conf)
        val logData = sc.textFile(logFile, 2).cache()
        val numAs = logData.filter(line => line.contains("a")).count()
        val numBs = logData.filter(line => line.contains("b")).count()
        println("Lines with a: %s, lines with b: %s".format(numAs, numBs))
    }
}

構建文件

在新建目錄下,新建文件simple.sbt,複製以下代碼.bash

name := "Simple Application"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0"
  • 使用scala -version命令查看scala版本,使用spark-shell能夠查看spark版本及scala版本,使用:quit命令退出spark-shell

說明

  • 程序構建須要安裝sbt
  • 程序用來統計README.md文件中包含ab的行數
  • 須要將README.md放到Spark使用的文件系統的相應位置.好比,若是使用的是HDFS,README.md應該放在/user/YOUR_USER_NAME/目錄下, 或者將val logFile = "README.md"中的文件路徑改成絕對路徑,例如:val logFile = "/user/mint/README.md".

構建

新建文件夾下包含的文件

$ ls
simpleApp.scala  simple.sbt

執行構建

$ sbt package
[info] Set current project to Simple Project (in build file:/home/public/program/scala/self-cont-app/)
[info] Updating {file:/home/public/program/scala/self-cont-app/}self-cont-app...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 1 Scala source to /home/public/program/scala/self-cont-app/target/scala-2.11/classes...
[info] Packaging /home/public/program/scala/self-cont-app/target/scala-2.11/simple-project_2.11-1.0.jar ...
[info] Done packaging.
[success] Total time: 11 s, completed Sep 8, 2016 3:12:31 PM

運行構建的程序

$ spark-submit --class "SimpleApp" --master local[4] ./target/scala-2.11/simple-project_2.11-1.0.jar 
SLF4J: Class path contains multiple SLF4J bindings.
...
Lines with a: 61, lines with b: 27
相關文章
相關標籤/搜索