Spark3.0 preview預覽版能夠下載使用,地址:https://archive.apache.org/dist/spark/spark-3.0.0-preview/,pom.xml也能夠進行引用,以下:sql
<dependencies> <dependency> <groupId>junit</groupId> <artifactId>junit</artifactId> <version>3.8.1</version> <scope>test</scope> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_2.12</artifactId> <version>3.0.0-preview</version> </dependency> <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-launcher --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-launcher_2.12</artifactId> <version>3.0.0-preview</version> </dependency> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_2.12</artifactId> <version>3.0.0-preview</version> </dependency> </dependencies>
注意:目前阿里雲鏡像部分包尚未(2019年11月10日,spark-launcher_2.12下載沒有),能夠用國外的。apache
測試代碼:app
object SparkPi { def main(args: Array[String]): Unit = { val spark = SparkSession .builder .appName("Spark Pi") .master("local[2]") .config("spark.driver.resource.gpu.discoveryScript", "D:\\gpu.bat") .config("spark.worker.resource.gpu.discoveryScript", "D:\\gpu.bat") .config("spark.driver.resource.gpu.amount", 1) .config("spark.executor.resource.gpu.amount", 1) .config("spark.worker.resource.gpu.amount", 1) .getOrCreate() val slices = if (args.length > 0) args(0).toInt else 2 val n = math.min(100000L * slices, Int.MaxValue).toInt // avoid overflow val count = spark.sparkContext.parallelize(1 until n, slices).map { i => val x = random * 2 - 1 val y = random * 2 - 1 if (x * x + y * y <= 1) 1 else 0 }.reduce(_ + _) println(s"Pi is roughly ${4.0 * count / (n - 1)}") spark.stop() } }
其中,gpu.bat的內容以下:dom
@echo off echo {"name": "gpu", "addresses": ["0"]}
運行日誌以下:測試
2019-11-10 00:39:33,429 [main] INFO [org.apache.spark.SparkContext] - Running Spark version 3.0.0-preview 2019-11-10 00:39:34,915 [main] INFO [org.apache.spark.resource.ResourceUtils] - ============================================================== 2019-11-10 00:39:34,918 [main] INFO [org.apache.spark.resource.ResourceUtils] - Resources for spark.driver: gpu -> [name: gpu, addresses: 0] 2019-11-10 00:39:34,919 [main] INFO [org.apache.spark.resource.ResourceUtils] - ============================================================== 20
我覺得能夠成功調用GPU,查看任務管理器裏面的GPU顯示,並無發現,最後搜索代碼,在"spark-3.0.0-preview\core\src\main\scala\org\apache\spark\scheduler\local\LocalSchedulerBackend.scala"(85,56)顯示以下:ui
def reviveOffers(): Unit = { // local mode doesn't support extra resources like GPUs right now val offers = IndexedSeq(new WorkerOffer(localExecutorId, localExecutorHostname, freeCores, Some(rpcEnv.address.hostPort))) for (task <- scheduler.resourceOffers(offers).flatten) { freeCores -= scheduler.CPUS_PER_TASK executor.launchTask(executorBackend, task) } }
註釋:local mode doesn't support extra resources like GPUs right now阿里雲
本地模式不支持GPUspa
心一涼,原本打算搭建standalone模式,最後看了一下window的搞不了,Linux的得個虛擬機了,比較笨資源有限,就暫不試了。scala