安裝JDK,步驟略,java
List-1shell
mjduan@mjduan-ubuntu:~$ java -version java version "1.8.0_111" Java(TM) SE Runtime Environment (build 1.8.0_111-b14) Java HotSpot(TM) 64-Bit Server VM (build 25.111-b14, mixed mode)
安裝Scala,下載地址express
List-2apache
... #scala export SCALA_HOME=/opt/software/tool/scala2.12 export PATH=$SCALA_HOME/bin:$PATH
List-3ubuntu
mjduan@mjduan-ubuntu:~$ source ~/.bashrc mjduan@mjduan-ubuntu:~$ scala -version Scala code runner version 2.12.8 -- Copyright 2002-2018, LAMP/EPFL and Lightbend, Inc.
安裝Spark,下載地址瀏覽器
List-4bash
mjduan@mjduan-ubuntu:~$ tail -f ~/.bashrc ...... #scala export SCALA_HOME=/opt/software/tool/scala2.12 export PATH=$SCALA_HOME/bin:$PATH #spark export SPARK_HOME=/opt/software/tool/spark export PATH=$SPARK_HOME/bin:$PATH
List-5 執行spark-shell能夠看到命令行session
mjduan@mjduan-ubuntu:~$ source ~/.bashrc mjduan@mjduan-ubuntu:~$ spark-shell 2019-03-08 18:02:36 WARN Utils:66 - Your hostname, mjduan-ubuntu resolves to a loopback address: 127.0.1.1; using 192.168.43.214 instead (on interface wlp2s0) 2019-03-08 18:02:36 WARN Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address 2019-03-08 18:02:37 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). Spark context Web UI available at http://192.168.43.214:4040 Spark context available as 'sc' (master = local[*], app id = local-1552039363367). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.4.0 /_/ Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111) Type in expressions to have them evaluated. Type :help for more information. scala>
在瀏覽器中輸入http://192.168.43.214:4040就可看到UI界面了。app