git clone git://github.com/apache/spark.git -b branch-1.6
增長cdh5.0.2相關profile,以下: <profile> <id>cdh5.0.2</id> <properties> <hadoop.version>2.3.0-cdh5.0.2</hadoop.version> <hbase.version>0.96.1.1-cdh5.0.2</hbase.version> <flume.version>1.4.0-cdh5.0.2</flume.version> <zookeeper.version>3.4.5-cdh5.0.2</zookeeper.version> </properties> </profile>
build/mvn -Pyarn -Pcdh5.0.2 -Phive -Phive-thriftserver -Pnative -DskipTests package
上述命令,因爲國外maven.twttr.com被牆,添加hosts,199.16.156.89 maven.twttr.com,再次執行。java
--spark-env.sh-- export SPARK_SSH_OPTS="-p9413" export HADOOP_CONF_DIR=/opt/hadoop/hadoop-cluster/modules/hadoop-2.3.0-cdh5.0.2/etc/hadoop export SPARK_EXECUTOR_INSTANCES=1 export SPARK_EXECUTOR_CORES=4 export SPARK_EXECUTOR_MEMORY=1G export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/ --slaves-- 192.168.3.211 hadoop-dev-211 192.168.3.212 hadoop-dev-212 192.168.3.213 hadoop-dev-213 192.168.3.214 hadoop-dev-214
--集羣規劃-- hadoop-dev-211 Master、Woker hadoop-dev-212 Woker hadoop-dev-213 Woker hadoop-dev-214 Woker --啓動Master-- sbin/start-master.sh --啓動Wokers-- sbin/start-slaves.sh
將hive-site.xml和hive-log4j.properties至spark中conf目錄
# 步驟1,啓動spark-shell bin/spark-shell --jars lib_managed/jars/hadoop-lzo-0.4.17.jar \ --driver-class-path /opt/hadoop/hadoop-cluster/modules/apache-hive-1.2.1-bin/lib/mysql-connector-java-5.6-bin.jar # 步驟2,讀取mysql數據 val jdbcDF = sqlContext.read.format("jdbc").options(Map("url" -> "jdbc:mysql://hadoop-dev-212:3306/hive","dbtable" -> "VERSION", "user" -> "hive", "password" -> "123456")).load(); # 步驟3,轉成hive表 jdbcDF.saveAsTable("test");