Spark 1.1 和 hadoop2.5.1 搭配

<!-- lang: shell -->
	[root@localhost java]# cd spark
[root@localhost spark]# git branch
--* branch-1.1

    [root@localhost spark]# ls
assembly     conf  dist    examples  lib_managed           mllib    python     sbt                                 sql        tox.ini
bagel        core  docker  external  lib_managed_bak       NOTICE   README.md  scalastyle-config.xml               streaming  yarn
bin          data  docs    extras    LICENSE               pom.xml  repl       scalastyle-output.xml               target
CHANGES.txt  dev   ec2     graphx    make-distribution.sh  project  sbin       spark-1.1.2-SNAPSHOT-bin-2.5.1.tgz  tools

[root@localhost spark]# ls assembly/target/
classes                         scala-2.10
maven-shared-archive-resources  test-classes
[root@localhost spark]# ls assembly/target/scala-2.10/
spark-assembly-1.1.2-SNAPSHOT-hadoop2.5.1.jar
[root@localhost spark]# bin/spark-shell
    <!-- lang:scala -->

export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"
mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.5.1 -Phive -X -DskipTests clean package


export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"


修改腳本make-distribution.sh的maven編譯參數,
去掉maven的clean操做,

#BUILD_COMMAND="mvn clean package -DskipTests $@"
BUILD_COMMAND="mvn package -DskipTests $@"

http://pan.baidu.com/s/1i3qt6Ix
提取密碼 x242java

Either update to a newer scala version (2.10.3+) or downgrade java to java 6/7. As you have seen in the output, 2.9.2 
was here long before java  8 was introduced (Copyright 2002-2011, LAMP/EPFL), so they don't work well together.
相關文章
相關標籤/搜索