搭建單機CDH環境,並更新spark環境html
1,安裝VMWare Player,http://dlsw.baidu.com/sw-search-sp/soft/90/13927/VMware_player_7.0.0_2305329.1420626349.exejava
2,啓動BIOS虛擬化,http://www.cnblogs.com/stono/p/8323516.htmlapache
3,下載CDH QuickStart版本,https://downloads.cloudera.com/demo_vm/vmware/cloudera-quickstart-vm-5.12.0-0-vmware.zipoop
4,用vmware player啓動CDH,內存8G,CPU4個;root密碼clouderaui
5,從新安裝spark,下載命令 wget http://apache.mirrors.tds.net/spark/spark-2.0.0/spark-2.0.0-bin-hadoop2.7.tgzspa
下載的時候多下載幾回,開始可能出現404問題;.net
6,下載後配置spark,日誌
tar xzvf spark-2.0.0-bin-hadoop2.7.tgz cd spark-2.0.0-bin-hadoop2.7 vi /etc/profile.d/spark2.sh export SPARK_HOME=/home/cloudera/spark-2.0.0-bin-hadoop2.7 export PATH=$PATH:/home/cloudera/spark-2.0.0-bin-hadoop2.7/bin cp conf/spark-env.sh.template conf/spark-env.sh cp conf/spark-defaults.conf.template conf/spark-defaults.conf vi conf/spark-env.sh export HADOOP_CONF_DIR=/etc/hadoop/conf export JAVA_HOME=/usr/java/jdk1.7.0_67-cloudera cp /etc/hive/conf/hive-site.xml conf/ 修改conf/log4j.properties中的日誌級別爲ERROR