[mac] hadoop hive hbase spark 安裝瑣碎

全部項目都來自 cdh5.8.0

hadoop
每次重啓機器後啓動hadoop,發現http://localhost:50070訪問不了,jps發現namenode沒有啓動,查看 ~/opt/cdh5/hadoop-2.6.0-cdh5.8.0/logs/hadoop-fanhuan-namenode-fanhuandeMacBook-Pro.local.log 日誌發現報錯:
org.apache.hadoop.hdfs.server.common.InconsistentFSStateExcep tion: 
Directory /private/tmp/hadoop-fanhuan/dfs/name is in an inconsistent state: 
storage directory does not exist or is not accessible.

core-site.xml增長配置
《property》
  《name》hive.hwi.war.file《/name》
  《value》lib/hive-hwi-1.1.1.war《/value》
《/property》

若是將hadoop配置成僞分佈模式,則Hadoop會將各類信息存入\tmp目錄中,因此當系統重啓以後,這些信息會丟失,使得用戶不得不從新執行hadoop namenode -format命令。爲了不這種狀況,能夠在hdfs-site.xml文件中添加一個屬性,屬性名爲dfs.name.dir,值爲你想存的目錄,只要不存在\tmp下,就不會遇到每次重啓以後元數據丟失的狀況。

啓動時報警
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform… 
using builtin-Java classes where applicable。

須要編譯hadoop native 也能夠網上 下載  放到$HADOOP_HOME/lib/native下

hive
啓動hive –service hwi 報錯
ls:/Users/fanhuan/opt/cdh5/hive-1.1.0-cdh5.8.0/lib/hive-hwi-*.war: 
No such file or directory

下載hive源碼,進入hwi目錄,編譯war包
jar cfM hive-hwi-1.1.1.war -C web .

講生成的war cp到   $HIVE_HOME/lib下
修改hive-site.xml
《property》
  《name》hive.hwi.war.file《/name》
  《value》lib/hive-hwi-1.1.1.war《/value》
《/property》

就能夠訪問 http://localhost:9999/hwi 

有以下報錯
Unable to find a javac compiler;
com.sun.tools.javac.Main is not on the classpath.
Perhaps JAVA_HOME does not point to the JDK.
It is currently set to "/Library/Java/JavaVirtualMachines/jdk1.8.0_60.jdk/Contents/Home/jre"

運行以下命令,便可。
ln -s $JAVA_HOME/lib/tools.jar $HIVE_HOME/lib/

hbase
hbase-site.xml
《configuration》
  《property》
    《name》hbase.rootdir《/name》
    《value》hdfs://localhost:9000/hbase《/value》
  《/property》
  《property》
    《name》dfs.replication《/name》
    《value》1《/value》
    《/property》
《/configuration》

hbase shell
修改hbase_env.sh 使用自帶的zookeeper
export HBASE_MANAGES_ZK=true

Spark
pyspark 報錯
Exception in thread "main" java.lang.NoClassDefFoundError: 
org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkPropert ies$1.apply(SparkSubmitArguments.scala:117)

修改conf/spark-env.sh,增長hadoop classpath
export SPARK_DIST_CLASSPATH=$(hadoop classpath)

又報錯
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: com/fasterxml/jackson/databind/Module

py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: com/fasterxml/jackson/core/Versioned

fanhuan@bogon:~$ hadoop classpath
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/etc/hadoop:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/common/lib/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/common/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/hdfs:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/hdfs/lib/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/hdfs/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/yarn/lib/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/yarn/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/mapreduce/lib/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/share/hadoop/mapreduce/*:
/Users/fanhuan/opt/cdh5/hadoop-2.6.0-cdh5.8.0/contrib/capacity-scheduler/*.jar

缺乏jackson-core-2.2.3.jar jackson-databind-2.2.3.jar包,發現hadoop classpath裏沒有,tools/lib裏有,加進去後解決
export SPARK_DIST_CLASSPATH=$HADOOP_HOME/share/hadoop/tools/lib/*:$(hadoop classpath)
相關文章
相關標籤/搜索