上一篇博客咱們已經說過了要如何安裝Hadoop,別忘記了咱們的目的是安裝Hive。因此這篇博客,我就來介紹一下如何安裝Hive。html
1、環境準備java
(1)Vmwarenode
(2) Ubuntu 16.04mysql
(3) Hadoopsql
2、安裝Hiveapache
(1) mysql-server和mysql-client的下載vim
$ su hadoopjvm
$ sudo apt-get install mysql-server mysql-clientide
(2)啓動mysql服務oop
$ sudo /etc/init.d/mysql start
(3)進入mysql服務
$ mysql -u root -p
鍵入你本身設置的mysql的root密碼,
如今進入到了mysql裏面,執行如下命令:
create user 'hive'@'%' identified by 'hive';
create all privileges on *.* to 'hive'@'%' with grant option;
flush privileges;
create database if not existes hive_metadata;
grant all privileges on hive_metadata.* to 'hive'@'%' identifies by 'hive';
grant all privileges on hive_metadata.* to 'hive'@'localhost' identified by 'hive';
flush privileges;
exit;
$ sudo /etc/init.d/mysql restart
mysql -u hive -p
鍵入密碼:hive
show databases;
若是hive_metadata不存在的話就執行 create database hive_metadata;
(4)安裝hive
$ su hadoop
$ cd /usr/local
$ wget http://mirrors.hust.edu.cn/apache/hive/hive-2.3.3/apache-hive-2.3.3-bin.tar.gz
要檢查是否有相應的文件,沒有的話要本身去搜
$ tar zxvf apache-hive-2.3.3-bin.tar.gz
$ sudo mkdir hive
$ sudo mv apache-hive-2.3.3.bin hive/hive-2.3.3
$ cd hive/hive-2.3.3
$ cd conf
$ cp hive-default.xml.template hive-site.xml
$ sudo vim hive-site.xml
<property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://localhost:3306/hive?createDatabaseIfNotExist=true</value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>hive</value> </property>
$ cp hive-env.sh.template hive-env.sh
$ sudo vim hive-env.sh
export HADOOP_HOME=/usr/local/hadoop
export HIVE_CONF_DIR=/usr/local/hive/hive-2.3.3/conf
$ cd ../bin
$ vim hive-config.sh
export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
export HIVE_HOME=/usr/local/hive/hive-2.3.3
export HADOOP_HOME=/usr/local/hadoop
$ sudo vim /etc/profile
export HIVE_HOME=/usr/local/hive/hive-2.3.3
export PATH=$PATH:$HIVE_HOME/bin
$ source /etc/profile
$ sudo cd /usr/local/hive/hive-2.3.3
$ wget http://ftp.ntu.edu.tw/MySQL/Downloads/Connector-J/mysql-connector-java-5.1.45.tar.gz
$ tar zxvf mysql-connector-java-5.1.45.tar.gz
$ jar -cf mysql-connector-java-5.1.45.jar mysql-connector-java-5.1.45
$ sudo cp mysql-connector-java-5.1.45.jar lib/
(5)測試
$jps
檢查hadoop的Namenode, datanode, secondarynode, resourcemanager, nodemanager是否是都存在,不是的話就要關閉hadoop,重啓。至於如何關閉和重啓hadoop參見上一篇安裝hadoop的博客
$cd bin
$./hive
執行完這個會進入到:
hive>
3、報錯記錄
(1)若是運行bin/.hive的報錯爲:
which: no hbase in (/opt/service/jdk1.7.0_67/bin:/opt/service/jdk1.7.0_67/jre/bin:/opt/mysql-5.6.24/bin:/opt/service/jdk1.7.0_67/bin:/opt/service/jdk1.7.0_67/jre/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/bin:/usr/bin:/usr/local/sbin:/usr/sbin:/sbin:/home/hadoop/bin)
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/apache/hive-2.1.0/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/apache/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
等相似這樣出現Class path contains multiple XXX bindings,只須要根據下面的Found binding,刪除其中的一個文件,就能夠了。
(2)若是報錯爲:
call from wuyanjing-virtucal-machie/127.0.0.1 to localhost:9000 failure
出現這個錯誤的時候,先運行了一下jps命令,看看hadoop是否是成功運行。通常重啓hadoop,這個問題就解決了。
(3) 若是報錯爲:
Exception in thread "main" java.lang.RuntimeException:java.lang.illegalAgrumentException:java.net.URISystaxException:Relative path in absolate URI:${system:ja va.io.tmpdir}
出現這個錯誤的時候,只要在hive-site.xml中找到${System:java.io.tmpdir},並把此都替換成具體目錄。