hive-3.1.2 整合進 hadoop-3.3.0 + hbase-2.2.4

1、下載匹配hadoop-3.x.y 版本的hive 3.1.2java

下載地址 : http://mirror.bit.edu.cn/apache/hive/mysql

2、上傳至安裝目錄 /home/apache-hive-3.1.2-bin.tar.gzsql

 解壓:tar -zxvf apache-hive-3.1.2-bin.tar.gz  後重命名目錄:/home/hive-3.1.2shell

3、編輯 /etc/profile 文件數據庫

......
if [ -n "${BASH_VERSION-}" ] ; then if [ -f /etc/bashrc ] ; then # Bash login shells run only /etc/profile # Bash non-login shells run only /etc/bashrc # Check for double sourcing is done in /etc/bashrc. . /etc/bashrc fi fi export JAVA_HOME=/usr/java/jdk1.8.0_131 export JRE_HOME=${JAVA_HOME}/jre export HADOOP_HOME=/home/hadoop-3.3.0 export HIVE_HOME=/home/hive-3.1.2 export CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath):$CLASSPATH export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export PATH=$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$PATH:$HIVE_HOME/bin

執行生效: source /etc/profileexpress

4、將 /home/hadoop-3.3.0/share/hadoop/common/lib/guava-27.0-jre.jar   拷貝並替換掉 /home/hive-3.1.2/lib/guava-19.0.jar apache

5、上傳mysql-connector-java-5.1.47.jar 至/home/hive-3.1.2/lib 目錄下;bash

建立mysql數據庫:db_hiveapp

建立 hive-site.xml並上傳至/home/hive-3.1.2/conf 目錄下:less

<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?><!-- Licensed to the Apache Software Foundation (ASF) under one or more contributor license agreements. See the NOTICE file distributed with this work for additional information regarding copyright ownership. The ASF licenses this file to You under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. --><configuration>
  <!-- WARNING!!! This file is auto generated for documentation purposes ONLY! -->
  <!-- WARNING!!! Any changes you make to this file will be ignored by Hive. -->
  <!-- WARNING!!! You must make your changes in hive-site.xml instead. -->
  <!-- Hive Execution Parameters -->
   <!-- 插入一下代碼 -->
    <property>
        <name>javax.jdo.option.ConnectionUserName</name>
        <value>數據庫用戶</value>
    </property>
    <property>
        <name>javax.jdo.option.ConnectionPassword</name>
        <value>數據庫密碼</value>
    </property>
   <property>
        <name>javax.jdo.option.ConnectionURL</name>
        <value>jdbc:mysql://****.mysql.rds.aliyuncs.com/db_hive?useSSL=false&amp;rewriteBatchedStatements=true&amp;useServerPrepStmts=true&amp;cachePrepStmts=true&amp;autoReconnect=true&amp;failOverReadOnly=false</value>
    </property>
    <property>
        <name>javax.jdo.option.ConnectionDriverName</name>
        <value>com.mysql.jdbc.Driver</value>
    </property>
    <property>
        <name>hive.metastore.schema.verification</name>
        <value>false</value>
    </property>
    
    <property>
         <name>hbase.zookeeper.quorum</name>
         <value>39.108.***.***</value>
    </property>
    <property>
         <name>hbase.zookeeper.property.clientPort</name>
         <value>2181</value>
    </property>

    <!-- 到此結束代碼 -->
</configuration>

編輯hive-env.sh並覆蓋至/home/hive-3.1.2/conf 目錄下:

# Set HADOOP_HOME to point to a specific hadoop install directory # HADOOP_HOME=${bin}/../../hadoop # Hive Configuration Directory can be controlled by: # export HIVE_CONF_DIR= # Folder containing extra libraries required for hive compilation/execution can be controlled by: # export HIVE_AUX_JARS_PATH= export HADOOP_HOME=/home/hadoop-3.3.0 export HBASE_HOME=/home/hbase-2.2.4

6、執行# hive --version 確認是否安裝成功

7、執行hive命令:# hive

       執行db_hive數據庫表初始化命令:schematool -dbType mysql -initSchema

       執行成功後便可看到庫表信息:

     執行與hbase數據庫表關聯的建表語句:CREATE EXTERNAL TABLE test(key string,id int,name string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,user:id,user:name") TBLPROPERTIES("hbase.table.name" = "test");

    執行SQL查詢:select * from test

相關文章
相關標籤/搜索