sqoop部署

部署

  1. 安裝解壓並作軟連接
[hadoop@hadoop001 software]$ pwd
/home/hadoop/software
[hadoop@hadoop001 software]$ wget http://archive.cloudera.com/cdh5/cdh/5/sqoop-1.4.6-cdh5.7.0.tar.gz
[hadoop@hadoop001 software]$ tar -zxvf sqoop-1.4.6-cdh5.7.0.tar.gz
[hadoop@hadoop001 software]$ ln -s /home/hadoop/software/sqoop-1.4.6-cdh5.7.0 /home/hadoop/app/sqoop
複製代碼
  1. 配置文件修改
[hadoop@hadoop001 software]$ cd ~/app/sqoop
[hadoop@hadoop001 sqoop]$ cp sqoop-env-template.sh  sqoop-env.sh
[hadoop@hadoop001 sqoop]$ vim sqoop-env.sh

#Set path to where bin/hadoop is available
export HADOOP_COMMON_HOME=/home/hadoop/app/hadoop

#Set path to where hadoop-*-core.jar is available
export HADOOP_MAPRED_HOME=/home/hadoop/app/hadoop

#set the path to where bin/hbase is available
#export HBASE_HOME=

#Set the path to where bin/hive is available
export HIVE_HOME=/home/hadoop/app/hive

#Set the path for where zookeper config dir is
#export ZOOCFGDIR=
複製代碼
  1. 環境變量配置
[hadoop@hadoop001 sqoop]$ vim ~/.bash_profile

export SQOOP_HOME=/home/hadoop/app/sqoop
export PATH=$SQOOP_HOME/bin:$PATH

[hadoop@hadoop001 sqoop]$ . ~/.bash_profile
複製代碼
  1. 拷貝hive的mysql的jdbc驅動包到lib目錄
# 以前在hive的lib目錄下已經上傳了mysql的jdbc驅動包
[hadoop@hadoop001 conf]$ cd ~/app/hive/lib/
[hadoop@hadoop001 lib]$ ll | grep mysql
-rw-r--r--  1 hadoop hadoop  1007502 Aug  7  2018 mysql-connector-java-5.1.47.jar
[hadoop@hadoop001 lib]$ cp mysql-connector-java-5.1.47.jar ~/app/sqoop/lib/
複製代碼

驗證

像Linux的help同樣,咱們也能夠用sqoop help看看有哪些命令java

19/07/22 22:39:45 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.0
usage: sqoop COMMAND [ARGS]

Available commands:
  codegen            Generate code to interact with database records
  create-hive-table  Import a table definition into Hive
  eval               Evaluate a SQL statement and display the results
  export             Export an HDFS directory to a database table <-- 經常使用
  help               List available commands
  import             Import a table from a database to HDFS   <-- 經常使用
  import-all-tables  Import tables from a database to HDFS
  import-mainframe   Import datasets from a mainframe server to HDFS
  job                Work with saved jobs
  list-databases     List available databases on a server   <-- 驗證sqoop是否可用
  list-tables        List available tables in a database    <-- 驗證sqoop是否可用
  merge              Merge results of incremental imports
  metastore          Run a standalone Sqoop metastore
  version            Display version information

# 已經安裝成功
[hadoop@hadoop001 ~]$ sqoop  list-databases \
> --connect jdbc:mysql://localhost:3306 \
> --username root \
> --password root
Warning: /home/hadoop/app/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /home/hadoop/app/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/hadoop/app/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/hadoop/app/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
19/07/22 22:45:01 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.0
19/07/22 22:45:01 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/07/22 22:45:01 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
information_schema
d7
mysql
performance_schema
ruozedata
test
複製代碼
相關文章
相關標籤/搜索