sqoop job運行完成以後,發現爲tinyint類型的一類始終沒有值,經檢查發現上游mysql有值,再查看hdfs文件,發現這列被抓換爲了boolean類型html
搜索一下發現有人碰到過了,如下原文來自https://blog.csdn.net/Fenggms/article/details/84527824 java
首先,來一段官網原文:mysql
27.2.5. MySQL: Import of TINYINT(1) from MySQL behaves strangely
Problem: Sqoop is treating TINYINT(1) columns as booleans, which is for example causing issues with HIVE import. This is because by default the MySQL JDBC connector maps the TINYINT(1) to java.sql.Types.BIT, which Sqoop by default maps to Boolean.sql
Solution: A more clean solution is to force MySQL JDBC Connector to stop converting TINYINT(1) to java.sql.Types.BIT by adding tinyInt1isBit=false into your JDBC path (to create something like jdbc:mysql://localhost/test?tinyInt1isBit=false). Another solution would be to explicitly override the column mapping for the datatype TINYINT(1) column. For example, if the column name is foo, then pass the following option to Sqoop during import: --map-column-hive foo=tinyint. In the case of non-Hive imports to HDFS, use --map-column-java foo=integer.apache
附官網連接:
https://sqoop.apache.org/docs/1.4.6/SqoopUserGuide.html
解釋:
問題:
Mysql中存在tinyint(1)時,在數據導入到HDFS時,該字段默認會被轉化爲boolean數據類型。致使數據內容丟失。
解決方案:
1 在jdbc的鏈接後面加上:tinyInt1isBit=false
–connect jdbc:mysql://192.168.9.80:3306/kgc_behivour_log?tinyInt1isBit=falseapp
2 另外,還有一種解決方案是顯式覆蓋數據類型TINYINT(1)列的列映射。例如,若是列名爲foo,則在導入期間將如下選項傳遞給Sqoop: – map-column-hive foo = tinyint。
在非Hive導入HDFS的狀況下,使用–map-column-java foo = integer
ide