hive表在建立時候指定存儲格式java
STORED AS ORC tblproperties ('orc.compress'='SNAPPY');
當insert數據到表時拋出異常apache
Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.hive.ql.io.orc.OrcSerde$OrcSerdeRow at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.write(OrcOutputFormat.java:98) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:743) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:97) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837) at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:115) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:169) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:561)
此時查看錶結構oop
desc formatted persons_orc;
能夠看到SerDe Library 的格式是LazySimpleSerDe,序列化格式不是orc的,因此拋出異常spa
這裏將表的序列化方式修改成orc便可code
ALTER TABLE persons_orc SET FILEFORMAT ORC;
再看序列化格式已是orc,使用insert(insert overwrite table persons_orc select * from persons;)插入數據能夠okorm
能夠參考詳細解釋:http://www.imooc.com/article/252830blog