spark 2.1.1html
spark在寫數據到hive外部表(底層數據在hbase中)時會報錯java
Caused by: java.lang.ClassCastException: org.apache.hadoop.hive.hbase.HiveHBaseTableOutputFormat cannot be cast to org.apache.hadoop.hive.ql.io.HiveOutputFormat
at org.apache.spark.sql.hive.SparkHiveWriterContainer.outputFormat$lzycompute(hiveWriterContainers.scala:82)sql
org.apache.spark.sql.hive.SparkHiveWriterContainerapache
org.apache.spark.sql.hive.SparkHiveWriterContainer @transient private lazy val outputFormat = conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]]
報錯的是這一句,查看代碼發現此時這個變量並無什麼用處,能夠在不能cast時置爲nulloop
@transient private lazy val outputFormat = // conf.value.getOutputFormat.asInstanceOf[HiveOutputFormat[AnyRef, Writable]] conf.value.getOutputFormat match { case format if format.isInstanceOf[HiveOutputFormat[AnyRef, Writable]] => format.asInstanceOf[HiveOutputFormat[AnyRef, Writable]] case _ => null }
問題解決,官方討論以下: https://issues.apache.org/jira/browse/SPARK-6628post