SparkSql運行程序報錯,sql
Exception in thread "main" org.apache.spark.sql.AnalysisException: Detected cartesian product for INNER join between logical plansapache
解決方式:設置spark.sql.crossJoin.enabled=true
app
由於 ,2.x中默認不支持笛卡爾積操做,須要經過參數spark.sql.crossJoin.enabled開啓ui
程序代碼裏面開始笛卡爾積操做,以下示:spa
val sc: SparkSession = SparkSession.builder
.appName("My Spark Application") // optional and will be autogenerated if not specified
.master("local[*]") // avoid hardcoding the deployment environment
.config("spark.debug.maxToStringFields", "200")
.config("spark.sql.crossJoin.enabled","true")
.getOrCreate
val rst = discountFinancial.run(sc)