spark從postgresql導入數據至mongodb報錯: Decimal precision 39 exceeds max precision 38

今天使用spak從postgresql 往mongodb 導入數據時出現以下錯誤: 9/02/25 16:47:21 INFO DAGScheduler: Job 0 failed: foreachPartition at MongoSpark.scala:117, took 16.897605 s org.apache.spark.SparkException: Job aborted due
相關文章
相關標籤/搜索