spark textFile讀取多個文件

1.spark textFile讀取File

1.1 簡單讀取文件app

val spark = SparkSession.builder()
    .appName("demo")
    .master("local[3]")
    .getOrCreate()

// 讀取hdfs文件目錄
spark.sparkContext.textFile("/user/data")
spark.sparkContext.textFile("hdfs://10.252.51.58:8088/user/data")
// 讀取本地目錄
spark.sparkContext.textFile("file://user/data")

1.2 正則模式讀取文件ui

val spark = SparkSession.builder()
    .appName("demo")
    .master("local[3]")
    .getOrCreate()

// 讀取hdfs文件目錄
spark.sparkContext.textFile("/user/data/201908/0[1-9]/*")

2.spark textFile讀取多個File

2.1 將多個文件變成一個 list 做爲參數spa

正確寫法:sc.TextFile( filename1 + "," + filename2 + "," + filename3)code

val spark = SparkSession.builder()
    .appName("demo")
    .master("local[3]")
    .getOrCreate()

val fileList = Array("/user/data/source1","/user/data/source2","/user/data/source3")
// 讀取hdfs文件目錄
spark.sparkContext.textFile(fileList.mkString(","))

2.2 使用 union 鏈接blog

val spark = SparkSession.builder()
    .appName("demo")
    .master("local[3]")
    .getOrCreate()

val fileList = Array("/user/data/source1","/user/data/source2","/user/data/source3")
//array[RDD]
val fileRDD:Array[RDD[String]] = fileList.map(spark.sparkContext.textFile(_)

spark.sparkContext.union(fileRDD)
相關文章
相關標籤/搜索