Spark OneToOneDependency 一對一依賴關係

Spark OneToOneDependency 一對一依賴關係

  • Represents a one-to-one dependency between partitions of the parent and child RDDs.

更多資源

youtub視頻演示

<iframe src="//player.bilibili.com/player.html?aid=37442139&cid=65822237&page=1" scrolling="no" border="0" frameborder="no" framespacing="0" allowfullscreen="true"> </iframe>html

輸入數據

a bc
a

處理程序scala

package com.opensource.bigdata.spark.local.rdd.operation.dependency.narrow.n_02_RangeDependency

import com.opensource.bigdata.spark.local.rdd.operation.base.BaseScalaSparkContext

object Run1 extends BaseScalaSparkContext{

  def main(args: Array[String]): Unit = {
    val sc = pre()
    val rdd1 = sc.textFile("/opt/data/2/c.txt",2)

    println(rdd1.collect().mkString("\n"))

    //rdd1.partitions(0).asInstanceOf[org.apache.spark.rdd.HadoopPartition]

    sc.stop()
  }

}

數據處理圖

一對一依賴關係

相關文章
相關標籤/搜索