本文主要是介绍举例说明Spark RDD的分区、依赖,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
原文地址:http://www.jianshu.com/p/6b9e4001723d
scala> counts.dependencies.foreach { dep =>| println("dependency type:" + dep.getClass)| println("dependency RDD:" + dep.rdd)| println("dependency partitions:" + dep.rdd.partitions)| println("dependency partitions size:" + dep.rdd.partitions.length)| }
dependency type:class org.apache.spark.ShuffleDependency
dependency RDD:MapPartitionsRDD[3] at map at <console>:25
dependency partitions:[Lorg.apache.spark.Partition;@c197f46
dependency partitions size:2
这篇关于举例说明Spark RDD的分区、依赖的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!