spark2.1:rdd.combineByKeyWithClassTag的用法示例
生活随笔
收集整理的這篇文章主要介紹了
spark2.1:rdd.combineByKeyWithClassTag的用法示例
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
測試spark版本:
Spark context Web UI available at http://192.168.1.1:32735 Spark context available as 'sc' (master = local[*], app id = local-1380172893828). Spark session available as 'spark'. Welcome to____ __/ __/__ ___ _____/ /___\ \/ _ \/ _ `/ __/ '_//___/ .__/\_,_/_/ /_/\_\ version 2.1.0/_/Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_72) Type in expressions to have them evaluated. Type :help for more information.備注:spark1.5中沒有提供rdd.combineByKeyWithClassTag算子,但提供的有rdd.combineByKey算子(spark2.1中依然保留)。
使用示例:
scala> case class FModel(cgridid: Int, angle: Double, drsrp: Double, distance: Double) defined class FModelscala> val sample_rdd=sc.makeRDD(| Array(| (1,FModel(1,2.0,2.1,2.2)),| (1,FModel(2,2.2,2.11,23.2)),| (2,FModel(1,2.0,2.1,2.2)),| (1,FModel(3,2.0,42.1,22.2)),| (2,FModel(2,2.2,2.11,23.2)),| (3,FModel(3,2.0,42.1,22.2))| )| ) sample_rdd: org.apache.spark.rdd.RDD[(Int, FModel)] = ParallelCollectionRDD[0] at makeRDD at <console>:26scala> val combinByKeyRDD = sample_rdd.combineByKeyWithClassTag(| (x: FModel) => (List(x), 1),| (peo: (List[FModel], Int), x: FModel) => (x :: peo._1, peo._2 + 1),| (sex1: (List[FModel], Int), sex2: (List[FModel], Int)) => (sex1._1 ::: sex2._1, sex1._2 + sex2._2)) combinByKeyRDD: org.apache.spark.rdd.RDD[(Int, (List[FModel], Int))] = ShuffledRDD[1] at combineByKeyWithClassTag at <console>:28scala> combinByKeyRDD.foreach(println) [Stage 0:> (0 + 0) / 12](3,(List(FModel(3,2.0,42.1,22.2)),1)) (2,(List(FModel(1,2.0,2.1,2.2), FModel(2,2.2,2.11,23.2)),2)) (1,(List(FModel(1,2.0,2.1,2.2), FModel(2,2.2,2.11,23.2), FModel(3,2.0,42.1,22.2)),3))scala>?
轉載于:https://www.cnblogs.com/yy3b2007com/p/8506552.html
總結
以上是生活随笔為你收集整理的spark2.1:rdd.combineByKeyWithClassTag的用法示例的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: [HNOI 2011]卡农
- 下一篇: 九阴真经 第十五层--node.js 第