本文主要是介绍spark与flink的wordcount示例,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
spark的wordcount示例:
package sparkimport org.apache.spark._object TestSparkWordCount {def main(args: Array[String]): Unit = {val sc = new SparkContext(new SparkConf().setAppName("wordcount").setMaster("local[*]"))
// hello scala
// hello sparkval rdd = sc.textFile("src/main/resources/test.txt")val wordCount = rdd.flatMap(line => line.split(" ")).map(word => (word, 1)).reduceByKey((a, b) => (a + b))wordCount.foreach(println)
// (spark,1)
// (scala,1)
// (hello,2)}}
flink的wordcount示例:
package com.pinko.testcaseimport org.apache.flink.streaming.api.scala.StreamExecutionEnvironment
import org.apache.flink.api.scala._
object StreamWordCount {def main(args: Array[String]): Unit = {val env = StreamExecutionEnvironment.getExecutionEnvironmentenv.setParallelism(1)
// hello world
// hello flinkval text = env.readTextFile("src/main/resources/output/1.txt")val result = text.flatMap(_.split(" ")).map((_, 1)).keyBy(0).sum(1)result.print("result")env.execute("StreamWordCount")
// result> (hello,1)
// result> (world,1)
// result> (hello,2)
// result> (flink,1)}}
这篇关于spark与flink的wordcount示例的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!