scala wordcount
2017-04-26 14:13
127 查看
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD.rddToOrderedRDDFunctions
import org.apache.spark.rdd.RDD.rddToPairRDDFunctions
object WordCount {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("WordCount").setMaster("local")
val sc = new SparkContext(conf)
val lines = sc.textFile("D:\\BigData\\mockData\\mockData.txt")
val words = lines.flatMap { _.split(" ") }
val pairs = words.map { (_, 1) }
val results = pairs.reduceByKey(_+_)
// val sorted = results.sortBy(pair=>pair._2, true)
val sorted = results.sortByKey(false)
sorted.foreach(println(_))
}
}
import org.apache.spark.SparkContext
import org.apache.spark.rdd.RDD.rddToOrderedRDDFunctions
import org.apache.spark.rdd.RDD.rddToPairRDDFunctions
object WordCount {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("WordCount").setMaster("local")
val sc = new SparkContext(conf)
val lines = sc.textFile("D:\\BigData\\mockData\\mockData.txt")
val words = lines.flatMap { _.split(" ") }
val pairs = words.map { (_, 1) }
val results = pairs.reduceByKey(_+_)
// val sorted = results.sortBy(pair=>pair._2, true)
val sorted = results.sortByKey(false)
sorted.foreach(println(_))
}
}
相关文章推荐
- python、scala、java分别实现在spark上实现WordCount
- idea利用scala编写wordcount 一些坑
- scala 编写wordCount
- SPARK-Shell 用Scala执行WordCount
- spark 2.2.0 WordCount scala版
- 使用Scala编写WordCount详细分析
- Scala 入门 WordCount
- scala本地wordcount的程序编写
- Scala中使用两种方式对单词进行次数统计(wordCount)
- Akka初体验之scala版word-count 的实现
- 利用Scala编写Wordcount并在spark框架下运行
- Scala 学习(六)--- 单机实现wordcount详细解读
- spark小应用一:wordcount,按词频降序(SCALA)
- scala akka wordcount程序
- Scala简单单机actorwordcount
- Scala分步完成wordcount例子
- Spark Run WordCount On Hdfs using Scala
- Spark Streaming开发入门——WordCount(Java&Scala)
- Spark+scala+Idea wordcount 示例
- JDK8+Scala2.11+spark-2.0.0+Intellij2017.3.4开发wordcount程序并在集群中运行