spark countByKey用法详解
2016-05-15 15:37
274 查看
统计每个key对应的value个数,需要注意的是rdd类型是pairRdd,即键值对的形式的rdd,详细代码如下:
private static void myCountByKey(){
SparkConf conf=new SparkConf()
.setMaster("local")
.setAppName("myCountByKey");
JavaSparkContext sc=new JavaSparkContext(conf);
List<Tuple2<String,String>> studentList=Arrays.asList(new Tuple2<String,String>("c1","cai"),new Tuple2<String,String>("c2","niao")
,new Tuple2<String,String>("c1","feng"),new Tuple2<String,String>("c2","jin"),new Tuple2<String,String>("c2","niao"));
JavaPairRDD<String, String> studentRdd= sc.parallelizePairs(studentList);
Map<String, Object> studentCounts=studentRdd.countByKey();
for(Map.Entry<String, Object> map:studentCounts.entrySet()){
System.out.println("key:"+map.getKey()+",values:"+map.getValue());
}
}
运行结果:
key:c2,values:3
key:c1,values:2
private static void myCountByKey(){
SparkConf conf=new SparkConf()
.setMaster("local")
.setAppName("myCountByKey");
JavaSparkContext sc=new JavaSparkContext(conf);
List<Tuple2<String,String>> studentList=Arrays.asList(new Tuple2<String,String>("c1","cai"),new Tuple2<String,String>("c2","niao")
,new Tuple2<String,String>("c1","feng"),new Tuple2<String,String>("c2","jin"),new Tuple2<String,String>("c2","niao"));
JavaPairRDD<String, String> studentRdd= sc.parallelizePairs(studentList);
Map<String, Object> studentCounts=studentRdd.countByKey();
for(Map.Entry<String, Object> map:studentCounts.entrySet()){
System.out.println("key:"+map.getKey()+",values:"+map.getValue());
}
}
运行结果:
key:c2,values:3
key:c1,values:2
相关文章推荐
- Spark RDD API详解(一) Map和Reduce
- 使用spark和spark mllib进行股票预测
- Spark随谈——开发指南(译)
- Spark,一种快速数据分析替代方案
- eclipse 开发 spark Streaming wordCount
- Understanding Spark Caching
- ClassNotFoundException:scala.PreDef$
- Windows 下Spark 快速搭建Spark源码阅读环境
- Spark中将对象序列化存储到hdfs
- 使用java代码提交Spark的hive sql任务,run as java application
- Spark机器学习(一) -- Machine Learning Library (MLlib)
- Spark机器学习(二) 局部向量 Local-- Data Types - MLlib
- Spark机器学习(三) Labeled point-- Data Types
- Spark初探
- Spark Streaming初探
- Spark本地开发环境搭建
- 搭建hadoop/spark集群环境
- Spark HA部署方案
- Spark HA原理架构图
- spark内存概述