您的位置:首页 > 其它

【spark】group\groupBy

2017-10-23 14:29 232 查看
groupBy(function)

function返回key,传入的RDD的各个元素根据这个key进行分组

val a = sc.parallelize(1 to 9, 3)
a.groupBy(x => { if (x % 2 == 0) "even" else "odd" }).collect//分成两组
/*结果
Array(
(even,ArrayBuffer(2, 4, 6, 8)),
(odd,ArrayBuffer(1, 3, 5, 7, 9))
)
*/
1
2
3
4
5
6
7
8
val a = sc.parallelize(1 to 9, 3)
def myfunc(a: Int) : Int =
{
a % 2//分成两组
}
a.groupBy(myfunc).collect
1
2
3
4
5
6
/*

结果

Map(

(0,ArrayBuffer(2, 4, 6, 8)),

(1,ArrayBuffer(1, 3, 5, 7, 9))

)

*/

groupByKey( )

val a = sc.parallelize(List("dog", "tiger", "lion", "cat", "spider", "eagle"), 2)
val b = a.keyBy(_.length)//给value加上key,key为对应string的长度
b.groupByKey.collect
//结果 Map((4,ArrayBuffer(lion)), (6,ArrayBuffer(spider)), (3,ArrayBuffer(dog, cat)), (5,ArrayBuffer(tiger, eagle)


转自:http://blog.csdn.net/guotong1988/article/details/50556871
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  spark groupby