您的位置:首页 > 编程语言

spark编程遇到的问题

2016-03-14 15:56 169 查看
问题1:16/03/14 15:22:24 WARN yarn.YarnAllocator: Container killed by YARN for exceeding memory limits. 17.0 GB of 17 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
解决:
--conf spark.yarn.executor.memoryOverhead=4096 \

问题2:
30000元素list,Cn2组合,用array或者list,会out of memory或者导致GC时间太长,从而作业失败
解决:

vali1 = x._2.iterator
val i2 = x._2.iterator.toArray
i1.flatMap(y => i2.map(z =>
((y, z), 1)
).filter(z => z._1._1 > z._1._2)
)
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: