您的位置:首页 > 移动开发 > Objective-C

关于异常 java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)的处理

2016-02-19 17:58 543 查看
接上个blogpost-http://blog.csdn.net/lzlchangqi/article/details/50631341

环境配置好后进行了开发 查询,下面是写了一个函数查询hive的两column,把这两列以key value的形式放入map返回,但是出现了如标题的异常

def get_repage_clicks(data_day_str : String)  = {
val sql_repage_clicks = """select convert_type,clicks from app.app_cps_repageall_total_clicks where dt = '""" + data_day_str + """' """
val ret = new HashMap[String, Long]()
hiveContext.sql(s"${sql_repage_clicks}").collect().foreach(convertclick => ret += (convertclick.getString(0) -> convertclick.getLong(1)))
ret
}


详细异常如下:

16/02/19 17:03:57 INFO DAGScheduler: ResultStage 0 (collect at SparkPlan.scala:94) finished in 9.795 s
16/02/19 17:03:57 INFO YarnScheduler: Removed TaskSet 0.0, whose tasks have all completed, from pool
16/02/19 17:03:57 INFO DAGScheduler: Job 0 finished: collect at SparkPlan.scala:94, took 10.060897 s
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;
at com.jd.jd_ad.report.auto.week.cost.CPSCal$$anonfun$main$1.apply(CPSCal.scala:24)
at com.jd.jd_ad.report.auto.week.cost.CPSCal$$anonfun$main$1.apply(CPSCal.scala:24)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:108)
at com.jd.jd_ad.report.auto.week.cost.CPSCal$.main(CPSCal.scala:24)
at com.jd.jd_ad.report.auto.week.cost.CPSCal.main(CPSCal.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:619)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
其实异常就是 val ret = new HashMap[String, Long]() 导致的

查询https://issues.apache.org/jira/browse/SPARK-5483 得知 是scala和spark不一致导致的:运行环境是spark-1.4,1.4应该使用scala-2.10 于是修改build.sbt,把2.11都改为了2.10就ok了
import AssemblyKeys._
assemblySettings

name := "scalaProjectTest"

version := "1.0"

scalaVersion := "2.10.5"

EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource

libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.10" % "1.6.0" % "provided",
"org.apache.spark" % "spark-sql_2.10" % "1.6.0" % "provided",
"org.apache.spark" % "spark-hive_2.10" % "1.6.0" % "provided"

)
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: