您的位置:首页 > 数据库 > MySQL

Spark insertIntoJDBC找不到Mysql驱动解决方法

2016-06-27 19:33 591 查看
下面这是个解决办法,但我想在eclipse里直接运行还是报错

java.sql.SQLException: No suitable driver found for jdbc:mysql://ip:3306/xx

  感觉很奇怪,我在启动作业的时候加了Mysql驱动啊在,怎么会出现这种异常呢??经过查找,发现在–jars参数里面加入Mysql是没有用的。通过查找,发现提交的作业可以通过加入–driver-class-path参数来设置driver的classpath,试了一下果然没有出现错误!

[itelbog@iteblog ~]$  bin/spark-submit --master local[2]
--driver-class-path lib/mysql-connector-java-5.1.35.jar
--class  spark.SparkToJDBC ./spark-test_2.10-1.0.jar


  其实,我们还可以在spark安装包的conf/spark-env.sh通过配置SPARK_CLASSPATH来设置driver的环境变量,如下:

export SPARK_CLASSPATH=$SPARK_CLASSPATH:/iteblog/com/mysql-connector-java-5.1.35.jar


  这样也可以解决上面出现的异常。但是,我们不能同时在conf/spark-env.sh里面配置SPARK_CLASSPATH和提交作业加上–driver-class-path参数,否则会出现以下异常:

[itelbog@iteblog ~]$  bin/spark-submit --master local[2]
--driver-class-path lib/mysql-connector-java-5.1.35.jar
--class  spark.SparkToJDBC ./spark-test_2.10-1.0.jar
Spark assembly has been built with Hive, including Datanucleus jars on classpath
Exception in thread "main" org.apache.spark.SparkException:
Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply
$7.apply(SparkConf.scala:339)
at org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply
$7.apply(SparkConf.scala:337)
at scala.collection.immutable.List.foreach(List.scala:318)
at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:337)
at org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:325)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:325)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:197)
at spark.SparkToJDBC$.main(SparkToJDBC.scala:41)
at spark.SparkToJDBC.main(SparkToJDBC.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$
deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)


本博客文章除特别声明,全部都是原创!

尊重原创,转载请注明: 转载自过往记忆(http://www.iteblog.com/

本文链接: 【Spark insertIntoJDBC找不到Mysql驱动解决方法】(http://www.iteblog.com/archives/1300
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  spark mysql jdbc