您的位置:首页 > 编程语言 > Java开发

Spark2.0的Caused by: java.net.URISyntaxException: Relative path in absolute URI: file错误

2016-09-21 14:17 465 查看
在使用spark2.0的时候遇到如下错误:

16/09/21 14:12:22 INFO SharedState: Warehouse path is 'file:E:\scalacode_v2\Spark2Pro/spark-warehouse'.

Exception in thread "main" java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: file:E:/scalacode_v2/Spark2Pro/spark-warehouse

    at org.apache.hadoop.fs.Path.initialize(Path.java:206)

    at org.apache.hadoop.fs.Path.<init>(Path.java:172)

    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:114)

    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145)

    at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89)

    at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95)

    at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95)

    at org.apache.spark.sql.internal.SessionState$$anon$1.<init>(SessionState.scala:112)

    at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112)

    at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111)

    at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)

    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:161)

    at org.apache.spark.sql.Dataset.<init>(Dataset.scala:167)

    at org.apache.spark.sql.Dataset$.apply(Dataset.scala:59)

    at org.apache.spark.sql.SparkSession.createDataset(SparkSession.scala:441)

    at org.apache.spark.sql.SQLContext.createDataset(SQLContext.scala:395)

    at org.apache.spark.sql.SQLImplicits.rddToDatasetHolder(SQLImplicits.scala:163)

    at com.yisa.test.Test$.main(Test.scala:24)

    at com.yisa.test.Test.main(Test.scala)

Caused by: java.net.URISyntaxException: Relative path in absolute URI: file:E:/scalacode_v2/Spark2Pro/spark-warehouse

    at java.net.URI.checkPath(Unknown Source)

    at java.net.URI.<init>(Unknown Source)

    at org.apache.hadoop.fs.Path.initialize(Path.java:203)

    ... 18 more

google了一下后发现如下解析:

The default value of `spark.sql.warehouse.dir` is `System.getProperty("user.dir")/spark-warehouse`. Since `System.getProperty("user.dir")` is a local dir, we should explicitly set the scheme to local filesystem.


就是说我们需要添加一个配置spark.sql.warehouse.dir,如果不添加上该配置,默认是找的user.dir下面的目录。这个其实是没有的。所以报错。

所以,我们需要添加配置。

    val sparkSession = SparkSession.builder()

    .master("local[2]")

    .appName("example")

    .config("spark.sql.warehouse.dir", "file:///e:/tmp/spark-warehouse")

    .getOrCreate()

添加之后。成功运行。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签:  spark
相关文章推荐