spark读写hdfs后出现的异常错误
2016-05-27 19:07
447 查看
org.apache.spark.scheduler.LiveListenerBus {Logging.scala:95} - Listener EventLoggingListener threw an exception java.lang.reflect.InvocationTargetException at sun.reflect.GeneratedMethodAccessor65.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:601) at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:150) at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:150) at scala.Option.foreach(Option.scala:236) at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:150) at org.apache.spark.scheduler.EventLoggingListener.onStageCompleted(EventLoggingListener.scala:170) at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:32) at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31) at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31) at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:55) at org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:37) at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(AsynchronousListenerBus.scala:80) at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:65) at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(AsynchronousListenerBus.scala:65) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57) at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:64) at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1180) at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:63) Caused by: java.io.IOException: Filesystem closed at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:629) at org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1629) at org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1590) at org.apache.hadoop.fs.FSDataOutputStream.hflush(FSDataOutputStream.java:128)
由于在读写hdfs时使用的是:
FileSystem fs = ConfigurationContext.getFileSystem();
当读写完hdfs文件,执行了fs.close()。但在spark设置中把spark的job日志是写到hdfs上。日志与我读写hdfs用的是一个FileSystem,从而导致后面日志出现这样的异常。
相关文章推荐
- hdfs文件误删恢复
- hdfs块丢失导致的异常问题排查解决
- sparkStreming on HDFS
- hdfs 常用命令
- hadoop升级后,hive报错
- HDFS配置Kerberos认证
- HDFS的运行原理
- HDFS初探
- HDFS数据更新到hbase表
- Hadoop HA重做 Standby
- HDFS的API
- HDFS客户端的权限错误:Permission denied
- HDFS的概念
- HDFS中心缓存管理
- Spark优化:禁止应用程序将依赖的Jar包传到HDFS
- Spark兼容Hive入门解析
- flume学习(五):flume将log4j日志数据写入到hdfs
- HDFS的搭建
- HDFS文件系统操作命令
- HDFS数据的读写过程