Task not serializable exception while running apache spark job
2015-09-29 09:59
399 查看
spark出现task不能序列化错误的解决方法 org.apache.spark.SparkException: Task not serializable
出现“task not serializable"这个错误,一般是因为在map、filter等的参数使用了外部的变量,但是这个变量不能序列化。特别是当引用了某个类(经常是当前类)的成员函数或变量时,会导致这个类的所有成员(整个类)都需要支持序列化。解决这个问题最常用的方法有:
如果可以,将依赖的变量放到map、filter等的参数内部定义。这样就可以使用不支持序列化的类;
如果可以,将依赖的变量独立放到一个小的class中,让这个class支持序列化;这样做可以减少网络传输量,提高效率;
如果可以,将被依赖的类中不能序列化的部分使用transient关键字修饰,告诉编译器它不需要序列化。
将引用的类做成可序列化的。
以下这两个没试过。。
Make the NotSerializable object as a static and create it once per machine.
Call rdd.forEachPartition and create the NotSerializable object in there like this:
==================
ref[1]:<http://databricks.gitbooks.io/databricks-spark-knowledge-base/content/troubleshooting/javaionotserializableexception.html>
If you see this error:
The above error can be triggered when you intialize a variable on the driver (master), but then try to use it on one of the workers. In that case, Spark Streaming will try to serialize the object to send it over to the worker, and fail if the object
is not serializable. Consider the following code snippet:
This will trigger that error. Here are some ideas to fix this error:
Serializable the class
Declare the instance only within the lambda function passed in map.
Make the NotSerializable object as a static and create it once per machine.
Call rdd.forEachPartition and create the NotSerializable object in there like this:
Pasted from: <http://databricks.gitbooks.io/databricks-spark-knowledge-base/content/troubleshooting/javaionotserializableexception.html>
另外, stackoverflow上http://stackoverflow.com/questions/25914057/task-not-serializable-exception-while-running-apache-spark-job 这个答的也很简明易懂。
From WizNote
出现“task not serializable"这个错误,一般是因为在map、filter等的参数使用了外部的变量,但是这个变量不能序列化。特别是当引用了某个类(经常是当前类)的成员函数或变量时,会导致这个类的所有成员(整个类)都需要支持序列化。解决这个问题最常用的方法有:
如果可以,将依赖的变量放到map、filter等的参数内部定义。这样就可以使用不支持序列化的类;
如果可以,将依赖的变量独立放到一个小的class中,让这个class支持序列化;这样做可以减少网络传输量,提高效率;
如果可以,将被依赖的类中不能序列化的部分使用transient关键字修饰,告诉编译器它不需要序列化。
将引用的类做成可序列化的。
以下这两个没试过。。
Make the NotSerializable object as a static and create it once per machine.
Call rdd.forEachPartition and create the NotSerializable object in there like this:
==================
ref[1]:<http://databricks.gitbooks.io/databricks-spark-knowledge-base/content/troubleshooting/javaionotserializableexception.html>
If you see this error:
org.apache.spark.SparkException: Job aborted due to stage failure: Task not serializable: java.io.NotSerializableException: ...
The above error can be triggered when you intialize a variable on the driver (master), but then try to use it on one of the workers. In that case, Spark Streaming will try to serialize the object to send it over to the worker, and fail if the object
is not serializable. Consider the following code snippet:
NotSerializable notSerializable = new NotSerializable(); JavaRDD<String> rdd = sc.textFile("/tmp/myfile"); rdd.map(s -> notSerializable.doSomething(s)).collect();
This will trigger that error. Here are some ideas to fix this error:
Serializable the class
Declare the instance only within the lambda function passed in map.
Make the NotSerializable object as a static and create it once per machine.
Call rdd.forEachPartition and create the NotSerializable object in there like this:
rdd.forEachPartition(iter -> { NotSerializable notSerializable = new NotSerializable(); // ...Now process iter });
Pasted from: <http://databricks.gitbooks.io/databricks-spark-knowledge-base/content/troubleshooting/javaionotserializableexception.html>
另外, stackoverflow上http://stackoverflow.com/questions/25914057/task-not-serializable-exception-while-running-apache-spark-job 这个答的也很简明易懂。
From WizNote
相关文章推荐
- Apache Commons CLI 简介
- apache, php, mysql 安装过程命令记录
- apache/mysql/php(wamp環境配置)
- apache/iis访问日志转到sql存储的程序思维
- Apache—DBUtils框架简介、DbUtils类、QueryRunner类 、ResultSetHandler接口
- apache 无权限访问目录访问access forbin
- Caused by: java.lang.SecurityException: sealing violation: package org.apache.derby...
- apache中国地址镜像文件下载地址
- 如何在Apache中配置多端口访问
- Apache2.4服务配置
- 启动Apache时提示No space left on device
- 如何为Apache JMeter开发插件(三)——冲破图片验证码的束缚
- Apache spark 的一些浅见。
- OSX10.10.5配置apache2&php5&VirtualHost
- MAC OS X10.10 Apache服务器搭建和配置
- Linux Apache服务搭建学习
- java.lang.NoSuchMethodException: org.apache.catalina.deploy.WebXml addFilter
- Apache模块 mod_expires
- 1.5 CentOS7 配置AMP环境之Apache
- Linux(CentOS)下的apache服务器配置与管理