启动SparkContext时出错---ValueError: Cannot run multiple SparkContexts at once; existing SparkContext
2017-02-13 09:53
429 查看
# 启动SparkContext
启动SparkContext的时候,提示ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(...)
// 创建SparkContext的最基本方法,只需要传递两个参数。
// 集群URL:告诉Spark如何连接到集群上,使用local可以让spark运行在单机单线程上。
// 应用名:使用"My App",当连接到一个集群时,这个值可以在集群管理器的用户界面中找到你的应用。
>>> from pyspark import SparkConf, SparkContext
>>> conf = SparkConf().setMaster("local").setAppName("My App")
>>> sc = SparkContext(conf = conf)Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/xl/spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/context.py", line 115, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "/home/xl/spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/context.py", line 272, in _ensure_initialized
callsite.function, callsite.file, callsite.linenum))
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by <module> at /home/xl/spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/shell.py:43
出现这个错误是因为之前已经启动了SparkContext,所以需要先关闭spark,然后再启动。
>>> sc.stop() // 关闭spark
>>> sc = SparkContext(conf = conf)
>>> conf
<pyspark.conf.SparkConf object at 0x7fedfc83bf10>
>>> sc
<pyspark.context.SparkContext object at 0x7fedfc943210>
>>>
启动SparkContext的时候,提示ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(...)
// 创建SparkContext的最基本方法,只需要传递两个参数。
// 集群URL:告诉Spark如何连接到集群上,使用local可以让spark运行在单机单线程上。
// 应用名:使用"My App",当连接到一个集群时,这个值可以在集群管理器的用户界面中找到你的应用。
>>> from pyspark import SparkConf, SparkContext
>>> conf = SparkConf().setMaster("local").setAppName("My App")
>>> sc = SparkContext(conf = conf)Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/xl/spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/context.py", line 115, in __init__
SparkContext._ensure_initialized(self, gateway=gateway, conf=conf)
File "/home/xl/spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/context.py", line 272, in _ensure_initialized
callsite.function, callsite.file, callsite.linenum))
ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, master=local[*]) created by <module> at /home/xl/spark/spark-2.1.0-bin-hadoop2.7/python/pyspark/shell.py:43
出现这个错误是因为之前已经启动了SparkContext,所以需要先关闭spark,然后再启动。
>>> sc.stop() // 关闭spark
>>> sc = SparkContext(conf = conf)
>>> conf
<pyspark.conf.SparkConf object at 0x7fedfc83bf10>
>>> sc
<pyspark.context.SparkContext object at 0x7fedfc943210>
>>>
相关文章推荐
- spark:ValueError: Cannot run multiple SparkContexts at once解决方法
- ValueError: Cannot run multiple SparkContexts at once; existing SparkContext ...
- ValueError: Cannot run multiple SparkContexts at once; existing SparkContext(app=PySparkShell, maste
- MyEclipse启动本地Tomcat出错:Cannot change deployment state from ERROR to REDEPLOYING.
- Windows 平台运行spark-shell 报"java.lang.NullPointerException, not found: value sqlContext" error 解决办法
- Glassfish Cannot run program "/usr/libexec/StartupItemContext; error=2 , No such file or directory
- ERROR spark.SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign req
- MyEclipse启动本地Tomcat出错:Cannot change deployment state from ERROR to REDEPLOYING.
- Multiple annotations found at this line: - javax.servlet.jsp.PageContext cannot be resolved to a typ
- 忽然遇到报错:ERROR spark.SparkContext: Error initializing SparkContext.
- Spark 2.1 , Method used to prevent multiple SparkContexts from being active at the same time
- 忽然遇到报错:ERROR spark.SparkContext: Error initializing SparkContext.
- win7 spark运行本地程序文件出错 error:avaSparkContext. : java.lang.NullPointerException
- Eclipse+CDT+MinGW出错:Error: Cannot run program "gcc": ????????解决方法
- Spring 2.0.8 和 2.5 applicationContext.xml 头文件写法 Spring启动异常: cvc-elt.1: Cannot find the declaration of element 'beans'
- tomcat启动时出错:严重: Error initializing endpoint
- Yum出错Error: Cannot find a valid baseurl for repo: addons
- Ubuntu下安装javaIDE IntellliJIDEA9.0.3时提示ERROR: cannot start IntelliJ IDEA. No JDK found to run IDEA. Please validate either IDEA_J
- TOMCAT 启动是出错:Error Filterstart
- cannot create windows service for mysql. error 0—mysql启动错误1067的解决