spark standalone ha spark submit
2016-09-19 15:45
405 查看
when you build a spark standalone ha cluster, when you submit your app, you should send it to the leader master,
not the standby master, how to decided the status of the two masters?
a simple way, not judge the status which master is the active, you can just add
spark.master spark://master1:port,master2:port
in the $SPARK_HOME/conf/spark-defaults.conf
or else,
you can curl the web-ui for your spark master.
like curl sparkmaster:web-ui-port |grep STANDBY?
not the standby master, how to decided the status of the two masters?
a simple way, not judge the status which master is the active, you can just add
spark.master spark://master1:port,master2:port
in the $SPARK_HOME/conf/spark-defaults.conf
or else,
you can curl the web-ui for your spark master.
like curl sparkmaster:web-ui-port |grep STANDBY?
相关文章推荐
- Spark1.0.0 Standalone HA的实现
- Spark standalone HA
- Spark学习之13:Standalone HA
- Spark Standalone模式HA环境搭建
- 使用spark-submit提交jar包到spark standalone集群(续)
- Spark1.0.0 Standalone HA的实现
- Spark1.2集群环境搭建(Standalone+HA) 4G内存5个节点也是蛮拼的
- Spark standalone HA
- spark standalone zookeeper HA部署方式
- Spark1.0.0 Standalone HA的实现
- Spark1.2集群环境搭建(Standalone+HA) 4G内存5个节点也是蛮拼的
- spark standalone zookeeper HA部署方式
- spark standalone zookeeper HA部署方式
- Spark1.2集群环境搭建(Standalone+HA) 4G内存5个节点
- Spark(一)-- Standalone HA的部署
- Spark1.2集群环境搭建(Standalone+HA) 4G内存5个节点也是蛮拼的
- spark standalone zookeeper HA部署方式
- Spark(一)-- Standalone HA的部署
- Spark Standalone HA
- spark standalone模式单节点启动多个executor