Spark On Yarn 提交后VCore显示错误
yarn8088UI界面VCore Total原本显示共有120个,今天将spark任务提交到yarn执行后发现,VCore的数据变成了26个。
提交命令:
./bin/spark-submit \ --class com.jyn.secondtime.Seconds \ --master yarn \ --deploy-mode cluster \ --driver-memory 4g \ --num-executors 25 \ --executor-memory 6g \ --executor-cores 4 \ hdfs://bi/home/dinpay/spark-normal-task-v1.0.0.jar
理论上:vcores使用数 = executor-cores * num-executors + 1 = 25 * 4 = 100 + 1 = 101,
但是实际中很可能你会在yarn监控界面上看到vcores数使用只是26,也就是executor-cores没起作用。
解决方法:
这其实不是spark的问题,而是yarn调度器的一个特性,只需要修改“capacity-scheduler.xml”文件中的配置“yarn.scheduler.capacity.resource-calculator”即可
原配置 <property> <name>yarn.scheduler.capacity.resource-calculato</name> <value>org.apache.hadoop.yarn.util.resource.DefaultResourceCalculator</value> </property> 修改成: <property> <name>yarn.scheduler.capacity.resource-calculato</name> <value>org.apache.hadoop.yarn.util.resource.DominantResourceCalculator</value> </property>
然后重启hadoop即可。
原因分析:
参考此文章中对配置“yarn.scheduler.capacity.resource-calculator”的描述为:
The ResourceCalculator implementation to be used to compare Resources in the scheduler. The default i.e. org.apache.hadoop.yarn.util.resource.DefaultResourseCalculator only uses Memory while DominantResourceCalculator uses Dominant-resource to compare multi-dimensional resources such as Memory, CPU etc. A Java ResourceCalculator class name is expected.
大概意思就是:默认的那个配置,只对内存起作用,而后改的那个配对内存、CPU核数等等都起作用。
- spark on yarn提交任务时一直显示ACCEPTED
- Spark On Yarn:提交Spark应用程序到Yarn
- hive on spark通过YARN-client提交任务不成功
- HDP2.5.0 + Spark1.6.2 通过IDEA(Win64)远程提交spark jobs On YARN
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark On Yarn:提交Spark应用程序到Yarn
- spark-on-yarn作业提交缓慢优化
- Spark on Yarn:任务提交参数配置
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark on Yarn提交任务缓慢
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark On Yarn:提交Spark应用程序到Yarn
- Spark On Yarn:提交Spark应用程序到Yarn
- eclipse中svn提交显示错误svn: E200007: CHECKOUT can only be performed on a version resource
- Spark On Yarn:提交Spark应用程序到Yarn
- spark on yarn 无法提交任务问题