Apache-spark – YARN: What is the difference between number-of-executors and executor-cores in Spark

apache-sparkemrhadoop-yarn

I am learning Spark on AWS EMR. In the process I am trying to understand the difference between number of executors(–num-executors) and executor cores (–executor-cores). Can any one please tell me here?

Also when I am trying to submit the following job, I am getting error:

spark-submit --deploy-mode cluster --master yarn --num-executors 1 --executor-cores 5   --executor-memory 1g -–conf spark.yarn.submit.waitAppCompletion=false wordcount.py s3://test/spark-example/input/input.txt s3://test/spark-example/output21

Error: Unrecognized option: -–conf

Best Answer

Number of executors is the number of distinct yarn containers (think processes/JVMs) that will execute your application.

Number of executor-cores is the number of threads you get inside each executor (container).

So the parallelism (number of concurrent threads/tasks running) of your spark application is #executors X #executor-cores. If you have 10 executors and 5 executor-cores you will have (hopefully) 50 tasks running at the same time.

Related Topic