Press "Enter" to skip to content

Spark Defaults for Core Count and Memory

The Big Data in Real World team gives us the defaults:

spark.executor.cores controls the number of cores available for the executors.


spark.executor.memory controls the amount of memory allocated for each executor. 

I did helpfully take out the first answer, so you’ll have to click through to the post in order to see the answers., as well as how cluster mode vs client mode can change things.