Sandeep Pawar wants to make a change:
Spark properties are divided into mutable and immutable configurations based on whether they can be safely modified during runtime after the spark session is created.
Mutable properties can be changed dynamically using
spark.conf.set()
without requiring a restart of the Spark application – these typically include performance tuning parameters like shuffle partitions, broadcast thresholds, AQE etc.Immutable properties, on the other hand, are global configurations that affect core spark behavior and cluster setup and these must be set before/at session initialization as they require a fresh session to take effect.
Read on to see how you can tell which is which.
Comments closed