Setting spark storageFraction has no efffect. It doesn't even crash with nonsense value

I am trying to change the spark environment variable "spark.memory.storageFraction". I have tried to do this in various ways:

  • As a parameter to my spark-submit command
  • Saved in a config file that I attached to my spark-submit
  • In the scala code via .set("spark.memory.storageFraction","0.1")

When i check the spark application UI under "Spark properties" it clearly shows that the variable is set, but it has no effect on the storage memory when I look at the "Executors"-section of the UI.

Even if I add a nonsense value like such:

.set("spark.memory.storageFraction","Blah blah blah")

The program doesn't seem to be affected at all. In fact, the "blah blah blah"-value is displayed under spark properties.

I am using spark 1.5

Answers


Try spark.storage.memoryFraction instead.


Need Your Help

Kill process by filename

powershell

I have 3 instances of application running from different places. All processes have similar names.

Call lightbox2 without <a>

javascript html lightbox2

I searched a lot, but didn't find a solution...