Performance of setting Java Initial and Maximum memory to the same value
In my work environment, we have a number of Enterprise Java applications that run on Windows Server 2008 hardware. An external consultant has been reviewing the performance of our Java applications and have suggested that we change out initial and maximum memory values to be identical.
So, at present, we're running an application with 1GB initial and 2GB maximum memory. Their recommendation is to change the initial and maximum memory both to 2GB.
Their reasoning is 2-fold...
- By allocating the initial and maximum at 2GB, when the Java application is started it will grab a 2GB block of memory straight up. This prevents any other applications from accessing this memory block, so this memory will always be available to the Java application. This is opposed to the current situation, where it would get an initial block of 1GB only, which means potentially other applications could consume the remaining available memory on the server, and the enterprise Java application wouldn't be able to grow to the 2GB memory maximum.
- There is a small performance hit every time that Java needs to allocate memory between the initial and maximum size. This is because it need to go to Windows, assign the new block of the required size, then use it. This performance hit will occur every time that memory is required between the initial size and the maximum size. Therefore, setting them both to the same value means that this performance impact is removed.
I think their 1st point is valid from a safety perspective, as it means the java application will always have access to the maximum memory regardless of what other applications are doing on the server. However, I also think that this could be a negative point, as it means that there is a big block of memory that can't be used by other applications if they need it, which could cause general server performance problems.
I think the performance impact they discuss in point 2 is probably so negligible that it isn't worth worrying about. If they're concerned about performance, they would be better off tuning things like the garbage collection aspect of Java rather than worrying about the tiny performance impact of allocating a block of memory.
Could anyone please tell me whether there is real benefit in their request to assign the initial and maximum memory to the same value. Are there any general recommendations either way?
setting them the same increase predictability. If you don't set them the same when the GC decides it needs more memory it will take time to allocate and shuffle around objects to make room. This is in addition to the GC activities. While this is happening requests are slower and your throughput will suffer. On a 2GB heap you will probably be allocating around 400mb of memory each time more is needed and 100mb removed each time the memory isn't needed. Increase the heap size and these numbers increase. The heap would be ever changing between your values, it isn't like it just allocates and keeps that memory.
For the argument #1 on ensuring the OS always has the memory available for you I believe is a moot point. If your server is hurting for memory and you are already capping out the machine to run the software then you are running it on the wrong server. Get the hardware you need and give it room to grow. If you say your application could use 2GB, personally I would have that on a machine with 4GB or more free. If my client/user base grows the freedom is there to increase the heap to accommodate new users.
IMO, even the 1st suggestion is not so important. Modern OS has virtual memory. So even if you allocate 2GB memory to your Java process, it's not guaranteed that these 2GB memory will always reside in physic memory. If there are other applications in the same box which consumes a lot of memory, you'll get poor performance no matter when you allocate memory.