Originally posted by Stan James:
Have you tried setting the max heap to values close to what you think the app really needs? See what that does to outside observations like total process memory.
Thanks Stan,
I hadn't thought of it, but it's given me a bit more of an insight.
First up (and I realise I should have mentioned this in my initial post, but it was already feeling a bit long-winded), the version that's out there with users complaining about memory has -Xmx200M (for various historical reasons, that no longer exist). All my profiling so far has been with -Xmx64M.
So - with -Xmx64M observations are as described above. The available memory works its way up to 64MB, the used memory sits at 20-30MB, peaking up to around 60MB for expensive operations (incidentally, the most expensive seems to be a PDF export, so that's what I'm using to
test the peak).
I tried -Xmx32M and found essentially the same as above - the used memory was around 25MB, with the maximum available quickly reaching 32MB. Unsurprisingly trying my expensive PDF export threw an OutOfMemoryError.
Next I tried -Xmx256MB and was surprised (although I shouldn't really have been, with hindsight) to find that the used memory now hovered around the 90MB mark, with the available memory working its way up to 170MB-ish. This time my PDF export pushed the used memory up to about 150MB. Forcing a garbage collection made no difference (I wondered if that would make it drop down to 20-30MB).
So how did all of that affect the reported memory in Windows? -Xmx32M and -Xmx64M were very similar - only about 10MB difference. They started around the 140MB mark, and gradually crept up over time. -Xmx256M started higher (about 190MB), and very quickly climbed to about 280MB, roughly (although not exactly) in line with the available memory, as reported by JProfiler.
So what does this tell me? I guess that
Java's optimisation is smarter than I gave it credit for; give it more memory, and it uses more, even if *could* do the same with less. So, if it's capped at the maximum that it needs, it should scale everything down, which it does seem to do.
Also, it's nice to know that the reported usage, as users will see it, does come down a reasonable amount, now that I'm able to restrict Java a bit more. It's early days, but the restriction doesn't appear to cause me any performance problems. Obviously more testing to do there.
So, if I allow the heap 64MB (and I might be able to get that down, with some optimisation), and the non-heap, by the way, seems to use a pretty much constant 30MB, my code is using a total of around 90MB (or at least, it needs that much available to it), and the JVM is using about 60MB in the short-term.
Longer-term, the memory according to Task Manager is still creeping up, although I'll need to do some more testing to establish how significantly. I have a feeling that by dropping from the -Xmx200M that's out there now, to -Xmx64M, that increase over time will be proportionally less - hopefully.
If the above several assumptions are correct, I think I can cope with it, and hopefully my boss and users can as well. It puts me around the same level as Firefox with 10+ tabs open, and Photoshop doing some fairly basic editing.
If anyone's actually bothered to read this far hoping for a question, then I can only apologise... As a token of goodwill, I'll offer 'does all of that seem to make sense?'. If you do see anything glaringly stupid in my ramblings please point it out, otherwise, sorry for wasting your time!