• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Tim Cooke
  • Ron McLeod
  • paul wheaton
  • Jeanne Boyarsky
Sheriffs:
  • Paul Clapham
  • Devaka Cooray
Saloon Keepers:
  • Tim Holloway
  • Roland Mueller
  • Himai Minh
Bartenders:

My Code vs. JVM Memory Usage

 
Ranch Hand
Posts: 129
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi all,

I know this is a question that's been covered several times before, and I apologise for asking it again, but...

When I run my (desktop) software, for any length of time, the memory usage, reported in Windows' Task Manager goes up, fairly gradually, but getting into the 200MB region relatively swiftly, and 300MB+ after a few hours of constant use.

However, as guided by the many responses here and elsewhere, I've profiled it (with JConsole and JProfiler) and found that there is definitely no memory leak in my code - it runs along happily around the 20-30MB mark for most operations, peaking around the 60MB mark occasionally.

Also, under XP I switched to the 'VM usage' (or whatever it calls itself) column in Task Manger, and saw much the same thing - no memory leak. I've since been forcibly moved to Vista, and the columns in Vista's Task Manager are different - I haven't really played with them, but don't really need to either.

However... that's very little comfort to our users (mostly home, non-technical), or my (vaguely technical) boss when he sees comments like: 'I would use your software, but Windows tells me it has a memory leak; even Photoshop/Word/Excel/your competitors (delete as appropriate) don't use half as much memory as you'.

We do have a few native calls going on (exporting to other applications, and such), but the memory creeps up, albeit more slowly even without using those features.

So, as I understand things, the JVM, along with these few native calls is what's taking up the extra memory.

The simple question is: how can I (or is it possible to) stop it from seeming as if our software has a memory leak. I'm open to using different JVMs, I've only tested Sun's 1.6 so far - any recommendations? Another thought was to isolate the more (apparently) memory intensive operations, which are used infrequently, and somehow fire them off as a separate application, in a separate JVM, which is closed down as soon as the operation completes. That feels somewhat messy, and so scares me a bit, but is it something that's at all feasible - in a cross platform way?

Any other suggestions gratefully received!
 
Ranch Hand
Posts: 239
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Two parts when talking about JVM memory.
1. The amount of memory required by your desktop application.
If it is 60MB max, then you cannot avoid this.
The best thing would be to optimize your code. Try playing around with -X JVM options along with tuning GC. (The default GC in JDK 6, I believe is aggressive heap, that could be one of the reasons for more memory. If it is available, the GC will be lazy ).
2. Memory leaks in the application.
Are you sure?
[ September 19, 2007: Message edited by: Rajah Nagur ]
 
Mark Newton
Ranch Hand
Posts: 129
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi Rajah,

Sorry - I obviously didn't explain properly - to answer your questions:

Originally posted by Rajah Nagur:

1. The amount of memory required by your desktop application.
If it is 60MB max, then you cannot avoid this.
The best thing would be to optimize your code. Try playing around with -X JVM options along with tuning GC. (The default GC in JDK 6, I believe is aggressive heap, that could be one of the reasons for more memory. If it is available, the GC will be lazy ).
[ September 19, 2007: Message edited by: Rajah Nagur ]



Not entirely sure what you mean here - from profiling, the memory usage sits comfortably around the 20-30MB mark, occasionally peaking up to 60MB when more is going on, but it drops down to 20-30MB when the GC next kicks in. This I'm fine with - I'm sure with some optimisation I could bring it lower, but for now I'm happy with this amount of memory.

Originally posted by Rajah Nagur:

2. Memory leaks in the application.
Are you sure?
[ September 19, 2007: Message edited by: Rajah Nagur ]



As sure as I can be, with a number of hours of usage the 'telemetry' in JProfiler shows a fairly flat trend for the various memory usage views. If there was a memory leak, that would be going up, which it's not.
 
(instanceof Sidekick)
Posts: 8791
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Have you tried setting the max heap to values close to what you think the app really needs? See what that does to outside observations like total process memory.
 
Mark Newton
Ranch Hand
Posts: 129
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Stan James:
Have you tried setting the max heap to values close to what you think the app really needs? See what that does to outside observations like total process memory.



Thanks Stan,

I hadn't thought of it, but it's given me a bit more of an insight.

First up (and I realise I should have mentioned this in my initial post, but it was already feeling a bit long-winded), the version that's out there with users complaining about memory has -Xmx200M (for various historical reasons, that no longer exist). All my profiling so far has been with -Xmx64M.

So - with -Xmx64M observations are as described above. The available memory works its way up to 64MB, the used memory sits at 20-30MB, peaking up to around 60MB for expensive operations (incidentally, the most expensive seems to be a PDF export, so that's what I'm using to test the peak).

I tried -Xmx32M and found essentially the same as above - the used memory was around 25MB, with the maximum available quickly reaching 32MB. Unsurprisingly trying my expensive PDF export threw an OutOfMemoryError.

Next I tried -Xmx256MB and was surprised (although I shouldn't really have been, with hindsight) to find that the used memory now hovered around the 90MB mark, with the available memory working its way up to 170MB-ish. This time my PDF export pushed the used memory up to about 150MB. Forcing a garbage collection made no difference (I wondered if that would make it drop down to 20-30MB).

So how did all of that affect the reported memory in Windows? -Xmx32M and -Xmx64M were very similar - only about 10MB difference. They started around the 140MB mark, and gradually crept up over time. -Xmx256M started higher (about 190MB), and very quickly climbed to about 280MB, roughly (although not exactly) in line with the available memory, as reported by JProfiler.

So what does this tell me? I guess that Java's optimisation is smarter than I gave it credit for; give it more memory, and it uses more, even if *could* do the same with less. So, if it's capped at the maximum that it needs, it should scale everything down, which it does seem to do.

Also, it's nice to know that the reported usage, as users will see it, does come down a reasonable amount, now that I'm able to restrict Java a bit more. It's early days, but the restriction doesn't appear to cause me any performance problems. Obviously more testing to do there.

So, if I allow the heap 64MB (and I might be able to get that down, with some optimisation), and the non-heap, by the way, seems to use a pretty much constant 30MB, my code is using a total of around 90MB (or at least, it needs that much available to it), and the JVM is using about 60MB in the short-term.

Longer-term, the memory according to Task Manager is still creeping up, although I'll need to do some more testing to establish how significantly. I have a feeling that by dropping from the -Xmx200M that's out there now, to -Xmx64M, that increase over time will be proportionally less - hopefully.

If the above several assumptions are correct, I think I can cope with it, and hopefully my boss and users can as well. It puts me around the same level as Firefox with 10+ tabs open, and Photoshop doing some fairly basic editing.

If anyone's actually bothered to read this far hoping for a question, then I can only apologise... As a token of goodwill, I'll offer 'does all of that seem to make sense?'. If you do see anything glaringly stupid in my ramblings please point it out, otherwise, sorry for wasting your time!
 
Stan James
(instanceof Sidekick)
Posts: 8791
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
give it more memory, and it uses more, even if *could* do the same with less.

How about that! I don't know if you ever said what JVM you're on. I think older ones don't GC at all until they try to allocate a new object and don't have enough free heap. They might leave GC eligible objects in memory until that happens. You'd see heap slowly grow even though you don't have a leak. Newer JVMs have incremental background GC that might remove things sooner than that so the heap wouldn't grow as much.

I haven't run any side-by-side comparisons, but a profiler on my old 1.3 application showed a sawtooth heap size with peaks and drops every few minutes. The whole curve grew and grew until at nearly max heap the GC made a very big drop. I guessed those little drops and big drops correspond to the different "generations" in memory but didn't pursue any more detail.
 
Don't get me started about those stupid light bulbs.
reply
    Bookmark Topic Watch Topic
  • New Topic