We are hunting for memory leaks on the application we have and to do this, we used both core dumps and garbage collection analysis.
Its just weird that we could not match the garbage collection data to the data in the core dump. For example:
At 8:00am, doing a core dump showed that 400MB of 512MB is being currently consumed by the application. Looking at the garbage collection data, the Used Tenured parameter (both before and after) at 8:00am showed value nowhere near that.
Is it really suppose to match or not? it should be right?
or is it really not possible to get a snapshot of memory consumption at a particular time using native_stderr.log garbage collection data? and this can only be done using forced heap dump
A core dump will have a lot of additional information outside the java heap and indeed a java process requires more memory than the heap e.g. thread stacks won't appear in your GC data. If you create a lot of threads they will consume a lot more OS visible memory than appears to be taken in the heap (configurable on the command line) for instance.
Why are you using core dumps ? It makes far more sense to obtain hprofs and use those for leak analysis (with your gc analysis), I usually only use core dumps as the results of crashes and then usually convert them to hprofs.
You need to define why you think you may have a leak ... e.g. process memory consumption growing or GC analysis.
"Eagles may soar but weasels don't get sucked into jet engines" SCJP 1.6, SCWCD 1.4, SCJD 1.5,SCBCD 5