Win a copy of Kotlin in Action this week in the Kotlin forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic

Unix profiling  RSS feed

 
Tom Johnson
Ranch Hand
Posts: 142
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi All,
Firstly I apologise if this is the wrong forum but a sticky issue like memory problems has to go somewhere

I am running a java application on a solaris9 machine. Its not an application server (JBoss etc), just a regular java application that acts as a server.
Part of its work involves parsing large XML files in parallel (current max is 60MB file) and comparing them. It uses SAX for the parse and stores the information as its gets the SAX events in various java collections.
During a parallel parse of 2 of these files of 60MB it gives an outOfMemory error. My heap settings are as follows :
-Xms256m
-Xmx512m
I could get around it (I hope ) by bumping up to 1024MB on max but would like to see if there is a memory leak OR is this much memory actually needed due to the file sizes and java objects needed to represent it.
Which profilers has anyone found to be useful in this respect - OptimizeIt, PurifyPlus, JProbe? I think these are my only options as far as license etc goes. As i said its on UNIX so I either need to install the profiler there or be able to remote connect in from windows (I would assume i'd still need it installed on UNIx though).

Any suggestions or pointers greatly appreciated.
Thanks
/Tom
 
Tom Johnson
Ranch Hand
Posts: 142
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,
Just an update - I forgot to say that I have profiled in using smaller files and following the parse, store and doStuff, the memory does return to its original level. This would indicate that there is no memory leak, that it just uses that much memory - could we extrapolate this up we could claim similar for the large files?

/Tom
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!