Hi, I am having a lot of memory leakage problems with an EJB developed web site which runs on a JRUN 3.1 application server. I wondered do you know if using "System.gc()" is a good wayt to help solve memory leakage problems and if so, then would I have to implement it in every Java class (seems a bit of a pain to have to do) or can I just write a class which uses this method and takes care of memory usage for the whole application server? Any help on this or any ideas would be greatly appreciated.
System.gc() would not help much. gc() just tells the system that you want the GarbageCollector to run. It does not guarantee that GC will run. (also i read that the JVM is not required to implement GC....eventhough i think a JVMwithout GC wouldnt be very usefull). But normally GC runs once in a while so memory that can be freed will be freed. if GC runs, then it will only free the memory of those Objects that have no reachable reference. memory leaks occur if you keep references to objects eventhough you dont need them anymore (eg. static references or keep reference to a collection which holds objects that should be freed), if you dont close streams, leave connections open and so on. i guess review of your code and/or a profiler would be better help (im no expert in this!) k
Cheers for the info karl koch> Here's another weird point which some of you might be able to explain. In My Windows Task Manager programme. I monitor the memory levels used by my Java processes. My web site was running Javaw.exe processes, but I've now changed to run a java.exe. I leave the java.exe process running in a dos window on my web server. If my memory levels are high for my java.exe process. Then If I minimize my dos window (where my java.exe is running) and then maximize this window again, my memory levels drop from my java.exe in my Task Manager processes. Can any of you explain why this happens. I am currently using this manual way to keep my memory levels low. Does minimising my dos window somehow initiate Garbage Collection, and if so is thee some way I can program this to happen automatically? [ March 10, 2003: Message edited by: Fergus Red ]
Wow, Fergus! I never knew that! Does it happen on Linux as well as windows? Mac OS X? My app was running through javaw and was using 45 megs of ram. I minimized it and it went down to 1.2 megs, then when I maximized it again, it only came up to 7 megs. Every other GUI Windows app (non-java) except for IE also has this same "ability." What is it that makes the Java app do this? Is there a way to cause this same type of memory saving to happen without minimizing? (I'm assuming this is a non-console app only that does this - i.e. it's a swing thing). [ March 10, 2003: Message edited by: Dan Bizman ]
I Wish Dan I knew why this happens. Also I have not tried this in Linux as only done so in Windows. Obviously it's not the most stable way of keeping memory levels low, but as I could not get any other way to work I had to use this method temporarily. If anyone can explain why this works, and if it is possible to create a program which constantly releases memory as in the window minimising and maximising function then any info on this would be great. Cheers for the help.