I am using oracle oc4j server to host applications in a profution enviroment. I deploy the apps as EAR files , each of the EAR's is ~5 MB in size. My server specs are as follows: 2Gb RAM 2x Intel Xeon Windows 2000 Advanced Server + 1Gb of virtual RAM memory.
I have two of oc4j's the servers running. One has ~20 EAR's deployed , the other instance ~35 EAR's deployed.
Since the number of EAR's deployed grew bigger I started reciving java.lang.outOfMemory errors on the console window, as well as on 500 java.lang.outOfMemory error on clients connecting via web-browser to the server.
I read some post about this errors and I set the way the oc4j is started with a parameter specifing the increased JVM heap size -Xms 512M and -Xmx1512M. But this deoas not seem to help. I still get the java.lang.outOfMemory during peak loads when many of the clients conenct. The only thing I can do is a reboot of the oc4j.
Well first think I thought I have to have memory leaks in my code. But java has a garbage collector that is supposed to do the job for me ( it's not c++) What is there for me left to do?
I try a new option of starting oc4j from the oc4j j2ee_install manual that thay put overthere: java -Doracle.j2ee.dont.use.memory.archive=true -jar oc4j.jar which is supposed to free the memory by not using it for class preloading, but I am more than sceptic about it.
Did any one of you have some problems like that before and has a solution? Is there some memory leak testing program which I can use to find memory leaks in my application? Or the only way I can go is by upgrading the number of RAM memory an increasing more the JVM heap size parameters ( but then wont I hit some sort of problems with java 32-bit memory addressing and its abilities to allocate only 1,2GB of memory and not a single byte more?)