Hi, I am processing a file by storing it into a list when the size of list increases to some 20000 records it is taking around 78min to process while for 5,000 records it takes 3 min, i feel it isproblem with the java Heap size, i have to test the same application on different OS and Hardwarelike HP,sun,Itanium,IBM at present iam testing it on NT. is java Heap a problem or something else. plssuggest a feasible solution.
Hi, The best way to figure out where performance problems lie is to perform experiments and make measurements. You can try running with a larger heap very easily: use the -XmxNNNm switch, where NN is a number of megabytes to use; 64 is the default on most platforms. But it's unlikely that "a problem with the Java heap" is causing performance problems; it's more likely to be an issue with the program itself. My number-one suspicion is that you're doing some kind of search while processing the file; the search takes longer as the list grows, and there's your performance problem right there. If you have access to a Java profiler like OptimizeIt, try using it to see what's happening.
I agree with EFH about his number-one suspicion. Another possibility though is that the total memory used exceeds the available RAM, and you're page faulting. So take a look at the memory usage on the machine, not just in the JVM. I'm not sure about NT, but on Windows 2000 Pro you can find some basic info at Start -> Programs -> Accessories -> SystemTools -> System Information, and you can get detailed graphs from Settings -> Control Panel -> Administrative Tools -> Performance. As for the Java side of things, if you don't have something like OptimizeIt or JProbe, you can try running java with the -Xprof or -Xrunhprof options. Also useful is -verbose:gc to see what GC is doing, as well as what tyour total heap size is at various times in the program.