• Post Reply Bookmark Topic Watch Topic
  • New Topic

Processing a large data file(55Mb)  RSS feed

 
brisk rook
Greenhorn
Posts: 20
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
At one point in my program, I performing calculations on number of rows(~600,000). I use JSP & JAVA. Here are some things I noticed.
1. It's very slow! :-)
2. Initially, the cpu monitor shows a high level of cpu utilisation, but this slowly drops off with a blip every second or so. Basically, things slow down as the process chugs along.
3. I eventually get a java.Lang.OutOfMemory exception after approx 1:00 -1:30 hrs.
I tried running the JVM with heap size set to 256 Mbytes. I still get the outof mem error.
Is there a faster/better way? Any ideas on how I can get rid of the out of mem errors.....???
I tried searching in the forum for similar kind of discussions but..could n't get the solution for it..Pls help...!!!
 
William Brogden
Author and all-around good cowpoke
Rancher
Posts: 13078
6
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
1. Not really appropriate for JSP forum
2. Something is very inefficient in your processing algorithm.
probably you make a lot of String objects and fail to discard them. Converting bytes to char when reading a String is VERY slow and of course creating lots of objects slows things down too.
3. Since you have the memory to spare apparently, read the whole file into a byte array and process it by moving pointers through it.
Bill

------------------
author of:
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!