• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Large File in Memory

 
Ranch Hand
Posts: 125
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello everyone.
I am creating a Java Servlet which reads a file (using a BufferedReader). Now, when I've read this file, I want to process it (search for a String pattern.) This is not really the problem.
The problem is that the reading of the file (around 50MB) takes a very long time (on slow machines, around 15 seconds..). Therefore I want to read it only once (e.g. in init() ), and put it in memory (application scope) after that. Now I tried using the readLine() methode to put the file in a Hashtable, or in a Vector, but both fail because of an 'Out of Memory' error. When I use a small file, there's no problem, and everything works extremely fast. Usually the error occurs after processing 68000 lines of text...that's not even halfway the actual file.
Does anyone got an idea on how to solve this problem? Any help would be greatly appriciated!
Thanks,
Erik
 
Greenhorn
Posts: 29
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
hi u should use read(byte[] b, int off, int len) method.......
this read() method allow u to read a any no of bytes u required rather than read all the bytes at one time
 
"The Hood"
Posts: 8521
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
You could try bumping up the amount of memory that the JVM has access to using the -mx mem and -ms mem parameters when starting the JVM.
>java -ms32mb MyApplication
From the Sun Tools Documentation


-mxx
Sets the maximum size of the memory allocation pool (the garbage collected heap) to x. The default is 16 megabytes of memory. x must be greater than or equal to 1000 bytes.
By default, x is measured in bytes. You can specify x in either kilobytes or megabytes by appending the letter "k" for kilobytes or the letter "m" for megabytes.
-ms x
Sets the startup size of the memory allocation pool (the garbage collected heap) to x. The default is 1 megabyte of memory. x must be > 1000 bytes.
By default, x is measured in bytes. You can specify x in either kilobytes or megabytes by appending the letter "k" for kilobytes or the letter "m" for megabytes.

 
Erik Pragt
Ranch Hand
Posts: 125
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Thanks for your advice, but I don't think this is very usefull to my, because I don't know the size of the memory, and the size of the files can also change. I was just hoping there would be some other way to process this file quickly (within 1 second).
Thanks anyway!
Erik
 
author
Posts: 621
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Check out java1.4 new IO api. With these new classes you can use Channels to get "in memory" like behavior but without the memory overhead.
Here a quick starter...
http://www.onjava.com/pub/a/onjava/2002/03/06/topten.html?page=2
Sean
 
Consider Paul's rocket mass heater.
reply
    Bookmark Topic Watch Topic
  • New Topic