• Post Reply Bookmark Topic Watch Topic
  • New Topic

ObjectInputStream wrapped in FileInputStream Consuming lot of Memor

 
sankar adabala
Greenhorn
Posts: 8
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi all,

Problem: Reading objects from a large file approx 400MB.The file contains 1000's of objects.While reading from files lot of memory is consumed and i finally end up getting Out Of memory.

I was facing this problem while writing the same objects to the above mentioned file.This time i had an option to reset my ObjectOutput Stream and thus make the so far written objects Garbage collected.This way i could write all the objects with out any error.

But while reading the Objects from the same file and i don't have an option to reset my ObjectInputStream and hence i end up having out of Memory.Is there any elegant way to handle this issue.


Thanks and Regards,
Sankar adabala
 
Jim Yingst
Wanderer
Sheriff
Posts: 18671
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Unfortunately, if you can't change the way this file is written in the first place, I don't think there's any way to reduce the memory necessary to read it. Short of studying the Java Serialization Specification and writing your own custom desrializer, which is probably a lot more trouble than it's worth. You best bet is to increase the memory allocation for your JVM as much as possible, e.g.

java -Xmx1024m MyClass

The -Xmx option is documented here under Nonstandard Options. You may end up requiring more or less than 1024 Mb; that was just a wild guess on my part.
 
sankar adabala
Greenhorn
Posts: 8
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks a lot Jim,

But i already have done that increasing the memory to 1024MB and that is the limit set for my application as we don't want to increase beyond that as that will effect the requirements for the customers to install this.

I want to confirm if writing our custom objectinputStream only solution for this.
 
Jim Yingst
Wanderer
Sheriff
Posts: 18671
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Well, here are the possibilities I see:
  • Force whoever is writing these 400MB object stream files to STOP DOING THAT. It's an extremely bad idea to expect applications to be able to read that much into memory.
  • Write a client-side program to read these monster files and convert them to a set of smaller files. This application would do nothing else, and exit when it's done, so maybe the 1024 MB will be enough. Subsequently your main application can read the set of smaller files and then do whatever else it needs to with them.
  • Increase the -Xmx beyond 1024 even though you don't want to. The client machines don't necessarily need to have this much RAM, but the program will run slower if it's paging virtual memory. Too bad, but at least it should run.
  • Write your own ObjectInputStream. You can use the code of the existing ObjectInputStream class and make a modified version. I'd look at the nested HandleTable class - that seems to be where they're keeping a list of all the previously deserielized objects. (Which is necessary because later objects may contain references to earlier objects). You could replace the in-memory arrays there with a more sophisticated caching system. Whenever the cache gets to big, take the least-recently-used objects and reserialize them to files, then remove references to the objects from the cache. Later if you need to look up one of those objects, you can deserialize them again from their files.

  • There may be other possibilities, but those are the best ones I can think of.
    [ September 17, 2006: Message edited by: Jim Yingst ]
     
    • Post Reply Bookmark Topic Watch Topic
    • New Topic
    Boost this thread!