• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

HashMap with huge data

 
Balakrishnan Venkatasubramanian
Greenhorn
Posts: 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Friends,

I am facing issue while loading a text File into HashMap. The text file is of size 80MB. If I read line-by-line and load it into HashMap, I get this error "Exception in thread "main" java.lang.OutOfMemoryError: Java heap space".

I understand we can solve this by increasing the heap space in JVM, but I am not sure how critical or how easy it is? Is it advisable do like that?

Also, Is it advisable to deal with Huge data in Memory with HashMap and write back to disk for future reference at the end of execution?

Kindly suggest.

Regards,
Balki.
 
Joe Ess
Bartender
Posts: 9338
10
Linux Mac OS X Windows
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Welcome to the JavaRanch.
The default heap size is 64Mb, so if you want to read in 80Mb of data, you will have to increase the heap size. This is done with the -Xmx command line switch. See the java command documentation for more.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic