• Post Reply Bookmark Topic Watch Topic
  • New Topic

Reading a huge(80GB) csv file using superCSV

 
Nikhil Das Nomula
Greenhorn
Posts: 26
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I want to read a huge csv file. I am using superCSV to parse through the files in general. In this particular scenario, the file is huge and there is always this problem of running out of memory for obvious reasons.

The initial idea is to read the file as chunks, but I am not sure if this would work with superCSV because when I chunk the file, only the first chunk has the header values and will be loaded into the CSV bean, while the other chunks do not have header values and I feel that it might throw an exception. So are there any other ways to approach this problem?
 
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!