• Post Reply Bookmark Topic Watch Topic
  • New Topic

Performance issue - reading a flat file with millions of records and save them to database  RSS feed

 
srinivas srini
Greenhorn
Posts: 23
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,
I am trying to read a flat file which has millions of records (more than 50 millions) and save the data read into database.
I am currently making use of collections framework but that is slowing down the processing (by fetching a few lines and parsing those lines and then again start from the point where last ended) so I would like to know if there is any best solution for this..
I am using JDBC for updating DB and a helper class that would read the flat file and update the collections and ultimate pass this collections object to DAO layer for further processing.

any help/suggestion in this regards would great,
Thanks in adv:
 
Adam Michalik
Ranch Hand
Posts: 128
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Here is a post on a similar problem.
 
Paul Sturrock
Bartender
Posts: 10336
Eclipse IDE Hibernate Java
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Any reason you don't use the bulk loading tool your database provides rather than writing your own?
 
Don't get me started about those stupid light bulbs.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!