Forums Register Login
Performance issue - reading a flat file with millions of records and save them to database
I am trying to read a flat file which has millions of records (more than 50 millions) and save the data read into database.
I am currently making use of collections framework but that is slowing down the processing (by fetching a few lines and parsing those lines and then again start from the point where last ended) so I would like to know if there is any best solution for this..
I am using JDBC for updating DB and a helper class that would read the flat file and update the collections and ultimate pass this collections object to DAO layer for further processing.

any help/suggestion in this regards would great,
Thanks in adv:
Here is a post on a similar problem.
Any reason you don't use the bulk loading tool your database provides rather than writing your own?
Wink, wink, nudge, nudge, say no more ... https://richsoil.com/cards

This thread has been viewed 1227 times.

All times above are in ranch (not your local) time.
The current ranch time is
Jan 24, 2018 02:40:53.