Forums Register Login
Performance issue - reading a flat file with millions of records and save them to database
I am trying to read a flat file which has millions of records (more than 50 millions) and save the data read into database.
I am currently making use of collections framework but that is slowing down the processing (by fetching a few lines and parsing those lines and then again start from the point where last ended) so I would like to know if there is any best solution for this..
I am using JDBC for updating DB and a helper class that would read the flat file and update the collections and ultimate pass this collections object to DAO layer for further processing.

any help/suggestion in this regards would great,
Thanks in adv:
Here is a post on a similar problem.
Any reason you don't use the bulk loading tool your database provides rather than writing your own?
please buy this thing and then I get a fat cut of the action:
Why should you try IntelliJ IDEA ?

This thread has been viewed 1297 times.

All times above are in ranch (not your local) time.
The current ranch time is
Aug 14, 2018 12:22:00.