[Logo]
Forums Register Login
Performance issue - reading a flat file with millions of records and save them to database
Hi,
I am trying to read a flat file which has millions of records (more than 50 millions) and save the data read into database.
I am currently making use of collections framework but that is slowing down the processing (by fetching a few lines and parsing those lines and then again start from the point where last ended) so I would like to know if there is any best solution for this..
I am using JDBC for updating DB and a helper class that would read the flat file and update the collections and ultimate pass this collections object to DAO layer for further processing.

any help/suggestion in this regards would great,
Thanks in adv:
Here is a post on a similar problem.
Any reason you don't use the bulk loading tool your database provides rather than writing your own?
I can't take it! You are too smart for me! Here is the tiny ad:
Thread Boost - a very different sort of advertising
https://coderanch.com/t/674455/Thread-Boost-feature


This thread has been viewed 1265 times.

All times above are in ranch (not your local) time.
The current ranch time is
Apr 25, 2018 17:45:31.