Performance issue - reading a flat file with millions of records and save them to database
posted 7 years ago
I am trying to read a flat file which has millions of records (more than 50 millions) and save the data read into database.
I am currently making use of collections framework but that is slowing down the processing (by fetching a few lines and parsing those lines and then again start from the point where last ended) so I would like to know if there is any best solution for this..
I am using JDBC for updating DB and a helper class that would read the flat file and update the collections and ultimate pass this collections object to DAO layer for further processing.
any help/suggestion in this regards would great,
Thanks in adv: