posted 13 years ago
I want to insert large amount of data into a table. The data is in a flat file and basically each line matches to a row in table. I can first create a collection of <record> objects, then I use SpringBatchUpdate to insert. Can someone tell me what exactly good about this BatchSqlUpdate ? Is it much faster than inserting record by record ?
Furthermore, is its only advantage on speed ? Anybody thinks about memory consumption ? I still have to load 1 million records objects into memory first!! Does Spring have a way that I don't need to create 1 million objects in memory ?
Thanks.