Hi All, I have list of million of records, I want to divide the list to smaller groups for later processing, say take 3000 keys each (Bulk) and insert to Bulk table. Other process process these Bulks later on.
I based on code sample from Sun website (executeBatch tutorial) and made the following method
On 10000 list I get the Bulk dividing done in less then 2sec. On 500000 List, the Bulk dividing take like 3 mints. 500000 and up I get out-of-memory Exception.
Sharon, Is the out of memory coming from the database part or the rest of the code? If you comment out the prepared statement part and don't get the error, it tells you that the problem is in the database code/driver. In that case, you can do commits more often so as not to build up as large a memory usage.
The out of memory coming from the Java side, problem was solved using -Xmx512m.
But the bulk splitting still take some time..
after each "addBatch" call so parameters will be removed.
And i added :
So the PreparedStatement will flush to the db every 100 records (6000 characters clob each) But still this process takes large amount of time (11 seconds for just one million records) and memory space What am I doing wrong?
11 seconds to insert one million records isn't bad. An import could take almost that long for a million records.
For the amount of memory used, you can check whether you need to hold all the data in memory at once. If you can do it in stages (read in some records, load them, repeat), this would save a lot of memory.