Win a copy of Functional Reactive Programming this week in the Other Languages forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Fastest way to insert 1 Lac records in a table

 
rastogi payam
Ranch Hand
Posts: 47
Chrome Eclipse IDE Tomcat Server
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi All,
I am working on application in which there is a requirement to insert 1 lac records.We are usinh oracle datbase.
Can anybody suggest me how to do that.
 
Martin Vajsar
Sheriff
Posts: 3752
62
Chrome Netbeans IDE Oracle
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I assume you're going to insert the records using the INSERT INTO ... VALUES command. If this is not the case, the following advice will not partiality or at all apply.

1) Use PreparedStatement and bind variables for all of the values. Create the statement only once and for every record you need to insert set the values of the binds and execute the statement. If some column values are the same for all rows you're going to insert, you might replace them with literals, but I mention this just as a possibility; it would be a step too far in my opinion.

2) Use batches. If you can use JDBC Driver for Oracle version 11 (usually the file ojdbc5. jar or ojdbc6.jar), you can use JDBC standard batching (see PreparedStatement's addBatch and executeBatch methods). Your code will be compatible with other databases too.

If you have to use older driver version, you need to use a batching mechanism specific to Oracle: cast the PreparedStatement to OraclePreparedStatement and call its setBatchSize(int batchSize) method, specifying the desired batch size. Then insert the records within a single loop, do not call addBatch or executeBatch. The driver will create batches of requested size automatically. This is actually much easier than JDBC's standard to code, and I'm using this method even though I have the Oracle 11 driver. It ties the code to Oracle, of course.

Optimal batch size has to be found. I'd suggest starting with 100, unless your table has a lot of columns, in that case a lower value might be better. Setting the value too high might lead to sub-optimal performance or even exceptions (at least I've encountered them). Ideally make the batch size configurable and test the performance with various batch sizes in the target environment.

3) If your table contains a key column you need to initialize, use sequence for that, don't use triggers. It is best to avoid insert/update triggers completely.

All of these three points are important.
 
Ireneusz Kordal
Ranch Hand
Posts: 423
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The fastest way is to load data from flat file using sqlloader.
If your data is in a flat file that application has to read then insert into db,
you may call slqloader from the application and load this file directly into db.
Loading even 1.000.000 records from local disk can take only a few seconds.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic