Khuda Haafiz Muzza 4 Java
mozammil muzza wrote:I am trying to run application that inserts 1 million of records into the DB table with 7 columns and with 1 PK, 1 FK and 3 Unique index constraints on it.
I am using PreparedStatement and JDBC Batch for this and on every 2000 batch size i runs executeBatch() method.
here, for half millions of records it is taking almost 3 mins i.e. 182 secs.
"Leadership is nature's way of removing morons from the productive flow" - Dogbert
Articles by Winston can be found here
Second: Have you considered re-designing your table? 4 unique indexes (including, presumably, PK) on a table with 7 columns suggests to me that it isn't properly rationalized.
Khuda Haafiz Muzza 4 Java
mozammil muzza wrote:I tried the same in other DB table with same structure, there it is taking hardly 37 secs for 600K records, thats why i am bit confused on it.
"Leadership is nature's way of removing morons from the productive flow" - Dogbert
Articles by Winston can be found here
Winston Gutkowski wrote:
Third: I suggest you look at this thread and, in particular, Junilu's response;
Jeff Verdegan wrote:
Winston Gutkowski wrote:
Third: I suggest you look at this thread and, in particular, Junilu's response;
Jayesh's response perhaps?
Martin Vajsar wrote:I'd say Winston has already raised the main points (that is, to consider using a data load tool, in this case SQL*Loader or perhaps external tables), and to review the table and its indexes (index maintenance is quite expensive, as compared to the insert operation itself).
So it's just the minor points left:
1) Do some experiments to find which batch size works best in your environment. 2000 could even be too much.
2) Make sure you're using the latest JDBC driver for Oracle. Oracle has made some performance enhancements in JDBC driver batch processing which came with the 11g database version (looking just at the filename is not enough, various versions of the driver are named ojdbc5.jar or ojdbc6.jar).
Khuda Haafiz Muzza 4 Java
Carles Gasques wrote:Some path to explore is drop the index / indexes, insert all your data and then rebuild the index / indexes...
"Leadership is nature's way of removing morons from the productive flow" - Dogbert
Articles by Winston can be found here
Winston Gutkowski wrote:
Carles Gasques wrote:Some path to explore is drop the index / indexes, insert all your data and then rebuild the index / indexes...
That occurred to me too; and it would definitely have been my advice if the indexes weren't unique - but one presumes they are unique for a reason (ie, probably as an automatic validity check).
If you drop, load and rebuild, you have two basic issues:
1. Values aren't verified on insertion.
2. Rejected index values are not likely to be in any predictable order (particularly as there are 4 indexes to satisfy).
My advice is still to look at that table's definition: it sounds "smelly" to me.
Winston
Khuda Haafiz Muzza 4 Java
mozammil muzza wrote:Well, indexes are unique one and i can't drop it before inserting records as they are dependent on it. It makes the data inconsistent.
Martin Vajsar wrote:
Perhaps, if your database has enough resources you could use, you might insert the rows into a temporary (unindexed) table and then merge/inset them into the target table in parallel.
Since you've ruled out the obvious measures, it would be good to find out what the bottleneck of your current processing is and target that bottleneck specifically. Do you have some experience with performance tuning in Oracle?
Khuda Haafiz Muzza 4 Java
mozammil muzza wrote:I tried load insert as well using stage table, still not much improvement.
can you please elaborate about merge/insert and its processing in parallel, not sure, but guess it might be helpful.
Don't get me started about those stupid light bulbs. |