There are several possibilities to improve the performance:
1) Use
PreparedStatement instead of
Statement. Not only it protects your code from possible SQL injection, it allows you to pass values read from the CSV file directly to the database, avoiding the text processing on Java side, and saves some time on DB side too (Oracle already knows your statement and only executes it with new values, whereas with your current approach, every statement you run is new to the database and needs to be parsed). See also
our page on PreparedStatement.
2) Create a temporary table (a special kind of table in Oracle, whose contents is specific to each user/session; search for
Oracle global temporary table for details). Insert the data you read from the file into the table using simple
insert into (use
PreparedStatement mentioned above and create batches of reasonable size). Then run one
merge statement which will merge all data from the temp table in one batch. I believe this will be much more efficient than merging every row separately.
3) Use Oracle 11g
JDBC driver. This version of the driver provides the topmost performance when using
PreparedStatements in a batch. See also
this.
4) Perhaps adding some indexes, or cardinality to the temp table hints might help further. This is actually SQL query tuning, which is a different question (and generally one too complicated for a single post like this).
In my opinion, these modifications will speed up your processing considerably. Further speedups might be possible, eg. with external tables, but that would require some setup on the DB server side, which might not be possible. My advice is to try these improvements first and see whether it gets good enough.