Hi!
Wondering if anyone have any sample code to read a xml file and load the parsed attribute values of a particular matching tag into a database table.
I already have a code which does the above using DOM. But as expected it performs extremely poor when the XML file is very large (~500 meg). I have a similar SAX code with hardcoded tag names which is currently writing a pipe delimited input file. The generated input file would be used separately used by the Oracle's SQLLoader utility to do the table load. This 2 way process is very efficient. I was wondering if I could incorporate the data loading into the existing code using
JDBC effieciently possibly using JDBC API - PreparedStatement's addBatch and executeBatch methods? I have been able to do the one insert at a time data load by dynamically constructing and executing the insert statement. But for the inserts which normally takes only few minutes now with the existing process this one took hours (God knows exactly how long!) for load of 170000 rows.
Any idea how I can get this SAX + JDBC effort done effieciently?
Thanking you all!