Hi. My first time here and I'm hoping I can get some advice with the following.
To begin with, I have to perform merge/purge processing on joined tables in an Oracle DB (8.1.7) on AIX. However, I have to use a third party tool an NT server(Sagent's Merge/Purge product) to actually do the merge/purge.
The joined tables create roughly 6 million records that represent about 9.6 GB of data.
My approach has been to create a stored
java procedure in Oracle to create a .csv file on the AIX system. This takes nearly 8hrs. I then FTP the file over to the NT server on which the merge/purge applications exists and run the application against the .csv file I just transferred. Running Sagen'ts merge/purge against the flat file is very fast and produces output files containing matching records.
What I need help with is
1. Is creating the .CSV and doing and FTP the best approach and if so, how can I improve the performance.
2. I want to avoid downloading the .csv file to the NT server any more than necessary and would like to update the .csv file with the output from the merge/purge results after updating the Oracle tables. So, I need to be able to search, insert and delete records in the .csv file. In other words I'd need to keep the Oracle tables and .csv sync'd. So I would like some advice on the best way to handle this if anyone has any experience with this sort of thing.
FYI: After create the result, I loop through the resultset and add a record at a time using
BufferedWriter writer = new BufferedWriter(new FileWriter("d:\\vicbatchload.txt")4000);
while (CustomerResultSet.next()) {field1 + "," + field2 + ","....)
Thank you,

[ November 17, 2003: Message edited by: Bill Earnhardt ]