Bill Earnhardt

Greenhorn
+ Follow
since Nov 15, 2003
Merit badge: grant badges
For More
Cows and Likes
Cows
Total received
In last 30 days
0
Forums and Threads

Recent posts by Bill Earnhardt

After several attempts to find a way to get the performance I was looking for, I realized that the fastest and easiest way was to just spool the output to a file.
Oracle's sql plus has a number of switches that make this very efficient. If anyone has any interest in how do this, I'll be glad to post the script.
Thanks again....
Bill
21 years ago
Jeanne, thanks for the helpful information and found the table you were talking about. I'll give this a try and let you know the outcome.
Thank you!
I need to be able to test for the existence of a table in oracle using a java stored procedure. If it exists, I populate it, it doesn't, I create it and then populate.
Any suggestions would be appreciated.
Thanks,
Bill
Hi, I hope I haven't put this question in the wrong forum, if so my apologies. I did look in the book review forum first. But here goes.
I've found that after getting pretty comfortable with the basics of JAVA programming that I'd like to find a book or some other reference that offers help with how to interpret user requirements, create staight forward specs and then develop the code to implement the specs. Oh yeah, if possible, it should include how to effectively deal with changes to the spec.
I've read Head First Java (twice) which is GREAT and now I'm studying for the Programmers exam using the study guide prepared by the same authors. Again an excellent book.
I feel I have enough knowledge to begin applying JAVA but I'm finding that one of the hardest things to do is come up with real world type "business case" excercises in order to take a project from inception to implmentation.
If you have any suggestions, I'd greatly appreciate.
Regards,
Bill
21 years ago
Leslie, thank you again. As I delve deeper into the use of JAVA the more powerful I find it to be. Regardless of whether I can achieve better performance in creating the .csv file, I've gained a lot of good experience and advice in the process.
I'll keep you posted and if I find anything that might be particularly interesting, I'll pass that along as well.
Regards,
Bill
21 years ago
Leslie, thank you for the input and you make a good point regarding how I'm using the oracle JVM. I'll try calling the data from the NT box. I't can't be any worse.
Regards,
21 years ago
Please note that I didn't realize there is an Oracle forum and originally posted this to the I/O forum instead. My apologies for any inconvenience.
I Ended the discussion thread (hopefully) for this post on that forum and moved it to Oracle.
To begin with, I have to perform merge/purge processing on joined tables in an Oracle DB (8.1.7) on AIX. However, I have to use a third party tool an NT server(Sagent's Merge/Purge product) to actually do the merge/purge.
The joined tables create roughly 6 million records that represent about 9.6 GB of data.
My approach has been to create a stored java procedure in Oracle to
create a .csv file on the AIX system. This takes nearly 8hrs.
I then FTP the file over to the NT server on which the merge/purge
application exists and run the application against the .csv file I just transferred.
Running Sagent's merge/purge against the flat file is very fast and produces output files containing matching records which I use for merge/purge on the origiinal data
What I need help with is:
1. The fastest approach for Creating a .CSV using a stored java procedure
2. Keeping the data in Oracle synch'd with the .csv file in order to avoid any more FTP than necessary.
I've noticed that increasing the buffer size hasn't seemed to make any difference.
Thank you,
21 years ago
Thank you for the quick response. Unfortunately, the data is within Oracle and Oracle provides their own (fully Sun compliant) JVM. Running a stored procedure within Oracle will produce the best performance. Having said that, there may be a better approach than the one I'm using. You're suggestion of using print instead of concatenation sounds good.
Is there a practical size for the output buffer that might help reduce overall I/O. I can't tell much if any difference when I increase the size.
BufferedWriter writer = new BufferedWriter(newFileWriter("/employee.csv")4000);
21 years ago
Hi. My first time here and I'm hoping I can get some advice with the following.
To begin with, I have to perform merge/purge processing on joined tables in an Oracle DB (8.1.7) on AIX. However, I have to use a third party tool an NT server(Sagent's Merge/Purge product) to actually do the merge/purge.
The joined tables create roughly 6 million records that represent about 9.6 GB of data.
My approach has been to create a stored java procedure in Oracle to create a .csv file on the AIX system. This takes nearly 8hrs. I then FTP the file over to the NT server on which the merge/purge applications exists and run the application against the .csv file I just transferred. Running Sagen'ts merge/purge against the flat file is very fast and produces output files containing matching records.
What I need help with is
1. Is creating the .CSV and doing and FTP the best approach and if so, how can I improve the performance.
2. I want to avoid downloading the .csv file to the NT server any more than necessary and would like to update the .csv file with the output from the merge/purge results after updating the Oracle tables. So, I need to be able to search, insert and delete records in the .csv file. In other words I'd need to keep the Oracle tables and .csv sync'd. So I would like some advice on the best way to handle this if anyone has any experience with this sort of thing.
FYI: After create the result, I loop through the resultset and add a record at a time using
BufferedWriter writer = new BufferedWriter(new FileWriter("d:\\vicbatchload.txt")4000);
while (CustomerResultSet.next()) {field1 + "," + field2 + ","....)
Thank you,


[ November 17, 2003: Message edited by: Bill Earnhardt ]
21 years ago