Monty Guppy

Ranch Hand
+ Follow
since Sep 15, 2001
Cows and Likes
Cows
Total received
0
In last 30 days
0
Total given
0
Likes
Total received
0
Received in last 30 days
0
Total given
0
Given in last 30 days
0
Forums and Threads
Scavenger Hunt
expand Ranch Hand Scavenger Hunt
expand Greenhorn Scavenger Hunt

Recent posts by Monty Guppy

Hi,
I am doing multitreading when running Weblogic on Unix. While connecting to Oracle (9i running on UNIX), I get the above exception (note the incorrect spelling). It happens intermittently, and is generally seen when a second thread is started before the first one is released. Basically I execute a Stored Proc, but when this exception occurs, it results in the truncating of the number of records returned by the first thread.
Here is some code snippet:
public final static Connection getOracleConnection()
{
java.sql.Connection oracleConnection = null;Context ctx = null;
Hashtable ht = new Hashtable();SQLEngine sqlEngine = new SQLEngine();
ht.put(Context.INITIAL_CONTEXT_FACTORY, "weblogic.jndi.WLInitialContextFactory");
try {
ctx = new InitialContext(ht);
DataSource ds = (DataSource) ctx.lookup ("java:comp/env/rptx\\oradb");
oracleConnection = ds.getConnection();
}catch
-----
I call the connection through my thread like this.....
java.sql.Connection connection = null;
connection = SQLEngine.getOracleConnection();
--------------
Any help would be appreciated
17 years ago
Hi,
I am doing multitreading when running Weblogic on Unix. While connecting to Oracle (9i running on UNIX), I get the above exception (note the incorrect spelling). It happens intermittently, and is generally seen when a second thread is started before the first one is released. Basically I execute a Stored Proc, but when this exception occurs, it results in the truncating of the number of records returned by the first thread.
Here is some code snippet:
public final static Connection getOracleConnection()
{
java.sql.Connection oracleConnection = null;Context ctx = null;
Hashtable ht = new Hashtable();SQLEngine sqlEngine = new SQLEngine();
ht.put(Context.INITIAL_CONTEXT_FACTORY, "weblogic.jndi.WLInitialContextFactory");
try {
ctx = new InitialContext(ht);
DataSource ds = (DataSource) ctx.lookup ("java:comp/env/rptx\\oradb");
oracleConnection = ds.getConnection();
}catch
-----
I call the connection through my thread like this.....
java.sql.Connection connection = null;
connection = SQLEngine.getOracleConnection();
--------------
Hi,
I am trying POST parameters by doing a submit automatically when my page is accessed, and thus redirecting the user to another application. I am able to redirect successfully.
However, in my effort to display an animated gif (e.g., hourglass), I am either able to show the animation or do the redirect, but not both.(I only get a stationary image - not animated - when I am able to successfully redirect). Please look at the <BODY>
Here is my code. Any help would be greatly appreaciated:

<html lang="en">
<HEAD>
<TITLE>Application Forward</TITLE>
</HEAD>
<script LANGUAGE="JavaScript">
function start(){
document.hssBridgeForm.submit();
}
</script>
<style>
#wait
{
font-family: Arial, Verdana, Helvetica, MS Sans Serif;
position:absolute;
left:1px;
top:100px;
width:450px;
height:450px;
visibility:hidden;
clip:rect(0px 450px 450px 0px);
}
</style>
<script langage=javacript>
//Preload the wait image
PreloadImage()
if (document.layers)
{
document.captureEvents(Event.LOAD)
}
function PreloadImage()
{
alert("preloading");
if (!document.images) return;
var imgs = new Array();
var PIargs = PreloadImage.arguments
for (var i = 0; i < PIargs.length; i++)
{
imgs[i] = new Image;
imgs[i].src = PIargs[i];
}
}
</script>
<BODY> // using this tag I am able to display animated gifs
<!--BODY onLoadd="start()" -->// using this tag, the animation stops, but redirect is successful
<form name="hssBridgeForm" action="http://www.yahoo.com" method="post">
<p>
<input type="hidden" name="USER_ID" value="4455asdf">
</p>
<img src="sequence2.gif" name="waitimage" id="waitimage">
<form>
</BODY>
</html>
Thanks for your response.
I think by using redirect, the parameter and values would get exposed in the URL (which in my case is undesirable). Please let me know if my assumption is incorrect.
17 years ago
Hi,
I am new to struts. I have a web application which basically is a redirector to other web apps. Depending upon who the user is and which app he wants to access, I (make databse calls &) generate a list of paramters (key,value) that need to be POSTed to the external app.
I am doing request.setAttribute(key,value) to see these params and then do findForward.
The mapping.findForward("success") launchess the other app but my parameters are not being passed. The Action() in the launched app tries to read the parameters from its ActionForm but finds nothing (I think this is because the 2 applications have different context).
Please let me know if you have an idea to accomplish this POST.
Thanks
17 years ago
Thanks a bunch Jim for your suggestions.
I had to rush my release last week, and it did solve most of my problems, though with some really huge data (>5MB). The original problem with Buffered Stream was (which I forgot to mention in the last post) were java.lang.outOfMemory.
To answer your other question about whether I need all the data in memory at a time- I am reasonably (not 100%) sure that I do. I pass the XML that is returned by my clob in the form of a ByteaArrayInputStream (BAIS) to be parsed by my SAX parser for generating a CSV document. THe CSV document is formed as the BAIS is being parsed:
SAXParser saxParser = SAXParserFactory.newInstance().newSAXParser();
saxParser.parse(byteArrayInputStream, this);
public void startDocument (){------}
public void endDocument () {------}
public void startElement (Str uri, Str name, Str qName, Attributes atts) {------}
public void endElement (String uri, String name, String qName){---}
public void characters (char ch[], int start, int length) {----}
Thanks again.
17 years ago
Jason,
I tried to use the BufferedInput/ Output stream. For 8MB of data returned back by my query, this failed. Following is the code:
BufferedInputStream bufIn=new BufferedInputStream(clob.getAsciiStream(),512);
ByteArrayOutputStream bAOut = new ByteArrayOutputStream();
BufferedOutputStream bufOut=new BufferedOutputStream(bAOut,512);
while ((c = bufIn.read()) != -1) {
bufOut.write((char) c);
}
Am I doing something wrong here?Any other ideas?
17 years ago
Thanks for your response Peter. Following your suggestion I came up with:
InputStream in=xmlClob().getAsciiStream();
int c;
while ((c = in.read()) != -1) {
byteArrayOutputStream.write((char) c);
}
ByteArrayInputStream byteArrayInputStream = new ByteArrayInputStream(byteArrayOutputStream.toByteArray());
But as you had mentioned, the memory constraint remains this way. If the ultimate goal is to convert the InputStream to ByteArrayInputStream, is there a possibility of using buffer in the above code to make it memory efficient.
Thanks again.
17 years ago
Hi, Can someone suggest the most efficient way to convert an InputStream to byte[]. I want to avoid using StringBuffer (since for larger files it gives me an outOfMemory). ANy code help would be greatly appreciated. Thanks.
17 years ago
Hi,
I am trying to read from an inputSource and stored the read characters into a stringbuffer. This is causing the outOfMemory beyond certain sized datasource (data that I am returning in the form of XML from Oracle). I read somewhere that I can buffer the read and thus overcome this problem. However, I am not sure how to implement it codewise.
Here is my code (works great for files upto 0.5MB):
int c;
InputStream in=clobXML.getAsciiStream();
while ((c = in.read()) != -1) {sb.append((char) c);}
Can someone please help me as to how to modify this code? Thanks
17 years ago
Hi,
I have a small reporting system that I have written, where I generate PDF, XML, CSV reports. Currently I am storing the generated reports as binaryarray in the LONG RAW field of Oracle. My system generates 30-35 reports that have an average size of 57K. The max report size to date has been 1.2 MB, although there are less than 1% of reports that are of size > 1MB.
I am resisting moves to start storing my reports on file server because of tremendous time pressures. Can someopne suggest good reasons of db storage over file server storage? Are there any resources that you are aware of on the internet that can help me present with an arguement.
Thanks a lot for your help.
18 years ago
Hi,
I have a small reporting system that I have written, where I generate PDF, XML, CSV reports. Currently I am storing the generated reports as binaryarray in the LONG RAW field of Oracle. My system generates 30-35 reports that have an average size of 57K. The max report size to date has been 1.2 MB, although there are less than 1% of reports that are of size > 1MB.
I am resisting moves to start storing my reports on file server because of tremendous time pressures. Can someopne suggest good reasons of db storage over file server storage? Are there any resources that you are aware of on the internet that can help me present with an arguement.
Thanks a lot for your help.
18 years ago
Hi,
I have a small reporting system that I have written, where I generate PDF, XML, CSV reports. Currently I am storing the generated reports as binaryarray in the LONG RAW field of Oracle. My system generates 30-35 reports that have an average size of 57K. The max report size to date has been 1.2 MB, although there are less than 1% of reports that are of size > 1MB.
I am resisting moves to start storing my reports on file server because of tremendous time pressures. Can someopne suggest good reasons of db storage over file server storage? Are there any resources that you are aware of on the internet that can help me present with an arguement.
Thanks a lot for your help.
18 years ago
I am using SAX parser to parse XML that is returned as a CLOB from an Oracle SP. The XML could be like this:
<RECORDSET>
<RECORD>
<NAME>Dave</NAME>
<ID>123</ID>
</RECORD>
<RECORD>
<NAME>Ken</NAME>
<ID>124</ID>
</RECORD>
.......
</RECORDSET>
Based on some logic, I determine that the element called <NAME> needs to be be removed completely from all <RECORD>s before this potentaillay huge XML file (there could be 1000's of records and scored os elements within each record) is processed further.
Is there a recommended way to accomplish this? Should I rather convert SAX to DOM, then remove element, then convert back to SAX? Please advise.
Thanks for your reply HTH. Both the files that I am dealing with are generated dynamically, and without actually forming physical *.xml files on my server, I stream the XML data using TransformerFactory.transform() method.
Any ideas as to how I can stream 2 dynamically generated XML files to the same XSL? Do you think the following scenario is possible?
-I have a XSL file called myXSL.xsl on my server.
- The 2 dynamic XMLs are called xmlfile1.XML and xmlfile2.xml.
-First I use myXSL to transform xmlfile1.xml using the transform(), and save the output as myNewxsl.xsl.
- Then I use myNewxsl.xsl to transform() xmlfile2.xml to get the final output.
What do you think?
Thanks