Hi, I can't come up with a reliable code for reading from an URL connection on a PHP script with an applet running on Netscape 4.7. My code works on other browsers (IE 5, Netscape 6), but not consistently on Netscape 4.7 when my data (which is text/plain) is large : when I call the read() method on my InputStreamReader, Netscape4.7 won't read all the bytes whereas the buffer is large enough. I suspect my PHP script doesn't put the right headers because conn.getContentLength() for example returns -1. The funny thing is that when I compress my data in the php script (serving a gzip file), the applet reads the output ok using a GZIPInputStream ! My code looks like this : con = myURL.openConnection(); con.setDoOutput(true); con.setDoInput(true); con.setUseCaches(false); con.setRequestProperty("Content-Type", "application/x-www-form-urlencoded"); con.setRequestProperty("Content-Length", " " + paramStr.length()); DataOutputStream out = new DataOutputStream(con.getOutputStream()); out.writeBytes(paramStr); out.flush(); out.close(); InputStream in = new DataInputStream(con.getInputStream()); InputStreamReader urlReader = new InputStreamReader(in); cbuf = new char[large_enough_int]; mylength = urlReader.read(cbuf); As a result, mylength is less than the number of bytes really available from the ressource ! Many thanks in advance.
Why am I so drawn to cherry pie? I can't seem to stop. Save me tiny ad!
Building a Better World in your Backyard by Paul Wheaton and Shawn Klassen-Koop