• Post Reply Bookmark Topic Watch Topic
  • New Topic

When not to use the "Reader/Writer" classes

Warren Brown
Posts: 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I submitted what I thought to be a bug to Sun concerning a problem I was experiencing with InputStreamReader. Below is the full description and their reply at the top...
--- begin email ---
Hi Warren Brown,
This does not seem to be a bug in the JDK.
The Reader Streams automatically attempt conversion of byte data.
It is part of their spec. To transfer byte data use the "Stream" classes.
See the following URL for more information. http://developer.java.sun.com/developer/bugParade/bugs/4296971.html
If you disagree with this evaluation feel free to re-submit.
When re-submitting please include this incident number and
a complete stand alone program that we can test with.
----------------- Original Bug Report-------------------
category : java
subcategory : classes_io
release : 1.3.0_01
type : bug
synopsis : InputStreamReader corrupts byte data
description : java version "1.3.0_01"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.3.0_01)
Java HotSpot(TM) Client VM (build 1.3.0_01, mixed mode)
also confirmed on
java version "1.2.2"
Classic VM (build JDK-1.2.2_006, native threads, symcjit)
Run this code, it has 4 possible ways to "read" the data from one array into
another. Only uncomment line one at a time. The problem exists for
the "Reader" classes, but the old "Stream" classes work fine.
This code will copy b to b1 and then compare the values. Any mismatches report
on the console. When I run this contrived sample, I get very random errors.
I discovered the problem while working in a Visibroker interceptor that is
encrypting the data streams. I was getting errors when working with large
arrays of String, so I went to more interpretable longs. In that case, there is
a pattern. I was parsing a array of long that was passing through and
beginning on element 32768 (starting at approx bytes 262144) the highest order
non-zero byte would transition from 127 to -84 (rather than -128 as it should).
The next 32 longs were wrong in the highest non-zero order byte, then 224 good,
and this pattern repeated. 224+32=256, so I suspected a boundary problem, but
as I see it now, there may be other issues, perhaps thread related? This test
on my pc yields thousands of mismatches with the "Reader" implementations.
try {
byte[] b = new byte[800000];
for(int i=0;i<b.length;i++) {>
// 4 different reading methods below, just uncomment one at a time
// these both fail, I suspect the problem is in InputStreamReader though
// InputStreamReader in = new InputStreamReader(new ByteArrayInputStream(b));
// BufferedReader in = new BufferedReader(new InputStreamReader(new
// either of these will work as expected
// BufferedInputStream in = new BufferedInputStream(new ByteArrayInputStream
ByteArrayInputStream in = new ByteArrayInputStream(b);
int c;
byte[] b2 = new byte[b.length];
for(int i=0;i<b.length;i++) {>
// I had also tried waiting here for in.ready(), but it doesn't help
c = in.read();
b2[i] = (byte)c;
for(int i=0;i<b.length;i++) {>
if(b[i] != b2[i]) {
System.out.println(""+i+" "+b[i]+" "+b2[i]);
} catch(Exception e) {
workaround : Don't use InputStreamReader, instead use the old InputStream,
BufferedInputStream or ByteArrayInputStream classes.
--- end of email ---
I hope this can save someone a lot of time. Basically, the Readers seem inappropriate to byte data based on what I've now been told.
BTW, if you haven't considered it for remote client/server data streams, the GZIP classes work wonders on the amount of data you have to transmit, and can make it possible for even modem users to have reasonable access speeds. It may be worth you looking into it. Much text data can compress to as little as 5% of the original, with non-text typically hitting 30% or so.
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!