Here's something that irritates me every time I bump into it. I wonder if anyone here has a good solution. I want to copy all the data, completely untranslated, from somewhere to somewhere else. For example, serving a file to a client over a socket. It seems that if I use Streams, it works fine, but if I try and use Readers/Writers, some sort of wierd translation takes place. To demonstrate the problem, compile the following simple application and run it with the path to a GIF file as an argument. It will produce two files in the current directory (stream.gif and reader.gif). These files all have the same size (on my system at least), but if you try and open them with a browser or editor, only the original and stream.gif seem to be valid.
I can't find any way of stopping the Reader/Writer version changing my data. Help! For the moment, I have converted over to using exclusively Stream classes, but at the expense of loads of "deprecated" warnings, and some irritating interfaces. I would really like to find out what's going on here, so I can move to using Reader/Writer properly. Any suggestions? [This message has been edited by Frank Carver (edited December 15, 1999).]
Hi, Stream is for byte reading. So when you use stream it works perfect. Stream is used for binary files such as images and sound But Reader/Writer is for 16bit reading(internationalization).However, most native file systems are based on 8-bit bytes. InputStreamReader and OutputStreamReader bridges the byte stream and character stream. Hope this helps.
posted 18 years ago
Thanks for your reply. I understand the intention of the Reader/Writer classes, but I still find myself in a dilemma. If I use Streams, as I used to before Reader/Writer was introduced, I now get a lot of deprecation warnings. I can only assume that Sun would prefer new code to use Reader/Writer instead. If I use Reader/Writer it just doesn't work. I can't believe that Reading data from a Reader and Writing it to a Writer should change the data. I currently feel that I can't trust Readers and Writers to pass data through unchanged. The last thing I need is another source of problems even if my code is right. I still think there must be something I'm missing. Some way of telling a Reader or a Writer not to #!$*! mess around with my data. Does anyone know of a way to do this?
It's not the whole PrintStream class that's deprecated, just its constructors! It's been this way since 1.1. They could hardly deprecate the whole PrintStream class without changing System.out which is used all the time. I thought everyone had come across this. From time to time I find myself writing a method which can either accept System.out or some other PrintStream (for logging to a file, etc.), but in order to create a PrintStream, I have to use a deprecated constructor. Seems crazy to me! The wierdest bit of this is that at the same time as all the constructors for PrintStream were being deprecated, two new methods (setOut and setErr) were added to java.lang.System which each take a PrintStream argument. Likewise if I want to read a text file line-by-line. As mentioned above, Readers and Writers have some serious data-integrity problems, but the readLine method of DataInputStream is deprecated. So how am I supposed to do it?
Hmm, what JDK do you use ? For me, everything is fine... <pre> [vps@druid]~/jprogs$ java CGif 326.gif [vps@druid]~/jprogs$ diff 326.gif reader.gif [vps@druid]~/jprogs$ diff 326.gif stream.gif </pre>
------------------ With best of best regards, Pawel S. Veselov ( aka Black Angel )
With best of best regards, Pawel S. Veselov ( aka Black Angel )
posted 18 years ago
Very interesting suggestion Pawel. It set me off trying all the versions I have installed here. And here's what I got: Windows 95 PC: FAIL: java version "1.2.1" Classic VM (build JDK-1.2.1-A, native threads) FAIL: java version "1.2" Classic VM (build JDK-1.2-V, native threads) OK: java version "1.1.6" Solaris SPARC: OK: java version "java1.2" Solaris VM (build Solaris_JDK_1.2_01_dev05_fcsK, native threads, sunwjit) OK: java version "1.1.7" I have no Linux box here at work any more, but I can try that at home. So the problem seems specific to Windows 1.2 and 1.2.1 ports. Has anyone tried 1.2.2, or 1.3 to see if they still have the problem? At least I know that I can run my software with 1.1.7 if I need to, but I really had hoped to move to 1.2...
A quick few extra installs and I've found that it fails on the Windows ports of 1.1.8 and 1.2.2 as well. For this "bug" to be retrofitted into 1.1.8 seems to imply it's desired behaviour. I've looked at the binary output with a variety of tools and the 1.2 Windows Readers seem to be translating several bytes into octal 77 (0x3F) '?'. Some of the bytes converted are 0201 (0x81), 0215 (0x8D), 0216 (0x8E), 0217 (0x8F), 0220 (0x90), 0235 (0x9D), 0236 (0x9E). Presumably these are some sort of invalid unicode characters. But this still begs two questions: 1. Why does it only happen on Windows? 2. What "switch" can I set to stop a Reader or Writer messing with my data.