Hi Thank you for reading my post I have a problem with reading some UTF-8 information from a file and writing it as UTF-8 into another file. imagine , i have a file that contains some utf-8 text ,(i can read them in windows using notepad/wordpad) Now i want to write a java application to split the file into some more files and do some changes on them.
here is the method i used to read the file :
and this is how i write the file :
That myChangedText is changed version of String that i read from utf-8 file.
the problem is that information that i see in a.txt are not readable? even the file signature (EmEditor) shows that it is not a utf file.
i only can read the information if i open the file in OO or MS office , by selecting UTF-8 in select encoding dialog
FileReader and FileWriter assume that you want to use the default character encoding, which in all likelihood is not UTF-8 (that's also stated in their respective javadocs). Use InputStreamReader and OutputStreamWriter to specify the encoding the files are in.
Take care about the Byte Order Mark that some editors and applications will put at the start of a UTF-8-encoded text file, to mark it as such. Java's java.io.Reader generally won't recognise this as a Byte Order Mark, but will instead read it as a few junk characters at the start of the file.
Betty Rubble? Well, I would go with Betty... but I'd be thinking of Wilma.
That's my roommate. He's kinda weird, but he always pays his half of the rent. And he gave me this tiny ad:
Programmatically Create PDF Using Free Spire.PDF with Java