4.2.1 Integral Types and Values The values of the integral types are integers in the following ranges:
* For byte, from -128 to 127, inclusive * For short, from -32768 to 32767, inclusive * For int, from -2147483648 to 2147483647, inclusive * For long, from -9223372036854775808 to 9223372036854775807, inclusive * For char, from '\u0000' to '\uffff' inclusive, that is, from 0 to 65535
The Java Specification It just so happens that \u0030 to \u0039 is Unicode for decimal 0-9. The compiler goes through on the first pass and resolves Unicode literals to their character equivalent, effectively changing your Unicode "characters" to integers, and from then on treating the declarations as perfectly legal integer assignments. [ February 10, 2005: Message edited by: Joe Ess ]
You are skipping a step in your mental compilation. If you write this:
The Java compiler changes the Unicode literal into it's character equivalent (all Java files are assumed to be Unicode):
Now this is setting the integral char to a value of 0x01 (or \u0001 if you want to stick with Unicode). If you print out b, you will be printing out a "Start of Heading" character (smiley face). The above line is not the same thing as the following line:
The first example, x, is using a character literal to set the integral char to a value of 0x31 (or \u0031). The second example is using a Unicode literal to do the same. Now if you print out x or y, you will print out "1". The lesson here is that char values aren't really "characters", they are integral values which are interpreted as "characters" in the correct context. Perhaps a gander at the Unicode Character Table is in order?