Could some one please clarify me: Why char c = 100; is true while int i =100; char c = i; is false. Similarly c= c+ i; is false but c+=i; is true. Thanks in advance for ur help.
char c = 100; //ok in declaration as long as value w/in range int i = 100; char c = i; //need cast...I think because you're not using a literal here, so compiler can't check c = c + i; //need cast c += i; //ok, but I'm not sure why. I know that c += 1 is fine because automatic promotion of the literal to int doesn't happen with +=, but I'm confused with c += i not needing a cast. Anyone?
Just to add more: char c = 100;//Ok because 100 is constant and valid within char's range. int i =100; char c = i;//invalid because i is a variable, whose value could go out of bounds of char's range Similarly c= c+ i;invalid because the result of an arithmetic operation is at least an int, which is wider that char c+=i;Ok because for op= type, an implicit cast takes place That's it, in brief. Herbert.
class MyClass { public static void main(String[] args) { final int i =200; char c = i; System.out.println(c); } } This will compile in jdk1.3, note final.
What I don't understand is how they changed the earth's orbit to fit the metric calendar. Tiny ad: