The following is a snippet,
char e='a';
int d=9;
e*=d;
is not that looking wierd?? for me it does. how could we multiply a char and an int?? but it compiles fine and there was no error.
whereas when we replace e*=d as e=e*d, the compiler has got some err with it.. can anyone please explain.....