Hi,
Well, colloquially there's no difference at all, and this is probably why you're confused. Let's see if I can make a useful distinction.
To "convert" one type to another means the actual bits have to change. For example, the computer's internal representation for integer 3 and double 3.0 are different. If you have a variable of type int, and want to assign the value 3.0 to it, then the value has to be converted. This requires actual computation. In
Java, it only makes sense to talk about converting from one numeric primitive type to another: long to int, double to float, etc. You can't convert from an object type to a primitive type, or vice versa.
To "cast" from one type to another can mean to tell the Java compiler to just treat a value that it thinks is of some type T1 as if it were actually of type T2. For example, a variable "o" of type Object might be referring to a value of type
String. To tell the Java compiler to treat the value in o as a String, you can use a cast:
Here T1 is Object, T2 is String, and the form "(String)" is the cast. Note that nothing gets converted or changed -- you're just telling the compiler to treat this object reference as a reference to another type.
Now, here's where it gets tricky. The cast syntax is also used to tell the compiler that a value should be converted from one primitive type to another. In the "3.0" example before, the cast is required.
So here a cast is used to tell the compiler to do a conversion.
See the difference?