I believe that the reason for sizeof in C/C++ (and other?) languages is that the size of a type is not fixed on all platforms. For instance, an "int" might be 2 bytes on some old systems, 4 on typical current systems and maybe bigger on super-duper systems.
In
Java, the size of each type is fixed by the language specification. A JVM is not free to change the size of any type. For instance, an "int" is always 4 bytes, whatever system it runs on.
Therefore, it is safe to "hard-code" the sizes of types into your code. You may just use a literal 4 to represent the number of bytes in an "int", for instance.
Can someone answer me a more subtle question? I know that a Java "int" is required to always appear to have 4 bytes, from the point of view of Java code. But are they actually required to use 4 bytes of memory? Obviously, they can't use less, but can they use more (padding)? A pure Java program can't find this out, because it can't get real pointers, but perhaps native, debugging and/or profiling interfaces do make such a requirement.