Got bit by a JOGL "portability bug" that drove me crazy for about three hours today, and thought I'd report it so it didn't happen to anybody else!
OpenGL defines an unsigned int type "GLuint" which JOGL maps to a Java int. This isn't quite right, is it, because on most architectures, GLuint is a 32-bit unsigned int, while a Java int is signed. That means that some code originally written in C would not work right if naively translated to Java.
The function glGenLists() (which creates display lists, basically an enormous optimization) returns GLuint. It returns 0 on failure, and non-zero for success. On success, the return value is meaningful. Now, there exists C code that looks like
But in Java, glGenLists() can return a negative number on success, so this code would be broken.
So to make a long story short: NVIDIA's drivers return small positive integers here when used with GeForce3 or 4 or Quadro 8xx or 9xx cards; but the very same drivers return large negative numbers when you've got a Quadro FX card. The symptom was that the application ran very slowly on these high-end cards (because the optimization wasn't in force) and fast on cheaper cards. Took me three hours to track this down. Don't let this happen to you! Now the code says