Miklos Szeles wrote:
Can anybody tell me which color model is used most widely in computers? I thought it is INT_BGR, but I tested the ColorModel returned from the GraphicsConfiguration on 3 computer and it was 3BYTE_BGR. Based on Filthy Rich Clients I thought that most of the computers uses 32bit color models nowadays:
As far as I know, when my image is not in compatible format with the graphics device, a color space conversion happens before image drawing. The image data comes from a native image decoder which can produce 3BYTE_BGR output at the moment. So I thought maybe it's worth to implement the INT_BGR model, but if computers don't use it I won't spend time with the INT_BGR support.
Color model type: 5
Pixel size: 32
Color model: DirectColorModel
Ulf Dittmer wrote:
Color model type: 5
Miklos Szeles wrote:I thought that. When I realized the color model is not the information what I was looking for then I found the pixelsize, which tells much more. So the question is not whether RGB used or not, but which type of RGB used. Is it operation system dependent? Video card dependent?
Miklos Szeles wrote:I think you misunderstood me.
What I want is the following:
When you create an image with createCompatibleImage you get an image with the color format used by your display system. So whenver you use drawImage no additional color space conversion occures since your image is compatible with the device. When you use any other formats, the system must convert the image. I use a native decoder which decodes images in YV12 format and then converts it to 3BYTE_BGR.
If your graphics device format is 3BYTE_BGR then everything is okay but an additional conversion happens in the background when the device's color space is different. I made some performance tests and my tests shows that the conversion doubles the drawing time, so it is really important to skip the conversion if possible.
I'm using windows XP and in the display settings panel 32bit is choosen, so I thought that my display device must use some 32bit color format. But I was wrong.
I tested this on a few computers and I found that these computers compatible image uses 3BYTE_BGR. But in Filthy Rich Clients it is said that nowadys computers use 32 representations for pixels. So it was strange, since I experienced differently and that was the reason to start this topic. As Ulf reported his system uses INT representation. So I just wanted to know which part of a computer determines the color format of the display device. Is it the video cards? Is it the monitor? Is it the operation system?...