Hi, it’s me again.
I have a general problem when using textures with an alpha channel (GL_RGBA).
When using textures without alpha (GL_RGB), it seems that these are internally stored (or at least displayed) with a color depth of 16 bit (RGB565 or RGB555). This is just a guess due to my perception of the resulting image (I can see some seams but not too many).
Now, when adding an alpha channel to the source image and loading it into OpenGL-ES (with GL_RGBA), the color depth is extremely bad. My suspicion is that the alpha texture is internally stored (or at least displayed) with 16 bit. That should be the same as with GL_RGB but now it has to store the alpha channel, too. Thus it may be that RGBA4444 is used, which results in 12 Bit for the colour which means we only have 4096 colours left to display...
In PC emulation everything works fine, I have a seamless colour transition. This problem only happens on the embedded device.
Also if using one texture with alpha channel and one without in the same scene, only the texture with alpha channel has few colours.
Furthermore it does not depend on the OpenGL-settings nor the states (GL_BLEND can be disabled, too, it doesn't matter).
It can be reproduced by using the "08_AlphaBlend"-TrainingCourse project from the OGLES-1.1_WINDOWS_PCEMULATION_2.02.22.0756 - SDK. Just disable the blending (otherwise it is not easily noticable) and you'll find the red dots displayed with very few colours.
It's even more obvious if replacing the background texture with an image which has nice colour-fadings (and an alpha channel which is always set at being opaque).
Is this problem known? Does it maybe depend on the driver or the hardware? Is there a solution?