I have a general problem when using textures with an alpha channel (GL_RGBA).
When using textures without alpha (GL_RGB), it seems that these are internally stored (or at least displayed) with a color depth of 16 bit (RGB565 or RGB555). This is just a guess due to my perception of the resulting image (I can see some seams but not too many).
Now, when adding an alpha channel to the source image and loading it into OpenGL-ES (with GL_RGBA), the color depth is extremely bad. My suspicion is that the alpha texture is internally stored (or at least displayed) with 16 bit. That should be the same as with GL_RGB but now it has to store the alpha channel, too. Thus it may be that RGBA4444 is used, which results in 12 Bit for the colour which means we only have 4096 colours left to display...
In PC emulation everything works fine, I have a seamless colour transition. This problem only happens on the embedded device.
Also if using one texture with alpha channel and one without in the same scene, only the texture with alpha channel has few colours.
Furthermore it does not depend on the OpenGL-settings nor the states (GL_BLEND can be disabled, too, it doesn't matter).
It can be reproduced by using the "08_AlphaBlend"-TrainingCourse project from the OGLES-1.1_WINDOWS_PCEMULATION_2.02.22.0756 - SDK. Just disable the blending (otherwise it is not easily noticable) and you'll find the red dots displayed with very few colours.
It's even more obvious if replacing the background texture with an image which has nice colour-fadings (and an alpha channel which is always set at being opaque).
Is this problem known? Does it maybe depend on the driver or the hardware? Is there a solution?
my device is a Freescale IMX31 board with a PowerVR MBX Chip.
I didn't modify the TrainingCourse project, but the maximum possible colour depth on the device seems to be 565 = 16 bits.
If I try it like this:
conflist[i++] = EGL_RED_SIZE;
conflist[i++] = 5;
conflist[i++] = EGL_GREEN_SIZE;
conflist[i++] = 6;
conflist[i++] = EGL_BLUE_SIZE;
conflist[i++] = 5;
and if I then ask for the values like this:
glGetIntegerv(GL_RED_BITS, &value);
glGetIntegerv(GL_GREEN_BITS, &value);
glGetIntegerv(GL_BLUE_BITS, &value);
glGetIntegerv(GL_ALPHA_BITS, &value);
I get 565 and an alpha size of 0 (but alpha is not needed for the display buffer I guess). Nevertheless, the result looks like 444. It looks definitely not like 16 bits used for colour only.
It looks exactly the same as when I save the picture with your PVRTexTool in the format ARGB4444.
And if I just change the image to a non-alpha-channel image (no other change at all in the code), the result looks fine. glGet still gives a colour depth of 565 and 0 for alpha.
Could you check if the device supports the GL_IMG_texture_format_BGRA8888 extension? This extension defines a new token, GL_BGRA (0x80E1), for texture data in BGRA order. You can use this to get 32-bit textures.
Thank you very much, Xmas! BGRA works on my device with 32 bits.
One question is left in my brain: Is there no other way to use 32 bit textures on this device?
I think it's strange that I pass 32 bits to OpenGL-ES but the textures are only displayed with 16 bits although the device is able to process 32 bits and to display it with this extension.