I found out that with the OpenGL ES 2.0 emulation on my PC, glReadPixels behaves against the OpenGL ES 2.0 spec.
I have a 16 bit framebuffer, format GL_RGB565.
GL_VENDOR: Imagination Technologies (Host GL: `ATI Technologies Inc.`)
GL_RENDERER: PowerVR PVRVFrame 8.1 SGX (Host GL: `ATI Radeon HD 5670`)
GL_VERSION: OpenGL ES 2.0 ( SDK build: 2.08.28.0607 )
The spec says that glReadPixels called with GL_RGBA, GL_UNSIGNED_BYTE should return the color components in the order R, G, B, A:
"Components are packed with the first component in the most significant bits
of the bitfield, and successive component occupying progressively less significant
locations. The most significant bit of each component is packed in the most significant
bit location of its location in the bitfield."
That said, glReadPixels in fact returns
a) the components in the order A, B, G, R and
b) the relevant bits of each component begin with the most significant ones until the bit count of each component is reached. That means, blue can be found on the first 5 bits of 0xF80000, green on the first 6 ones on 0xFC00 and red on the first 5 ones on 0xF8 of each read pixel which is a 32 bit value of course. "first" in this case means the most significant ones.
So the 32 bit value has to be interpreted like "
bbbbb---gggggg--rrrrr---" ("-" being a not-set bit, "b" being a blue bit, "g" being a green bit and "r" being a red one).
I would have expected something like "rrrrrrrrggggggggbbbbbbbbaaaaaaaa".
glReadPixels seems to behave like a weird combination of PACK_SWAP_BYTES == TRUE and PACK_LSB_FIRST == TRUE whereas these constants are neither supported by OpenGL ES 2.0 nor can they be retrieved by an application - in contrast to desktop OpenGL.
Is there any fix available or planned? Is it a known bug?
Thanks in advance!