Hi, I’m working on a device with a PowerVR Rogue Hood (driver 1.3@2876724)
I’m surprised to see that a 2k14 device has only 24bit integer (high) precision (using glGetShaderPrecisionFormat), I thought it would be 32 nowadays …
(which may be the reason of some bugs I got on this device for a code running well on all other devices)
I understand that on this kind of device, only 24bit are valid in (u)integer.
Let’s consider the case of a TransformFeedback only (no rasterizer) vertex shader, outputing some uvec4 values in a 4xu32 buffer.
Will the shader compiler be smart enough to merge 16/24bit intermediate results to write valid 32bit results
or do I have to consider that at most 24 of the 32bit of the output will be valid ?
(and in this case, will the top 8bit be cleared, set or undefined ?)