I want to utilize SGX540 to do GPGPU like linear algebra. The data I need to compute is short type that I cannot map it to texture. Can anyone give me some hints?
do you need to output 16 bit short values from a fragment shader?
You could pack the individual bytes into the channels of a texture.
Yes, I need to do computation on 16-bit short values in fragment shader. But
There is a similar method to pack a floating point value into the individual channels of a 32bit unsigned byte RGBA texture:
You could adapt that code snippet to pack your values into two channels instead of four if that is enough precision for you.