I want to utilize SGX540 to do GPGPU like linear algebra. The data I need to compute is short type that I cannot map it to texture. Can anyone give me some hints?
Any reply?
Hi,
do you need to output 16 bit short values from a fragment shader?
You could pack the individual bytes into the channels of a texture.
Regards,
Marco
Yes, I need to do computation on 16-bit short values in fragment shader. But
I don’t know how to map the 16-bit short values to Texture and then output a
16-bit short values.
Can you explain it in more detail? I am a newbie in GL ES and GPGPU. It would
be better if given some code snippet.
There is a similar method to pack a floating point value into the individual channels of a 32bit unsigned byte RGBA texture:
http://stackoverflow.com/questions/9882716/packing-float-into-vec4-how-does-this-code-work
You could adapt that code snippet to pack your values into two channels instead of four if that is enough precision for you.