GL_IMG_texture_stream

I’ve been looking through the OMAP35x_Graphics_SDK_3_00_00_06 and see an interesting extension called GL_IMG_texture_stream. I want to use the SGX to scale the output from an emulator (e.g. NES) to full screen. At the moment I would have to take the output from the emulator and upload into texture memory every frame.

Is there any documention on the GL_IMG_texture_stream extension? I can see three functions (from PVRTglesExt.h):

    /* IMG_texture_stream */


    typedef void (APIENTRY * PFNGLGETTEXSTREAMDEVICEATTRIBIVIMG)(GLint device, GLenum pname, GLint *params);


    typedef void (APIENTRY * PFNGLTEXBINDSTREAMIMG)(GLint device, GLint deviceoffset);


    typedef const GLubyte * (APIENTRY * PFNGLGETTEXSTREAMDEVICENAMEIMG)(GLint device);

There is an example program called gles2_texture_stream but can’t see any source code for it.

Cheers,

Mark

Mark,

 

1. You need to update to later versions of the SDK, right now 09 should already be up on the website. You might need to register if you have not already have done so.

 

http://software-dl.ti.com/dsps/dsps_public_sw/sdo_sb/targetcontent/dvsdk/DVSDK_3_00/latest/index_FDS.html

 

2. I will send you sample code to do this separately. What format does your "Output from emulator" come in - YUV, RGB, other type ? Does the format change anytime, or is it fixed once you start the application ? What resolution of the texture are we talking about ?

 

I’ve requested the download so once that’s been cleared by customs I’ll check it out ;).

The output will be RGB and it’s size will be fixed once you start the application, as for the resolution, that will depend on the emulator. For the NES it’s 256 x 240.

I’ll have a look through the documentation and code when I get them, but are you saying that YUV texture is supported? This could be interesting for doing funky stuff with Video streams.

Below  is a link to a video showing what I did in my previous company with a 2D graphics accelerator, with the SGX maybe I could do a spining video cube :D.

http://www.youtube.com/watch?v=SC7PfhpWVQo&eurl=http%3A%2F%2Fwww.youtube.com%2Fuser%2Fmikedjames&feature=player_profilepage

Mark



Yes, there are more interesting than spinning video cubes that you can do with video texturing :). I have uploaded the sample code that exercises some of these paths in the gleslayer collaboration page at,

 

https://gforge.ti.com/gf/project/gleslayer/scmsvn/?action=browse&path=%2Ftrunk%2FPackages%2FOMAP3_Graphics_SDK%2F

 

Some missing portions will appear as we start supporting them in our official releases.

 

Wow this could be exactly what I'm looking for (to do Augmented Reality on mobiles, see http://mi.eng.cam.ac.uk/~sjt59/hips.html). The big question (apart from the documentation) is does it work under symbian?

See my post in the "General Mobile Graphics" forum for more background on what I'm doing and details of what I've tried so far.
https://www.imgtec.com/forum/forum_posts.asp?TID=492

Cheers for any assistance,

Simon
simontaylor12009-09-17 12:19:31

Simon, there are a couple of options you have for this. You can do this under Symbian, but you can get more existing applications on other OSs.


- If you have native access to the OMAP3 display planes, you can use it for blending the video and the "reality" features you overlay on top of that. But I doubt you have it, on the Symbian device you have shown in the demo.

 

- For these class of applications, blending of multiple planes can be done in GLES itself, and a composited frame generated out of it and sent directly to display. You can do this with shaders (needs gles2.0)

 

- You can accomplish rendering the camera stream directly without intermediate conversion steps.

 

- If you can get hold of an OMAP3 based Beagleboard running Linux, we have a demo showcasing a similar application. I will mail you separately about that.

 

~prabindh

Prabindh,

Thanks for the reply. I’d like to have something working on current mobiles, hence wanting something to work on Symbian.

I’m pretty sure there’s no native access. Writing a shader for it is a good idea, I’ve never tried that before but will give it a go.

“You can accomplish rendering the camera stream directly without intermediate conversion steps.” - That sounds perfect; how do I do that?!

I was actually planning on using a Beagleboard for development but the i8910 has a nice camera and screen already and uses pretty much the same chip. It’s unfortunate that Symbian is nowhere near as open as Linux would be. I might get myself a beagleboard anyway just for tinkering with, it seems like a great little platform for all sorts of fun hacking projects.

I look forward to the email - the best address to use is sjt59 cam ac uk

Simon

A new OMAP3 Graphics SDK release with support for allocating texture memory from user mode for the imgstream driver, and host of other new features is available at,

 

http://software-dl.ti.com/dsps/dsps_public_sw/sdo_sb/targetcontent/gfxsdk/latest/index_FDS.html

 

 

I just joined to say: Thank you very much for this, prabindh. There are many downstream who will appreciate it.

Hi Prabindh,





I checked the Graphics SDK, it looks like it requires TI EVM. Is that possible to build it under the same hardware device but different Linux OS, like Nokia N900?





Thanks!


So long you can manage kernel display driver differences, you might be able to do it. There are dependencies on the memory map as well, running from user mode.

Hi prabindh,





Thanks for your reply!





Actually I did compile the kernel module from graphics SDK successfully (pvrshm, omaplfb, bufferclass_ti). However, after installed the module, it bricked the device. To load the graphics driver (pvrshm / omaplfb), I didn’t see any complain. It caused problem only after reboot.





It’s so close. :-/

Nothing that an offline chat cant help ...