Imagination PowerVR SDK Blog

unable to generate texture id



I’m using the PowerVR OpenGL ES 2.0 Windows emulator (  I’ve got some simple applications running fine. However, I one application the following call is failing

unsigned int id;
glGenTextures(1, &id);

It does not register an OpenGL error but the value of “id” after the call is 0.  Does anyone have any idea of why this would be?   This code is part of a texture class, which is being used in other samples successfully.

Could someone on the PowerVR team describe what conditions would result in this function returning an id of 0?


Do other projects using textures such as the ones from our SDK run correctly on your system? Does something like IntroducingPOD build and run okay?

For this use of your class, are you definitely calling glGenTextures after GL is initialised properly, i.e. if you're using our shell is this code in InitView() or RenderScene()?

Gordon2010-09-28 17:59:44


Yes, even some of my own projects which use this class are working fine, which is why this is so confusing.  I’m virtually certain things are initialized, but I’ll check again.

Can you have someone check and see what conditions would result in this function returning an id of 0?  Perhaps this would help point me in the right direction in terms of figuring out where things are going wrong.


Ok, I found the source of the problem… just not sure how to fix it.

The application I’m working on has a multi-threaded rendering model.  For example, one thread can do application processing (e.g. positioning objects, determining what objects are visible), while another thread makes the actual OpenGL calls.  The draw thread lags one frame behind the app thread, but the overall throughput is greater.

So currently the calls to eglGetDisplay, eglInitialize, eglChooseConfig, and eglCreateWindowSurface are made in the app thread, while the calls to eglCreateContext, eglMakeCurrent, eglSwapBuffers are made in the draw thread.  If I change the multi-thread mode to do both the app and draw in the same thread everything works.  However, the app and draw in separate threads I get odd behavior like the inability to generate a texture id described here.

Although this may sound odd to those not used to doing things this way, we’ve been doing this w/ standard OpenGL for years on both Windows and Linux.  Assuming this is just an issue w/ the emulator and not OpenGL ES itself, is there any to make this work?


This is a known issue: BRN29913. It will be fixed in a future release. It may also not be an issue on your target platform.


in general, even if the rest of an application is multithreaded, we always recommend using a single thread for all interaction with OpenGL/EGL. This fits well with DirectX usage and will, as you’ve discovered, also work around this issue.

What happens if you move the EGL calls into the drawing thread?



While I’m sure it would be possible to move the egl calls into the draw thread, that would mean making an architecture change to a mature code base, which would require significant regression testing on all other platforms on which this already works as is.  Probably save this as a last resort.  Do you have a list of platforms on which this is known to be a problem?

Is there some way for me to know when this bug has been fixed?


This isn’t a recommended approach and so we don’t keep a list of platforms with which this will work. I would suggest that you test your application as soon as possible on your target platform(s) - which platform(s) will it be?

When this bug with PVRVFrame is fixed we’ll try and post here, but you can always check for what’s new in an SDK release by going here:


Not sure what platforms we’ll be targeting yet.  Just in the preliminary stages of seeing if we can get our software to run w/ OpenGL ES.

Thanks for your help.