[SOLVED] Troubles with Tutorials or Demos Bins


Hello,

i am completely new to OpenGles development, and I want to get started by running th Binary Demos and Tutorials comming with the POWERVR SDK 2.08.28.0634.
Compiling the tutorials on my own, or running the prebuild binaries gives me the following for the first Tutorial (Hello Triangle)

dlopen tries: libGL.so
Error: eglChooseConfig() failed

And the following for all others ( including demos)

dlopen tries: libGL.so
PVRShell: EGL 1.3 initialized
Exit message has been set to: "PVRShell: Unable to create a context
".
InitAPI failed!
PVRShell: Unable to create a context

I am running a Linux Ubuntu 11.04 installation with an GeForce 7300 SE/7200 GS graphics card, without compiz. As i know this graphic card supports OpenGL 2.0.

Thx for you help

mapa172011-07-06 12:42:04

Hello,

i now know that the programs fail while calling eglChooseConfig() because
num_config always returns 0. I have seen some other posts int he forum about simular problems. They seem to solve this by editing pcviewer.cfg, but i cand find such file. Should pcviewer.cfg be part of a Linux SDK installation? if so, what is its content and where to i have to put it?

Thx!



Hi,

Our SDK requires that you have nvidia drivers installed for your graphic card provided by NVIDIA. By default in your ubuntu you will have probably something like nouveau (open source driver for Nvidia graphics hardware) . So this is possible reason why eglChooseConfig returning number of host configurations that supports requsted ES2 pixel format as 0. You can check with glxinfo wheather you have proper drivers installed. Let us now weather it helped.

Regards,



I’m also having issues with the eglChooseConfig. When I run HelloTriangle, it says:





dlopen tries: libGL.so


Error: eglChooseConfig() failed.





When I run the Water Demo, it prints:





dlopen tries: libGL.so


PVRShell: EGL 1.3 initialized


Exit message has been set to: “PVRShell: Unable to create a context”.


InitAPI failed!


PVRShell: Unable to create a context





I have copied libEGL.so and libGLESv2.so to the /usr/lib directory instead of altering LD_LIBRARY_PATH. I have an NVIDIA 9500 GT card, it has OpenGL 3.3.0/GLX 1.4 with Direct Rendering using Drivers 290.10. I have Ubuntu 11.04 (Oneiric), and Unity 3D … I also tried without Compiz with the same error.





In my code, I can initialize EGL, and get the version (v1.3), but if I run eglChooseConfig it always returns false and 0 configs. I would say it’s my code, but the demos also don’t work. I’m sorry to re-open this thread, but it says solved, and I’d like to know how the issue was solved…

We are experiencing nearly the same problem with the PVRFrame SDK on Ubuntu Linux 32-bit, NVidia graphics (GeForce GTX 260) hardware.





The pre-compiled demos will not run:





SDKPackage_OGLES2/Demos/ChameleonMan/Media$ OGLES2ChameleonMan


dlopen tries: libGL.so


PVRShell: EGL 1.3 initialized


Exit message has been set to: “PVRShell: Unable to create a context


”.


InitAPI failed!


PVRShell: Unable to create a context






glxgears runs fine.





glxinfo reports (extensions list omitted, full report at http://pastebin.ca/2097560 ):





name of display: :0


display: :0 screen: 0


direct rendering: Yes


server glx vendor string: NVIDIA Corporation


server glx version string: 1.4


server glx extensions:





client glx vendor string: NVIDIA Corporation


client glx version string: 1.4


client glx extensions:





GLX version: 1.4


GLX extensions:





OpenGL vendor string: NVIDIA Corporation


OpenGL renderer string: GeForce GTX 260/PCI/SSE2/3DNOW!


OpenGL version string: 3.3.0 NVIDIA 280.13


OpenGL shading language version string: 3.30 NVIDIA via Cg compiler











The uname for the system is:


Linux UBU1 3.0.0-12-generic #20-Ubuntu SMP Fri Oct 7 14:50:42 UTC 2011 i686 athlon i386 GNU/Linux








I’ve been able to compile our app against the PVR SDK, but it won’t run either, reporting








max@UBU1:~/Dev/OSG/data$ osgvertexattributes cow.osg


GraphicsWindowX11::init() - eglInitialize() succeded eglMajorVersion=1 iMinorVersion=3


GraphicsWindowX11::init() - window created =1


GraphicsWindowX11::init() - eglChooseConfig() failed.


GraphicsWindow has not been created successfully.









Would greatly appreciate any assistance troubleshooting this.

Hi Raid and AlphaPixel





This error typically presents itself when the NVIDIA drivers are not properly installed, however judging from your glxinfo report I’d say everything looks good. It may be worth doing a clean reinstall of the drivers to see if that helps.





Otherwise, try running a minimal eglChooseConfig test with an empty attribute list and see how many configs EGL is reporting.


Code:
#include <GL/gl.h>
#include <EGL/egl.h>
#include <X11/Xlib.h>
#include <cstdio>

int main(int argc, char** argv)
{
     EGLDisplay eglDisplay = eglGetDisplay((EGLNativeDisplayType)XOpenDisplay(0));
     EGLint const attribs[1] = { EGL_NONE };
     EGLint num_config = 0;
     if(eglChooseConfig(eglDisplay, attribs, 0, 0, &num_config))
          printf("eglChooseConfig succeeded. %d configs.n", num_config);
     else
          printf("eglChooseConfig failed.n", num_config);
     return 0;
}

It’s weird, when I call eglChooseConfig with no configs option (3rd parameter), it returns 6 configs… But as soon as I pass the configs it ‘passes’ (EGL_TRUE) but returns no configs.

Hi Guys





I ran into the same problem a few days ago and here is what I’ve found out so far.





I’ve save the code that chris posted in a file called test.cpp and compiled it with “g++ test.cpp -lX11 -lEGL”. That resulted in a file called a.out and after setting the executable flag via “chmod +x a.out” I could run it. The resulting output was “eglChooseConfig failed.”.





After that I modified the source code to show the result of the eglGetError() function after the call to eglChooseConfig(). Now the test application said: “eglGetError: 12289”. After a quick search on Google I found out that the error code 12289 is EGL_NOT_INITIALIZED (http://www.androidjavadoc.com/1.0_r1/javax/microedition/khronos/egl/EGL10.html). After some more research I’ve found the related eglInitialize() function that wasn’t called in the test application. Due to that I added the eglInitialize() call between the eglGetDisplay() and eglChooseConfig() calls and now the test application seems to work alright. The output now is:





eglGetDisplay()

eglGetError: 12288.

eglInitialize()

libEGL warning: DRI2: failed to authenticate

eglGetError: 12288.

eglChooseConfig()

eglChooseConfig succeeded. 24 configs.

eglGetError: 12288.[/CODE]



The code inside the test.cpp is currently like this:





#include &lt;GL/gl.h&gt; <br />#include &lt;EGL/egl.h&gt; <br />#include &lt;X11/Xlib.h&gt; <br />#include &lt;cstdio&gt; <br /> <br />int main(int argc, char** argv) <br />{ <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLint nErr; <br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetDisplay()n"); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLDisplay eglDisplay = eglGetDisplay((EGLNativeDisplayType)XOpenDisplay(0)); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;nErr = eglGetError(); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetError: %d.n", nErr); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglInitialize()n"); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLBoolean initialized = eglInitialize(eglDisplay, NULL, NULL); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;nErr = eglGetError(); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetError: %d.n", nErr); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLint const attribs[1] = { EGL_NONE }; <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLint num_config = 0; <br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglChooseConfig()n"); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;if(eglChooseConfig(eglDisplay, attribs, NULL, 0, &amp;num_config)) <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglChooseConfig succeeded. %d configs.n", num_config); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;else <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglChooseConfig failed.n", num_config); <br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;nErr = eglGetError(); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetError: %d.n", nErr); <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;return 0; <br />}



I hope this will lead me in the right direction to make the trainingcourses and demos work too. Is this missing eglInitialize() call maybe a bug in the framework?! I am running the SDK on Samsung Laptop with a GeForce 9600M graphics card and Linux Mint Debian Edition with the official NVidia drivers. glxinfo says that OpenGL 3.3 is supported.eglGetDisplay()<br /> <br />eglGetError: 12288.<br /> <br />eglInitialize()<br /> <br />libEGL warning: DRI2: failed to authenticate<br /> <br />eglGetError: 12288.<br /> <br />eglChooseConfig()<br /> <br />eglChooseConfig succeeded. 24 configs.<br /> <br />eglGetError: 12288.





The code inside the test.cpp is currently like this:








#include <GL/gl.h>

#include <EGL/egl.h>

#include <X11/Xlib.h>

#include <cstdio>



int main(int argc, char** argv)

{

     EGLint nErr;



     printf(“eglGetDisplay()n”);

     EGLDisplay eglDisplay = eglGetDisplay((EGLNativeDisplayType)XOpenDisplay(0));

     

     nErr = eglGetError();

     printf(“eglGetError: %d.n”, nErr);

     

     printf(“eglInitialize()n”);

     EGLBoolean initialized = eglInitialize(eglDisplay, NULL, NULL);

     

     nErr = eglGetError();

     printf(“eglGetError: %d.n”, nErr);

     

     EGLint const attribs[1] = { EGL_NONE };

     EGLint num_config = 0;



     printf(“eglChooseConfig()n”);

     if(eglChooseConfig(eglDisplay, attribs, NULL, 0, &num_config))

          printf(“eglChooseConfig succeeded. %d configs.n”, num_config);

     else

          printf(“eglChooseConfig failed.n”, num_config);



     nErr = eglGetError();

     printf(“eglGetError: %d.n”, nErr);

     

     return 0;

}[/CODE]



I hope this will lead me in the right direction to make the trainingcourses and demos work too. Is this missing eglInitialize() call maybe a bug in the framework?! I am running the SDK on Samsung Laptop with a GeForce 9600M graphics card and Linux Mint Debian Edition with the official NVidia drivers. glxinfo says that OpenGL 3.3 is supported.#include &lt;GL/gl.h&gt;<br /> <br />#include &lt;EGL/egl.h&gt;<br /> <br />#include &lt;X11/Xlib.h&gt;<br /> <br />#include &lt;cstdio&gt;<br /> <br /><br /> <br />int main(int argc, char** argv)<br /> <br />{<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLint nErr;<br /> <br /><br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetDisplay()n");<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLDisplay eglDisplay = eglGetDisplay((EGLNativeDisplayType)XOpenDisplay(0));<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;nErr = eglGetError();<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetError: %d.n", nErr);<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglInitialize()n");<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLBoolean initialized = eglInitialize(eglDisplay, NULL, NULL);<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;nErr = eglGetError();<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetError: %d.n", nErr);<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLint const attribs[1] = { EGL_NONE };<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;EGLint num_config = 0;<br /> <br /><br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglChooseConfig()n");<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;if(eglChooseConfig(eglDisplay, attribs, NULL, 0, &amp;num_config))<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglChooseConfig succeeded. %d configs.n", num_config);<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;else<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglChooseConfig failed.n", num_config);<br /> <br /><br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;nErr = eglGetError();<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;printf("eglGetError: %d.n", nErr);<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;<br /> <br />&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;return 0;<br /> <br />}





I hope this will lead me in the right direction to make the trainingcourses and demos work too. Is this missing eglInitialize() call maybe a bug in the framework?! I am running the SDK on Samsung Laptop with a GeForce 9600M graphics card and Linux Mint Debian Edition with the official NVidia drivers. glxinfo says that OpenGL 3.3 is supported.

I too get unable to create a context



This happens because glChooseConfig returns 0 configs and true



This happens on OSX, Win7 and Win8. hardware: 17" MBP late 2011.



It used to work fine until I introduced a bug in the calling code. It’s hasn’t been able to be run since. is there some persistent configuration stored anywhere that I can delete so it would start as if on the first run?



Thanks,

Willem