i am completely new to OpenGles development, and I want to get started by running th Binary Demos and Tutorials comming with the POWERVR SDK 2.08.28.0634. Compiling the tutorials on my own, or running the prebuild binaries gives me the following for the first Tutorial (Hello Triangle)
And the following for all others ( including demos)
dlopen tries: libGL.so PVRShell: EGL 1.3 initialized Exit message has been set to: "PVRShell: Unable to create a context ". InitAPI failed! PVRShell: Unable to create a context
I am running a Linux Ubuntu 11.04 installation with an GeForce 7300 SE/7200 GS graphics card, without compiz. As i know this graphic card supports OpenGL 2.0.
i now know that the programs fail while calling eglChooseConfig() because num_config always returns 0. I have seen some other posts int he forum about simular problems. They seem to solve this by editing pcviewer.cfg, but i cand find such file. Should pcviewer.cfg be part of a Linux SDK installation? if so, what is its content and where to i have to put it?
Our SDK requires that you have nvidia drivers installed for your graphic card provided by NVIDIA. By default in your ubuntu you will have probably something like nouveau (open source driver for Nvidia graphics hardware) . So this is possible reason why eglChooseConfig returning number of host configurations that supports requsted ES2 pixel format as 0. You can check with glxinfo wheather you have proper drivers installed. Let us now weather it helped.
I’m also having issues with the eglChooseConfig. When I run HelloTriangle, it says:
dlopen tries: libGL.so
Error: eglChooseConfig() failed.
When I run the Water Demo, it prints:
dlopen tries: libGL.so
PVRShell: EGL 1.3 initialized
Exit message has been set to: “PVRShell: Unable to create a context”.
InitAPI failed!
PVRShell: Unable to create a context
I have copied libEGL.so and libGLESv2.so to the /usr/lib directory instead of altering LD_LIBRARY_PATH. I have an NVIDIA 9500 GT card, it has OpenGL 3.3.0/GLX 1.4 with Direct Rendering using Drivers 290.10. I have Ubuntu 11.04 (Oneiric), and Unity 3D … I also tried without Compiz with the same error.
In my code, I can initialize EGL, and get the version (v1.3), but if I run eglChooseConfig it always returns false and 0 configs. I would say it’s my code, but the demos also don’t work. I’m sorry to re-open this thread, but it says solved, and I’d like to know how the issue was solved…
This error typically presents itself when the NVIDIA drivers are not properly installed, however judging from your glxinfo report I’d say everything looks good. It may be worth doing a clean reinstall of the drivers to see if that helps.
Otherwise, try running a minimal eglChooseConfig test with an empty attribute list and see how many configs EGL is reporting.
It’s weird, when I call eglChooseConfig with no configs option (3rd parameter), it returns 6 configs… But as soon as I pass the configs it ‘passes’ (EGL_TRUE) but returns no configs.
I ran into the same problem a few days ago and here is what I’ve found out so far.
I’ve save the code that chris posted in a file called test.cpp and compiled it with “g++ test.cpp -lX11 -lEGL”. That resulted in a file called a.out and after setting the executable flag via “chmod +x a.out” I could run it. The resulting output was “eglChooseConfig failed.”.
After that I modified the source code to show the result of the eglGetError() function after the call to eglChooseConfig(). Now the test application said: “eglGetError: 12289”. After a quick search on Google I found out that the error code 12289 is EGL_NOT_INITIALIZED (http://www.androidjavadoc.com/1.0_r1/javax/microedition/khronos/egl/EGL10.html). After some more research I’ve found the related eglInitialize() function that wasn’t called in the test application. Due to that I added the eglInitialize() call between the eglGetDisplay() and eglChooseConfig() calls and now the test application seems to work alright. The output now is:
eglGetDisplay()
eglGetError: 12288.
eglInitialize()
libEGL warning: DRI2: failed to authenticate
eglGetError: 12288.
eglChooseConfig()
eglChooseConfig succeeded. 24 configs.
eglGetError: 12288.[/CODE]
The code inside the test.cpp is currently like this:
I hope this will lead me in the right direction to make the trainingcourses and demos work too. Is this missing eglInitialize() call maybe a bug in the framework?! I am running the SDK on Samsung Laptop with a GeForce 9600M graphics card and Linux Mint Debian Edition with the official NVidia drivers. glxinfo says that OpenGL 3.3 is supported.eglGetDisplay()<br />
<br />eglGetError: 12288.<br />
<br />eglInitialize()<br />
<br />libEGL warning: DRI2: failed to authenticate<br />
<br />eglGetError: 12288.<br />
<br />eglChooseConfig()<br />
<br />eglChooseConfig succeeded. 24 configs.<br />
<br />eglGetError: 12288.
The code inside the test.cpp is currently like this:
I hope this will lead me in the right direction to make the trainingcourses and demos work too. Is this missing eglInitialize() call maybe a bug in the framework?! I am running the SDK on Samsung Laptop with a GeForce 9600M graphics card and Linux Mint Debian Edition with the official NVidia drivers. glxinfo says that OpenGL 3.3 is supported.#include <GL/gl.h><br />
<br />#include <EGL/egl.h><br />
<br />#include <X11/Xlib.h><br />
<br />#include <cstdio><br />
<br /><br />
<br />int main(int argc, char** argv)<br />
<br />{<br />
<br /> EGLint nErr;<br />
<br /><br />
<br /> printf("eglGetDisplay()n");<br />
<br /> EGLDisplay eglDisplay = eglGetDisplay((EGLNativeDisplayType)XOpenDisplay(0));<br />
<br /> <br />
<br /> nErr = eglGetError();<br />
<br /> printf("eglGetError: %d.n", nErr);<br />
<br /> <br />
<br /> printf("eglInitialize()n");<br />
<br /> EGLBoolean initialized = eglInitialize(eglDisplay, NULL, NULL);<br />
<br /> <br />
<br /> nErr = eglGetError();<br />
<br /> printf("eglGetError: %d.n", nErr);<br />
<br /> <br />
<br /> EGLint const attribs[1] = { EGL_NONE };<br />
<br /> EGLint num_config = 0;<br />
<br /><br />
<br /> printf("eglChooseConfig()n");<br />
<br /> if(eglChooseConfig(eglDisplay, attribs, NULL, 0, &num_config))<br />
<br /> printf("eglChooseConfig succeeded. %d configs.n", num_config);<br />
<br /> else<br />
<br /> printf("eglChooseConfig failed.n", num_config);<br />
<br /><br />
<br /> nErr = eglGetError();<br />
<br /> printf("eglGetError: %d.n", nErr);<br />
<br /> <br />
<br /> return 0;<br />
<br />}
I hope this will lead me in the right direction to make the trainingcourses and demos work too. Is this missing eglInitialize() call maybe a bug in the framework?! I am running the SDK on Samsung Laptop with a GeForce 9600M graphics card and Linux Mint Debian Edition with the official NVidia drivers. glxinfo says that OpenGL 3.3 is supported.
This happens because glChooseConfig returns 0 configs and true
This happens on OSX, Win7 and Win8. hardware: 17" MBP late 2011.
It used to work fine until I introduced a bug in the calling code. It’s hasn’t been able to be run since. is there some persistent configuration stored anywhere that I can delete so it would start as if on the first run?