OSX EGL Initialization Issues

As the title says, Im having a few issues, the first is that regardless the way Im initializing the EGL context the configuration attributes that Im passing are always ignored… In previous version that was working fine… Im initializing the context like this:


	EGLDisplay	m_EGLDisplay;<br />
EGLSurface	m_EGLWindow;<br />
EGLContext	m_EGLContext;<br />
<br />
EGLNativeDisplayType m_NDT;<br />
EGLNativePixmapType  m_NPT;<br />
EGLNativeWindowType  m_NWT;<br />
<br />
m_NDT = (EGLNativeDisplayType)NULL;<br />
m_NPT = (EGLNativePixmapType)NULL;<br />
m_NWT = (EGLNativeWindowType)window->ns.view;<br />
<br />
EGLConfig config[ 16 ];<br />
<br />
EGLint attribs[] = { EGL_CONTEXT_CLIENT_VERSION,<br />
3/* or 2 */,<br />
EGL_NONE },<br />
<br />
config_attr[] = { EGL_RENDERABLE_TYPE, EGL_OPENGL_ES2_BIT,<br />
EGL_RED_SIZE	    , 5,<br />
EGL_GREEN_SIZE	    , 6,<br />
EGL_BLUE_SIZE	    , 5,<br />
EGL_ALPHA_SIZE	    , 0,<br />
EGL_DEPTH_SIZE	    , 24,<br />
EGL_STENCIL_SIZE   , 8,<br />
EGL_SAMPLE_BUFFERS , 0,<br />
EGL_SAMPLES	    , 0,<br />
EGL_SURFACE_TYPE   , EGL_WINDOW_BIT,<br />
EGL_NONE },<br />
version_major,<br />
version_minor,<br />
num_config;<br />
<br />
m_EGLDisplay = eglGetDisplay( m_NDT );<br />
<br />
eglInitialize( m_EGLDisplay,<br />
&version_major,<br />
&version_minor );<br />
<br />
eglBindAPI( EGL_OPENGL_ES_API );<br />
<br />
fprintf( stderr, "EGL_VENDOR: %sn"     , ( char * )eglQueryString( m_EGLDisplay, EGL_VENDOR      ) );<br />
fprintf( stderr, "EGL_VERSION: %sn"    , ( char * )eglQueryString( m_EGLDisplay, EGL_VERSION     ) );<br />
fprintf( stderr, "EGL_CLIENT_APIS: %sn", ( char * )eglQueryString( m_EGLDisplay, EGL_CLIENT_APIS ) );<br />
<br />
EGLint	i = 0,<br />
id,<br />
size,<br />
red,<br />
green,<br />
blue,<br />
alpha,<br />
depth,<br />
stencil,<br />
sample_buffers,<br />
samples;<br />
<br />
eglGetConfigs( m_EGLDisplay,<br />
config,<br />
16,<br />
&num_config );<br />
<br />
while( i != num_config )<br />
{<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_CONFIG_ID	 , &id				);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_BUFFER_SIZE	 , &size			);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_RED_SIZE		 , &red				);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_GREEN_SIZE	 , &green			);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_BLUE_SIZE	 , &blue			);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_ALPHA_SIZE	 , &alpha			);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_DEPTH_SIZE	 , &depth			);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_STENCIL_SIZE  , &stencil			);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_SAMPLE_BUFFERS, &sample_buffers	);<br />
eglGetConfigAttrib( m_EGLDisplay, config[ i ], EGL_SAMPLES		 , &samples			);<br />
<br />
fprintf( stderr,<br />
"EGL_CONFIG%d: Bits:%d R:%d G:%d B:%d A:%d D:%d St:%d Sb:%d Sa:%dn",<br />
id,<br />
size,<br />
red,<br />
green,<br />
blue,<br />
alpha,<br />
depth,<br />
stencil,<br />
sample_buffers,<br />
samples );<br />
++i;<br />
}<br />
<br />
fprintf( stderr, "EGL_EXTENSIONS: %snn" , ( char * )eglQueryString( m_EGLDisplay, EGL_EXTENSIONS  ) );<br />
<br />
<br />
eglChooseConfig(  m_EGLDisplay,<br />
config_attr,<br />
&config[ 0 ],<br />
1,<br />
&num_config );<br />
<br />
m_EGLWindow = eglCreateWindowSurface( m_EGLDisplay,<br />
config[ 0 ],<br />
m_NWT,<br />
NULL );<br />
<br />
m_EGLContext = eglCreateContext( m_EGLDisplay,<br />
config[ 0 ],<br />
EGL_NO_CONTEXT,<br />
attribs );<br />
<br />
eglMakeCurrent( m_EGLDisplay,<br />
m_EGLWindow,<br />
m_EGLWindow,<br />
m_EGLContext );
```<br />
<br />
The EGL_CONFIGs that are returned are always the same, which explain why Im always getting a 32bits with multisampling context:<br />
<br />

EGL_CONFIG1: Bits:32 R:8 G:8 B:8 A:8 D:24 St:8 Sb:1 Sa:4

EGL_CONFIG2: Bits:32 R:8 G:8 B:8 A:8 D:24 St:8 Sb:1 Sa:4

EGL_CONFIG3: Bits:32 R:8 G:8 B:8 A:8 D:24 St:8 Sb:1 Sa:4

EGL_CONFIG4: Bits:32 R:8 G:8 B:8 A:8 D:24 St:8 Sb:1 Sa:4

EGL_CONFIG5: Bits:32 R:8 G:8 B:8 A:8 D:24 St:8 Sb:1 Sa:4

EGL_CONFIG6: Bits:32 R:8 G:8 B:8 A:8 D:24 St:8 Sb:1 Sa:4

<br />
1.<br />
Is there at least a workaround to fix this... or at least disable multisampling (slows down a lot in my case)?<br />
<br />
2.<br />
The second problem that Im having is that, on my MBP Im having 2 video cards, 1 Nvidia (which PVRVframe pickup and work fine on it) and an Intel HD... For the Intel card PVRVframe failed to pick it up and Im falling back in software renderer... (plus notice below that in both case the GL_VERSION query fail...)<br />
<br />
When I select the Nvidia card:<br />
<br />

GL_VENDOR: Imagination Technologies (Host: NVIDIA Corporation)

GL_RENDERER: PVRVFrame 10.0 - None (Host : NVIDIA GeForce GT 330M OpenGL Engine) (SDK Build: 3.4@3186613)

GL_VERSION: (Host : 3.3 NVIDIA-8.24.16 310.90.9.05f01)

GL_SHADING_LANGUAGE_VERSION: OpenGL ES GLSL ES 1.00 (Host: 3.30)

GL_EXTENSIONS: GL_APPLE_copy_texture_levels GL_APPLE_sync GL_APPLE_texture_max_level GL_EXT_blend_minmax GL_EXT_color_buffer_float GL_EXT_debug_marker GL_EXT_discard_framebuffer GL_EXT_draw_buffers GL_EXT_multi_draw_arrays GL_EXT_multisampled_render_to_texture GL_EXT_occlusion_query_boolean GL_EXT_robustness GL_EXT_shader_texture_lod GL_EXT_sRGB GL_EXT_texture_filter_anisotropic GL_EXT_texture_rg GL_EXT_texture_storage GL_EXT_texture_sRGB_decode GL_EXT_texture_type_2_10_10_10_REV GL_IMG_multisampled_render_to_texture GL_IMG_program_binary GL_IMG_read_format GL_IMG_shader_binary GL_IMG_texture_compression_pvrtc GL_IMG_texture_compression_pvrtc2 GL_IMG_texture_npot GL_IMG_texture_stream GL_IMG_texture_stream2 GL_IMG_uniform_buffer_object GL_IMG_vertex_array_object GL_KHR_debug GL_KHR_blend_equation_advanced GL_OES_compressed_ETC1_RGB8_texture GL_OES_depth_texture GL_OES_depth_texture_cube_map GL_OES_EGL_image_external GL_OES_egl_sync GL_OES_element_index_uint GL_OES_fragment_precision_high GL_OES_get_program_binary GL_OES_mapbuffer GL_OES_packed_depth_stencil GL_OES_read_format GL_OES_required_internalformat GL_OES_sample_shading GL_OES_sample_variables GL_OES_shader_image_atomic GL_OES_shader_multisample_interpolation GL_OES_standard_derivatives GL_OES_stencil_wrap GL_OES_surfaceless_context GL_OES_texture_mirrored_repeat GL_OES_texture_stencil8 GL_OES_texture_storage_multisample_2d_array GL_OES_vertex_array_object

<br />
When I select the Intel card using gfxCardStatus:<br />
<br />

GL_VENDOR: Imagination Technologies (Host: Apple Computer, Inc.)

GL_RENDERER: PVRVFrame 10.0 - None (Host : Apple Software Renderer) (SDK Build: 3.4@3186613)

GL_VERSION: (Host : 4.1 APPLE-9.6.1)

GL_SHADING_LANGUAGE_VERSION: OpenGL ES GLSL ES 1.00 (Host: 4.10)

GL_EXTENSIONS: GL_APPLE_copy_texture_levels GL_APPLE_sync GL_APPLE_texture_max_level GL_EXT_blend_minmax GL_EXT_color_buffer_float GL_EXT_debug_marker GL_EXT_discard_framebuffer GL_EXT_draw_buffers GL_EXT_multi_draw_arrays GL_EXT_multisampled_render_to_texture GL_EXT_occlusion_query_boolean GL_EXT_robustness GL_EXT_shader_texture_lod GL_EXT_sRGB GL_EXT_texture_filter_anisotropic GL_EXT_texture_rg GL_EXT_texture_storage GL_EXT_texture_sRGB_decode GL_EXT_texture_type_2_10_10_10_REV GL_IMG_multisampled_render_to_texture GL_IMG_program_binary GL_IMG_read_format GL_IMG_shader_binary GL_IMG_texture_compression_pvrtc GL_IMG_texture_compression_pvrtc2 GL_IMG_texture_npot GL_IMG_texture_stream GL_IMG_texture_stream2 GL_IMG_uniform_buffer_object GL_IMG_vertex_array_object GL_KHR_debug GL_KHR_blend_equation_advanced GL_OES_compressed_ETC1_RGB8_texture GL_OES_depth_texture GL_OES_depth_texture_cube_map GL_OES_EGL_image_external GL_OES_egl_sync GL_OES_element_index_uint GL_OES_fragment_precision_high GL_OES_get_program_binary GL_OES_mapbuffer GL_OES_packed_depth_stencil GL_OES_read_format GL_OES_required_internalformat GL_OES_sample_shading GL_OES_sample_variables GL_OES_shader_image_atomic GL_OES_shader_multisample_interpolation GL_OES_standard_derivatives GL_OES_stencil_wrap GL_OES_surfaceless_context GL_OES_texture_mirrored_repeat GL_OES_texture_stencil8 GL_OES_texture_storage_multisample_2d_array GL_OES_vertex_array_object

<br />
3.<br />
Im also receiving random crash when building/restarting my App (on both card btw):<br />
<br />

/Users/autobuild/buildxl/buildroot/sdk/branch/UtilitiesSrc/Common/PVRPreferences/PVRPreferences.cpp

WARNING: No declaration found at the start. The declaration will be recreated.

<br />
4.<br />
In addition changing the client version bit have no effect:<br />
<br />
EGLint attribs[] = { EGL_CONTEXT_CLIENT_VERSION,<br />

2,

EGL_NONE }

<br />
<br />
EGLint attribs[] = { EGL_CONTEXT_CLIENT_VERSION,<br />

3,

EGL_NONE }

<br />
I was hoping that changing 2 to 3 or vice versa would give me a correct GL_VERSION and GL_SHADING_LANGUAGE version... It it crucial in my App as I need to adjust the GLSL code generation, and functionalities / function extensions in my App...<br />
<br />
Tks in advance for your reply... Hope that there's some workaround for theses issues, or they can be fixed soon...

Hi ROm, thanks for reporting these issues


  1. As you’ve seen there’s currently a nasty bug in eglChooseConfig where only the first matching config is returned. A temporary workaround would be to use eglGetConfigs to return a list of all available configs and select and appropriate one manually.


  2. We’re currently working on better support for intel graphics. For the time being you are probably best off sticking with the nvidia card. There’s another bug in glGetString where GL_VERSION fails for anything above ES 2.0.



    These two bugs will be fixed in the next release, which will be early next year, but we should be releasing beta versions around December.


  3. I haven’t seen this crash before, but I’ll file a bug and try to reproduce it. I assume you mean this only happens when restarting the app in Xcode?


  4. The version of context returned by eglCreateContext also depends on the value of EGL_RENDERABLE_TYPE in the selected EGL config. In PVRVFrame you’ll always be getting one of a set of hard-coded EGL configs, and all ES2_BIT configs also support ES3_BIT_KHR, meaning you’ll always get an ES 3.0 context. I realise this isn’t really ideal so I’ll file a bug report to add a better selection of configs. This should also allow you to select a non-multisampled config too.



    Again I’ll do my best to push these fixes in for the beta around December time. I hope these issues don’t slow you down too much in the meantime!
  1. I think about it but… as you can in my code output when I enumerate the configs they are all the same… :wink:


  2. Would be great if we can select to either initialize GLES2 or GLES3, that makes it ALOT easier to test our apps using a specific profile. I personally really need that since Im developing an editor and selecting the profile for the project is crucial :wink: Do not use the EGL_OPENGL_ES3_BIT_KHR its basically an ext, simply use the standard EGL_OPENGL_ES2_BIT, and toggle GLES2/GLES3 context creation with the EGL_CONTEXT_CLIENT_VERSION. I believe that’s what make more sense… (see answer for #4 for more info…)


  3. Yes it is using Xcode, it typically append when my app is running then hit build and run. The app close then the new ly compiled version launched and can’t create the context. Another thing that happen is sometimes it gets the context (as I can see on the console its initialized) but the app just hang there (at eglMakeCurrent)… I have to restart again to start it up… not sure its related but…


  4. Hmmm the way I believe it should be done (my 0.02$) is like this: when EGL_CONTEXT_CLIENT_VERSION is set to 2 a compatible OpenGL 2.1 context basically NSOpenGLProfileVersionLegacy. And when ES3 is selected initialize it internally with NSOpenGLProfileVersion3_2Core and up… On Linux and Windows its basically the same thing, glXCreateContextAttribsARB with GLX_CONTEXT_CORE_PROFILE_BIT_ARB and wglCreateContextAttribsARB WGL_CONTEXT_CORE_PROFILE_BIT_ARB or else (the latest version supported by the driver).


I’m experiencing the same issue 3 here. It seems the library cannot find the XML file when initialization and crashed