glGetString(GL_EXTENSIONS) returning 0
1 year, 1 month ago Edited by on March 27, 2017, 4:09 a.m. Reason: added more info
Using Day 373 code, I'm getting a fault due to glGetString returning 0 when asking for the GL_EXTENSIONS value.
It appears that the GL_EXTENSIONS value for glGetString was deprecated in OpenGL 3 in favor of retrieving each extension individually via glGetStringi:
My graphics card is an Nvidia GTX 770. Not sure why it's not failing for Casey...maybe AMD kept the old behavior around anyway?
Existing code using glGetString works if WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB is turned off.
Verified glGetIntegerv(GL_NUM_EXTENSIONS) and looping with getGetStringi works fine; just need to load glGetStringi before calling OpenGLGetInfo.