glGetString(GL_EXTENSIONS) returning 0

Using Day 373 code, I'm getting a fault due to glGetString returning 0 when asking for the GL_EXTENSIONS value.

It appears that the GL_EXTENSIONS value for glGetString was deprecated in OpenGL 3 in favor of retrieving each extension individually via glGetStringi:
1
2
3
4
5
6
7
    GLint n = 0;
    glGetIntegerv( GL_NUM_EXTENSIONS, &n );
    for ( GLint i = 0; i < n; i++ )
    {
        char* extension = (char*)glGetStringi( GL_EXTENSIONS, i );
        // do something interesting
    }

My graphics card is an Nvidia GTX 770. Not sure why it's not failing for Casey...maybe AMD kept the old behavior around anyway?

Edit:
Existing code using glGetString works if WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB is turned off.
Verified glGetIntegerv(GL_NUM_EXTENSIONS) and looping with getGetStringi works fine; just need to load glGetStringi before calling OpenGLGetInfo.


Edited by DSmith on Reason: added more info