Correct gamma w/ software rendering

Hi,

I can't submit this during the stream as I live in a different timezone, but I wanted to report that when switching to the software rendering (by setting GlobalConstants_Renderer_UseSoftware 1) the colors come out wrong.

The reason is that our opengl internal textures are in SRGB format.

Going inside OpenGLInit(...) and disabling the request for internal SRGB formats will make it look good again:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
 DEBUG_IF(Renderer_UseSoftware) {
      // Do nothing, the software renderer does not produce sRGB bitmaps
    } else {
      if(Info.GL_EXT_texture_sRGB)
      {
          OpenGLDefaultInternalTextureFormat = GL_SRGB8_ALPHA8;
      }
      // TODO(casey): Need to go back and use extended version of choose pixel format
      // to ensure that our framebuffer is marked as SRGB?
      if(Info.GL_EXT_framebuffer_sRGB)
      {
          glEnable(GL_FRAMEBUFFER_SRGB);
      }
    }
// Do nothing, the software renderer does not produce sRGB bitmaps
This is not correct. Software renderer produces sRGB bitmaps (well with approximated sRGB curve, not exact one).

That why it does square and square root operations when loading and storing pixel values - it approximates sRGB curve with quadratic curve.
Mm I seemed to have remembered that, actually. So what could it be?

Basically the colors of the software renderer come out too bright/washed out here.
Well, we only just recently got the sRGB output pixel format set correctly in the Win32 layer, so if it was reading sRGB textures but not writing sRGB backbuffers, that could be relevant... does this happen on the latest source code?

- Casey
And can we get screenshot showing too bright colors?
Are you using some software that affects gamma of screen? Like f.lux or similar?
Day 246 w/ or w/o the reported typo corrected. This is almost vanilla Windows 8.1; No f.lux installed

Software Renderer:



Hardware Renderer:



The only thing that changes is the Renderer_UseSoftware define which is either 1 or 0.

A bit more details after tracing a couple things in the debugger:

In the first call to Win32SetPixelFormat, wglChoosePixelFormatARB is 0 (somehow that's expected!)

In the second call to Win32SetPixelFormat, after extensions have been queried wglChoosePixelFormatARG is non null and we set the IntAttribList to:
 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
		[0x00000000]	0x00002001	int
		[0x00000001]	0x00000001	int
		[0x00000002]	0x00002003	int
		[0x00000003]	0x00002027	int
		[0x00000004]	0x00002010	int
		[0x00000005]	0x00000001	int
		[0x00000006]	0x00002011	int
		[0x00000007]	0x00000001	int
		[0x00000008]	0x00002013	int
		[0x00000009]	0x0000202b	int
		[0x0000000a]	0x000020a9	int
		[0x0000000b]	0x00000001	int
		[0x0000000c]	0x00000000	int


Therefore WGL_FRAMEBUFFER_SRGB_CAPABLE_ARB is paired with 1.

Later I checked and:

1
2
3
opengl_info Info has
  GL_EXT_texture_sRGB true
  GL_EXT_framebuffer_sRGB true

Edited by Nicolas Léveillé on
Really strange. I have no idea what is wrong.
BTW, I fixed this on Friday's stream.

To the OP: it is _also_ correct to fix it the way you were suggesting, which is to turn off sRGB altogether. The reason for the washed-out color is because the gamma curve is being applied effectively twice - once by the software renderer before it writes the values, and then again by the OpenGL texture blit to the framebuffer. This is because the framebuffer was set as sRGB, but the software renderer's texture wasn't.

The solution that I used was to set the texture to sRGB, so that it would be un-gamma'd on read and the re-gamma'd on write. Your solution of turning off the gamma on write is also valid - this just passed through the gamma'd values, which has the same effect.

- Casey
Thanks for the explanation!
Finally caught up and can make points about things that didn't happen months ago!

Quick note on a problem that at first looked similar to this one: the SRGB framebuffer extension name we're checking for isn't always the one you need to check for, thanks to Graphics Vendor Hell.

On my Intel HD 4000, they only have GL_ARB_framebuffer_sRGB, not GL_EXT_framebuffer_sRGB. I guess they figured they'd just stick the latest one in there? This makes everything too dark since the gamma correction at the end is never enabled.
Thanks - I'll add that to the issue list on GitHub.

- Casey