Monitors and Color

Let's talk about how different monitors display color.

I am a programmer making a game with an artist colleague. I am writing OpenGL code on my Linux dev machine and he is making art assets in Photoshop on his Macbook. We notice very early that our monitors output noticeably different colors by comparing side-by-side with the same image, with image preview software, and our monitors at max brightness. My colors are bright and more washed-out, while his are much darker but more intense. It's a significant and immediately noticeable difference, but we don't think much of it at first. However, at some point we realize we want a dramatic lighting system that looks good but also influences the how the game is played. Naturally we start to worry about how we're going to develop this lighting when we don't even have a canonical display to compare results with.

Fear not, I say, let me implement Gamma Correction like I saw on HH and all our problems will go away! I took quite a while to rewatch those episodes and read many blog posts and tutorials, and I finally got it working. However, implementing gamma correction did absolutely nothing to bring my monitor's output closer to the artist's.

I looked into it some more and found that many people in the photography industry buy devices called colorimeters with accompanying software that supposedly calibrates your monitors. Most are crazy expensive but some seem to go as low as $100.

Now here is the million dollar question: What can I actually do as a developer? The way I see it, there are two paths to go down.

The first path is the "correct" path, which is to say buy a colorimeter, calibrate our monitors and pray that we can see the same image (or at least closer than now). Then we could say, "this is the way to play the game"... but who the hell calibrates their monitors anyway? Won't 99.9% of people who play games have an uncalibrated monitor? Is it possible that at this point we could be developing the game with a color profile further from the average user's? This would defeat the purpose, right?

The second path is... to abandon the entire notion of correctness, and try to compensate by offering a gamut of visual customization options through post-processing shaders. For example, a supplementary gamma correction shader, brightness, contrast, saturation, etc. In a sense we throw our hands in the air and tell the users they have to find the right look for their monitor. In addition, we'll surely have to struggle to get our monitor outputs to match using all these band-aid options.

This must be a common problem since devs prefer Windows and Linux while artists tend to prefer Macs. As for users, you can't control their hardware, but can you still do something in software? Interested in hearing your thoughts.
In my opinion, this is the same problem the movie & broadcast industries have already faced and I think their answer is correct.

Get a colorimeter and decent monitor and create what you think looks right with the best calibrated reference monitor you feel you can reasonably afford to use. This way you have at least defined what 'correct' is and players who actually care then have a course of action available to them if they care about your game looking correct, they can calibrate their own monitors - counter scenario: correct is undefined, the player has no idea what you intended, there is no course of action for them to assure they're seeing your game as you intended.

People might still have horribly wrong viewing setups but at least you've made it _their_ fault/problem.

That said, if you really don't care how it looks and it's not part of your artistic vision then I guess just having lots of options and custom color LUTs or whatever would do.
This problem is also the same in non-visual industries. For example, it is not uncommon for the final mixdown of a music track to be played by the engineer over many different types of speakers/headphones to make sure it doesn't sound awful on low-end or vacuum tube or whatever else. So the general answer for how do you proof the end result is you have to have some monitors and some TVs that are common for the end user, and test how things look periodically.

Now, the more important part where you need colorimeters and special monitors and so on is actually not for the end product, it's for helping you and your artists work together (and the artists to work with each other). If everyone is looking at different colors, it's harder to converge on same-looking results. So what I recommend here (and what we do at Molly) is that at a minimum everyone has a second monitor that is a color-proofing monitor, and they're all the same. Since artists use Cintiq's, and Wacom (hilariously) totally sucks at color calibration, you need it to be a second monitor because artists can't use it as their primary. Programmers could, though, if you want, but typically I have my setup so that the editor/debugger/etc. runs on the primary and the game runs on the secondary anyway.

Technically you could use any monitor that is "the same" generally speaking from the factory, so you don't have to color calibrate. But I find a good solution is to go with the HP DreamColor displays which come factory-calibrated very nicely for sRGB and are not significantly more expensive than other monitors.

- Casey
You have discovered the million dollar question for monitor. Most cheap monitor cannot display accurate color. TFT LCD panels are the worst. So you have to make some decisions.

First. Is accurate color important for the design/story/experience of your project? And if any one of those is true, then go down the correct path. If not, it doesn't matter which path you go down from this perspective.

Second, do you want you game to look good 5/10/15 years from now? (This is a trick question. The answer is yes.)

How can you make it look good in that time frame? Color correcting you systems now will help achieve this, due to the nature of the current industry. Monitors are cheaper now partly thanks to the mass production of HDTVs. UHD TVs should also lower prices for 4k Monitors. The important thing for you is not the 4k aspect, however, but the fact UHD requires at least 10bit displays, and follows the BT 2020 color spec. Those monitors can show all the colors that you can produce on mainstream monitors today, calibrated, and in HDR where available. It requires some research, because these are early days, but you can plan ahead now, by color correcting now.
BillDStrong

How can you make it look good in that time frame? Color correcting you systems now will help achieve this, due to the nature of the current industry. Monitors are cheaper now partly thanks to the mass production of HDTVs. UHD TVs should also lower prices for 4k Monitors. The important thing for you is not the 4k aspect, however, but the fact UHD requires at least 10bit displays, and follows the BT 2020 color spec. Those monitors can show all the colors that you can produce on mainstream monitors today, calibrated, and in HDR where available. It requires some research, because these are early days, but you can plan ahead now, by color correcting now.


Nvidia have been putting out some
great introductory material for developers to learn-up on this stuff. (also contains some really good recommendations around the subject of consumer displays not yet hitting anywhere near full Rec. 2020 coverage and what color spaces etc. are good targets for near-term development efforts)
Thanks for the replies everyone.

We're going to try the correct route and calibrate the displays. Correct color is actually a big deal to us and spending money on a decent calibration device seems like an easy win if we can match displays with each other and with any color-correct display a user might have.

As usual, this was the right place to ask.
For posterity, I would like to update on our situation as it seems to have settled.

We bought a Colormunki Display for $150, and calibrated our displays. We noticed at the end of my desktop calibration that I went from a "very yellow" color to a "much bluer" overall color. This made me optimistic because as I mentioned in my OP we had initially noticed that the Macbook's retina display was way bluer than mine. I therefore assumed my display had gotten closer to the color-correct "middle ground". Sure enough, we calibrated his Macbook and it went from being "very blue" to "much yellower". We compared our displays side-by-side for a naked-eye comparison and we were pleasantly surprised to find the colors pretty much indistinguishable.

Unfortunately, our images remain drastically different in another (orthogonal?) respect: the retina display's blacks are way blacker than on my Desktop monitor. Generally the whole image is darker and feels more intense. I assume there is nothing at all that can be done from our perspective since it seems to be a strictly hardware issue in that retina displays use different tech from your standard desktop monitors and seem to produce darker colors in the low ranges (as far as we can tell at least).

Despite this issue we're still pretty happy that our colors seem to match and are willing to add a brightness control as a user setting, assuming there's nothing else we can do.

Thanks again for all the help.

P.S. 10 years since Casey stream. I walk through the empty streets trying to think of something else but my path always leads to the stream...
hotspur

Despite this issue we're still pretty happy that our colors seem to match and are willing to add a brightness control as a user setting, assuming there's nothing else we can do.


Generally when calibrating monitors you have to make a choice about what things are most important to you from a range of different issues such as:

Black point
White point
Linearity
Color twist

Some monitor technologies will of course also naturally be better or worse in different regards, generally people tend to pick a good white point over a good black since it usually makes for the most realistic looking photos and such (I remember when using a CRT calibrated for a decent white point that the blacks ended up _very_ green and not all that black). Personally all other things equal I'd rather have a dimmer image and reasonable blacks than have perfect highlights in most circumstances but I wouldn't be willing to compromise too much on color accuracy or contrast or whatever just to have perfect blacks.
If vibrant colors and contrast are going to play a big role in the game, there's a few more things you may want to consider apart from calibration.

As suggested, the chance that whose playing the game will have a calibrated display is small.
One thing you could do, other than a brightness slider is make sure that the gamer can set a white point and a black point at a minimum. You see this reasonably often in games where they have a black to white gradient and they'll play with the color response on part of the screen until you tell them that "yes, this black now matches the other black and that white now matches the other white."

You could of course take that a bit further and ask them to match a color swatch against every day colors they're likely to have seen in real life. Caveat there of course is that asking someone from the UK to "pick the square from this blue gradient that most closely matches what to you a clear sky at noon looks like" may be too much of an ask ;-).

Depending on how important color is, it may be worth a punt to find some cross-cultural color standards that people are likely familiar with. How do you then map this from what they've indicated is their monitor's color calibration (or their perception of it, rather) to yours? Take the distance in a color cube between the various calibration points the player picked, turn it into a response curve for each of rgb and either use that in a post-proc shader or turn it into a 3x256 LUT that it can use.

Disclaimer: I haven't implemented the above "cross cultural self-calibration" and used it in anger yet, but on the face of it it seems like it could be worth a punt.

Caveat the second, and possibly more important: Some people are color blind. While there's nothing you can do about their condition, there is an easy way to make sure the game's still playable to this group. Or rather there's two ways:

Way the first: Implement a set of post-proc shaders that simulates the most prevalent ones. This isn't something you'll end up shipping, but it lets you guys make sure that everything reads well when green looks like grey or a certain color can't be distinguished from a certain background.

That is, you can make your assets work equally well to people who are colorblind and those that are not. Those who are not may get a richer experience out of it, but them's the breaks. At least all the parts that are important for interactivity will be equally distinguishable.

Way the second: Implement the same set of shaders, but add a tonemapping shader that seeks to alleviate the individual problems for the various kinds of color blindness. If red and purple can't be told apart easily, tonemap them so they're a bit further apart. This is a shader you would ship as a post-proc option to the player.

How would you know if it works? The colorblind player uses the tonemapper appropriate for their kind of color blindness. You on the other hand run your colorblindness simulating shader on its output to see if A) red and purple are now distinguishable, B) It doesn't look complete pants.

The first option has the benefit (and drawback) that you can tune your assets, which is probably more work.
The second option you would author your assets as you normally do and makes things playable to the colorblind. It may not look as good as when you tune the assets to work well across the board.

I've yet to see someone author each texture 3 times, which would make things look even better across the board, but that's pretty crazy town ROI-wise and not what I'm suggesting you do. Luckily it's the shaders that do most of the work here, and if in your art pipeline early enough, your assets can be authored from day 1 to look good as is.

Potential upside is that the accessibility angle may net you some good press and a boost in sales for maybe a week's worth of research and writing of shaders mostly, add a 3-way preview of your textures so your artist can also iterate quickly without having to run the full game, and before you know it you only need to run the game with the simulation shader on once a week or so because the artist has internalised the full gamut of what works and what doesn't.

On the other hand, it is extra work, so you'd probably be forgiven if you didn't do it. On a purely financial note it's not clear what the upsides and downsides are. Would the extra publicity and sales bump offset the extra effort? That's something I can't answer for you.

I wonder if Jonathan Blow's art team made sure that the colors used in all the puzzles were colorblind-friendly. There's not that many colors in each puzzle, not a lot of gradients going on, so it wouldn't have been a lot of work to pick a 'safe palet' and 'safe combos' and not worry about it for the rest of the development.

(I'm not colorblind, for the record, but I have a relative who is and this is a problem more often than you'd think. Recently an exchange went like this: "what do I click on?" followed by my "what do you mean? it's highlighted!" followed by his "I can't see that!")