As a computer graphics programmer, gamma correction seems like something that we shouldn't have to worry about. And even after being told that it is something that must be addressed in order to produce good images, it seems like it probably isn't that big a deal. How non-linear could computer monitors be, anyway? It often seems like we can get away with the assumption that a linear change in an "RGB value" produces a linear change in on-screen brightness.
Most programs can get away with it. But if you're hoping to garner the respect of creative professionals, handling gamma correctly is the sort of thing that really separates the world-class solutions from the rest. There are a surprising number of places where the gamma issue creeps into an imaging pipeline, and one of them is shading. For instance, how do you render a diffuse surface that's lit by several lights? If you were to ignore the fact that computer displays aren't linear, then you wouldn't need any power functions at all... you could simply add the result of the lambertian shader for each light. That's what OpenGL's fixed-function implementation does, and it looks like this:
It looks kind of right, but an artist (or a visual effects supervisor!) will notice that it's a bit off. The area in the center where the two lights overlap seems too bright (brighter than the two lights put together) and as a result it looks a little bigger than it should be. (Notice how the top and bottom points of that center oval seem to stick out a little too far.) The problem comes from the fact that we're pretending our monitor has a gamma of 1.0, where in reality it has a gamma of 1.8 (or something else, if you're not lucky enough to be using a Macintosh :-P).
How do we fix this problem? We want our lights to look natural, and to do that we have to compensate for the way our monitor distorts the image we're trying to display. The result of our lighting calculation is linear (because if it weren't, the calculation wouldn't be valid), but the monitor isn't going to show us brightnesses that are proportional to the values we request. It's going to show something that's as bright as our pixel values raised to the power 1.8 (or 2.2 or something else, depending on the details of the display hardware). So let's cancel that out by raising each of the R, G, and B components of the final color by the power 1/1.8 (that is, 0.5555). Then the displayed brightness will be proportional to what we compute it should be, and everything should work right. Right?
Sure enough the image looks more natural, but now it's too bright! That's no good. We set the intensity of our lights to 100%, gave them zero fall-off, and made the shape white, so we'd expect the color of a spotlight to be the same as the color channel of the light that made it.
The problem is that if our lighting pipeline is now linear, and the inputs to it (like the light's color channel) are expressed in non-linear "monitor colors" then we need to gamma-correct our inputs too, not just our output. We need to do the same thing to colors that the monitor does: raise them to 1.8. If we do that, our render finally looks correct:
The individual spotlight discs are the same color as our lights' color channels, and the area in the center looks natural... like it should look if two lights illuminate the same surface. Yay! Happy clients make happy artists.
If we're gamma-correcting the color of lights, should we gamma-correct the brightness of those lights, too? No! A brightness control is linear by definition. Users might expect a color parameter to match all of the other applications on the computer that aren't gamma-aware, but it's important for brightness to be linear because when the user drops it to 50%, the intensity of the light must look like it's half as bright. Some users might notice that setting the brightness slider to 50% doesn't do the same thing as dividing the color component values in half, but that's how it goes. It's a small price to pay compared to the messy alternatives.
What else? Are there other areas where we need to worry about gamma? Oh yeah, definitely. There are a lot of them. You have to account for it in order to do a good job anti-aliasing the edges of shapes. Similarly, you have to linearize before resampling an image. And of course gamma is crucial if you're trying to match colors, but that's way beyond the scope of this article. :-)