A Field Guide to Gamma Correction

Gamma Correction can really mess you up. It can screw up your web pages, it can ruin your composites, it makes rendered lighting look incorrect. Even simply blending colours together requires that you know about Gamma. The concept of gamma is pretty straight forwards, and if the world was a sensible place it wouldn't be a problem, but different systems handle gamma differently (or oftern not at all!), so be carefull out there...

What is Gamma

Images are manipulated as a set of values representing light energy - on your computer as 0,1,2,3,4...255, or electrically in a video system 0.1Volts, 0.2V, 0.3V...1V. You'd think that 8 was twice as powerfull (I'm not going to use the word "bright" as thats a whole other page!) as 4, or that 0.6Volts is twice as powerfull as 0.3Volts. The former case is sometimes true, while the latter is never true. The relationship of voltage to light energy is non-linear, and it varies between different pieces of equipment! Fortunatly all you need to know is the Gamma value and the rest can be taken care of for you. A Gamma of 1.0 is linear realtionship, while the recommended NTSC standard is 2.2 - quite curved. Most systems are someway beteen this.

When manipulating images digitally most software (renderer's, compositors, paint packages etc) assume that your system is linear - that is 8 is twice as powerfull as 4. They need to do this, as they're performing calculations about the ammount of light bouncing round in your digitaly simulated scene (for example). At some point this linear data needs to get converted to the non-linear space that your screen uses.

Ideally you'd simply program the Gamma into your display driver, and then forget about it. On some systems you can do this, but others ignore gamma and feed the digital values from the application stright to the video system - the result is very dark images. Others perform some correction but insuffucient.

To make things worse, upon finding their image is dark, users adjust the lighting to correct. They then transfer the image to video tape through a calibrated system and find it is now washed out! Then they blame the transfer system. Renderering systems oftern allow you to set the gamma of the final image (RiExposure in RenderMan), but setting this for your screen will prevent compositing, and the image may be incorrect on other displays.

What can I do about it?

First you need to find the Gamma value for your current display system. Some systems allow you to set it while others have a built in value which you can't change. If you can't change it then you need to convince your user level software to do the compensation. As a last resort you can manually gamma correct your output images for a particular target system.

SGI and NeXT both handle Gamma correction as part of the windowing system so its not much of a problem, while NT is a total disaster. For that reason, under NT all my apps have a gamma correction function built in (this may not be implemented in all currently released versions). The default value of GAMMA is 2.0 (which looked good on my machine), but you can control it by setting the environment variable GAMMA. DCT apps on non-DOS based platforms assume that you're display driver is doing the right thing and apply no gamma correction by default. However you can still use the GAMMA variable to make DCT apps gamma correct themselves.

What is the required gamma for my system?

Strictly, the gamma correction required varies even between identical machines, though in practise you can probably get away with a single value across similar systems - any correction is going to be better than none. Fortunately I've written a program to help you find your gamma value.( NT, FreeBSD, Linux, SGI )

Run "gammaCalc", and a checker board pattern appears. By clicking from left to right across the image the relative brightness of the squares changes - click until you find the spot where the brightness of the squares is as uniform as possible, then quit. The value of gamma will be printed to the terminal when you press space (or q to quit). Depending on the version of gammaCalc you're running you may be able to use the right mouse button to check the gamma correction for each channel indepentantly.

For this to work you should set the gamma correction of your display to 1.0 before running the App. Under NT you must set the environment variable GAMMA=1 to disable my display library from double correction (sorry...). Ideally you should now be able to set your display drivers gamma to be the value just calculated, and forget about things. This can be done on most Unix based platforms by using the xgamma command.

I can't set my gamma!!!

Under NT I'm not aware of any global gamma settings. DCT code should pick up the environment variable GAMMA. Set this to the value produced by the "gammaCalc" program. Images then viewed using the following program will then be "correct". (NT,Linux). You can check your gamma is set correctly by viewing the test pattern.

Viewer makes my images look washed out!

Now I'm sorted but things look wrong on my friends machine!

Can't I fix things up after I'm done?

If you've produced images with an uncorrected system they'll need correcting back to a linear space. Similarly if you want images to look good on an uncorrecting system you can pre-process them to look correct when viewed on a system which NEEDS a particular value of gamma but isn't using it.

OK - lets make things easier...

You've got an image which looked good on some random (uncalibrated) machine, but under your new nicely calibrated system they're wrong - probably washed out. You need to find out what gamma correction that machine SHOULD have had (say 1.9), and apply the inverse. If your machine isn't gamma correcting either, but you know the value (say 1.6) you can build that into the conversion also, so the image should look the same as it did on the remote machine! To do this use "regamma":( NT, FreeBSD, Linux, SGI)
regamma 1.9 1.6 image.tiff corrected.tiff

Suppose I've got an image that I produced before I discovered gamma correction. I've found that I need a value of 1.6 for my machine, and I've set up my display to do that. I can fix my old images so they look like they used to using:
regamma 1.6 1.0 image.tiff corrected.tiff
I'm converting from an old gamma of 1.6 which was build into the image, and making the image linear (1.0).

To convert an image for viewing on an uncalibrated system when it has been produced on a calibrated machine we run:
regamma 1.0 2.2 image.tiff corrected.tiff
where 2.2 is the gamma of the non-correcting machine.

Of course all this image processing is degrading the quality of your image. That's probably OK when you're just viewing it, but repeatedly shuffling it between machines and applying regamma will rapidly degrade the image. One way to avoid this is to use a higher bit depth - say 16bits per sample. Then you can correct and everything will still be accurate enough.

Of course the real answer is to calibrate all your systems and always work with a Gamma of 1.0!

Run that by me one more time?

Lets take some senarios. Say you're on an SGI, and run gammaCalc. It will probably look about right, and you can get on with your business. If its wrong, then set the display gamma to 1, and use gammaCalc to find out what it should be (default is 1.7). Then set the display gamma to that value and you're sorted. The only problem you'll have is if you want to export an image to a device which isn't performing correction in which case you can either do your final render with that machines gamma value, OR use regamma to convert to that machines gamma.

On NT you should find your gamma using gammaCalc, then set the environment variable GAMMA to the result. Then preview all your images using "viewer". Clearly this isn't ideal so an alternative is to get the images looking as good as possible under NT's normal imaging system, then use regamma to convert them to linear (or whatever else is required) when you export them (regamma 2.0 1.0 xxx.tif out.tiff). If this approach is used, then lighting in rendered images may be incorrect. To fix this set the output gamma of your renderer to be your screens gamma, but rememeber that the images should be linearized before comp'ing or exporting.

So now I will have perfect image reproduction...

Sorry, 'fraid we've only just prised the lid off a really wriggly can of worms. Getting the gamma right goes a long way to making images look better, but it's only a start. Having set up your system, remember to tape over the controls to your monitor otherwise you've just undone all your good work! Simlarly changing lighting conditions change the way your eyes perceive things. Your working environment should be neutrally lit and coloured. That goes for the desktop too - no more dayglo borders on windows, and that manga backdrop has to go.

If you're really serious about colour fidelity then you need to look at the phosphors and gamut of your monitor. At the very least you should really calibrate the GAMMA for each channel seperatly - My Sony Trinitron seems to put too much green into the mid values, sugguesting a lower value of Gamma required for that channel. The very idea of colour is built on some very shakey foundations, and unless you've cash to throw at the problem there's not too much you can do - a decent reference monitor goes for about $7,000 (Read Hall'89 for a cheaper introduction to the concept of colour).

Having said that it can't be done, I do have plans for software to manage colour correction between platforms, but I figured most people would need to get their head round gamma first! As always development will be driven by user feedback...

Ian Stephenson.
DCT Systems
NCCA,Bournemouth University