One of the programmers here swears by his display gamma of 2.5, but I look at our art on his screen and it looks heavily washed out. So I'm trying to calibrate the artists monitors here, and I'm getting a bit confused.
I can evaluate the gamma of my monitor with this guide...
http://www.aim-dtp.net/aim/evaluation/gamma_space/index.htm
But after setting the black point and comparing my desktop with the gamma space charts, what am I supposed to do with the gamma number?
The image that has the best grays for me is 2.2
http://www.aim-dtp.net/aim/evaluation/gamma_space/22.htm
So do I just set Adobe Gamma's "desired gamma" to 2.2 and adjust its gamma setting to that value?
I guess I don't understand why you would use the display driver's gamma setting. That's what the programmer is using to set his gamma, which seems wrong to me. He's using Display Properties > Settings > Advanced > GeForce tab > Color Correction > Gamma (we're mostly using Nvidia cards, though some ATIs are scattered around).
Maybe someone can help me sort this out.
Replies
You place the puck on your screen, and it'll go through a wizard and calibrate your display. As long as you set a policy or standard for certain steps, you should be able to get all your monitors on the "same page" colorwise.
2.2 is widely considered a "standard" gamma, particularly for PC users. It works well with the sRGB color space.
1.8 is frequently used by Macintosh users + pre-press production/professional printer people.
If you get a colorimeter, I'd suggest removing Adobe Gamma.
To the best of my knowledge Adobe Gamma doesnt do anything spectacular or better than your video card drivers gamma correction though, besides eat up more of your RAM.
I guess I'm trying to understand what's the point of following that website guide to find out what gamma setting my display uses, if in the end I don't actually input that setting somewhere? Sure that guide helps me set the proper color and a good black point, but is there a reason I need to know the gamma setting?
I see the programmer setting his gamma via his display driver, but is this the right way to do it? Seems wrong, since it looks so horrible. He's a bit color blind though, so maybe that has something to do with it. Also he's using a laptop LCD, which he says doesn't allow any brightness/contrast adjustments.
Anyhow, he says it's the standard, but if he doesn't calibrate to a dithered test image, how does he know the resulting colors/brightness are "standard"? I look at a color palette on his monitor and it looks really washed out and lacking range.
Basically I just set my monitors till the white looks white and the black looks black, and the shades of grey in between are distinguishable on this chart:
http://www.aim-dtp.net/aim/download/monitor_gamma/10.gif
Surely that's all you'd need to do, right? If your programmer's monitor is making stuff look washed out, then I think you'll find that the shades of grey closest to white on his monitor, will probably appear to be white. I just bring that up on the screen and adjust brightness/contrast. I would only use the Display driver properties for tweaking colour temperature, personally.
But how did you adjust your grays MoP? I was using Nvidia's gamma control slider, since my monitor brightness/contrast controls were used for the black point.
However, my monitors seem to have pretty decent settings...