I just got a new second monitor but the color on it is tinted slightly blue, and im not sure why. My main display draws colors perfectly but anytime I throw in a second one its always off a bit. Obviously I need to be able to see colors correctly, does anyone know how I can get them running the same colors?
Both at 1600x900
One is HD but im pretty sure that only applies to movies connected via DVI
One is samsung and one is hp connected via hdmi
Im on an ATI 6570
Replies
tried the Windows 7 Display Calibration tool? It might help you
Why would they make the colors off in the first place... no matter what the brand
Well, it's the same thing with TV:s, it all depends on the components, to get a really good screen for photography and art you should not buy a cheap one. It's also recommended that you buy two of the same brand.
If your second monitor is too dark , try and figure out if it's the brightness or the gamma. Second check what kind of "user-settings" it uses : "Cold, Warm, Custom etc" Try to make both match up, my bet is that they never really will match up 100% though. =/
I had two Samsung monitors, both 22", both the same res, but one just a newer model of the same display and one had a noticably warmer hue, or one had a cooler hue depending how you looked at it. Without a hardware calibrator I could never get it perfect which was annoying.
I don't know if I trust Samsung, they really have a lot of different versions of the same models. C, B, A, S-panels etc etc, all different quality panels of the same model for the same price, it drives me crazy.
Not only that, but they insist on 'developing' those stupid mode settings like 'Movie' or 'Games', and then on another model they're called 'Entertainment' or 'Cool' etc etc
My thought on the subject is that the average person doesn't adjust their monitor, or if they do, they typically adjust them to their personal liking. There is no guarantee that what you see is going to look the same way on someone elses monitor so I try to factor that into the way I work.
At Work they usually try to calibrate everyones monitors. I let them calibrate my viewing monitor but I have them leave my workspace monitor alone. I find that if I can get my art to look acceptable on both that it will usually still look good in the conference room and at home.
- BoBo
[ame]http://www.amazon.com/Datacolor-DC-S3X100-Spyder-Express/dp/B0037255LC/ref=sr_1_2?ie=UTF8&qid=1315392014&sr=8-2[/ame]
i calibrate all my displays with this one and the results are quite good. of course you cant get more colors out of your display, than it can display, but at least they are more accurate representations.
I agree with this sentiment. I always wonder why people go to such lengths to achieve 'correct' color when 99% of end users are probably not going to have corrected their displays anyway. I can understand it if you design for a fixed medium like print, but for digital content, it seems like a wasted effort.
I don't think I've met a single consumer in my life that has calibrated their display properly, so as BoBo said, what you see on your display is likely to be different on practically everyone else's anyway so unless you're working for print or a client who you know has properly calibrated their displays, then I wouldn't worry about it.
I put together an image to illustrate what happens:
If there is such a thing as 'correct color', and all displays can be calibrated to display this, why don't manufacturers make their panels default to 'correct color'?
Or is it that they do, and the environment in which the display is used affects the color (temperature, age of panel, lightness of room etc) and it needs to be calibrated by the end user?
But still I find myself thinking there should be a way to test how the game will look on players' systems. This is mainly because console gamers mostly play on larger displays with longer view distances. It seems inefficient to make this on the artist level. How does large studios manage this kind of stuff? Any ideas?
Most articles/discussion I read state you should be trying ot work with a luminance of 120 cd/m2 but I often find that to be quite dull.(Using an Xrite i1 Pro on a Dell U2410 Monitor btw).
Considering most people won't tweak their screens it I'm concerned a lower luminance could mean blown out images in the general publics screens.
Anyone dealt with this, any suggestions?