Hi all,
Question for game devs . What are your processes for calibrating tvs/monitors at your work stations? There isn't really a good way to predict what the viewing environment will be for the shipped version of the game as people play in all sorts of lighting situations so I'm curious to hear how others have dealt with this. Also has anyone figured out a good way to calibrate for HDR tvs ? ive heard mixed things about it . Some claim you just calibrate it like you would a non hdr while others say that is actually impossible to do, and i'm really having trouble figuring out where to get real information on HDR technology in terms of calibration processes. For example has anyone just hooked up a calman device to an hdr tv and calibrate as usual?
Replies
http://polycount.com/discussion/comment/1610179/#Comment_1610179
The main problem is wide color gamut , not brightness/contrast ratio. Any wide gamut monitor would show you insanely oversaturated acid colors, like grass being emerald green, faces are reddish etc if content color space is just usual sRGB (smaller gamut) . Since games are usually being done in sRGB standard and contrary to Photoshop are "color non aware" applications . i.e. not adjusting colors for specific color gamut and monitor profile or in a word just know nothing about monitor they are shown on , its on TV side to correct sRGB content to be more or less color accurate on its wide gamut monitor , even if it creates nasty banding artifacts. Would any random HDR TV do it , I have no idea.
There is a good chance they wouldn't (because of artifacts) and would show you insane colors, calibrated or not. So I would first find "sRGB" or "photo" mode on TV for games and check it with some football match. If grass color is natural and what you see with your eyes it's ok. If it's kind of "improved" i.e. super nuke green , acid etc, in that case even calibrator may not help since monitor profiling usually not involving specific color space emulation.