Home Technical Talk

Help in Understanding Color Space for Creation and Implementation

polycounter
Offline / Send Message
killnpc polycounter

i have a surface level of understanding sRGB and color profiles and was wondering what key aspects i should understand for game asset work. as far as i understand sRGB like all color profiles, translates RGB values to a color/value, but sRGB allows a 50/50 distribution of the spectrum with minimal gamma correction. as far as i'm aware, linear color space is the standard used for shaders, my assumption is that its gradient lean toward the lighter side of the spectrum, using less dark ones, so its calculation have more light values to work with within the 256 value range per channel in order to account for and blend a material's highlights.


i also know, that when creating texture assets, i personally would want an even distribution of values to paint and work with. in my head i liken it to painting where you're running low on black paint so you can't mix it to create a lot of dark values, so you paint values lighter on the spectrum. or like painting with traditional medium under a blue light where the results will change when the work is later displayed under white studio light.


what i mean to ask here is, am i off with my initial assertions and what is the logical process to adopt in content creation to asset implementation in engine. what color profile do i use in Photoshop/Substance Painter for 3d game assets and what color profile do i convert my 3d art assets to for a game engine for the best results?

thanks

Replies

  • Eric Chadwick
    Options
    Offline / Send Message

    It's pretty simple when you boil it all down. But gets complex when you get down into the nitty gritty math details.

    Photo sourced textures and the like (diffuse, emissive, specular color) should be saved in sRGB (or gamma 2.2, basically the same thing). All other textures should be saved in Linear (or gamma 1.0, same thing)... roughness, metallic, normal maps, ambient occlusion, etc.

    Renderers all convert textures into Linear during rendering, makes the maths easier and the shading less artifacty. Then at the end just before outputting to the screen, the rendered image is converted to sRGB.

    There are exceptions (there always are, ugh) but that's it in a nutshell.

  • Eric Chadwick
    Options
    Offline / Send Message

    but sRGB allows a 50/50 distribution of the spectrum with minimal gamma correction.

    Not really. sRGB and gamma correction are basically the same thing (very slight difference, not enough to matter for us artists).

    Gamma correction is used by all cameras, when they store photos in jpg format.

    Why?

    1. 24bit color means that JPG has only 256 levels of red, 256 levels of green, 256 levels of blue.

    2. Our human visual perception system has greater sensitivity for dark values than for bright values.

    The low value range of 24bit color, when combined with the biased human perception, means we tend to see banding/artifacting in the darker parts of a 24bit image.

    So... gamma correction biases the values, by allocating more values in the darker parts, and less in the brighter parts.

  • Eric Chadwick
  • killnpc
    Options
    Offline / Send Message
    killnpc polycounter

    excellent, thanks Eric, you're awesome.

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    In terms of making stuff for games in most game engines...

    Your colour textures (basecolor/emissive) should be stored using values in sRGB space

    Your data textures(normal, opacity, roughness and everything else) should be encoded using values in linear space.


    The issue is not what data you store, it's what the thing reading it is expecting.

    Most game engines do not read color space meta data from files and many of the tools we use do not embed it in the first place so we are forced to make assumptions and to manage it manually


    It's a shit show.

    We all need to move to properly managed workflows and file formats that are for for purpose(ext)

  • gnoop
    Options
    Online / Send Message
    gnoop polycounter

    Isn't Photoshop also just reads "sRGB" or "Adobe RGB' etc from the files . A color space the file is intended to store values . And all the magic happens on correcting/ transforming color values for your specific monitor profile by Adobe color engine . A profile created by calibrating device.

    I thought Unreal at least should do something similar nowdays by its ACES color process by outputting different but still proper color values for different kinds of monitors at least , wide gamut, HDR or simple sRGB one. The only difference maybe it doesn't do it for your specific screen unit , specific profile of your monitor for absolute accuracy?

    A question rather than statement. I don't work with Unreal myself. And always use only sRGB monitors to avoid any extra pain in my a... :)

    Still accustomed to kill super vivid greens although because people seeing them on expensive HDR monitors see something nuke acid if getting non transformed sRGB signal

  • pior
    Options
    Online / Send Message
    pior grand marshal polycounter

    "Isn't Photoshop also just reads "sRGB" or "Adobe RGB' etc from the files . A color space the file is intended to store values . And all the magic happens on correcting/ transforming color values for your specific monitor profile by Adobe color engine . A profile created by calibrating device."

    Just wanted to mention that this paragraph is pretty much undecipherable ... which doesn't really help when the topic is already quite technical :D

    - - - - -

    Now on to the topic itself : I've personally found that there is a bit of a chasm between things sounding like good practice and what actually works on the daily. For instance I've used a high quality, high gamut monitor in the past that required hardware calibration and a color profile to be loaded. Now that's great, but that can also quickly fall apart as soon as one piece of software in the creation chain just doesn't support accurate color profile rendering. From the software used to review work (random internet browser, or whatever is associated with opening image files on the system driving the TV or projector of a conference room) to any little tool used in the creation process (printscreen apps for paintovers, Pureref for moodboard display, and so on) - there's just a lot that can go wrong and does go wrong.

    If anything I feel like sticking to sRGB for everything as well as "pseudo-calibrating" the monitors of a dual monitor workstation by eye (using the good old on-screen controls and/or the Nvidia control panel) really goes a long way and is less of a headache than anything involving color profiles and hardware calibration devices. Cross checking the good display of the pictures on a modern smartphone + a cheapo tablet can be useful too.

    And from there as soon as something seems off in any given software, it's a sign that an exotic color profile has been introduced ... and needs to be squashed out :D

    But of course I also understand that to people involved in professional photography this approach would sound like heresy :D

  • gnoop
    Options
    Online / Send Message
    gnoop polycounter

    I just meant that Photoshop or any so called "color aware" soft just read image file sRGB , aRGB etc flag form metadata. It's no much different when a game engine assumes all color textures be sRGB. You just know it and input texture that way. Reading from metadata IMO seems redundant for me.

    And with the magic I meant that Photoshop color engine compare RGB values stored in the image with your monitor profile then calculates and shows accurate colors on your screen . Ideally very close on any monitor you use. Well in case a decent calibration device (colorimeter) did the profiles for those monitors.

    The difference appears when a game output signal is always sRGB too, even for nuke vivid HDR or gets re-calculated for a color space/gamut of connected monitor . As I read Unreal should do it with its ACES support but have no idea if it does it for exact monitor profile of your screen or just generic sRGB or whatever profile and how it knows what kind of monitor is connected .

    I did calibrated on eye too until once realized how off and shifted my colors are. And never been able to get a good reliable result on my eye vs something like my x-rite colormunkey.

    It's simple like 2x2 . All is done automatically in 4-5 minutes. It's a lot less headache actually than on eye and included software is simple like 2x2 . Even having one button hit and forget until done mode. The device is amateur level and restricted to work with all that pro level complicated soft anyway. Still reviews said the result is mostly same.

    I believe $150 colorimeter + $300 sRGB IPS monitor is always better then $1k monitor without colorimeter.

  • pior
    Options
    Online / Send Message
    pior grand marshal polycounter

    Oh yeah I should have been more clear myself : I see nothing wrong with hardware calibration in and of itself, as getting a monitor to look right is always a good thing. My issue is when color profiles start creeping up inside renders or texture files for no good reason, resulting in colors being off in any non photography-oriented application trying to simply access the values of pixels - like game engines loading up texture files, or any other non specialized application (like the aforementionned screengrab and moodboard tools).

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    That is a problem in practice for sure


    Where we might differ in opinion is that my view is that we as an industry should start doing things properly rather than avoid the subject altogether.

    That doesn't mean storing our data in rec709 etc because that's only useful for camera captured data, it means our file formats and software all need to support the idea of a color space so we aren't making assumptions about the state of our input data.

    The irony is that if this is done correctly you never have to think about color spaces, if its not - you have to think about them all the time

  • gnoop
    Options
    Online / Send Message
    gnoop polycounter

    As for on eye calibration or using a device I usually use this picture to check

    Thr picture in 100% scale should be perfectly gray gradient without any vertical half split or colors visible.

    And thankfully many monitors are just fine at their default sRGB settings nowadays . The picture show gamma linearity but doesn't show you anything of color temperature or color space mismatch .

    But at least you know that evrything is not that bad when the stripes are gray

Sign In or Register to comment.