Home Technical Talk

Albedo Creation Confusion

polycounter lvl 6
Offline / Send Message
monitorhero polycounter lvl 6
Hello guys,

I registered to this forum to get your insight on PBR workflows and texturing for those. My question on albedo maps:

Albedo
If I understood the idea behind it an albedo map should contain no lighting information except the pure reflected diffuse.

So should there be any gradients in an albedo color map at all? Like in this example: http://shadermap.com/docs/mm_albedo_map_from_diffuse.html
Because the darker colors are just brightened shadows in this example, aren't they?

When I think of an albedo map I see only solid colored areas. Because every surface irregularity, should come from roughness, normal etc. right? (e.g. fresh paint on a surface is just one color, out of the bucket)
Or do I have to take into account if a part is modeled or flat in my 3d model, when creating my albedo?

For example, if I have a plane geometry where the light source in my scene could not create any shadow gradients my albedo should contain a color gradient? And if the part was modeled convex or concave it could just be a solid color in the albedo?

I haven't found any good explanation of the albedo workflow when creating my own painted maps and what colors can be used.

Thanks for the help in advance.

Greetings
monitorhero


Replies

  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range
    As far as I see it the albedo is a representation of pure diffuse/reflected light containing absolutely no lighting info/ao. The example you posted is an auto generated map from a photo and I think it is a bad example as the diffuse in the generated albedo in the shadowed areas of the original image looks like ao. The diffuse gradients should be derived in the shader when lit and shouldn't be in the albedo  map.

    This is my understanding anyway. I'm sure @earthquake  on the forums here will give you a better (correct) answer.
  • Quack!
    Options
    Offline / Send Message
    Quack! polycounter lvl 17
    With the Base Color, which I prefer over albedo as far as terminology goes, it should only represent the perceived color. No light information needs to be in the Base Color.

    "For example, if I have a plane geometry where the light source in my scene could not create any shadow gradients my albedo should contain a color gradient? And if the part was modeled convex or concave it could just be a solid color in the albedo?"

    You kind of answered the question here. Unless you WANT to fake light information, don't fake light information.  At the end of the day it is up to you if you want to do it the old school way and fake the light or embrace the tech and let it work it's magic.

    As for the flowers, and many organics for that matter, there is generally a TON of color variation and gradation.  Often this is subtle but still exists. So those gradients may be part of the cellular makeup of the petal and would be in the base color.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    @musashidan That was also my understanding and the picture illustrates a bad example. But also this video (which is refered to a lot) seems to do the same thing, so technically it's not correct: https://www.youtube.com/watch?v=KKQZN3eoKUo&list=UUfyvlGW2gDNh5YH6CFh7BtQ

    @Quack!  Yeah I kinda did, but in a low poly model I need faked lighting even in my albedo otherwise the whole thing would look dull, doesn't it?

    I see a lot of albedo maps on websites and every time I think they look wrong cause the shadow areas just get brightned when they really should be removed completely.
  • Quack!
    Options
    Offline / Send Message
    Quack! polycounter lvl 17
    If you use PBR properly your model won't look dull, including a flat base color.  If it does look dull, it is most likely a problem elsewhere, such as lighting, bad material values, lack of proper textures, etc.

    If you put no lighting into your base color then you want to make sure you are using an Ambient Occlusion map in real time, which will give you proper reflection dampening in shadowed areas.  Don't worry about what individual maps look like in a modern PBR environment, what matters is how they all come together in a scene.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    Quack! said:
    If you use PBR properly your model won't look dull, including a flat base color.  If it does look dull, it is most likely a problem elsewhere, such as lighting, bad material values, lack of proper textures, etc.

    If you put no lighting into your base color then you want to make sure you are using an Ambient Occlusion map in real time, which will give you proper reflection dampening in shadowed areas.  Don't worry about what individual maps look like in a modern PBR environment, what matters is how they all come together in a scene.

    Thanks for the explanation. But for realtime rendering doesn't there still need to be some kind of faking involved? Because I couldn't model each fiber of a tshirt for example to get the proper reaction to light.
  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range

    Thanks for the explanation. But for realtime rendering doesn't there still need to be some kind of faking involved? Because I couldn't model each fiber of a tshirt for example to get the proper reaction to light.
    No, this is the whole idea of physically based shading/rendering: to eliminate old hacks. Same as it's been in offline rendering for years once ray tracing became the norm.

    This is where the glossiness/normal/ao map all work in harmony in the PBR workflow, to create the microsurface detailing shading quality under lighting. The AO, as @quack says, should always be plugged into the shader rather than baked in the diffuse. It is there to ensure that an object looks 'grounded' under direct light and is controlled dynamically by the engine under direct/indirect lighting.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6

    Thanks for the explanation. But for realtime rendering doesn't there still need to be some kind of faking involved? Because I couldn't model each fiber of a tshirt for example to get the proper reaction to light.
    No, this is the whole idea of physically based shading/rendering: to eliminate old hacks. Same as it's been in offline rendering for years once ray tracing became the norm.

    This is where the glossiness/normal/ao map all work in harmony in the PBR workflow, to create the microsurface detailing shading quality under lighting. The AO, as @quack says, should always be plugged into the shader rather than baked in the diffuse. It is there to ensure that an object looks 'grounded' under direct light and is controlled dynamically by the engine under direct/indirect lighting.
    Yeah I know that this is the idea. But aren't textures/maps not some sort of faking too? To me textures seem to be only an appoximation because I can't model every say dust particle and assign the appropriate material to it because there isn't enough computing power. But maybe I'm having a  misconception here :smile:
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    Another question regarding the metallic - roughness workflow:

    In this chart http://blogs.unity3d.com/wp-content/uploads/2014/11/UnityMetallicChart.png it says under Metallic R "this value will always be greyscale" but contrary to this statement the guys from Allegorithmic say that the metallic map should be only black and white most of the time as @somedoggy also mentions in this thread 
    http://polycount.com/discussion/117309/faq-how-u-make-dem-mats-hands-on-mini-tuts-for-making-materials-and-texturing/p5

    "For the Metallic style PBR, they lock nonmetals at 4% reflection which is a refractive index of 1.5 for the material. This is good enough for most everything. The metalness map replaces this 4% with whatever is in the albedo texture. This is why people advise against painting grey values in metalness maps. For example a 50% grey would be 50-50 blending between that 4% reflection and the albedo textures color, and then removing 50% of the albedo. Note that a transition between rusted and non-rusted metal would have grey values in its metalness map in order to look decent, but this can be rationalized as thinking that at a sub-pixel level it's defining the percentage of metal versus nonmetal."
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    perna said:
    I'll disagree with the generally held Polycount consensus here.

    PBR is not physically correct. It's a very rough approximation, so like always we still need to fake a large number of effects manually.

    For the flowers example specifically, I'm sure you can imagine how lifeless the colors would look without a high-quality SSS solution, in which case it would help to lower brightness and increase saturation in AO-areas.

    You still need to very much have an artistic eye and be able to make manual adjustments to make something look right, as opposed to just referring to some chart.

    As for the metallic workflow... there's been big discussion around this. It's simply scientific facts that you won't find pure metals anywhere except perhaps in a laboratory in outer space. You'll need to work with grayscale, not just 1/0. A lot of people will disagree, so let's perform a simple thought experiment here:

    Leave a steel object in a low-humidity environment. At which point in time does this material turn SUDDENLY from 1 to 0 metalness? How could you possibly be able to capture the transitional stages without using grey values?

    Then again I advice using specular workflow, not metalness.
    I completely agree with every statement  .   Moreover   for many materials where surface is irregular enough there are a lot of   self shadowing   that  a game engine would never do by itself and there is no other way but just fake them in the textures .  Without those small shadows  things would never look real.  
      Also for almost every environment tileable texture which usually have pretty big texel   you should keep in mind that normal map in many cases would have not enough resolution to reflect light in proper direction  for all those  small surface   grains and chips  so you would have to  fake a lot , same  as before PBR.    And  in fact   have to non stop fight with inflexible locked nature of  pure metalness variant of shader .       I personally almost broke my mind  before finally realised that I do have to use grayscale values in metalness channel.

  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    @perna @gnoop I totally agree with your views. I also don't understand why this metallic/roughness workflow becomes more and more of a standard and is promoted so much Allegorithmic. In my opinion it's the weaker of both. I would prefer specular but I haven't found any tutorials for using it with Substance and Unity. But I like those programs and would like to use them but I can't wrap my head around how to use it.
  • EarthQuake
    Options
    Offline / Send Message
    An albedo should typically not have any lighting information in a modern PBR pipeline.

    If you want to account for shadows in ambient light (or shadows within shadows) you should use an AO map, which is typically loaded as a separate input so it only applies to the diffuse ambient contribution. Otherwise you get this problem where dark areas in diffuse from AO are always dark, even when lit with dynamic light sources.

    Exceptions to this would generally be better represented with some special purpose shader. As Per says, if you have an object that scatters light, you may want to create the content a bit differently, but ideally here you would use a shader that approximates SSS, because light scattering and AO or baked in shadows are two very different things.

    Now the whole specular vs metalness workflow thing, I don't see the point of arguing about it like it's your favorite sports team. Both work slightly differently, both have pros/cons, both have a set of rules and when it comes to most things art, you should understand the rules before you break them. That's what most documentation, charts, and basic explanations you'll find on the subject are, a set of logical rules to get you started but not some religious text you need to stick to for fear of being cast from Polycount as a heretic.

    In most case, it won't be up to the individual artist to choose which to use (for instance, advising someone who is working on a UE4 project to use the specular workflow would be pointless) so you should learn how to use both. Personally, I find working with the specular workflow more comfortable, but that's simply due to muscle memory, I've been using it for a very long time. I can get essentially the same results with the metalness workflow, again unless it's some strange outlier material like SSS, gems, etc that would be better represented with a purpose built shader in either workflow.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    @EarthQuake
    Thank you for your answer. I know about the pros and cons but I prefer a method with more control (even if it might result in more mistakes if you don't understand to use it properly) plus I saw the substance painter video where we see both examples and the metalness workflow has a lot more edge artifacts which I personally don't like. It's not really a religious thing :smile: Coming from Vray rendering with a lot more control I prefer the specular workflow cause it has some similarities.

    Another question:

    in this thread Xoliul mentioned that the specular values from the quixel scan data are "bogus" told by one of the employees himself.

    So is there any reliable resource at all to work with when using the specular workflow?
  • Synaesthesia
    Options
    Offline / Send Message
    Synaesthesia polycounter
    Only some of the data is incorrect - although that chart is so old that potentially all of it is incorrect, given our new approach to scanning materials.
  • EarthQuake
    Options
    Offline / Send Message
    Yeah Teddy told me the data in that chart is probably wrong, but not massively so. It's still valid as a means to compare where various materials fall on the spectrum. As I wrote in the tutorial that that image comes from, creating art content isn't a paint by numbers process where you just look up numbers on a chart, but having measured values to look at is certainly a valuable learning tool.
  • Synaesthesia
    Options
    Offline / Send Message
    Synaesthesia polycounter
    Indeed so! Charts are great reference material, but what's better in some cases is eyeballing it. A physically accurate value may not look as good as a fudged number that you eyeballed to match a reference image. I'd imagine a lot of factors come into play, most notably the fact that no two dirts are identical, no two metals are identical, etc. There's always subtle differences that define the materials.  When Megascans releases (soon, for real this time!) it'll be amusing to see just how different similar materials really are.
  • EarthQuake
    Options
    Offline / Send Message
    The data in that chart is from Quixel's older gen scanner, which wasn't perfect, but is still perfectly valid for basic comparison purposes, which was the intent of that chart.

    Take note of the next accompanying that chart from the tutorial it came from: http://www.marmoset.co/toolbag/learn/pbr-practice

    "Material values from most libraries tend to be measured from raw materials in laboratory conditions, of which you rarely see in real life. Factors like pureness of material, age, oxidization, and wear may cause variation in the real world reflectance value for a given object.


    While Quixel’s scans are measured from real world materials, there is often variation even within the same material type depending on the various conditions described above, especially when it comes to gloss/roughness. The values in the chart above should be thought of as more of a beginning point, not a rigid/absolute reference."


    tl;dr: Reflectance values or whatever reflectivity workflow you're using isn't a replacement for taste, experience, material research, etc.


    Anyway, even if I updated the chart with more accurate data (which I will be happy to do if someone from Quixel wants to coordinate on), what we're likely to see is the reflectivity range of insulators (non metals) being clustered even closer to 4% than is shown in that chart, and metals typically in the 70%+ reflectivity range. What I would love to spend some time researching is metals in various states of oxidization, to see exactly where the reflectivity values fall, as well as various coated metals like blued steel and things of that nature, but I don't really have accurate reference for that.

  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    Another question regarding the metallic - roughness workflow:

    In this chart http://blogs.unity3d.com/wp-content/uploads/2014/11/UnityMetallicChart.png it says under Metallic R "this value will always be greyscale" but contrary to this statement the guys from Allegorithmic say that the metallic map should be only black and white most of the time as @somedoggy also mentions in this thread 
    http://polycount.com/discussion/117309/faq-how-u-make-dem-mats-hands-on-mini-tuts-for-making-materials-and-texturing/p5

    "For the Metallic style PBR, they lock nonmetals at 4% reflection which is a refractive index of 1.5 for the material. This is good enough for most everything. The metalness map replaces this 4% with whatever is in the albedo texture. This is why people advise against painting grey values in metalness maps. For example a 50% grey would be 50-50 blending between that 4% reflection and the albedo textures color, and then removing 50% of the albedo. Note that a transition between rusted and non-rusted metal would have grey values in its metalness map in order to look decent, but this can be rationalized as thinking that at a sub-pixel level it's defining the percentage of metal versus nonmetal."
    An important note about this is that it is pointing out a caveat of the workflow more than anything else: that metalness couples albedo and specular in a way that limits potential materials that could be totally correct. Specular workflows have the advantage in this regard because they can utilize colored specular while also having diffuse contribution, really great for representing a slightly diffused metal. In practice if using metalness you'll not want to just stick to 1 or 0 values, because that's a factory new look. It just means you have to take that coupling into account when authoring your values.

    As said though, "PBR" in its current conception isn't quite there yet. Having a good understanding of the system is what really gives you a leg up, because you know how to bend it to produce for visuals first. PBR is half a conceptual framework, half real physical rules or approximations to them. We're also going to see it really improve in the next few years, which I go off on in a bit.

    I also strongly agree with the idea of diffuse/albedo having no lighting info. Using an AO map works far, far better since the first lighting pass is direct light. Only indirect light receives AO. Another good technique is for noisy surfaces, you can precompute the normal map mipmaps to increase roughness based on distance. UE4 supports this and it's great.

    Now going off on a tangent here just out of personal interest. One big improvement I've been wanting to work on is precomputing an sggx table or making an approximated function, to get multiple scattering (currently the biggest downfall of our attainable realtime visuals IMO). It looks gorgeous but I haven't seen anyone start attempting it yet. Another big thing is having multiple samples per pixel in a single shader pass, which would produce seamless blending between diffuse and specular as well as solving specular aliasing which is a big problem in PBR. NVIDIA actually showed that tech off this year.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6

    What I would love to spend some time researching is metals in various states of oxidization, to see exactly where the reflectivity values fall, as well as various coated metals like blued steel and things of that nature, but I don't really have accurate reference for that.

    I would really be interested in this too.

    @somedoggy

    Those developments sound interesting. When you are talking about sggx you don't mean an approximation like in this paper:
    http://blog.selfshadow.com/publications/s2015-shading-course/activision/s2015_pbs_approx_models_slides.pdf

    I guess this is something different? And is there a link to this presentation from nvidia?

    So thank you everyone so far for the shared knowledge. I might come back to this thread since I am working on a project where I try to use pbr :smile:


  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    somedoggy said:

    I also strongly agree with the idea of diffuse/albedo having no lighting info. Using an AO map works far, far better since the first lighting pass is direct light. Only indirect light receives AO. Another good technique is for noisy surfaces, you can precompute the normal map mipmaps to increase roughness based on distance. UE4 supports this and it's great.      
     


       What If I want to do  shadows  from direct light  casted  by small pebbles ,  beneath a prominent brick on a wall     or grass blades for example?    Those couple pixels one  engine would incapable to render dynamically.      AO would only work within shadows  and not on fully illuminated side of things where it wouldn't  be noticeable at all.   At least it works that way in our shaders.  

       Could you also give a link to that precomputed normal  mips   and multi-sample per  pixel   please.  Google gives nothing.

    Although I can say my problems with normal maps not exclusively  on mip levels only but on close first mip too.   Resolution is just not enough to form small details reflections  properly   while somehow enough for   diffuse color  i.e   1 pix details .

  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth

    What I would love to spend some time researching is metals in various states of oxidization, to see exactly where the reflectivity values fall, as well as various coated metals like blued steel and things of that nature, but I don't really have accurate reference for that.

    I would really be interested in this too.

    @somedoggy

    Those developments sound interesting. When you are talking about sggx you don't mean an approximation like in this paper:
    http://blog.selfshadow.com/publications/s2015-shading-course/activision/s2015_pbs_approx_models_slides.pdf

    I guess this is something different? And is there a link to this presentation from nvidia?

    So thank you everyone so far for the shared knowledge. I might come back to this thread since I am working on a project where I try to use pbr :smile:


    Stable Specular Highlights - NVIDIA: http://developer.download.nvidia.com/gameworks/events/GDC2016/akaplanyan_specular_aa.pdf

    SGGX is an adaptation of GGX to a microflake distribution, they use intuitions gained from that reasearch to produce a BRDF with arbitrary multiple scattering events:
    The Distribution -> https://drive.google.com/file/d/0BzvWIdpUpRx_dXJIMk9rdEdrd00/view
    Multiple Scattering BRDF variant -> https://eheitzresearch.wordpress.com/240-2/

    @gnoop, the normal convolution info is here: http://media.steampowered.com/apps/valve/2015/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf
    As for small objects in textures that should cast shadows, I'd advocate a push for tessellation or another approach that could capture that information correctly (Oldie but a goodie, a variant  like this could work wonders: http://www.valvesoftware.com/publications/2007/SIGGRAPH2007_EfficientSelfShadowedRadiosityNormalMapping.pdf). AO in lit areas looks like such an ugly visual artifact 99% of the time that I think that if you can't do it right, you should do it the most correct way you can. And that means limiting it to indirect light, even if that means sacrificing the fact that small objects would truly lie in shadow can't be properly represented. It looks good enough, IMO.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
  • throttlekitty
    Options
    Offline / Send Message

    Anyway, even if I updated the chart with more accurate data (which I will be happy to do if someone from Quixel wants to coordinate on), what we're likely to see is the reflectivity range of insulators (non metals) being clustered even closer to 4% than is shown in that chart, and metals typically in the 70%+ reflectivity range. What I would love to spend some time researching is metals in various states of oxidization, to see exactly where the reflectivity values fall, as well as various coated metals like blued steel and things of that nature, but I don't really have accurate reference for that.

    You know, I'm going to dig around and see what I can't come up with. I live near SDSM&T, which kinda has enough focus on metallurgy to put "Mines" in their name. I don't know that I'll find what you're talking about exactly since it isn't an ID school, but I should get pointed in the direction of material study like that.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    somedoggy said:

    What I would love to spend some time researching is metals in various states of oxidization, to see exactly where the reflectivity values fall, as well as various coated metals like blued steel and things of that nature, but I don't really have accurate reference for that.

    I would really be interested in this too.

    @somedoggy

    Those developments sound interesting. When you are talking about sggx you don't mean an approximation like in this paper:
    http://blog.selfshadow.com/publications/s2015-shading-course/activision/s2015_pbs_approx_models_slides.pdf

    I guess this is something different? And is there a link to this presentation from nvidia?

    So thank you everyone so far for the shared knowledge. I might come back to this thread since I am working on a project where I try to use pbr :smile:


    Stable Specular Highlights - NVIDIA: http://developer.download.nvidia.com/gameworks/events/GDC2016/akaplanyan_specular_aa.pdf

    SGGX is an adaptation of GGX to a microflake distribution, they use intuitions gained from that reasearch to produce a BRDF with arbitrary multiple scattering events:
    The Distribution -> https://drive.google.com/file/d/0BzvWIdpUpRx_dXJIMk9rdEdrd00/view
    Multiple Scattering BRDF variant -> https://eheitzresearch.wordpress.com/240-2/

    @gnoop, the normal convolution info is here: http://media.steampowered.com/apps/valve/2015/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf
    As for small objects in textures that should cast shadows, I'd advocate a push for tessellation or another approach that could capture that information correctly (Oldie but a goodie, a variant  like this could work wonders: http://www.valvesoftware.com/publications/2007/SIGGRAPH2007_EfficientSelfShadowedRadiosityNormalMapping.pdf). AO in lit areas looks like such an ugly visual artifact 99% of the time that I think that if you can't do it right, you should do it the most correct way you can. And that means limiting it to indirect light, even if that means sacrificing the fact that small objects would truly lie in shadow can't be properly represented. It looks good enough, IMO.
    Interesting papers. How would I use this self shadowing normal mapping in Unity 5 and limit it to indirect lighting?
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    Well this is an old technique that you'd probably want to take a second look at and update. The biggest problem is that it only works from 3 directions, so to make this really useful it'd be smart to find a method of getting more arbitrary directions. Then you'd want to take it into a PBR implementation, which has a lot more restrictions on correct specular (and albedo for that matter) than HL2's renderer did. This means looking at how to generalize it for the microfacet model. I haven't seen any kind of new research like that (or even along the lines of directional radiosity, with parallax occlusion maybe?), tho it wouldn't be surprising to see it at all.

    The original code is in the paper, so if you wanted to you could hypothetically drop it as-is into a custom shader. For quick results just make a variant off their built-in PBR setup. It'd be very easy to do that by dropping multicompile functions into the relevant bits of code.
  • EarthQuake
    Options
    Offline / Send Message
    perna said:

    [...] what we're likely to see is the reflectivity range of insulators (non metals) being clustered even closer to 4% than is shown in that chart, and metals typically in the 70%+ reflectivity range. What I would love to spend some time researching is metals in various states of oxidization, to see exactly where the reflectivity values fall [...]

    Well, anywhere between 4 and 70% ;-)



    Sure, the range is obvious, but what I'm interested in is the distribution. If I had to hazard a guess I would say most real world materials would fall closer to either end of the spectrum than the middle.
  • EarthQuake
    Options
    Offline / Send Message
    perna said:
    perna said:

    [...] what we're likely to see is the reflectivity range of insulators (non metals) being clustered even closer to 4% than is shown in that chart, and metals typically in the 70%+ reflectivity range. What I would love to spend some time researching is metals in various states of oxidization, to see exactly where the reflectivity values fall [...]

    Well, anywhere between 4 and 70% ;-)



    Sure, the range is obvious, but what I'm interested in is the distribution. If I had to hazard a guess I would say most real world materials would fall closer to either end of the spectrum than the middle.
    Maybe I'm missing something here. If the rate of oxidation is linear, the sample distribution will be equal. If not, you would have an over-representation in one end only, not both. Try to think of a scientific justification for there being more cases of 75% oxidation than 25%. To reach 75% you need to pass 25% first...

    I believe that in the "early days" of PBR, some very vocal people were so stubborn about the whole silly 1/0 idea that there's still this bias hanging around.
    Hmm yeah. I guess I'm suggesting a non-linear rate of oxidation? Maybe that's stupid. From my understanding on a molecular level it's either metal, or iron oxide, there isn't really a "half metal" state. By the time you notice something has significantly oxidized, it's probably much closer to the reflectivity range of an insulator. Okay, there's my argument, now tell me why I'm dumb lolol.

    Still, I would love to see some actual real world data on this. Where is a material scientist when you need one?

    I thought about an interesting experiment, taking pennies of various age and sorting them by level of oxidization, taking photos of them all with the same position/lighting, and measuring the highlight value in various spots. But I just brought all the change from my piggy back to the bank. =( Even then, without a proper scanner it's difficult to separate reflectivity from roughness.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    perna said:
    Yep, correct on the microscopic level. Or, at least the transition time is so short that it's practically correct. You don't need to zoom in that far to see how it's not a uniform change across the surface, but "growing veins". But of course, that is irrelevant for us. To the human eye, and for our relatively low-resolution textures, those fine lines get super-sampled into gray gradients.

    There are further complexities. I think part of the problem is that we have some people who are very technically astute but lack the artists' eye driving much of the information in the industry. They like to wrap things up in neat little packages like "it's either 0 or 1". There's a lot of misinformation out there. There are quotes which multiply and become "truths", like for example... for painted metal you need to look up the *paint* values from your chart, bla bla. It's not correct. Can't people remember occasions when they look at a painted surface and know for a fact that metal is underneath just from the visual appearance? You intuitively know it's wrong to just call that surface "paint", so now you have a basis for looking up the relevant scientific information.  But, PBR has become this religious thing where you can't dissent with the commonly held views, so barring this little flip out I better just retreat again...
    What I am also having troubles with is the terminology. I used Albedo on the thread title, but there is also discussion if this should be the term at all. Some still say diffuse, or base color like Allegorithmic. Also this confusion between roughness,smoothness,glossiness. I think there should be a common language defined first before we can discuss things. There is a lot of misinformation spread around the internet, on youtube explanation videos in particular. Hopefully these things get sorted out in the future.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6


    Something to illustrate my problem. Allegorithmic is talking about base color/albedo in all their videos, but in their substance player they use the terms diffuse and specular. But in this example it must be specular workflow right? Also there is a bump and height map which are basically the same (one is 8bit, the other 16bit). Why does this substance need both?

    Another example. Why is the spec map so colorful. Should it be metal? If so why is there no black in the diffuse like it should be in a specular workflow? Can someone explain this to me?


  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    perna said:
    Monitorhero: None of that really matters. You may just be confused because it's new to you. That doesn't mean the industry should accommodate that confusion. It will pass shortly. In the end what you need is the eye of the artist. All these technical things are just ways to reach your goal. You make art with your eye, not with Microsoft Excel or Turbo Pascal.

    Again: There is no set standard for rendering.

    When you see non-metal colored specular it's usually because the shader isn't gamma-correct, or that the company wanted a certain look.

    Remember that all the information about how things work in the real world is just meant to help you gain confidence and understanding. No games exist that look anywhere near realistic, nor have anywhere near the technology to pull off photo-realism, so keep in mind that this is all still very artistic, and artistic choices are made. Especially because reality tends to look dull. Movies don't try to look real.
    I understand what you are trying to say. But that doesn't really answer my question. When there are two workflows like specular and metalness and they create a substance that is totally different from what they explain how am i supposed to follow?
    I don't care about photorealism but more about the workflow which is obviously different here. It's like a mix of specular and the old non-pbr workflow. Maybe it was created before they made their transition to PBR.
    And could you tell me why there is bump and height? Or does one of both cancel out the other one? Because they don't seem to multiply the effect on the texture.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    perna said:
    monitor: It does answer your questions. You express frustration with the fact that there is no one firm, set standard and ask how you're supposed to cope. I answer you that your confusion is only temporary. Have a little bit of patience. These topics are exhaustively covered by free tutorials and documentation. You need to actually work to learn this stuff.

    Bump and height maps are not used in-engine. If your software uses these, simply refer to its documentation.
    I know those topics are covered in a lot of docs and videos, but I also saw videos that presented things confident but were actually just wrong. So it's not easy to find a trustable source. I mean for example if someone uses a glossiness map in a metalness workflow the result is incorrect, isn't it? Or if I would use colors in my metalness map. Aren't there at least some rules you have to follow to not totally mess up your material? To a certain amount there must be rules to follow to keep a pbr workflow.
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    Metalness is just a constraint on specular values because artists got easily confused in the beginning. It's honestly totally unnecessary now, IMO. Internally (in a shader) it's another way of doing specular. There is no such thing as a colored metalness map. That's the whole point. It replaces specular with whatever is in the abledo map, and that's where the color comes from. In the shader it would use a greyscale value and assumes you fed it a greyscale value. If you try to feed in a colored metalness map, it will probably just grab the red or green channel. And you won't know which unless you want to dig through the code. Anyways, the math is very simple.

    float Metalness = MetalnessMap.g;
    float3 defaultSpecular = (0.04, 0.04, 0.04);
    float3 Specular = (defaultSpecular - defaultSpecular * Metalness) + AlbedoColor * Metalness; // 2 mad
    float3 Albedo = AlbedoColor - AlbedoColor * Metalness; // 1 mad

    Roughness and gloss or smoothness (thats a very dumb sounding name to me tho), are identical. There is no difference in what they represent other than that the values are inverted between roughness and gloss. Wanna make a gloss map out of a roughness? y = 1 - x. Wanna get your old roughness map back? y = 1 - x. Easy peasy. Hell, you could have any mixture of roughness or gloss maps in a game and let the conversion be handled through a tickbox. It's trivial.

    There doesn't need to be any standardization of names here because they're all synonyms. Albedo = Diffuse = Base Color = whatever. No amount of trying to pick one name will ever make anyones art better, so who cares tbh. On another note Monitor, you seem really dug into however they did those Substances and all I can tell is: nobody will likely have an answer unless the creator shows up. It doesn't matter how it was made or with what system. What matters is what are you rendering with and how your result looks in it. Author for that, understand that. Looking at somebody else's art that's used with some specific shader in a previewing app isn't going to help you get better at all if you're not using the exact same setup.
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    I asked in the Allegorithmic forums to give me an explanation on this substance :smile: Albedo = Diffuse are just synonyms. But in the past diffuse maps had lighting information baked into them while with PBR workflows we try to avoid this. I haven't read the term albedo a few years back. So I thought the reason this term was established was to illustrate the difference between a classic diffuse map and an albedo map
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    The thing is, a diffuse/albedo map with no lighting information was used years back: in films.

    Yes part of the reason people suggest using certain terms is to highlight the differences for game production. However if you looked at a diffuse only view from any film scene even 20 years ago, it'll look just like a diffuse only view in any PBR game level today. Super flat, no lighting baked in.


  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    I mean for example if someone uses a glossiness map in a metalness workflow the result is incorrect, isn't it?

    So, besides the problem with this statement in itself (Per already covered that part), I think this illustrates a problem I have seen time and time again here on the forums: artists coming across modern realtime shading ("PBR" in any shape or form, depending on the implementation) seem to worry waaaaaay too much about it beforehand, as if it was some alien thing, and end up searching for tutorials left and right as opposed to simply looking at the documentation of specific tools and engines.

    The Marmoset team put together an extremely useful set of guides on their website clearing up the difference between their specular and metalness implementation ; there are example assets freely available for Toolbag, the Quixel Suite and Substance Painter ; and the UE4 documentation regarding inputs for their default material can be summed up as "well, our engine works pretty well with on/off (black or white) metalness, and leaving the spec map input empty, so do that". This kind of stuff is basically all there is to know to get started, and everything is actually very simple in practice. 

    Something else that struck me is seing artists researching how to do Albedo maps (= avoiding using too much fake lighting) ... but then forgetting that sun actually bleaches the top of objects, and that cracks tend to collect dust and dirt. Meaning that they end up with textures that are waaaay too poor as far as surface color and value variation are concerned.
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    perna said:
    No, that's wrong. Make the edges brighter. It's the physically more correct thing to do.
    Good post! Could you elaborate on this bit specifically? Trying to think of an instance this would make sense from a physical basis.

    This thread owns btw.
  • leleuxart
    Options
    Offline / Send Message
    leleuxart polycounter lvl 12
    somedoggy said:
    This thread owns btw.
    I agree. It's been a good read. 

    I think I kind of fall in the area of sticking to the more standard PBR guidelines, only because I feel like you should know the rules before you break them. Starting a material off following said rules, then applying the artistic liberties to really fine tune the material. 

    But the whole colored specular with plastic thing. I didn't know that. Are there any other materials that potentially break the whole "grayscale spec values for non-metals" mindset? Is the colored specular just a more desaturated color of the Albedo in that case? 
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    Anything with iridescence, like some car paints or beetles or CDs. Many materials due to their structures will have some slight color, or respond differently under certain wavelengths. One example is copper, which has a heightened response under tungsten lights that can't be represented without spectral rendering. This is where RGB rendering falls short, but there are pretty new advancements that may make that a moot point soon-ish.
    https://jo.dreggn.org/home/2015_spectrum.pdf
    Just compare how drastically different the images appear in that doc for RGB vs spectral rendered.

    Fabrics are another example because their microstructure scatters light in ways that they can easily reflect a colored specular, like silk.

    I'm honestly not sure where the pure plastic + colored specular he mentioned comes in, though I wouldn't be surprised at all to see much of that being due to multiple scattering events. There is this piece of research I just came across though:
    http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.9.5561&rep=rep1&type=pdf
    If you look down towards the bottom they have plots showing specular differences, amounting to 3.5-5% variations in specular from the incident light color (incoming light). This is from '04 and they mention that it was preliminary. I'll definitely keep looking further!

    That said, the effects seen in the above paper are the least of our 2 big caveats in current realtime rendering: No multiple scattering, no spectral based rendering. And of course, I'm not advocating that anyone tries to be a purist and ignore the artist's eye over 100% measurement based. Just sharing the rendering background and theory behind this all.

    Edit:
    https://books.google.com/books?id=ZRMEPqCRnlUC&pg=PA305&lpg=PA305&dq="The+Color+of+Specular+Highlights"&source=bl&ots=EAh-jG8QDP&sig=EqPxrOY0lM9ivBb5hGWZEmQnV_s&hl=en&sa=X&ved=0ahUKEwjD95iM5fXMAhWJaD4KHT72DiwQ6AEITzAH#v=onepage&q="The Color of Specular Highlights"&f=false
    This found "at least as large as 15%"
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    Thanks for the expansions :)

    One thing that I think is close to solving specular edge problems like you mention is having a geometric + normal map -> effective roughness conversion like Valve's Advanced VR presentation (I linked that earler up the page) and UE4's "Composite Texture" for that matter. There is no equation that solves this problem perfectly right now, but there likely will be soon. If a correct framework for that can be found and done efficiently enough in realtime then roughness will naturally increase the smaller the onscreen pixel footprint becomes, like seeing a chiseled brick building from a distance. The detail converges to look rough:


    I wish I had more time to work on these kinds of problems because I've got scattered pages of content on my own thoughts for how to approach such stuff, but one can only dream!

    On a totally different note, all of the tips you've mentioned would be fantastic applied to highly stylized PBR based art. That's what I really wanna see more of. Breaking the rules and our intuitions in a cohesive way to produce completely non-realistic works, instead using PBR as a template to form their own internal logic.

    Edit:
    Something I actually do to help roughness along, and I apologize in advance for the poor example, is go ahead and do a normal to roughness conversion with a simple Substance node I created. This is extremely similar to what Valve and UE4 do, but it's an effect that I effectively "bake in" to my materials and it works wonders. It has strength controls as well as control for large and small forms, so with a minute of tweaking it makes any material pop and lose any ugly specular edges.

  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    Nice example somedoggy. You really know what you are doing. I don't get the math behind it unfortunately :smiley: Is there something like a paper for dummies? :smiley:

  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    Thanks :) I've been writing PBR shaders for a couple years now, found I enjoy it just as much as the art itself.

    What in particular are you looking for? If it's the above, I'll be putting the Substance node up this week once I do some cleanup (but it's not tied to any physical scale so it still requires good art direction).

    As far as rendering papers for dummies... well, that's a little tougher, and something I actually started working on a while back.
  • Daew
    Options
    Offline / Send Message
    Daew polycounter lvl 9
    hey, I'm not as experienced at the people posting here but I can relate a  bit. Things can look and sound confusing when trying to learn the game dev pipeline because there are heaps of terminology/jargon being used. So maybe you could try starting from the basics like texturing in photoshop and using unreal/marmoset to get a grasp of the basics before moving onto substance. That way you can understand the whole metalness/ specular workflow. (Unreal is generally only metalness workflow) 
  • leleuxart
    Options
    Offline / Send Message
    leleuxart polycounter lvl 12
    perna said:
    Brighter specular edges: The in-game edge is so tiny in pixel size that it won't correctly catch highlights. In real life "super-sampling" solves this problem. We approximate it by brightening the edges in the specular map. I've never seen anyone explain this, but plenty of artists have intuitively known that they need to brighten up edges. Trust your eye. Now, of course there are other cases where edges should be brightened, but this is the global case.

    Colored specular: Scattering. That real-life soft red plastic ball looks SUPER colorful when hit by direct light. Coloring the specular map slightly is a hack to emulate some of that effect. Without it, these objects often look plain. If you're lucky enough to have a SSS solution, use that instead. Tip: When making gray plastic materials, CG artists usually go with the strict mathematical definition of gray, identical values in all channels. Painters tend to think in terms of "warm" or "cold" grays. Doing this in CG tends to give more life to your plastics. Few of the things you see as gray are actually gray. This also allows you to use the colored spec trick.

    Nice links, Somedoggy
    Ah okay, so it's primarily from the scattering. I wasn't thinking of that. 

    For fabrics, car paint, etc. I always expect a special shader with the appropriate effects(with their own rules for the textures), but didn't consider SSS for some plastics. 

    SOMEDOGGY, that Substance node is going to be a lifesaver. I look forward to seeing how you set it up.
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    @leleuxart I've just released it! Wanted to bump this thread too since yall inspired me to release the node, and I gave the thread a good mention at the end of the video :)
    https://www.youtube.com/watch?v=kUCDIAupRmQ
    You can grab the link from the video description or from my own thread. Thanks all!
  • monitorhero
    Options
    Offline / Send Message
    monitorhero polycounter lvl 6
    This is really cool somedoggy. Thank you for sharing and happy to hear this thread inspired you :smile:
Sign In or Register to comment.