Home Technical Talk

Is there a reason NOT to use a metalness workflow in PBR?

polycounter lvl 11
Offline / Send Message
daniellooartist polycounter lvl 11
The purpose of metallness vs. spec/gloss has been made clear. Only use it when working on metallic objects. However if your object is a mix of metal and dielectric then flood the dialectic portion of them black. The dielectric always comes out just fine in a metalness map. I figured that if I have an all dielectric material then I'll just make a roughness/all black metalness map and call it a day rather than spend the time/effort making a specular as well.

Sounds like metalness is objectively better, yet people use spec/gloss all the time. Am I missing something?

Replies

  • PolyHertz
    Options
    Offline / Send Message
    PolyHertz polycount lvl 666
    Metalness can have issues with artifacts along the transition line between metals and non-metals. Examples in this thread: http://polycount.com/discussion/133033/can-someone-explain-this-metallness-pbr-thing-to-me/p2
  • daniellooartist
    Options
    Offline / Send Message
    daniellooartist polycounter lvl 11
    What if you just have a wood texture? Is there any issue with just making a roughness and a flooded black metalness? Making a specular map takes time and if I can make it look just as good without it then why now?
  • JedTheKrampus
    Options
    Offline / Send Message
    JedTheKrampus polycounter lvl 8
    It might look a little funny in the cracks but 99.9% of people are probably not going to notice or care.

    Two things that you should always tweak the specular of in a metalness workflow are human skin (which should be set to 0.4125 in UE4 spec) and water (which should be set to 0.25 in UE4 spec.) Otherwise, you shouldn't need to worry about it much. Also, if you are going to make a flooded black metalness it's usually considered better to set it with a value in the UE4 material editor rather than with a texture, but if you're already channel packing it shouldn't be a big deal.
  • daniellooartist
    Options
    Offline / Send Message
    daniellooartist polycounter lvl 11
    The thing I'm working on is in  for UE4. I figured I'd stuff the roughness in the alpha channel and the occlusion/cavity/transparency/emissive in all 4 channels of another map while using a constant for the metalness.
  • JedTheKrampus
    Options
    Offline / Send Message
    JedTheKrampus polycounter lvl 8
    That's not a bad strategy. Just keep compression artifacts in mind--for DXT5 maps that are masks the most important texture should go in the alpha channel, the second most important in the green, and the least important in the red and blue. So, if I were you I would probably put the transparency in the alpha channel of the base color, and the roughness in the alpha of the masks map. Just my 2 cents.
  • Scruples
    Options
    Offline / Send Message
    Scruples polycounter lvl 10
    Pearlescent and other materials where the specular color is different than that of the albedo is a great deal easier to create with spec/gloss.
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    This is not a popular opinion but I don't believe in cavity maps going into specular. By manipulating specular you are reducing the refractive index of light hitting the surface which completely breaks the notion of having a physical basis in the first place. Instead, use Unreal's awesome method of adding normal information into the roughness mipmaps. It helps tremendously as normal information must be converted to micronormal info once it's lost to mipping. This will address a good portion of the reason why cavity maps are used. The other is trying to darken the surface overall, but that type of darkening generated by small crevices should be handled through occlusion. AO is still not perfect but at least it's a hack that has some physical meaning behind it. I feel manipulating specular in this way is artists simply not understanding the system. I have never needed to touch specular for Unreal.

    Specular in UE4 is a limited range of refractive indices (represented as percentages) to work with which makes it impossible to use as if it were a typical specular workflow. It is there for if you want very specific control over its limited range of 0% to 8% specular (1-~1.79 IoR). Its default of 0.5 is a 1.5 IoR, working out to a refractive coefficient of 0.04 (look up schlick's approximation and the Ro equation for why). In other terms, 4% of light is specularly reflected. Good enough for everything but metals. So if you care about having that fine of control then use the specular input knowing how those values are getting interpreted and in a physical sense. But for 99% of the stuff out there the default value is very close. The metallic workflow takes advantage of the fact that metal materials have no real albedo, and blends between the specular value and using the albedo as a refractive coefficient for each RGB channel. The downside is the trouble of softly blending between dielectric and metallic on the same material.

    Oh and I do R - AO G - Roughness B - Metallic for my packed textures. With appropriate AO and well authored albedo/roughness/metallic I have never needed anything else.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    Could you explain this Unreal awesome method a bit more. Maybe a link to some info, please.

    I am one who uses specular input all the time to erase excessive  highlight from cracks , surface pores , small details shadows baked into surface etc. And then have to compensate decreasing overall reflectivity somehow fighting with PBR stuff.    Especially when the normal map has not enough resolution to model surface bumps accurately enough.    i.e. a crack  is 1pix wide and have no room in the normal map actually.


  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    Sure. Physically based rendering in Unreal is implemented via a standard microfacet BRDF. The BRDF is the bit that takes in all the parameters and spits back out a reflection for each surface, where roughness is interpreted as a statistical heightfield of tiny facets on the surface of a model within a pixel. These facets go from being noisy like mountains and valleys (white roughness) to extremely flat and smooth (black). Again this is statistical, meaning that it distributes the incoming light, to each sample, as an outgoing reflection that's statistically like what a simulation of a microfacet surface would be.

    Now this means that for normal map information we have to do a slight modification to maintain what is perceptually correct roughness when objects are at different distances. When we move further away from an object our sample size (area of each surface within one pixel) increases.

    The textures begin to mip. And in this process you can imagine that information is lost due to that resolution decrease. This generally isn't an issue for albedo textures and the like, but it's a massive problem for normals. Because light is being bent in various directions for each pixel of the normal map texture, this has a similar effect to how microfacets cause reflections to appear rough. WIth very detail normal maps an artist can cause a surface to look nice and rough up close, but as the camera zooms out in engine and all those perturbed normals were lost, the object appears extremely glossy! To counteract this we are applying a precalculation to transform normals into a microfacet roughness representation and just add that increasing roughness to each of the object's roughness mip levels. Valve has a great presentation on this problem, also presenting a hacky solution for geometric normals being converted to roughness.
    PDF  Slideshow -
    http://media.steampowered.com/apps/valve/2015/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf
    Video Slideshow -
    http://www.gdcvault.com/play/1021771/Advanced-VR

    In UE4 it's accomplished for normal maps through Composite Texture:
    https://docs.unrealengine.com/latest/INT/Engine/Content/Types/Textures/Composite/index.html

    A fun anecdote, I learned about this as a kid due to visiting the Biltmore House (iirc one of the top 2 biggest houses in the US, it's a tourist attraction now) and they told us that the builders hand chipped all the bricks specifically to create a uniform rough look for the building at a distance to prevent any glare. Who would have figured that'd be a useful bit of information later on!

    Edit: Oh yeah. As a reversion of this thought process, your problem of having a 1px crack and not being able to represent it with normals is easy to solve. This is where the highest resolution of your normal map has reached its limits, so you represent it via increased roughness :)
  • daniellooartist
    Options
    Offline / Send Message
    daniellooartist polycounter lvl 11
    Wow, I learned a lot just by reading this. Thank you so much!
  • oblomov
    Options
    Offline / Send Message
    oblomov polycounter lvl 8
    The principle of taking the variance of normals into account for computing the mipmap of the "roughness" textures (or glossiness, or smoothness or whatever the name you want to use for the texture that encodes the variance of the distribution of microfacets) is a bit older than Unreal4 though. The oldest reference I know is LEAN mapping (http://www.csee.umbc.edu/~olano/papers/lean/). It dates from 2010, and was developed by Fireaxis and was first used in Civ5.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    Thanks for the detailed explanation, Somedoggy.      

    Unfortunately  even being  100% rough  those  tiny cracks  with no representation in the normal map  still reflect essential amount of highlight  ,  making kind of "laminate cover" feeling and dissolving right in the next mip.    Plus having too strong contrast between polished and rough pixels creates same kind of artifact (halos) as in between metals and nonmetals .  In fact in practice I really can't  use contrast  black and white representation in metallic channel  and in roughness one too.   It's always a kind of compromise to reach better overall crispness.    
       That makes the whole metallic approach a kind of tricky and more time consuming than  the old speclevel/gloss one imo.   So I totally understand why people still use it 


  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    oblomov said:
    The principle of taking the variance of normals into account for computing the mipmap of the "roughness" textures (or glossiness, or smoothness or whatever the name you want to use for the texture that encodes the variance of the distribution of microfacets) is a bit older than Unreal4 though. The oldest reference I know is LEAN mapping (http://www.csee.umbc.edu/~olano/papers/lean/). It dates from 2010, and was developed by Fireaxis and was first used in Civ5.
    Yeah LEAN mapping was the first I know of too, thanks for linking to that!
    gnoop said:
    Thanks for the detailed explanation, Somedoggy.       

    Unfortunately  even being  100% rough  those  tiny cracks  with no representation in the normal map  still reflect essential amount of highlight  ,  making kind of "laminate cover" feeling and dissolving right in the next mip.    Plus having too strong contrast between polished and rough pixels creates same kind of artifact (halos) as in between metals and nonmetals .  In fact in practice I really can't  use contrast  black and white representation in metallic channel  and in roughness one too.   It's always a kind of compromise to reach better overall crispness.    
       That makes the whole metallic approach a kind of tricky and more time consuming than  the old speclevel/gloss one imo.   So I totally understand why people still use it 
    So this is an interesting problem that I aimed to briefly talk about with the suggestion to use the ambient occlusion slot. The reality is that normal maps are by nature not actually energy conserving. Where a high poly model may shadow or mask light, the normal map has no information for capturing this. However by darkening the specular you are occluding direct light that could just as easily be reflected from within a crevice as well as darkening indirect light which kinda makes sense, but without any sense of the material occlusion. I try to model with these things in mind, using geometry to aid this type of energy conservation. One solution would be to capture a correlated position-normal dataset that you could enforce energy conservation across, but this is beyond what's currently feasible and in the future tessellation will solve this problem anyways.

    Could you share the halo artifacts you're talking about between rough and polished pixels? I haven't had this problem. As far as using an IoR based specular it is a better method for controlling reflections, but a last gen spec/gloss isn't gonna be less time consuming to work with than spec/metallic PBR setups. I pump out textures like twice as fast now haha. And not to say that this is what you're doing, but the modern pseudo PBR systems are a huge improvement in every regard. If people avoid embracing modern rendering systems because they can't get good results then they are hurting themselves.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool

    base color -black, metallic -black, roughness -is dark rectangles on white background.    It's not that strong as metal halos  but still makes noisy/grainy surfaces a bit less crisp comparing with  old spec level shaders where highlight intensity was expressed directly by pixel values.      

    btw, picture also shows how much highlight intensity 100% black and 100% rough surface still reflects  while in reality it would be most often filled with micro shadowing from the small bumps  a normal map often can't represent correctly .
  • ActionDawg
    Options
    Offline / Send Message
    ActionDawg greentooth
    I see what you mean. You don't see real contiguous objects that look like your example though. If you want a continuous mesh you could easily mitigate it through uv seams.

    The shadowing-masking parameter of the BRDF is the main thing controlling energy conservation. Roughness is a principle component of it. And because the normal map feeds into roughness as the mipmaps blend, this method makes a better approximation to the energy conservation that would happen IRL. The other thing that should get modeled is occlusion, which yeah the normal map can't represent. But by prebaking that AO information and applying it to the indirect lighting function you are dealing with the shadowing that would happen at such small levels in a more physically appropriate way. You also allow direct lighting to work as it's supposed to by not arbitrarily adjusting a refractive index value.

    Realtime has a lot of shortcomings at the moment, particularly multiple scattering which is the current main source of energy loss. But if composite textures and occlusion are leveraged appropriately I've found no need for that kind of tweaking.
Sign In or Register to comment.