the shader system doesn't need to be complex at all... unless i'm reading your comment wrong?
the way i see it there are two ways of handling this kind of material system:
1. the art lead sits down and defines the look of the game by making a batch of say, 100 different material presets. the art team then composite those material presets using masks. then, depending on the engine these are either baked down to a final map (The Order: 1886), or they remain composited in realtime (EU4).
2. the artists control the various aspects of their material through maps, they would need (from what i would guess).
Albedo - coloured for dielectrics, black for metallics.
Base reflectivity - very dark (around 0.04?) for dielectrics, measured colour samples used for the various metals.
roughness - similar in concept to the gloss we currently use, but less confusing, no sliders etc.
Yeah a big problem is just lack of proper documentation. We (marmoset) don't have any materials specifically related to PBR or anything like that.
A diffuse / albedo / color map are really the same thing, how much (if any) lighting you have prebaked into this texture is really an art content issue more so than something specific to the shaders in an app like Toolbag or any games engine. I haven't really used directional light in my diffuse textures in years personally. Toolbag's shaders look much better if you do not bake any lighting (other than AO), otherwise you get that "painted on abs" look.
AO in a separate texture is something that TB1 does not support but TB2 will likely support (as you can do some more advanced stuff with it like mask it where an object receives direct light). There's an AO slot in TB1 but its for a 2nd uv channel thing that barely works.
A substance/surface/specular/reflectivity map, again all just different names for the same thing. Its a map to set the base reflectivity of the material (some shaders do away with it entirely in favor of gloss map and probably a global reflectivity value). What this texture looks like is again an art content question, if you're creating it logically where more reflective areas on the texture are brighter, that's going to work in TB1, and TB2, and any PBR system that has a base reflectivity map.
Roughness is just another word for a gloss map, and if black = rough or if black = glossy is just an implementation thing. The big deal with gloss maps and PBR is that rougher areas sample blurrier/dimmer areas of your IBL cube maps (and your dynamic light intensity fades with glossier reflections) because of energy conservation, which is how TB1 works as well.
I hope that helps clear it up a little.
Sorry, TB2 is still work in progress so I can't really tell you exactly what is different (as it may change before release). But basically some of the rendering stuff is more sophisticated / based on more accurate shading models etc.
One thing I can say is that TB2 has screen-space reflections, which rules and means that instead of putting AO into your spec map to mask strange reflections in occluded areas, you get the proper content reflected in those bits, and it looks really good.
Awesome, thanks for clearing that up for me. I always thought IBL was just part of the PBR and the other part was super precise textures. I've been reading up on PBR since the SF GDC, but your explanation helped me wrap my head around some of the inconsistencies.
I don't think you "need" it, but I do agree it makes things a lot more straightforward and easier to work with. Working on every texture channel separately is such an abstract concept, and a big reason why so many (new) artists struggle with texturing.
to get the extra mile and specular break up and range... yeah. but we can agree to disagree.
it's definitely exciting seeing studios moving towards this solution. (brdf, pbr, ibl, etc)
So I have been reading up on this stuff, but I am still confused a little bit about the diffuse/albedo map. I understand why you don't bake any lighting into the map, and only use a little bit of the AO. I am confused about why the colors and values in the map are so important, and how they actually effect the rendering.
This is the image I am confused about.
Confusion seems to be rife ..
The only new thing is the specular reflectance / substance map - it will help to think of it as a substance map because it defines the material by describing how reflective the surface is at a given view angle and also affects the colour of the specular highlights.
You have to use values as defined in the various reference images because they're inputs to some cunning maths - its not a simple value as per your standard phong spec.
the system makes things more straightforward in the long run. If everyone does it properly you're not stuck balancing specular values for weeks at the end of a project and you dont have to dick about bending values to fit your lightimg. Also theres less parameters to mess with(break) in your shaders..
The reason it's good to have a guide for colors and brightness values is you want to remove as much "eyeballing" of values as possible. What may look "correct" to you in a scene where you eyeballed the lighting and materials may result in too bright or (more often) too dark of a material. Imagine you're working in a scene that is overexposed (lighting is very bright) and you tweak your white cement to actually be very dark or grey in the material. Once you put it in a scene with correct lighting and exposure, it'll no longer be white cement, it'll look too dark.
One recommendation I have is while tweaking materials always have a copy of the mesh you're working on but with neutral gray applied to it, that'll help you make sure your exposure is correct while tweaking the working material. There was once I was tweaking something really complex like skin, nail the look then put it beside a finished/calibrated material and find out it was reflecting too much light. Having a few reference materials helps as well. Like when you're working on assets bring in one you know is finished and correct and make sure they look "right" next to each other.
The reason it's good to have a guide for colors and brightness values is you want to remove as much "eyeballing" of values as possible. What may look "correct" to you in a scene where you eyeballed the lighting and materials may result in too bright or (more often) too dark of a material. Imagine you're working in a scene that is overexposed (lighting is very bright) and you tweak your white cement to actually be very dark or grey in the material. Once you put it in a scene with correct lighting and exposure, it'll no longer be white cement, it'll look too dark.
One recommendation I have is while tweaking materials always have a copy of the mesh you're working on but with neutral gray applied to it, that'll help you make sure your exposure is correct while tweaking the working material. There was once I was tweaking something really complex like skin, nail the look then put it beside a finished/calibrated material and find out it was reflecting too much light. Having a few reference materials helps as well. Like when you're working on assets bring in one you know is finished and correct and make sure they look "right" next to each other.
ya because of this i tend to work on normals and lighting first, and flick back and forward between previewing my scene textured, and with just flat grey.
the detail lighting mode for the ue3 viewport is very usefull for this
It's good to setup benchmark scene with Macbeth color chart inside. Of course it's still eye balled, but if colors on Macbeth char look ok, in scene we can assume that lighting is as neutral as we can get.
The whole thing was one material with shading models and properties separated by a greyscale mask. Since I only needed 5 or so material definitions (metal, skin, fabric, leather, lycra) I was able to use a single channel for the material masks with a similar technique to gradient mapping.
The lycra bodysuit used a Minnaert shading model. The metal portions of the suit all reflected a cube map and non-linearly skewed the colors in the environment map toward the color in their color map.
Don't remember how I did the skin, but there are better models out there. Mine was always a bit waxy.
Still though this wasn't a bad start. Would have been better spun off into a set of material functions but hey.
Should be able to do something similar to the UE4 layered materials with material functions and some masking. It may not run as efficiently (since its all custom brew rather than built right in). Tricky bit is just how you want to make artists make their masks.
believable metal shaders often requires you to have some kind of variance in both the roughness and reflection
thats true, glossiness or roughness variation as you want to call it does a lot in terms of helping with the illusion, renders like mental ray call it glossiness an others like arnold call it roughness. Which is the inverse, but the same thing.
for doing roughness maps I do like to create pure black and white maps mapping reflective and less reflective areas of the object and then tweak the values in the shader with a scalar change range. As the map already comes with 0 and 1 values (black and white) is easier to tweak reflection scale and roughness more accurately than putting straighly a map with grey values into the reflection scale and roughness and let it play by itself... I ve yet to see more than 5 % of people doing this right... most people create a reflection and glossiness map out of the blue and because its black and white they think it i ll work fine.... wrong, lookdev from the shader point of view needs to be taken in consideration prior any texture painting, at least for reflection and surface values...
Also good cgi metals needs complex ior for different kind of metals, arnold and mental ray give access to this for more accurately fresnel, not sure about realtime though... also material blending does a lot for layered materials with different properties...
thing is, with all this new PBR stuff, roughness has to be clamped between 0 - 1. it's not like gloss where you could interpolate your 0-1 texture to be between any two values you like.
take Marmoset Toolbag as an example, you have a glossiness slider in there, they recommend that you have it set to maximum to allow "the full range of glossiness". the slider caps out at 256, but in reality you could actually make that value higher and have an even greater degree of control over the glossiness of an object.
that doesn't (or shouldn't) exist with a PBR setup, when you start pushing the roughness value past 0-1 you start getting horrendous artifacts.
so while i agree that in current realtime systems, sure you should look at the material setup and tweak the gloss as needed... that just simply isn't the case going forward with PBR.
as for fresnel values, that's actually very easy to do. in the case of UE4 they use the following:
thing is, with all this new PBR stuff, roughness has to be clamped between 0 - 1. it's not like gloss where you could interpolate your 0-1 texture to be between any two values you like.
take Marmoset Toolbag as an example, you have a glossiness slider in there, they recommend that you have it set to maximum to allow "the full range of glossiness". the slider caps out at 256, but in reality you could actually make that value higher and have an even greater degree of control over the glossiness of an object.
that doesn't (or shouldn't) exist with a PBR setup, when you start pushing the roughness value past 0-1 you start getting horrendous artifacts.
so while i agree that in current realtime systems, sure you should look at the material setup and tweak the gloss as needed... that just simply isn't the case going forward with PBR.
Yeah. This is where the term Specular Power(Gloss) and Roughness are different. If you need to, you convert from a classic specular power, to a roughness value with some simple math.
// Roughness parameter to a Blinn-Phong specular power
float RoughnessToSpecPower(float m)
{
return 2.0f / (m * m) - 2.0f;
}
// Blinn-Phong specular power to a roughness parameter
float SpecPowerToRoughness(float p)
{
return sqrt(2.0f / (p + 2.0f));
}
// Another quick way to bring roughness value to a specular power ( 0-2048 )
float power = exp2(13.0 * roughness );
thing is, with all this new PBR stuff, roughness has to be clamped between 0 - 1. it's not like gloss where you could interpolate your 0-1 texture to be between any two values you like.
take Marmoset Toolbag as an example, you have a glossiness slider in there, they recommend that you have it set to maximum to allow "the full range of glossiness". the slider caps out at 256, but in reality you could actually make that value higher and have an even greater degree of control over the glossiness of an object.
that doesn't (or shouldn't) exist with a PBR setup, when you start pushing the roughness value past 0-1 you start getting horrendous artifacts.
so while i agree that in current realtime systems, sure you should look at the material setup and tweak the gloss as needed... that just simply isn't the case going forward with PBR.
I suppose you're referring to the method posted by maze?
The way I understood it was that you'd use a mask, which in turn would be used for lerping the min and max roughness values, which could be tweaked per material, allowing you to use the same generic roughness textures for multiple materials.
At least that's what I had in my older PBR shader, and I'll most likely incorporate that into my new one as it's very flexible and with the right kind of generic textures it could save a ton of memory.
I know this hasn't been posted in awhile, and I haven't seen anyone talk about it enough to adopt, but is there anything on the workflow of creating textures for PBR yet? I'm not talking about UE4, but for the other engines and Marmoset, where we have to focus more on Diffuse photographs and remove lighting info and use the proper colors. I think one of the charts by S
lele: As an artist, not much has changed with PBR. Making a texture correctly the classic way and doing it correctly with "PBR" is pretty much identical.
You won't have to manually emulate energy conservation anymore as that should be done in the shader, and you might be using a mask instead of colored specular, but these are things that your employer or client will inform you about anyway, as everyone has their own implementation of PBR - it's a very lose standard.
Yeah, I was just curious about the Diffuse color mostly. I know more of the specular work will be done with a Physically Based engine, including the Fresnel/reflectance and energy conservation, but those color/specular charts throw me off. I was just wondering if anyone has had to reference them or determine a more accurate Albedo color of a material.
If anyone has any PBR texture experience they can throw in, feel free.
It depends on the math. I think the lambert diffuse in UE4 was diffuse/pi, so the values will be darker in the UE4 BPR shader than they would be in another shader where you don't divide it by anything (or divide by a different number).
The technical artists in the project you're working on will most likely tell you what kind of values you need to use.
A shader will never be able to emulate all the physical effects that happen in real-life, so the artist's eye will always be necessary, and even if such a shader did exist, looking correct isn't the same as looking good.
i think he could have phrased his question better, i spoke to him on skype last night and although i could think of a couple of solutions to his problem i couldn't figure which would be best so i suggested asking a tech artist.
What he actually needs to know, is that since image based lighting is a fairly integral part to a physically based setup, How do you handle a situation where your player character might be able to move into a room where the sky outside (and therefor the sky on the cubemap) shouldn't be seen, but then if a hole were to be blown in the wall and the sky can now come in. What do you do to make sure the IBL is "correct" for both instances.
i think he could have phrased his question better, i spoke to him on skype last night and although i could think of a couple of solutions to his problem i couldn't figure which would be best so i suggested asking a tech artist.
What he actually needs to know, is that since image based lighting is a fairly integral part to a physically based setup, How do you handle a situation where your player character might be able to move into a room where the sky outside (and therefor the sky on the cubemap) shouldn't be seen, but then if a hole were to be blown in the wall and the sky can now come in. What do you do to make sure the IBL is "correct" for both instances.
Its a difficult problem and pretty much any traditional method of lighting would have some sort of issues. If you're using lightmaps you'll have major issues too.
Cube map probes that update in semi-real time would probably be the solution here. Wouldn't need to be every frame or anything. Still would probably be fairly expensive.
A real time radiosity or occlusion system instead of IBL probes for per-area light would also probably work. eg, geomeric's enlighten.
Or of course, restricting damageable items so you can't bring down an entire room/building.
Well, any offline-generated/static part of the lighting solution would have problems. Cubemaps, lightmaps, pre-computed shadowmaps being the top of the list. If you're talking about scripted desctruction sequences ... different solutions are available depending on your engine tech, eg scripting an A/B swap for your cubemaps would fix that one part of the problem. If you're talking about systemic/procedural destruction then you're likely going to have to ditch mostly all of the static/precomputed parts of the lighting solution entirely I'd think.
pbl sounds great ... but where do these "reflections" come from !?
and now you guys tell me its all pre-calculated cubemaps ? thats so ... *LASTGEN* i am disgusted
Well, there are other techniques in play as well, like spherical harmonics for realtime reflections, but just about any realtime reflective techniques are also very expensive.
The only viable solution for this issue is some kind of volume base lighting (the one presented by NVIDIA - GI Works, or now scrapped Voxel Lighting in UE4).
Image Based Lighting is really nothing more, but a cheap addition to existing GI solution (which are mainly for diffuse, while IBL adds specular and good metallic reflections as well as extends existing diffuse).
But it is still static solution.
If you want to have correct lighting in dynamic environment you must have dynamic radiosity solution implmeneted, I don't really see way around it.
Now we deal it with cheating. Appiling global enviromental hdr map, to everything, then placing dynamic objects(for example destructable), that are lit by hdr map as well as global illumination informations from probes around level. But dynamic objects on their own do not contribute to GI.
geomeric's enlighten.
Which is not dynamic. Just sayin. It's precalculated solution, that works only for static geometry. Unfortunetly there is no publicly available information on how exactly this precomputation works.
Never the less I think that Enlighten is at this point of time best available GI solution. It is super fast, super quality, allow very high amount of lights to contribute to GI, deal with AO, transsmision lighting and sub surface. It is just mind blowing.
But works only for static geometry. Which honestly is fine in about 90% of games.
Replies
the way i see it there are two ways of handling this kind of material system:
1. the art lead sits down and defines the look of the game by making a batch of say, 100 different material presets. the art team then composite those material presets using masks. then, depending on the engine these are either baked down to a final map (The Order: 1886), or they remain composited in realtime (EU4).
2. the artists control the various aspects of their material through maps, they would need (from what i would guess).
Albedo - coloured for dielectrics, black for metallics.
Base reflectivity - very dark (around 0.04?) for dielectrics, measured colour samples used for the various metals.
roughness - similar in concept to the gloss we currently use, but less confusing, no sliders etc.
Awesome, thanks for clearing that up for me. I always thought IBL was just part of the PBR and the other part was super precise textures. I've been reading up on PBR since the SF GDC, but your explanation helped me wrap my head around some of the inconsistencies.
especially like the part about "this week"
We are allready talking about it Here
it's out.
Not according the the graphics programmers that I know, and various published articles that refer to it as PBR.
Nitpick... Your beard is too short. Do you even wrestle bears?
Their servers are getting destroyed. I can't get it to download.
but just in case, sulkyrobot has posted a mirror.
http://www.polycount.com/forum/showpost.php?p=1905093&postcount=22
hmm that sucks, they made it seem like that would be in this update.
to get the extra mile and specular break up and range... yeah. but we can agree to disagree.
it's definitely exciting seeing studios moving towards this solution. (brdf, pbr, ibl, etc)
Confusion seems to be rife ..
The only new thing is the specular reflectance / substance map - it will help to think of it as a substance map because it defines the material by describing how reflective the surface is at a given view angle and also affects the colour of the specular highlights.
You have to use values as defined in the various reference images because they're inputs to some cunning maths - its not a simple value as per your standard phong spec.
the system makes things more straightforward in the long run. If everyone does it properly you're not stuck balancing specular values for weeks at the end of a project and you dont have to dick about bending values to fit your lightimg. Also theres less parameters to mess with(break) in your shaders..
One recommendation I have is while tweaking materials always have a copy of the mesh you're working on but with neutral gray applied to it, that'll help you make sure your exposure is correct while tweaking the working material. There was once I was tweaking something really complex like skin, nail the look then put it beside a finished/calibrated material and find out it was reflecting too much light. Having a few reference materials helps as well. Like when you're working on assets bring in one you know is finished and correct and make sure they look "right" next to each other.
ya because of this i tend to work on normals and lighting first, and flick back and forward between previewing my scene textured, and with just flat grey.
the detail lighting mode for the ue3 viewport is very usefull for this
HOOOOLY SHIT!
Opens up whole new vistas of creativity for artists.
[ame="http://www.youtube.com/watch?v=tXVAbCofgCU&list=UUECFyRilO6YBueIcP0_u2pQ"]Angel Turnaround - more progress - YouTube[/ame]
[ame="http://www.youtube.com/watch?v=owEkIZOZU1s&list=UUECFyRilO6YBueIcP0_u2pQ"]Angel Turnaround - boots/metal - YouTube[/ame]
The whole thing was one material with shading models and properties separated by a greyscale mask. Since I only needed 5 or so material definitions (metal, skin, fabric, leather, lycra) I was able to use a single channel for the material masks with a similar technique to gradient mapping.
The lycra bodysuit used a Minnaert shading model. The metal portions of the suit all reflected a cube map and non-linearly skewed the colors in the environment map toward the color in their color map.
Don't remember how I did the skin, but there are better models out there. Mine was always a bit waxy.
Still though this wasn't a bad start. Would have been better spun off into a set of material functions but hey.
Should be able to do something similar to the UE4 layered materials with material functions and some masking. It may not run as efficiently (since its all custom brew rather than built right in). Tricky bit is just how you want to make artists make their masks.
Masking methodology, for the curious.
have just released it in my shader =]
runs from 30~40 fps @ 1024x768 on a gtx485m depending on how many of the 3 shadow-casting objects I turn on.
Did this last week for everyone interested in PBR:
Short Version:
[ame="http://www.youtube.com/watch?v=LpLBzV9uG0Y"]Physically Based Rendering for Artists - Recap - YouTube[/ame]
Longer Version:
[ame="http://www.youtube.com/watch?v=LNwMJeWFr0U"]Physically Based Rendering for Artists - YouTube[/ame]
If anything's not clear in there, feel free to ask!
Cheers
for doing roughness maps I do like to create pure black and white maps mapping reflective and less reflective areas of the object and then tweak the values in the shader with a scalar change range. As the map already comes with 0 and 1 values (black and white) is easier to tweak reflection scale and roughness more accurately than putting straighly a map with grey values into the reflection scale and roughness and let it play by itself... I ve yet to see more than 5 % of people doing this right... most people create a reflection and glossiness map out of the blue and because its black and white they think it i ll work fine.... wrong, lookdev from the shader point of view needs to be taken in consideration prior any texture painting, at least for reflection and surface values...
Also good cgi metals needs complex ior for different kind of metals, arnold and mental ray give access to this for more accurately fresnel, not sure about realtime though... also material blending does a lot for layered materials with different properties...
take Marmoset Toolbag as an example, you have a glossiness slider in there, they recommend that you have it set to maximum to allow "the full range of glossiness". the slider caps out at 256, but in reality you could actually make that value higher and have an even greater degree of control over the glossiness of an object.
that doesn't (or shouldn't) exist with a PBR setup, when you start pushing the roughness value past 0-1 you start getting horrendous artifacts.
so while i agree that in current realtime systems, sure you should look at the material setup and tweak the gloss as needed... that just simply isn't the case going forward with PBR.
as for fresnel values, that's actually very easy to do. in the case of UE4 they use the following:
all you would need to do is change "ndoth" to a reflectance value, and you're sorted.
Yeah. This is where the term Specular Power(Gloss) and Roughness are different. If you need to, you convert from a classic specular power, to a roughness value with some simple math.
I suppose you're referring to the method posted by maze?
The way I understood it was that you'd use a mask, which in turn would be used for lerping the min and max roughness values, which could be tweaked per material, allowing you to use the same generic roughness textures for multiple materials.
At least that's what I had in my older PBR shader, and I'll most likely incorporate that into my new one as it's very flexible and with the right kind of generic textures it could save a ton of memory.
Yeah, I was just curious about the Diffuse color mostly. I know more of the specular work will be done with a Physically Based engine, including the Fresnel/reflectance and energy conservation, but those color/specular charts throw me off. I was just wondering if anyone has had to reference them or determine a more accurate Albedo color of a material.
If anyone has any PBR texture experience they can throw in, feel free.
The technical artists in the project you're working on will most likely tell you what kind of values you need to use.
This, this and this. It's great to hear similar war stories from different people
Really looking forward to seeing what everyone here will make of all this pbr-i-ness, personally i'm very excited.
Quoted for agreement.
So after teasing us with that WIP youtube video for a good while now - where is it?
PBR Theory
PBR and you can too!
yes, where?
erm, what?
What he actually needs to know, is that since image based lighting is a fairly integral part to a physically based setup, How do you handle a situation where your player character might be able to move into a room where the sky outside (and therefor the sky on the cubemap) shouldn't be seen, but then if a hole were to be blown in the wall and the sky can now come in. What do you do to make sure the IBL is "correct" for both instances.
Its a difficult problem and pretty much any traditional method of lighting would have some sort of issues. If you're using lightmaps you'll have major issues too.
Cube map probes that update in semi-real time would probably be the solution here. Wouldn't need to be every frame or anything. Still would probably be fairly expensive.
A real time radiosity or occlusion system instead of IBL probes for per-area light would also probably work. eg, geomeric's enlighten.
Or of course, restricting damageable items so you can't bring down an entire room/building.
pbl sounds great ... but where do these "reflections" come from !?
and now you guys tell me its all pre-calculated cubemaps ? thats so ... *LASTGEN* i am disgusted
Image Based Lighting is really nothing more, but a cheap addition to existing GI solution (which are mainly for diffuse, while IBL adds specular and good metallic reflections as well as extends existing diffuse).
But it is still static solution.
If you want to have correct lighting in dynamic environment you must have dynamic radiosity solution implmeneted, I don't really see way around it.
Now we deal it with cheating. Appiling global enviromental hdr map, to everything, then placing dynamic objects(for example destructable), that are lit by hdr map as well as global illumination informations from probes around level. But dynamic objects on their own do not contribute to GI.
Which is not dynamic. Just sayin. It's precalculated solution, that works only for static geometry. Unfortunetly there is no publicly available information on how exactly this precomputation works.
Never the less I think that Enlighten is at this point of time best available GI solution. It is super fast, super quality, allow very high amount of lights to contribute to GI, deal with AO, transsmision lighting and sub surface. It is just mind blowing.
But works only for static geometry. Which honestly is fine in about 90% of games.