Awesome - the skin scattering alone makes me all happy inside Must have been great to have such tech running while working on the assets. Great stuff !!
Interesting, but I can't help thinking that Allegorithmic Substance does all that compositing + more. Any reason you decided to go for an in-house solution?
Interesting, but I can't help thinking that Allegorithmic Substance does all that compositing + more. Any reason you decided to go for an in-house solution?
The base materials that propagate onto your objects are a lot more advanced than simple d/s/n/g materials.
The base materials that propagate onto your objects are a lot more advanced than simple d/s/n/g materials.
That's what I was thinking, but they're not explaining any of it. Care to elaborate a bit more?
Does it multiply additional values into your maps? Does it create additional (hidden) composite value maps other than the actual texture composites?
It's hard to tell what's happening exactly with so little info.
This is brilliant, and very generous to share as the game is currently in development. Kudos to you and the rest of the team! I can honestly say this game even though no gameplay (to my knowledge) has been shown yet is pretty much the reason why I'm considering buying a PS4. Keep up the good work!
That's what I was thinking, but they're not explaining any of it. Care to elaborate a bit more?
Does it multiply additional values into your maps? Does it create additional (hidden) composite value maps other than the actual texture composites?
It's hard to tell what's happening exactly with so little info.
There's a 4 channel composite map in the slides that holds occlusion, anisotrophy, BRDF, and roughness values.
Additionally, by tying everything into our own database system, we are able to make global, sweeping changes and have them propagate through inheritance. I'm not personally familiar with Substance Designer's workflow, and maybe this can be automated through their workflow, but being able to tweak a property of base gold and have it propagate to 5, 50, or 500 assets and maps through a data rebuild automatically is a huge advantage for ensuring global consistency.
Snefer, the compositing is an offline process that bakes and packs data down to a handful of maps - the slides show the breakdown of each channel of the 4 maps - D/N/S and then a special 4-channel map with a bunch of material values in it. You can have an arbitrarily layered compositing material (such as the well pump in the slides) and it bakes down to the same number of maps.
With a run-time layered environment material, obviously, you can be blending between several different materials and pay the cost for each, but that's no different than any other blend material. The big win for our run-time layered materials compared to the sorts of vertex blend materials you might make in Unreal is that our system is built off our database with inheritance, so we could (and do) have a universal water, universal mortar, and 30 different brick materials, and wind up with 30 different brick+mortar+water materials. If we then decide we want the water to be brown, or the mortar to be smoother, we can make that change in the universal material, and have it propagate to all 30 brick+water+mortar materials.
I believe the material compositing and layering isn't really more or less texture dependent than a similarly robust end-result (ie, if you wanted the various rendering features, such as map-controlled roughness, BRDF, etc), the win is in the fact that it all retains a modular, component-driven system wherein the elements can be edited in isolation and recombined by the software.
but being able to tweak a property of base gold and have it propagate to 5, 50, or 500 assets and maps through a data rebuild automatically is a huge advantage for ensuring global consistency.
This is indeed the workflow Substance Designer allows you to follow. We are going through pretty much the exact same process with some developers.
Kudos for recreating it in-house though
how is mari not crashing with all those material types!?
this is our own, material editor. mari is used separately to create the maps.
With a run-time layered environment material, obviously, you can be blending between several different materials and pay the cost for each, but that's no different than any other blend material.
not necessarily, the smasher will still crush it to one single material, albeit more expensive. but, still one single draw call.
So what do I need to know in order to be able to read all that math? I keep seeing that type of stuff in PDFs about rendering, and I never know wtf it means. How do I make sense of it?
So what do I need to know in order to be able to read all that math? I keep seeing that type of stuff in PDFs about rendering, and I never know wtf it means. How do I make sense of it?
mostly vector math, linear algebra, and some working knowldge of a shader or programming language.
very interesting! great result! I saw the mention of using vertex colours to define where the four textures go, so theres only ever 4 variations on each object? and is that done in engine or in mari or 3dsmax or something? Eventually all of this stuff done with materials on that pump is baked down to diffuse, normal, specular and reflection mask before the game plays?
So RAD guys, I'm curious as to how much of your composited layering shader setup you have directly integrated into Mari (obviously being able to see realtime results while painting masks for the material templates is better than not). Or are the masks painted 'blind' in Mari just as greyscale on the geo and then used to composite inside your proprietary material compositing tool?
If a lot of it has been integrated inside Mari, how difficult did you find that process (I understand Mari is supposed to be quite easy to implement new shader models etc) ?
This is indeed the workflow Substance Designer allows you to follow. We are going through pretty much the exact same process with some developers.
Kudos for recreating it in-house though
The workflow might be similiar, but I believe the end result is completly diffrent. Substance Designer essentialy generate textures, that you have to plug into your own material.
Material = Shader (for most cases anyway).
The material system they are using is not generating static textures they are using in materials (shaders), but material it self is generating texture.
I think this approach resembles much more what is done in offline rendering, where you composite diffrent, often very simple textures (often they are simple masks, that sole prupose is the where are what colors and other properties) to get the final effect.
The compsiting/procedural generation of final images inside shaders will more often used as we move towards PBR, as good lighting will mask most of tiling issues.
The final materials are either composited before or inside of the shader itself but you still need to create the base material definitions which are comprised of textures (color, normal, roughness, etc). The workflow is similar, but you use a different environment and technique to composite everything in the end.
I'd be curious to know if the final material gets baked offline as a set of textures or if the compositing is still computed live ingame ?
I'd be curious to know if the final material gets baked offline as a set of textures or if the compositing is still computed live ingame ?
According to the SIGGRAPH notes, the material gets baked offline during their build process as a set of textures/texture channels comprising the material's BRDF parameters.
hello my friend . i could u plz help me how can i find where can i use material composite system in mari texturing workflow. is it in maya or mari? is that a script? can u give me more information about that.iv read you pdf in siggraph but i culdnt fine it where can i fine and use it. tnx very much
Replies
Man, everytime I talk to graphics engineers from the studios working on this sort of stuff, my mind just goes kaputz.
Though, I CAN see the SSS is kicking in like a mule.
The base materials that propagate onto your objects are a lot more advanced than simple d/s/n/g materials.
That's what I was thinking, but they're not explaining any of it. Care to elaborate a bit more?
Does it multiply additional values into your maps? Does it create additional (hidden) composite value maps other than the actual texture composites?
It's hard to tell what's happening exactly with so little info.
Here's the full course page from SIGGRAPH:
http://blog.selfshadow.com/publications/s2013-shading-course/
and the course notes provide additional information:
http://blog.selfshadow.com/publications/s2013-shading-course/rad/s2013_pbs_rad_notes.pdf
There's a 4 channel composite map in the slides that holds occlusion, anisotrophy, BRDF, and roughness values.
Additionally, by tying everything into our own database system, we are able to make global, sweeping changes and have them propagate through inheritance. I'm not personally familiar with Substance Designer's workflow, and maybe this can be automated through their workflow, but being able to tweak a property of base gold and have it propagate to 5, 50, or 500 assets and maps through a data rebuild automatically is a huge advantage for ensuring global consistency.
Buuut, WHAT THE HELL IS THAT THING ON THE LAST PAGE?!
[ame="http://www.amazon.com/Accoutrements-12293-Squirrel-Mask/dp/B0070QMUN2/ref=sr_1_1?ie=UTF8&qid=1374873018&sr=8-1&keywords=squirrel+mask"]Amazon.com: Squirrel Mask: Clothing[/ame]
with a scarf, on a mannequin
Snefer, the compositing is an offline process that bakes and packs data down to a handful of maps - the slides show the breakdown of each channel of the 4 maps - D/N/S and then a special 4-channel map with a bunch of material values in it. You can have an arbitrarily layered compositing material (such as the well pump in the slides) and it bakes down to the same number of maps.
With a run-time layered environment material, obviously, you can be blending between several different materials and pay the cost for each, but that's no different than any other blend material. The big win for our run-time layered materials compared to the sorts of vertex blend materials you might make in Unreal is that our system is built off our database with inheritance, so we could (and do) have a universal water, universal mortar, and 30 different brick materials, and wind up with 30 different brick+mortar+water materials. If we then decide we want the water to be brown, or the mortar to be smoother, we can make that change in the universal material, and have it propagate to all 30 brick+water+mortar materials.
I believe the material compositing and layering isn't really more or less texture dependent than a similarly robust end-result (ie, if you wanted the various rendering features, such as map-controlled roughness, BRDF, etc), the win is in the fact that it all retains a modular, component-driven system wherein the elements can be edited in isolation and recombined by the software.
Kudos for recreating it in-house though
this is our own, material editor. mari is used separately to create the maps.
not necessarily, the smasher will still crush it to one single material, albeit more expensive. but, still one single draw call.
http://mynameismjp.wordpress.com/2013/07/28/siggraph-follow-up/
Interestingly, I also hear that Mari has "almost completely replaced Photoshop" for them
Yes and no. I still use photoshop! :P
mostly vector math, linear algebra, and some working knowldge of a shader or programming language.
If a lot of it has been integrated inside Mari, how difficult did you find that process (I understand Mari is supposed to be quite easy to implement new shader models etc) ?
The workflow might be similiar, but I believe the end result is completly diffrent. Substance Designer essentialy generate textures, that you have to plug into your own material.
Material = Shader (for most cases anyway).
The material system they are using is not generating static textures they are using in materials (shaders), but material it self is generating texture.
I think this approach resembles much more what is done in offline rendering, where you composite diffrent, often very simple textures (often they are simple masks, that sole prupose is the where are what colors and other properties) to get the final effect.
The compsiting/procedural generation of final images inside shaders will more often used as we move towards PBR, as good lighting will mask most of tiling issues.
I'd be curious to know if the final material gets baked offline as a set of textures or if the compositing is still computed live ingame ?
According to the SIGGRAPH notes, the material gets baked offline during their build process as a set of textures/texture channels comprising the material's BRDF parameters.
i could u plz help me how can i find where can i use material composite system in mari texturing workflow.
is it in maya or mari?
is that a script?
can u give me more information about that.iv read you pdf in siggraph but i culdnt fine it where can i fine and use it.
tnx very much
cause substance painter would be easier and cheaper... and you could do the same... except you need a massive amount of udims...