Home Unreal Engine

Expanding DBuffer Decals

1
andrad
polycounter lvl 5
Offline / Send Message
andrad polycounter lvl 5
==============================
EDIT 2022-11-09
==============================
I'm cleaning up and consolidating a bunch of accounts, so the GitHub links in this thread will soon stop working. I have included my commits as patches attached to this post, so anyone still interested in this can take them and try to integrate them with their version of UE.
==============================

This post originated on the UE forums. a user at Polycount suggested I post the information here as well in order to reach more people and gain more feedback. I plan on keeping both threads in sync as I make progress. I also have some ideas on how to proceed, but would like to gauge other people's opinions before. I continue this tomorrow, for now I'm off watching the new season of Luke Cage. I want to see Misty's robot arm!

======================= Original message incoming =========================

I managed to implement metallic output for dbuffer decals. Below are some screenshots of how stuff looks. All the white dots and sprinkles in the buffer image are fully metallic decals. There's no dynamic lighting in the scene.




Note how this also allows placing non-metallic decals on metallic surfaces. This wasn't possible before where decals would inherit the metalness value of the underlying surface every time, which means you couldn't place a sign or a poster made out of paper on a metal wall without the poster becoming metal, too. The dark screws in the screenshots below are correctly rendered non-metallic even when placed on the metal floor.



How does it work?


There's a new option in the material settings under Decal Blend Mode: DBuffer Tanslucent Color,Normal,Roughness,Metal. Choosing this enables the metallic pin in the material attributes. From there it works as every other material does:

1. Connect pin
2. Save material
3. Profit!



I have not yet measured the performance impact, but it should be minimal. Connecting the metallic pin increases instruction count, but that's the case with every non-metallic vs. metallic material. There are some additional instructions in the shader files that are necessary to actually read the metallic value and write it into the respective buffers. But that's just some assignment and basic arithmetic operations. It doesn't get cheaper than this.

One reason I haven't measured performance is that the material is not yet optimized. I'll write about that in another post, in order to keep these from going too much TLDR.

So, what's the bad news?


There is unfortunately no way to get this working without modifications to UE source code. I tried to get this done by modifying the shaders only, but it simply does not suffice. Modifications are minimal, Git tells me I only had to change 24 lines of C++ code. But still, if you want to use this you have to use a custom engine build, as it is also not possible to pack this in a plugin.

Once I'm done with all I have planned, I'll tell you which commits you have to cherry-pick in order to integrate this in your engine builds. Alternatively, you can go pull the whole engine code from GitHub. The branch to check out is named decals-plus. This might not always be up to date and major breakage might occur. You have been warned.

What can you do?


If you'd be so inclined, I'd like you to help testing the stuff. There shouldn't be any problems really, but I would still like to know if there are problems on specific platforms or hardware configurations. I have compiled a demo project that you can play with. You can get it here:

Windows_64
Linux_64

Setups the project was successfully tested on:
Windows, Nvidia GTX 1060
Windows, Radeon HD 8850M
Ubuntu 18.04, Radeon HD 8850M

The Radeon card is a GCN 1.0 Southern Island chipset released in 2013. On Ubuntu the amdgpu free driver was used, running through Mesa 18.1. I haven't tested either the radeon driver nor any of the proprietary drivers and I didn't test with Nvidia on a Linux system at all. I suspect It works just as well, but I rather keep my Linuxes free of NV if I can help it.

I can't compile for Mac or consoles and if anyone who can would like to compile the modified UE code and give it a spin, I'd be tremendously grateful.

Future work


I should talk about that in another post. This one's long enough as it is.

Replies

  • andrad
    Offline / Send Message
    andrad polycounter lvl 5

    Goal: Selective Blending

    Ultimately, what I want to achieve is the ability to selectively blend different channels of the decal with the underlying material. Examples:

    A metal bolt overwrites roughness, normal, metalness (usually) and color (in most cases) values.
    A crack might overwrite normal information and maybe roughness because of dust and dirt that settle in cracks over time.
    A panel might overwrite or blend normal information at the edges and retain it on the surface.
    A layer of paint overwrites color, maybe roughness and might overwrite or blend or retain the underlying normal information.




    The screenshots above show this quite nicely. The screws overwrite all the information of the underlying material. The strip of color overwrites the color and roughness, but keeps normal information. The recess in the middle keeps the underlying normals intact and nearly overwrites all of it only at the incline.

    Hold on. Isn't this exactly what we want? Yes and no. It is, but only because I cheated. First, look at the rim of the screws where the surface is recessed and gets a little darker than everywhere else. The color roughly matches the color of the tiles, but only because I manually set it so. Look at what happens when I set the background color inside the decal material to a bright green:



    Nice. It is obvious that this method doesn't scale, especially with very colorful surfaces. We don't want to have to set a background color at all, we want the decal to automatically use the one that's provided by the underlying surface. Like with the large recess in the middle and the strip of paint at the bottom.

    This leads to the second problem with this setup. The piece of wall above uses 6 materials! Two of those are the metal beams and the tiles on the wall (which could easily be packed into one), the other four are for the different types of decals, using the different types of dbuffer decal blend modes:

    Screws: Color,Normal,Roughness,Metal
    Recess: Normal
    Paint: Color,Roughness
    Recess (metal panel): Normal,Roughness

    This means that what many of us want - to selectively blend the different material attributes between decal and underlying surface - has already been possible with vanilla UE for years. The drawback is that the number of draw calls would rise spectacularly if you wanted to do this for your whole environment. That sucks, especially for fully dynamic lighting which requires an already increased amount of draw calls compared to statically lit games. Even then it's not perfect, as the example with the screws demonstrates.

    The Plan


    So the plan is simple: Implement selective blending/masking inside the decal shader so that the four different decal materials above can be replaced by one.

    This might be easier than I originally thought. While I was working on getting metalness to work, I stumbled upon pieces of code that suggest getting the actual blending to work is a matter of modifying no more than 4 or 5 lines of shader code. The shader even incorporates a concept of "multi opacity", where opacity is stored as a vector containing separate opacity values for color, roughness and normals. It's not used like that for now, but changing it does seem pretty trivial at this point. I'm very optimistic for now.

    The problem lies in getting the data to the shader. This is not really a technical problem, but a conceptual one. It also impacts performance (so it probably is a technical one...). Right now, the decal material provides only one opacity value via the Opacity pin of the Material Attributes node. This one float value is used for every blending operation on all the attributes that interest us.

    The Question


    The question is: How do we provide four (color,roughness,metal,normal) different values for opacity to our material while adding the least amount of overhead and performance impact to the shader?

    The answer to this will impact how comfortable it is for artists to work with the new decal material. Performance-wise, it's a matter of minimizing instuctions needed and data transferred. Floats are faster than texture samplers, but providing texture masks is simpler for artists than computing crazy bitmasks inside their node graph. Also, this should all be done with the least amount of changes to UE source code.

    What doesn't work is utilizing the alpha channels that already exist on texture samplers for color and normal  opacity, respectively. You might think that the alpha channel of a texture node is passed to the shader when plugged into the BaseColor pin, but the compiled code only ever uses RGB values and discards the alpha. Same for the normal map. Changing that is comletely out of the question.

    There's also the issue of blending vs. masking. Should we blend between decal and underlying surface or is it sufficient to completely mask the decal once its opacity reaches below a configurable threshold? Masking would be cheaper and could probably be done using only one float as input for all four attributes. But there is no doubt that blending would look better, be more versatile and easier to work with. I tend towards blending, which is also closer to how the shader handles it now.

    This means the Material Attributes node has to provide 3 new pins or one pin for a 3-component vector. Here's where I'll be investigating during the next few days. The node already provides 2 custom data pins. They're disabled by default, but that's easily changed. The problem is that those two are floats only and therefore not sufficient. There seems to be a way to define completely new custom pins. I only ever heard of it, but if the rumours are true, this might be the way to go.


  • Millenia
    Offline / Send Message
    Millenia polycount sponsor
    Very interesting, cheers for sharing!
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Menchen Thanks! I'm afraid what you suggest is still not enough. Not only because of the examples I posted above and the limitations you mention, but also because of the fact that opacity still has to determine the weighting of the blend. Let's say I have an opacity of 0 and the decal normal blends with the underlying surface. Do they blend at equal weights, 50/50? That would mean a lot of detail gets lost or "washed out" because the normals would be flattened quite a bit in a lot of cases. If the decal normals overlay instead of blend, we lose the underlying information when we might like to have it.

    All in all, for the kind of fidelity I'm after, a single float value just does not contain enough information.

    What I'm currently trying to work with is this:

    The last pin called "MultiOpacity" accepts a 4-component vector that contains the opacity values for color, normal, roughness and metalness. If it is used, the input of the Opacity pin will be ignored, as it is not really needed anymore. This also allows for some optimization, because the reverse can also be true: If the Opacity pin is used, MultiOpacity will be ignored, opacity will be the same for all 4 attributes (basically how it is now) and we can save some shader instructions.

    The pin is at the bottom for now, because adding new material attribute pins in the middle requires some extra finnicky additional code that I do not yet completely understand. Getting this wrong means a whole lot of wrong shader code, so it's important that I do this right. What I'd like to do is put the MultiOpacity pin nearer to the other two opacity-related ones. I'd also like to hide it by default if the material is not a deferred decal, in order to not clutter the node with useless stuff.

    Getting all this to work will require some patience, though, because every code change warrants a near-complete engine recompile. So I have to contend with coding for ten minutes and waiting about an hour for compiling to finish before I can see what I broke this time. But I'm getting there.

    On a completely unrelated note: Two episodes in and still no robot arm. What gives!?
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5

    Goals: Addendum

    One thing I forgot: Once I have all of this working for dbuffer decals, I like to implement it for regular deferred decals, too. I actually started with those, but getting them to blend with instead of overwriting underlying materials turned out to be more complicated than I thought. Once I'm more familiar with Unreal's render pipeline, I'll have another go at it.

    The rendering steps leading up to regular decals seem pretty clear to me: The gbuffer is filled during the prepass and the basepass, and right before lighting is applied, deferred decals are rendered on top of the geometry. You can see it in RenderDoc: No decals at one stage and all decals at the next.



    Still, I need to do some more digging until I'll be able to figure this out. The point is, I would like to have selective blending working not only for dbuffer decals, but for regular ones as well.

    But Why?

    Why would I want to do this when dbuffer decals already give me all I need? There are two reasons.

    The first one is really simple as it can be explained by two numbers: 99 and 51. The default base material in its simplest form requires 99 instructions in the base pass if static lighting and dbuffer decals are enabled in the project settings (they are in a new blank project). The same material requires only 51 base pass instuctions if both options are turned off. That's roughly half.

    Which means that projects that don't make use of static lighting can usually save a good number of shader instructions if they disable it. If you don't use static lighting, though, there is usually no need to have dbuffer decals enabled, since their main (only?) advantage over regular decals is that they can be used with baked lighting. But if you wished to make use of selective blending in your decal material, you'd be forced to enable dbuffer decals only for this feature. And your game will be less performant because of it.

    That's reason number one why I'd like to implement selective blending for regular decals as well.

    The second reason is that dbuffer decals still don't provide access to the same material attributes that regular decals do, most notably the Emissive pin. There is no chance I will try to enable emission in dbuffer decals. Doing this would require adding another render target to the dbuffer which I want to avoid at all costs. As it stands, there is just no space left to fit the emissive information. The last remaining space that the dbuffer had to offer went into storing metalness.

    Goode ol' regular decals don't suffer from this limitation and offer a more complete package as a result. Getting selective blending to work for them would offer people a perfomrant alternative for dbuffer decals in games that make use of dynamic lighting only.
  • Johnnynapalmsc
    Offline / Send Message
    Johnnynapalmsc interpolator

    Hi Andrad,

    I hope you don't mind me posting in your thread and asking a few questions. The issue that I have is way more basic than anything you spoke about here, but I think this is the right thread to ask it. :)



    So I can't really figure out why I am getting these artifacts when I am using no alpha blending and using differed decals. It must have something to do with normals as that's the only channel I am using. Isn't this supposed to work with no normal masking?
    Also when it comes to base colour, how would I go about have a separate opacity channel for the masks only? Is the only way to do this to import the albedo of underlying texture and blend within this decal material? I suppose this is what your MultiOpacity will be doing?
  • Obscura
    Offline / Send Message
    Obscura grand marshal polycounter
    You need to use an opacity map.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Johnnynapalmsc Hi! First off, I'm not quite sure what exactly your issue is. The only problem I can make out in the screenshots above is that the whole square mesh of your decal is slightly visible and I assume you want it to blend so that only the normals of the round thing are. That's because you don't put anything in the Opacity pin. Default opacity is 1, so the complete mesh is visible, which in your case means the whole square is overwriting the normal and roughness info of the mesh below. Buffer visualization viewmode will confirm that.

    As @Obscura pointed out, you need to provide an opacity map. A texture that is white where your decals are and black everywhere else is usually the way to go.

    I'm also not quite sure I understand every part of your second question, but separate opacity channels are not possible. That is indeed what I'm trying to change up there. Your decal will blend all of its attributes - color, roughness etc. - based on the one input you provide to the Opacity pin. If opacity is 0.4 for a given pixel, then the color of that pixel will be 40% your decal color and 60% the underlying surface color. Same with roughness and normals.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    @Johnnynapalmsc You should check out @Millenia (who posted above) youtube video on setting this up from Max>UE4. I won't post the link here as I don't want to clutter up this thread. I also have a few Deferred decal tutorials on my YT channel.
  • Johnnynapalmsc
    Offline / Send Message
    Johnnynapalmsc interpolator
    Ok thanks very much, for some reason I thought that the blank area of the normal map would act as an alpha, resulting to surrounding area being completely see through.
    And lastly, in terms of the 2nd part of my question, here a super simple illustration on what I mean:

    So the normal map is inherited of the underlying material, but then on top of that there is a masked diffuse texture. I don't believe there is a way to do this currently with deferred decal, unless you use the underlying texture and blend it with the pink part within the material using alpha blending? Is it something that layered materials could achieve? Thanks in advance!
  • Johnnynapalmsc
    Offline / Send Message
    Johnnynapalmsc interpolator
    Menchen said:
    Ok thanks very much, for some reason I thought that the blank area of the normal map would act as an alpha, resulting to surrounding area being completely see through.
    And lastly, in terms of the 2nd part of my question, here a super simple illustration on what I mean:

    So the normal map is inherited of the underlying material, but then on top of that there is a masked diffuse texture. I don't believe there is a way to do this currently with deferred decal, unless you use the underlying texture and blend it with the pink part within the material using alpha blending? Is it something that layered materials could achieve? Thanks in advance!
    Independently masking each parameter (having a different opacity mask/map for albedo and another one for normals) is something you can't do. That's the main purpose of andrad's work in this thread
    Ok awesome, just wanted to make sure I'm not missing anything. Thanks!
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Johnnynapalmsc Since you asked about layered materials: EVERYTHING that's discussed in this thread and in the other one is already possible using layered materials. All the cool selective surface blending stuff is relatively easy going that route. It just has a number of drawbacks when it comes to performance and workflow, which is why I am trying to move away from it and started this work on decals.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5

    Interlude: Translucency Sort Order

    One problem I noticed when stacking decals is that the sort order wouldn't respect geometry. You can see it in the image below. Even though the screws are placed well above the paint strip, the paint gets rendered after the screws and as a result overwrites the screws' color information. That's not what I want.



    It's true that UE allows users to define the sort order for translucent geometry and decal actors, but only relative to other actors. The problem is that the above does not show two actors fighting for the Z, it's two materials on the SAME actor. None of the settings available to us handle a situation like that. It gets even more annoying when it's the same material that contains stacked decal geometry.

    There are several possible solutions for this problem. The first one would be to split overlapping decals into separate actors. Looking at the image above, I could import the wall as the first mesh, the paint strip as a second and the screws as the third mesh. I'd place all of them in the level and could set the translucency sort order in the actor settings so that the screws would always render on top of the paint.

    I'd also quickly start to hate the shit out of it. Having three separate actors for what should be only one, always taking care to move and rotate them in sync, and maintaining sort order values that don't conflict with other translucent pieces of geometry that might enter the scene just sounds like trouble. I'm doing this to make my life easier.

    So I took a look at what detemines the order in which different materials on the same mesh are rendered, and - to the surprise of probably no one - it's the order of the material slots on the mesh. The top-most material is rendered before the one below it and so on. Which means all I have to do is make sure that the order of the material slots corresponds to the intended sort order of the decals.

    Maintaining Material Slot Order

    As far as I could find out, the order of material slots on an imported mesh is determined by the application that exports the fbx file. The Unreal editor preserves that order on import. So this part is not really related to UE but to Maya, Max, Blender or whatever application you use to build and export your 3D models.

    I use Blender, which is free software and thus made it easy to spot where the order is determined. It's not something the artist has any immediate control over, so I modified the code in the fbx exporter to allow for some user control. The modified exporter is on GitHub, but please be aware that I also changed the way how the exporter handles root bones on exported skeletons. This is a longstanding cause of confusion especially for people new to Blender and Unreal, but I don't want to derail this thread, so I invite you to google it. Just keep in mind that if you use the modified exporter for skeletal meshes, an additional root bone is never created and UE imports it without modifications to the skeleton.

    The way Blender's fbx exporter sorts meshes now: If you're using the binary exporter and "Selected Objects" is ticked, the meshes inside the fbx file will be sorted alphabetically, according to their names as shown in the outliner.



    This allows you to easily control the order of material slots. You could call the wall mesh "AA", the paint mesh "AB" and the screw mesh "ZZ" and it would guarantee that the screws are rendered last. The result is this:



    Note that it doesn't really matter what the other meshes are called, as long as the first instance of a paint mesh is sorted alphabetically before the first instance of the screw mesh, and if you don't have overlapping decals on a mesh, you can ignore all of this completely. Also note that Blender sorts case-sensitive. I realize that giving meshes some random names does not fit everyone's workflow. Mine is highly modular even inside Blender, using instanced meshes and linked duplicates, so I usually don't care about the names of the meshes that are about to be exported.

    I don't know if Maya or Max users have similar problems with sorting material slots and unfortunately I lack the time to dig into it. I hope you are fine, but if not, you can always switch to Blender :p
  • Johnnynapalmsc
    Offline / Send Message
    Johnnynapalmsc interpolator
    andrad said:
    @Johnnynapalmsc Since you asked about layered materials: EVERYTHING that's discussed in this thread and in the other one is already possible using layered materials. All the cool selective surface blending stuff is relatively easy going that route. It just has a number of drawbacks when it comes to performance and workflow, which is why I am trying to move away from it and started this work on decals.
    Awesome thanks for that. I don't have access at the moment to UE4 as I'm away for a bit, but I assume you can use layered materials with deferred decals in UE4? For example the deferred decal layer would be providing the normal information and then I can alpha mask another material on top with all of my texture and colour information? Or am I misunderstanding this? For example if I have s screw in a hole, the hole would be deferred decal and screw would be masked out as the top layer and all of that would be applied to one mesh plane?
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Johnnynapalmsc You could probably do that, but usually you wouldn't use deferred decals at all for this. You would do all the blending in your material. A performant way to blend is based on vertex color, where your floating decal planes are identified by their color. From here on out I'm going to use the word "decal" to refer to those floating detail meshes. Look at this super simple example below.
    The pink color is the regular color of your mesh, grey is the color of metallic screws and the texture contains the roughness, metallic, normal and color masks for your decals.

    So, the node that goes directly into base color, the If node, says: "If vertex color red is above 0.5, use the regular color, otherwise blend between the regular color and the metal color according to the texture that knows where the metallic parts are." You then repeat this check for the other attributes, which means copy-pasting the If node (the 0.5 node is just there for demonstration purposes) three times and connecting them with additional lerps based on the texture mask.

    You have to make sure that you set the correct vertex colors inside your 3D modeling software and if you want to be extra diligent you also have to work with multiple UV channels, but the basic idea is just what I wrote above.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5


    Looks like I'm done. The two screenshots below show the same decal before and after implementing multi opacity. In the first image opacity is 1 for the whole material. That's how it looks when the opacity pin is just left alone. The second image shows multi oacity applied, providing different opacity values for each attribute. You can find more detailed breakdowns of the images into the different buffers below.



    Providing Opacity

    In order to drive the opacity for color, normals, roughness and metalness I use a single texture. It works pretty much as it always has been, only now all four channels of the texture are read when connected to the pin instead of just one. I made sure this is "backwards-compatible", though. If you connect a constant value, all of the decal is visible for every attribute. If you connect a constant vector, you can control the opacity for each attribute separately, but the values will be the same across the whole surface of the decal. If you connect a single texture channel, this channel drives the opacity for every attribute. This is analogous to how it works with the regular Opacity pin.



    Note that the Append node is necessary because, as I wrote earlier, texture samplers return a 3-component vector and discard alpha, so I have to put it back in there, so to speak, before it connects to the pin. Append is not the only way to do it, there are others that can do the same thing. The imortant part is that whatever goes into the Multi Opacity pin has to be either a 1-component (i.e. a scalar value) or a 4-component vector.

    The opacity texture I'm using looks like this:



    You can see that most parts are either fully white or fully black. Blending happens here and there, most of it in the roughness channel, never in metalness. The two screws and the clip thing overwrite the underlying surface on every channel. The pink dots only ever affect the base color and leave every other attribute alone.

    The normal channel overwrites the underling normals for recesses, cracks and panels, but only where the actual incline is, leaving the surface inside alone. This preserves the underlying information nicely. You can see that on the large one in the screenshots above, and you can see that it might have been better to increase the normal opacity on the inside of the two smaller recesses, because having the underlying normals shine through as strong as they do here, this looks a bit awkward.

    The roughness channel provides an opportunity to add some form of wear to cracks and seams. By blending the roughness opacity just slightly and providing a roughness value close to 1 for that area of the decal, I can simulate how recesses gather dust over time and lose a bit of their shininess. Alternatively, if I set the roughness value close to 0, I can simulate how edges get smoother over time from all the friction. A low roughness opacity here means the effect will be very subtle. It also doesn't matter how rough the underling surface actually is. All I'm saying here is, "Make this area slightly rougher (or smoother) than it is now".

    The metalness opacity is relatively boring, all in all. It should almost always be either black or white and never blend. It controls where the decal can overwrite the metalness value of the surface beneath it. A thing to note is that this means reversing metalness, too. If a surface is made of metal and I wanted to have some screws on it that are not, the metalness opacity for those screws should be white. Because only then do the screws overwrite the metalness, in this case setting it to 0. The Metalness giveth and the Metalness taketh away.

    Providing Values

    There is an important distinction to be kept in mind, and the last paragraph hinted at it a little. What this texture provides is opacity, not value.

    Nowhere does it say how rough, how metallic a surface should be or which color it should have. The metalness channel up there doesn't say that the two screws and the clip should be made of metal. It says that for those 3 objects, the decal is allowed to overwrite the metalness of the underlying surface. Whether they actually are metallic depends on what you connect to the Metalness input pin of the decal material. If you connect a single constant value of 1, it won't make your whole decal metallic. It will make it metallic only where the texture above allows it to. So even though I made the screws metallic in this specific example, as seen in the screenshots above, they would cease to be so the moment I disconnected the Metallic pin.

    That's not to say you can't use this texture to drive values, of course. If you were certain that all of your screws will always be made of metal, you could just plug the channel into the Metalness pin and be done with it.

    Before & After

    So here are the screenshots of how stuff looks before I implemented multi opacity and after that. It's the exact same decal and the same material. I noticed a little too late that I changed the roughness value at some point between the two captures. That's why the decal is rougher in the first set of images. Its roughness changed from 0.75 to 0.45, but apart from that, all material parameters are the same.




    We can see in the first set that the decal opacity completely overwrites the information below. Its opacity is set to the same value for every material attribute, 1 in this case. The second set shows how the multi opacity texture drives each attribute independently. The color of the pink dots blend nicely with the underlying color while the other attributes don't care about those dots at all. Roughness, normals and metalness are still taken from the surface below. We can also observe the roughness blending while not affecting the color or the normals at all. All in all, this is some nice selective blending with all the freedom we wish for.

    What I find exciting for effects and stuff is that we can also drive those opacity values - separately - by gameplay code. Think about some simple things like a puddle of spilled paint: It can dry up over time, changing its roughness to that of the underlying material while preserving its color. Procedural and animated decals become a whole lot more versatile with this.

    Technicalities

    Turns out I did have to implement an additional render target for the dbuffer after all. It took some time for me to understand certain things about UE's render pipeline, but once I did, it became clear that the only way to do this cleanly is to use an additional render target for the metal opacity.

    This problem only exists because I want metallic dbuffer decals as well as selective blending. Either one of those alone would be possible with only 3 dbuffer render targets, as it is in vanilla UE, but since I aim to implement both, There's no reasonable way around a fourth texture. I spare you the details now, but I can go into it if there's a demand.

    Adding a render target to the buffer turned out to be way easier than I thought, so Epic probably deserves a lot of credit there. Documentation is sparse, but the existing code provides enough useful hints to get by. Optimization is next on my to-do list and once I'm done with that, I will know how much of a performance impact this step turns out to be. The additional render target should increase GPU memory usage by about 8 MB for 1080p displays and 32 MB for 4k.

    I also noticed that cooking for GLSL 150 (SM4 on OpenGL systems) threw a shader compilation error for WorldGridMaterial because it now exceeded the limit of 16 samplers per material. You can google what's up with that limitation and why Epic put it in place. The point is if you package a game for Linux or Mac and you compile shaders to be compatible with OpenGL 3.3, you might hit this sampler limit.

    There are 2 ways around that. Either reduce sampler usage elsewhere or compile shaders for newer versions of OpenGL. You can reduce the overall usage of texture samplers by disabling features in the project settings. I decided to disable stationary sky lights, which was enough to finish packaging the project without errors. You can disable support for older versions of OpenGL in the project settings as well.

    There are basically 2 relevant versions of OpenGL that correspond to SM4 and SM5. OpenGL 3.3 is the equivalent of SM4 (D3D10) and was released in 2010. OpenGL starting at version 4.3 is the equivalent of SM5 (D3D11) and was released in 2012. The mobile AMD graphics card from 2013 which I wrote about in my first post supports OpenGL 4.3. I'm writing all of this to give you an idea of how widespread driver support for OGL 4.3 is on current Linux systems. I don't have official numbers, but I think it is safe to assume that most Linux gaming rigs are easily capable of utilizing SM5.

    All of this is completely irrelevant for Windows, which doesn't impose such tight restrictions on the number of samplers used. I can't test on consoles, one day I might be able to. Does UE support decals on mobile systems at all?

    As it stands now, the extended decal functionality works on Windows and Linux. I suspect it works on MacOS as well. I'd be very surprised if it didn't work on current-gen consoles, to be honest.

    Wrapping Up

    That's it for today. Next on my list is optimizing and profiling the decal system and also refining some workflow procedures. If it turns out that UE supports dbuffer decals on mobile systems I'm inclined to package a mobile project to test it for Android, too. Once again, I don't have access to Apple hardware or I'd test on those systems, too.
  • Nosslak
    Offline / Send Message
    Nosslak polycounter lvl 12
    Do you have any plans on implementing support for AO maps in the decals? Last time I tried UE didn't support it, but you could just get around it by including it the albedo, however that's not very PBR compliant.

    I'll give this a go once I get home, if you've got it all submitted and somewhat easily downloadable.
  • AXEL
    Offline / Send Message
    AXEL polycounter lvl 6
    Do you think there is any chance of ever getting POM in dbuffer decals ?
  • Nosslak
    Offline / Send Message
    Nosslak polycounter lvl 12
    There's been support for POM for a while, my dude. Just add a ParallaxOcclusionMapping node (you don't need that weird workaround with projected UVs at all) and you're pretty much set to go. Here's an example of it in action (not the prettiest just a quick test I did a while back):
    You can see the simple shader setup for the decals here, nothing too fancy going on, just the POM node and a couple controls to change the strength and blending of the decals.


  • Obscura
    Offline / Send Message
    Obscura grand marshal polycounter
    Nosslak said:
    There's been support for POM for a while, my dude. Just add a ParallaxOcclusionMapping node (you don't need that weird workaround with projected UVs at all) and you're pretty much set to go. Here's an example of it in action (not the prettiest just a quick test I did a while back):
    You can see the simple shader setup for the decals here, nothing too fancy going on, just the POM node and a couple controls to change the strength and blending of the decals.


    Wow haven't checked that for a while. Thats some great news! Looks like things are starting to get together.
  • Toku
    Offline / Send Message
    Toku polycounter lvl 6
    Well this is good news, UE4 really needs a fully functioning PBR decal system! Last year I tried making a Star Citizen style environment and I found a hacky way to get the decals working, Basically it's 3 planes layered ontop of each other, the first one uses the DBuffer decal shader and effects the normal/roughness of the surface, 2nd albedo/metalness and 3rd is anAO. It requires 3 times the geo and 2 different mask textures so it isnt really ideal, considering you can achieve the same effect with a correct shader.


  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    Nosslak said:
    I'll give this a go once I get home, if you've got it all submitted and somewhat easily downloadable.
     @Nosslak Unfortunately, I don't. The code lives in this UE fork: https://github.com/POETindustries/UnrealEngine/tree/decals-plus and to use it, you would have to compile the engine yourself. I am very certain that I can't throw precompiled binaries of Unreal Engine around without unleashing the angry powers of Epic's lawyers.

    As for AO: I can look into it, but for dbuffer decals the chances are slim. The dbuffer is somewhat restrictive in the way data is stored and adding new information to it almost always requires additional render targets. I'm going to investigate regular decals in depth after I'm done with optimizing and profiling my changes to the dbuffer. Maybe there's a way to allow for Material AO in those. Both decal systems work substantially differently.
  • LiuYang
    Offline / Send Message
    LiuYang null
    thanks for the amazing work! 
  • Nosslak
    Offline / Send Message
    Nosslak polycounter lvl 12
    andrad said:
    Nosslak said:
    I'll give this a go once I get home, if you've got it all submitted and somewhat easily downloadable.
     @Nosslak Unfortunately, I don't. The code lives in this UE fork: https://github.com/POETindustries/UnrealEngine/tree/decals-plus and to use it, you would have to compile the engine yourself. I am very certain that I can't throw precompiled binaries of Unreal Engine around without unleashing the angry powers of Epic's lawyers.

    As for AO: I can look into it, but for dbuffer decals the chances are slim. The dbuffer is somewhat restrictive in the way data is stored and adding new information to it almost always requires additional render targets. I'm going to investigate regular decals in depth after I'm done with optimizing and profiling my changes to the dbuffer. Maybe there's a way to allow for Material AO in those. Both decal systems work substantially differently.
    I misunderstood what you meant in the first post, I thought you were saying you had it ready to go so anyone could easily try it via the download links. I'll look into compiling Unreal and giving it a try in the next few days. Edit: I tried using your link and it's just returning a 404 error, I found your account on Github but couldn't find the repository in question.

    Also thanks for trying to improve the decals!
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Nosslak Do you have your GitHub account linked to your Unreal user profile? You need to do that in order to see people's UE forks on GitHub.
  • Johnnynapalmsc
    Offline / Send Message
    Johnnynapalmsc interpolator
    andrad said:
    @Johnnynapalmsc You could probably do that, but usually you wouldn't use deferred decals at all for this. You would do all the blending in your material. A performant way to blend is based on vertex color, where your floating decal planes are identified by their color. From here on out I'm going to use the word "decal" to refer to those floating detail meshes. Look at this super simple example below.
    The pink color is the regular color of your mesh, grey is the color of metallic screws and the texture contains the roughness, metallic, normal and color masks for your decals.

    So, the node that goes directly into base color, the If node, says: "If vertex color red is above 0.5, use the regular color, otherwise blend between the regular color and the metal color according to the texture that knows where the metallic parts are." You then repeat this check for the other attributes, which means copy-pasting the If node (the 0.5 node is just there for demonstration purposes) three times and connecting them with additional lerps based on the texture mask.

    You have to make sure that you set the correct vertex colors inside your 3D modeling software and if you want to be extra diligent you also have to work with multiple UV channels, but the basic idea is just what I wrote above.
    Thank you very much for all of your help. Thought I might as well show what all of this lead me to creating, not trying to hi-jack this thread :D https://polycount.com/discussion/202641/oxygen-generator-room-jonas-roscinas-star-citizen-test/p1?new=1
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    Menchen said:
    Just a noob question: if you release an update on this for optimization purposes and such, would we have to recompile the whole engine again (because it's taking a lot of time right now), or just the classes that change?
    Both, most likely. It usually is just the classes that have changed plus the classes that depend on those. But because the classes that deal with rendering are used pretty much everywhere in the engine, almost all of it has to be recompiled anyway. A full engine build should take between 1.5 and 3 hours on relatively high-end harware released within the last 2 years or so. I spend most of the time waiting for compilation to finish while I was working on this. In fact, I nearly finished building a whole kit for a new level while UE was compiling.

    Also, because you wrote about "releasing an update" I'd like to stress that none of this is "released" or finished or in any way complete and shippable. The optimizations I wrote about throughout this thread are an integral part of this extended decal system that I absolutely want to get done before I start thinking about integrating this into real, live projects.

    I'm going to write more about it once I finished the draft on the optimization post.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Menchen Nice! That's using a POM node, right? I didn't have time to test that yet.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Menchen Yes, translucent decals are on my to-do list, too. In fact, emissive is one of the main reasons why I'd like to get this working. And for fully dynamic games they provide a pretty substantial performance improvement over dbuffer decals.

    Another thing: I don't know how well this is known, but the whole problem with decals not showing up in shadowed areas exists only if static lighting is enabled in the project settings. If we disable static lighting for the whole project, there are no problems with decals anywhere. The following screenshots demonstrate this:

    The pink square is a regular decal, blend mode set to translucent. Static lighting is enabled, but all lights in the scene are dynamic, which is why the pink color is at least somewhat visible in the shadowed areas. Nonetheless, we can observe the usual problem: The yellow stripes shine through in the shadows and even in direct sunlight some of the text beneath the decal is visible, because the sun shines on the decal at an angle !=90°. Observe what happens when static lighting is disabled in the project settings:

    None of that! The pink decal is completely opaque both in direct light as well as in the shadows. Just for shits and giggles, I disabled all lights in the scene and made the decal emissive:

    Full visibility. Now, we all know that dynamic GI in UE leaves a lot to be desired, but if a game doesn't make use of baked lighting, disabling static lighting project-wide has a lot of advantages and makes decals behave correctly everywhere.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5

    Optimization

    There are 3 areas of optimization I want to talk about.

    1. Material pins: There are now two pins that deal with decal opacity: Opacity & MultiOpacity. How do they interact with each other? I decided to make it like this: If MultiOpacity is connected, the opacity will be read from this attribute, otherwise the input of the Opacity pin will be used. If you look at the compiled HLSL code for a material, you'll find dozens of statements like these:
    #if POST_PROCESS_MATERIAL<br>&nbsp;&nbsp;&nbsp; ...<br>#endif<br><br>#if FEATURE_LEVEL >= FEATURE_LEVEL_SM5<br>&nbsp;&nbsp; ...<br>#endif

    When a material gets compiled, the parts inside these checks will only be compiled if the checks resolve to true. That's one of the main mechanisms by which Unreal controls shader complexity. The material is not a post-process material? Ignore whatever it is between the #if and the #endif above. Does the platform support SM5 (D3D11)? Include the code above, otherwise, ignore it, too.

    Most of those variables are set by C++ or console variables. So I created a new variable, USE_DBUFFER_MULTIOPACITY, that is set from C++ whenever the MulitOpacity pin in a material is connected. I now had a way to check for the value of this variable inside the shader. The following function returns the opacity of a material:
    half GetMaterialOpacity(FPixelMaterialInputs PixelMaterialInputs)<br>{<br>&nbsp;&nbsp;&nbsp; ...<br>}

    You can read "half" as "float" for this specific example. So, the function returns a scalar, which is not what we need when dealing with multi opacity. I modified it to look like this:
    #if USE_DBUFFER_MULTIOPACITY<br>half4 GetMaterialOpacity(FPixelMaterialInputs PixelMaterialInputs)<br>#else<br>half GetMaterialOpacity(FPixelMaterialInputs PixelMaterialInputs)<br>#endif<br>{<br>&nbsp;&nbsp;&nbsp; ...<br>}

    The shader compiler now changes the return type of this specific function from a scalar to a 4-component vector whenever a material has the MultiOpacity pin connected. I did this in a couple more places and the end result is that when the renderer is not dealing with decals, everything works exactly as before, while shaders that need multi opacity always get it because USE_DBUFFER_MULTIOPACITY will always be set correctly.

    This means that the decal material works the same way it has always worked when only the Opacity pin or none of the two is connected. There are also some bandwidth and computation savings. It also means that migrating to this custom version of Uneal Engine does not change anything about decals automatically, leaving users in control.

    2. In Unreal Engine, there are some nice optimizations that happen for decal blend modes when the Normal material pin is not connected and I wanted to implement those for the new mode as well. This has led to the creation of an additional blend mode.

    Suppose we were using the decal blend mode ColorNormalRoughness, but didn't connect the Normal pin. When compiling this material, Unreal checks if the pin is connected and if not, "downgrades" the material to the same blend mode without the normal information. ColorNormalRoughness becomes ColorRoughness, NormalRoughness becomes Roughness and so on. For this to work with the new blend mode, ColorNormalRoughnessMetal, there had to be another blend mode, ColorRoughnessMetal, which you can find in the decal blend mode dropdown in the material properties.

    This additional blend mode can be used like all others. The reason it exists is to make the aforementioned optimization possible, but since it's already there, you can use it if you want to.

    3. I also added a new decal response to the list of possible decal response modes in the material settings. Decal response governs what kind of decals have an effect on a given suface material. The default response is ColorNormalRoughness, which means if a decal is placed over a mesh with this material, the decal's color, roughness and normal values will override the ones coming from that material. Changing the decal response is a way for a material to allow or prohibit the use of decals on itself. It also has an effect on shader complexity.

    Since there is a new decal blend mode, there should also be a corresponding decal response mode. There is and it is called ColorNormalRoughnessMetal, unsurprisingly. Enabling this decal response allows decals on this material to overwrite metalness in addition to everything that ColorNormalRoughness allows. Activating this response mode also increases shader instruction count from 99 to 104 for the base material.

    Here are the dropdown menus containing the new modes:


    Compared to previous builds of this custom engine branch, metalness will not be visible by default on decals. For consistency with upstream, I decided to keep the default values for blend mode and decal response, which means a new material will have ColorNormalRoughness as its default decal response. In order to use the Metallic pin in a decal, its blend mode has to be ColorNormalRoughnessMetal, and in order to actually see the metalness every material that interacts with this decal must have its decal response set to ColorNormalRoughnessMetal as well.

    Completeness vs. Compactness

    As long as UE supported only color, normal and roughness in dbuffer decals, every possible combination of those could be selected in the blend mode menu. Now, after metalness has been added to it, this is obviously not the case anymore. I only added 2 blend modes while I should have added 8. I limited the amount of additional blend modes for 2 reasons.

    The first one is covered by what I wrote earlier. I want to do this with the least amount of changes to Unreal Engine as possible. There were good reasons to add the 2 modes that I did add, but it is my opinion that there are not enough to support writing the code that adds 6 more. It just spams the list of available blend modes when most people are well served with 3 or 4 out of the 9 that exist now for dbuffer decals.

    As for the second reason: I really don't think there's much utility in the missing blend modes. Let's be honest, when do you need to have a decal that only overwrites normal and metalness? And you cannot use the ColorNormalRoughnessMetal mode which allows you to do just that? Not only do modes like RoughnessMetal and NormalMetal provie hardly any usefulness, they're also going completely against established PBR gudelines and practices.

    A decal that overwrites metalness, but doesn't at least overwrite color at the same time breaks PBR, conceptually speaking. I understand that a lot of people use very stylized NPR shading techniques in their projects, and if they really want to, they can violate PBR guidelines with the decal blend modes that I already implemented. But I don't have to be an enabler for them :)

    In short: I think the missing modes are useless.

    Numbers

    Let's see how all of this affects performance. I already wrote that the new decal blend mode raises shader instruction count by 5. With static lighting enabled, the base material requires 99 instructions when decal response is left at ColorNormalRoughness and 104 when it's set to ColorNormalRoughnessMetal. If static lighting is disabled, those numbers change to 73 and 78. The increase in instruction count seems to be consistent.

    I ran the demo room (links in the first post are NOT updated, btw.) three times under the same conditions but with these differences:

    1. Unmodified Unreal Engine, all mesh decals set to ColorNormalRoughness and all material decal responses set accordingly. This acts as a kind of baseline.
    2. Modified engine, but still only using ColorNormalRoughness.
    3. Modified engine, using ColorNormalRoughnessMetal decal modes everywhere except for the graffito on the wall.

    Let's compare 1 and 2 first. The decal blend modes are the same. 2 uses the modified engine, but doesn't use the new decal modes. The render step that renders dbuffer decals is called CompositionBeforeBasePass. There is also DeferredShadingSceneRendererDBuffer, which collects the decals in the scene and is also responsible for drawing them. These are the results:
    <b>CompositionBeforeBasePass:</b><br>1. 0.11ms avg.<br>2. 0.14ms avg.<br><br><b>DeferredShadingSceneRendererDBuffer:</b><br>1. 0.238ms avg.<br>2. 0.277ms avg.<br>(Render thread time in both cases about 5ms.)

    This tells me that dbuffer decals in the modified engine have an increased base cost. They have become slower even when the new decal modes are not used. I also have an idea as to why that is. A jump from 11 to 14 is roughly equal to one from 3 to 4. The same goes for 238 increasing to 277. I therefore suspect that the increase in render time is a result of adding a 4th render target to the dbuffer in order to provide for the metalness. If this is true, there should not be another significant increase in render time once we activate the new decal blend mode. Lets check that.
    <b>CompositionBeforeBasePass:</b><br>2. 0.14ms avg.<br>3. 0.14ms avg.<br><br><b>DeferredShadingSceneRendererDBuffer:</b><br>2. 0.277ms avg.<br>3. 0.261ms avg.

    Huh. Seems performance is pretty much the same. Who would have thought?

    All in all, the render thread time has increased by about 1%, although that's a pretty useless thing to say. Decal performance depends on the number of pixels on the screen that are affected by decals, and for a setup like the demo room, that number will always be pretty low, which means decals will make up only a small portion of the overall render time. If the new decal system was 10x slower than before, render time would still have gone from 5ms to just 5.5ms.

    It's better to go with what the comparisons above show: We can estimate that the time to process decals has increased by about 33% for the modified engine, regardless of whether the new blend modes are actually in use or not. Since this is still well below 0.1ms for rooms like the one above, I for one can live with that.

    Let's look at a couple shades of green to finish this off.


    This is the decal room with static lighting enabled and decal blend mode set to ColorNormalRoughness.


    This is with blend mode set to ColorNormalRoughnessMetal. You can barely see it, but it is a shade darker than the image above.


    This is with static lighting disabled and decal blend mode set to ColorNormalRoughnessMetal. That is a nice, solid green I'm seeing there. Which provides a good opportunity to transition to the final part of this undertaking: Getting multi opacity to work for regular, non-dbuffer decals. This is what I'll be spending the next couple of days with.
  • Hayden_Price
    Offline / Send Message
    Hayden_Price polycounter lvl 5
    Been having a play around with this the last few days and I'm loving it so far, thank you so much for working on it, it'll definitely change the way I work on future projects. I found that mesh decals, when aligned perfectly to the geometry, will Z-Fight in the Decals-Plus build of Unreal, even when using non-multimasked and 'stock' Unreal Deferred Decals (such as DBuffer Translucent Color,Normal,Roughness). This is easily worked around by ever so slightly offsetting the decal meshes along their face normals, so not a big problem, however the Z-Fighting doesn't occur in the current stable build of Unreal.

    Here is a mesh based DBuffer Translucent Color,Normal,Roughness decal in the current stable build of 4.19:


    And here is the exact same material setup and mesh in the Decal-Plus build (Once again Color,Normal,Roughness, no multi-masking either, just a single opacity mask plugged into the Opacity output, and static lighting disabled in Project settings):


    Bit hard to see in the screenshots, but the Decal-Plus build gets crazy Z-Fighting as opposed to the current stable build. I've been doing my best to keep up with the thread so I apologize if it's already been mentioned and I just missed it, or if it's a known side effect, I just thought I'd drop in in case it's not known/hasn't been mentioned/only happens to me.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Hayden_Price You are very right, mesh decals will cause z-fighting when they are perfectly aligned with the underlying mesh. That's completely expected behavior, but I realize I failed to mention it anywhere, so that's on me. There is a line of code in the stock UE mesh decal shader that offsets the decal based on the distance to the camera and it's the reason your decals work without problems in the unmodified engine.

    I removed that line.

    That piece of code causes a lot of people some trouble with nearby geometry and large viewing distances. You can read about it in this UE forum thread and also on page 5 of this thread on Polycount. All in all, this line has always been considered rather hacky and removing it puts more control into the hands of the artist.

    You're right, though: The "proper" way to author mesh decals now with these changes to UE source code is by offsetting them yourself inside your DCC app.
  • Forest_Cat
    Offline / Send Message
    Forest_Cat polycounter lvl 6
    @andrad Great result! I'm looking forward to the moment when it will be added to the official version of UE4. By the way, how do you plan to solve the problem with Temporal AA which makes all small meshes disappear at a certain distance? At the official forum UE4 this issue was repeatedly raised, but I did not find an adequate working solution.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    By the way, how do you plan to solve the problem with Temporal AA which makes all small meshes disappear at a certain distance? At the official forum UE4 this issue was repeatedly raised, but I did not find an adequate working solution.
    I don't. That's just the way TAA works. But this should rarely be a problem, because most finer decals will be mip-mapped away long before any AA solution renders them invisible.

    And I gotta be honest with you guys, I don't see much of a chance for Epic integrating this into upstream UE. It adds small but significant changes to the rendering code, comes with a performance cost for dbuffer decals even when the new blend modes are not used and it potentially breaks packaging projects. All of this might not be a problem for some projects, but then again, "might not be a problem for some" is exactly the use case that a custom engine build is better suited for than breaking a couple tens of thousand people's projects.
  • Nosslak
    Offline / Send Message
    Nosslak polycounter lvl 12

    andrad said:
    And I gotta be honest with you guys, I don't see much of a chance for Epic integrating this into upstream UE. It adds small but significant changes to the rendering code, comes with a performance cost for dbuffer decals even when the new blend modes are not used and it potentially breaks packaging projects. All of this might not be a problem for some projects, but then again, "might not be a problem for some" is exactly the use case that a custom engine build is better suited for than breaking a couple tens of thousand people's projects.
    I'm sorry to hear that as I just started to get into using it for the last few days and it seemed to be very promising. I did some tests and while it wasn't perfect the decals blended a lot nicer than the default option in Unreal. I guess this is a conclusion you came to after the optimization pass you were talking about or do you still need to do that?
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Nosslak Well, the performance impact is something that became known only after profiling, obviously, but it seemed relatively clear to me from the start that chances of merging this into official UE are slim. If it was only the metalness in dbuffer decals and nothing else, there might be a chance that Epic is willing to accept a pull request. But the selective blending part might be too big a change for their tastes.

    I think I'll create a pull request anyway, even if it's just to learn their reasoning for refusing it. But that'll happen after 4.20 comes out. I want to merge UE 4.20 to get an idea of the amount of merge conflicts this will produce. After having confirmed that everything works for the new UE version, I'll submit a pull request and we'll see what they have to say about it.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Menchen I finished this a couple of days ago, but I haven't gotten around to writing everything up. I'm hoping I'm ready to post something by the end of the week.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    So, there's good news and bad news. The good news is that I got selective blending to work with regular decals as well, and the performance impact is pretty minimal. The bad news is that the workflow differs a little from dbuffer decals and both implementations don't have feature parity.

    The most severe limitation is that metallic opacity and roughness opacity will always be the same. The way gbuffer decals work seems to be coupled heavily with the general layout of UE's gbuffer, where roughness, specularity and metalness are stored in the same render target. Without heavy modifications to the whole render pipeline, opacity can be set separately only for each render target, which means I can blend [color], [normal] and [roughness+metal] separately.

    A somewhat mitigating factor is that this limitation is noticeable only in specific circumstances and can be worked around in most of those. Still, it's not optimal.

    Pics or it didn't happen

    You can activate selective blending my setting the decal blend mode to Selective, as seen in the screenshot below. This is a new blend mode I created for that purpose, so that the other modes are left unchanged and work as they did before. This blend mode shares all the limitations of the other gbuffer decals, most importantly: it doesn't work well with baked lighting.



    The first set of images shows (by now) nothing special, really. It blends the same way that dbuffer decals blend and nothing has to be changed in the material graph.




    All the metallic parts overwrite every attribute, the recesses and seams overwrite normals at the incline, but retain normals on the surface (like the large one to the right of the metallic strip), and most recessed shapes have a very weak roughness opacity in order to blend the underlying roughness with the one provided by the decal. You can see some faint traces of that in the dots on the bottom.

    This specific material/decal combination would look exactly the same when built with either gbuffer or dbuffer decals. There is virtually no difference that an observer could notice. Let's look at what happens when the same decal is placed on a metallic surface.

    If the normals of the first image look inverted to you, it's because I changed the light's direction in order to better make out the surface. Unfortunately, when viewed side-by-side with the image above this creates kind of an optical illusion that suggests the normals, and not the light, have changed.




    Apart from that, you might think there's nothing wrong with the first image. You'd be sorta kinda right, but only because the decal's roughness opacity is so weak that the adverse effects are barely noticeable. It's still enough to observe the problem in the buffer images, though.

    When looking at the metallic buffer, we can see that there's a ton of grey, which is generally not what we want. All those grey parts come from the roughness opacity, and what happens is this: When it comes time to blend the decal with the environment, the renderer takes the metallic value of the decal (0 for those grey spots), multiplies it with the opacity (the same for roughness and metalness), and blends it with the underlying, fully metallic surface. It comes down to something like 1 * 0.9 + 0 * 0.1 = 0.9, I don't know the exact blending operation off the dome. The point is that roughness blending always results in metalness blending which produces metalness values that are not 0 or 1, which violates PBR specs in most of the cases.

    The reason the images above look still kind of okay is that metalness can be blended to very light greys without producing non-PBR results right away. It's okay to have lightgrey metalness in areas that are covered by thin layers of dust for example, provided the reflectance value in the base color is still correct.

    Workarounds and Workwiths

    In summary, unwanted effects will happen if the intended metal opacity differs from the actual roughness opacity and the metal value of the underlying surface is different than the metal value of the decal.

    Let's break that sentence apart: If roughness opacity and metal opacity are the same, there's obviously no problem. Even though the wrong texture channel is used, metal opacity will still be calculated correctly. If roughness and metal opacity differ, but the metal value of the decal and the surface are the same, then the resulting value will stay the same. For non-metals 0 * 0.123 + 0 * 0.877 is still 0, still not metallic. The same but in reverse goes for two metals. If they differ, the amount of roughness opacity is proportional to the visible error, which means the error can be reduced if the opacity is low.

    With all that said, here are some ways to work around this problem:

    1. Don't blend roughness opacity independently. If you never blend roughness independently from metalness, you'll never run into this problem. Their opacities will always be linked and you get by with only one texture for both. This is very limiting, though. Forget blending roughness inside of seams and cracks or anywhere really. Roughness opacity will always have near white or near black values and more even blends cannot be achieved this way.

    2. Use metallic and non-metallic versions of decals. Let's say you have a seam in your decal sheet and that seam blends roughness opacity to simulate dust and dirt that have gathered in it. If this seam is not specifically set to be fully metallic, it will look good on non-metallic surfaces and bad on metallic ones, and vice versa. The solution to this is having two variations of that seam in the decal sheet - a metallic one and a non-metallic one. By placing the metallic seam on metallic surfaces, the problem goes away. The downside to this is that you have to know/decide in advance which surfaces of a mesh are going to be metallic and which ones are not. This severely limits modularity and it might force you to rework some of your meshes if, for example, art direction changes during development.

    3. Use weak roughness opacity. As stated, light greys in the metalness don't break PBR immediately, so you can limit yourself to having roughness opacity very low where it's independent of metal opacity. It will make metals slightly less metallic, but you might get away with it and keep the shading intact. Might also limit overall usefulness, though.

    4. Ignore it. Well, you can always decide to just not give a damn. After all, most of these decals consist of narrow seams, bolts and other very small elements that generally make up only a little portion of screen space and are seldom the center of the action. Who cares about bolts and seams? There's enemies to murder and worlds to save, and no one pauses to stare at a wall to admire the nice shading of that hex nut in the corner over there. In fact, go to Port Olisar and do just that. You will see tons of places where the decal shading might seem a bit off, but it really makes no difference to the overall experience.

    5. Use dbuffer decals. DBuffer decals don't suffer from this problem and blend every material attribute independently. You can always choose to use those instead. If your project makes use of baked lighting, you should be using dbuffer decals anyway, so this issue might not even be relevant to you at all.

    Feature Comparison

    I consider this expanded decal system to be feature-complete. I will look into AO since I have a feeling that it should be possible to get it working for decal materials, but that might just be my limited understanding of UE's rendering pipeline. I also might have another go at getting emissive working for dbuffer decals. For the time being though, I'm finished and here's how dbuffer decals differ from gbuffer decals:

    DBuffer
    • available attributes: Base Color, Metallic, Roughness, Normal
    • selectively blend color, normal, roughness, metal
    • works with dynamic and static lighting
    • adds constant overhead of about 25 instructions for every default lit surface material in the project, whether decals are in the scene or not (this is the cost of enabling dbuffer decals in the project settings)

    GBuffer
    • available attributes: Base Color, Metallic, Roughness, Emissive, Normal
    • selectively blends color, normal, roughnessmetal
    • works flawlessly with dynamic lighting, sucks most of the time for baked lighting

    Pick your poison, as they say.

    Performance

    Performance impact is practically negligible. I profiled FSceneRenderer_RenderMeshDecals and the frame time was 0.056 ms for the unmodified engine and 0.053 ms for selective blending. So there's no statistically significant difference. There's nothing like additional render targets going on and the only performance impact that's worth mentioning comes with evaluationg a ector instead of a scalar for decal opacity.

    Idiosyncrasies

    There are two things I'd like to mention before closing this bit.

    First, concerning sort order of mesh planes inside the same decal material. I have not yet figured out how to reliably control which decal plane gets rendered on top of another if both are part of the same material on the same mesh. I wrote previously about how to properly sort different decal materials on the same mesh, but this doesn't apply to stacking decal planes in the same material. For opaque and masked materials, sort order is defined by the geometry, i.e. if you place a polygon in front of another, it will be rendered in front of it, too. The same doesn't seem to be true for translucent materials. If I have a screw and I place it on top of a panel, both belonging to the decal in one material slot, then the sort order is, for all intents and purposes, undefined. It's not really, though, since the rendered order is consistent and this whole thing is deterministic, after all. I just haven't figured out yet how to control it. I suspect it is determind outside of UE, in the DCC app or the FBX exporter.

    Second, if you plan on using both gbuffer and dbuffer decals alongside each other, be aware that there is a predefined stacking order that you cannot control, not even by manually adjusting the decal actor's sort order: gbuffer decals are always rendered on top of dbuffer decals. It's easy to see why that is. Dbuffer decals are applied before the base pass, gbuffer decals are applied before lighting, well after the base pass. When the renderer deals with gbuffer decals, the dbuffer has been processed, its render targets destroyed and there is just no way to access it again to do any kind of sorting. You can see it in the image below, where every decal is a gbuffer decal except the graffito on the right, which gets drawn behind every other piece of decal.


  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    Heads up: I merged 4.20 and everything works as far as I can see. While doing so, I noticed that there is a new decal blend mode called Ambient Occlusion. This sounds promising. But it seems that this type of decal really is for AO only, the other pins aren't even enabled. I don't quite see the usefulness in this, but what do I know? Anyway, maybe I can glean some insight into decal AO from this and incorporate it properly into the custom decal blend modes.
  • frmdbl
    Offline / Send Message
    frmdbl polycounter
    @andrad

    I compiled the 4.20 version (from 2 days ago). I created a decal material with some quick textures.
    I think the setup should be ok, but no matter what I do, I can't get it to work properly.

    When I use the 'Selective' blend mode the Base Color just doesn't show, when I change to Dbuffer Translucent etc. (with Metallic) the metal isn't blended.
    I think the 'Multi Opacity' masks are fine. It's a vector4 with color, normal, roughness and metal masks in that order.

    The left one is 'Selective' , the right one is 'Dbuffer'.

  • Vexar
    Offline / Send Message
    Vexar polycounter lvl 3
    I am not an engineer but one of the best decal systems to study that I have used is in the BlackOps 3 renderer, which also includes volumetric decals that shrink wrap around models.

    In Black Ops3 they take the all the "grunge" decal layers and render them to one final texture so it reduces rendering overhead this is done at compile time and allows the level designer / environmental artists 5 layers to stack.



  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @frmdbl Can you post screenshots of the material graph and the blend mode settings of your decal material, and the decal response settings of the material on which the decal is projected?
  • frmdbl
    Offline / Send Message
    frmdbl polycounter
    @andrad
    Here's the material graph.
    In terms of the response setting for the underlying material, actually it's my bad, I didn't read your instructions carefully enough and it was the default 'Color Normal Roughness'

    however changing it to does make all channels blend for the Dbuffer mode, but the Base Color of the decal seems like a 0.5 grey rather than a white color as it should.


  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @frmdbl All right, I'll have a look at it. Which decal blend mode did you set when the base color showed the correct value?
  • frmdbl
    Offline / Send Message
    frmdbl polycounter
    @andrad

    Both are 'Dbuffer Translucent CNR Metallic, the difference is the underlying material.

    On the left the response is the default CNR, on the right CNRM.


  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @frmdbl I have to wait for the engine to finish building, so i can only test in about two hours. But I have a suspicion that, if correct, can explain everything.

    In the screenshots you did above, did you use the view mode "Unlit" or did you use the buffer visualization view mode "Base Color"? I suspect it is the former, in which case you should see the decal be as white as you expected when you look at it in the BaseColor view mode. Tell me if that works and I'll tell you what happened.
  • frmdbl
    Offline / Send Message
    frmdbl polycounter
    @andrad
    Ok, you're damn right :)
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @frmdbl Nice. So, everything works as intended and what you were seeing was just the way that metals work in UE. You can confirm that by placing a plain old, regular metallic object in the scene and viewing it in unlit mode. It'll have the same grey color instead of a full white. Set metallic to 0 in the material and it will be white again. And because the decal response mode in the left screenshot ignores all metalness in decals, the decal was non-metallic and therefore fully white. Not so in the right screenshot, where metalness gets properly applied.

    I hope that solves your problems. Shout if there's anything else.
  • LVG
    Offline / Send Message
    LVG polycounter lvl 12
    Very cool work man, Would love giving this a try if my build didn't fail every time
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    Heads up! A commenter in the UE4 forums brought to my attention that Epic finally did some work to integrate metallic & specular into DBuffer decals. They did it independently, though, so I had nothing to do with it.

    Here is a link to the forum post. And here is the commit on GitHub.
  • andrad
    Offline / Send Message
    andrad polycounter lvl 5
    @Menchen The multiopacity parameter has always been there. This is what I used as starting point for my changes to the rendering code. My guess is they started to implement proper multi opacity sometime in the past, but never finished it for some reason. All I did was kind of finishing it for them.
1
Sign In or Register to comment.