Hi guys, I am moderately experienced with UDK's material editor but I have no idea how to "program" a very particular feature into my material... I'd like to create something very similar to DepthBiasedAlpha, but with a twist:
I want the material's opacity to change depending on the Z-axis depth to the surface beneath it, rather than the depth from the perspective of the camera.
So imagine that there is a plane named 'plane O' that is parallel to the ground. Plane O has a totally opaque material applied to it. Now, 16uu above it is another parallel plane P. P has the special translucent material I described, so it is translucent, but its translucency does not depend on the camera's viewing angle, since it is always 16uu above plane O.
How can I make my material "read" the Z-depth beneath it? Is such a thing doable with just the default material editor? If not, could I perhaps learn how to program it in?
Replies
Is what you're attempting to make something that can be observed in the real world, or is it something that's meant for gameplay?
Yeah, I wasn't at my desktop when I made that post so the example was kind of bad. But I am at my PC now so here come the pictures:
In the above 2 pics, you can see the kind of blending effect that I am looking for. I want to make a series of cave/tunnel meshes (modular) that blend together nicely instead of leaving ugly seams when they intersect. In the pictures, it is only the floor pieces you see.
The white material is 100% opaque. The purple/white material is translucent.
So, it seems like regular DBA should do the job just fine, right? The problem, though, is this:
For some reason, the translucent material doesn't occlude the surface behind it if that surface is the same kind of material as itself.
I've tried both regular DBA and the quasi-DBA that you see in the Material Editor screenshot, and both have this problem. I thought introducing the IF node with a cutoff value could help the problem, but it doesn't really do anything.
I realize now that the solution I described in the First Post will not work for certain cases (eg. when two cave walls intersect each other). So the real question is: How do I get this kind of DBA to work even when two translucent materials are overlapping each other?
I realized that with even the simplest possible translucent material (material is set to BLEND_Translucent & static scalar 1.0 is plugged into opacity), the exact same problem persists, even though it should theoretically be a totally opaque material.
Here's an example by Hourences: Solus Overview pt. 1
An overview of transparency/blending rendering: Blending Tutorial
Though Hourence's tutorial is using UE4, you could probably do the same thing in UDK with little to no fanagling.
I am going to conclude that I cannot have my modular tunnels exactly the way I want, because the walls are too smooth to be blended together seamlessly between meshes:
Hourece's cave staticmeshes from his Solus game, in comparison, are very rough so it's okay to just have opaque meshes intersecting each other without any kind of blending effect. I'll have to make my meshes rougher too, as a compromise.
Sorry, I have no idea what this is. Can you explain?
this is a good example where he uses this world space mapping as a dirt overlay: http://www.chrisalbeluhn.com/UDK_Asset_Position_Offsets_Texturet_Tutorial.html
if you want to paint out seams between meshes you can make a tiling material in world space (so it always tiles everywhere) then vertex paint between that and your normal material.
But my meshes don't line up (as in, they are designed to overlap slightly), so there will be no nice way to blend their surfaces together, right?
Yes, that's the solution. It's simply explained here: http://www.polycount.com/forum/showthread.php?t=140607
Then use the same texture (or material function) of tiles rock on two objects. Mask the common texture it out (for example with vertex color as a lerp input) in one of them, except for the edges (the place of transition).
EDIT: I am back at my desktop now, so here are the pics of my problem:
Honestly, I'm not really sure I understand the theory here. How is it possible for a static mesh to be textured in a way that doesn't depend on the UVs...? That just makes no sense to me at all, since I've only ever had to deal with objects where the UV mapping dictates the way the texture is unwrapped.
0 is the beginning, 1 is the end.
UVs are infinitely wrapped (by default settings), so they always fit within 0 and 1.
That means 1.1 is the same as 0.1,
1.2 = 0.2,
999999.3 = 0.3.
Red channel is U, green channel is V. After all, the colors are just representation of a two-item vector. A pair of numbers.
To the conclusion then:
You ask why don't you need TexCoord node? Because TexCoords just returns a pair of numbers UVs input in the node means: "What pixel on the texture should I choose for this pixel of the object?"
If you feed this input with TexCoords, it gets the answer from the UV map. But you can feed it with anything else. For example - world coordinates ;>
Recreate this basic example first:
Mask = ComponentMask in UDK. (0.02,0.03) is a Constant2Vector.
Don't hesitate to experiment. Shader sorcery can lead to funny results in viewport before you get it (and that's what I like about it ).
I figured out why it wasn't working for me. When I tried to emulate bbox's material I mistook the "Object World Position" node with the "World Position" node. But now it works with my new setup:
So literally, all the material does is project the texture down from an infinitely-large XY-plane down onto whatever surface the material is on. That is genius. But I guess nothing can be done about the hard line at the mesh intersections, right?
I'm going to have to figure out how to combine this with XZ and YZ projection so that I can apply this same technique to my Tunnel Roof meshes, which are inverted U-shape tubes that go over the ground. I'm guessing I should use the surface-normal vector to control a Lerp that blends between Horizontally and Vertically projected textures.
Thanks a lot you guys.
Such solution is available for materials with transparency. Particles, water. http://udn.epicgames.com/Three/DepthBiasBlendUsage.html Called DepthFade in UE4 https://docs.unrealengine.com/latest/images/Engine/Rendering/Materials/ExpressionReference/Depth/DepthFade1.jpg
Here you would probably have just to align the meshes right. Not have too hard angles at their intersection. It's a lighting problem now, nothing texture-related.
at the intersections can use custom lighting for smoother transition.
teaandcigarettes made a really nice overview of the material in their environment, which also use world projected UVs.
http://www.polycount.com/forum/showpost.php?p=1518875&postcount=23
ideally DepthFade (DepthBiasAlpha) could be used in an opaque material to drive the lerp.
but currently i think can only access scene depth / depth bias alpha in translucent materials that will have those sorting issues in the original post.
I tried using the same World Projection method to make a material that will wrap around any tunnel surface, regardless of surface direction. I think I'm getting close to a good solution, but the blending is very awkward on surfaces that are not close to parallel with the XY, YZ, or XZ planes... I'll have to figure out how to make it better somehow.
This sounds like a really good idea. I looked at TeaAndCigarette's material and the "flat transition" between the mesh and the terrain is very nice. But I'll need to learn how to use Custom Lighting first. But before I get to that, I need to figure out the triple world projection method first... one step at a time, I guess.
Triplanar projection.
https://forums.epicgames.com/threads/768080-3d-projection-texture-coordinates-for-Terrain-layers
link found here:
https://answers.unrealengine.com/questions/42505/set-a-vertex-uv-based-on-world-space-coords.html
How much more "expensive" is it if I enable "Per-pixel camera vector" so that I am able to have both triplanar mapping and vertex painting? Is it only okay to enable it in special cases? Or did the EpicGames devs intend for it to solve common problems like mine?
My UV mapping is nice and seamless now thanks to the Triplanar projection, but of course it hasn't solved the abrupt lighting at the mesh seams. I want to achieve the same lighting effect as this guy:
I know he is using vertex painting, so I thought I could copy him by doing this:
But for some reason it has no effect on the lighting (and yes, I have per-pixel camera vector enabled). Theoretically, I think my solution is sound. Vertex Alpha controls the degree of blending, and Vertex Red controls the brightness of the lighting at that point. But it doesn't work and I don't know why. I've never done custom lighting before. Any advice on what I am doing wrong?
But now I'm not so sure I even want to use vertex painting to cover up the lighting seams. In a level that uses these tunnels, there could easily by over 200+ intersections between all the tunnel pieces and it will be extraordinarily tedious for a level designer to have to re-paint the lighting each time he moves or adjusts a piece.
Another user suggested using DepthFade, which I assume is just UE4's new name for DepthBiasedAlpha:
(In the above pic, there appears to be no sorting issues, which makes me very curious.) This looks very attractive to me right now, because it appears to accomplish what I was trying to do in the first post, but without the sorting issues. Is this kind of DepthFade doable in UE3? How can I accomplish DepthFade without the sorting issues?
And if DepthFade is not doable without the sorting issues, is there any other kind of non-VertexPainting solution I can explore?
edit: D'oh. I read the documentation for Translucency wrong. DepthFade works if I enable the following: Two Sided Seperate Pass, Use lit Translucency Depth Pass, Use lit Translucency Depth Pass. I just hope that it's not overly expensive as a material that I am combining Triplanar projection and Depth Biased Alpha.
In the vertex blending one it looks good because of a material being shared across the transition, and the transition being fairly evenly, and brightly, lit.
So to be sure of your final result you'll need to have lighting in place and built. (building is very important)
You have some stronger options though.
1: Build better meshes. Make sure the normals at your transition areas are all aligned so the lighting builds smoothly across them. Actually design your set to fit together well so you don't have to pull shader voodoo to get the result you want.
Really this is like having a car with dented fenders and hoping you can make it look smooth and new by finding the right paint. You possibly COULD concoct a paint that reflects light in such a way as to make things nearly featureless from every view angle.. or you could bolt on new fenders.
2: Use other meshes to hide glaring intersections. This is done ALL THE TIME. Seriously. Some small rocks, some sand, even some darker drippy decals along the wall base. It will go a long way to selling the scene AND covering up issues with lighting. Remember that static meshes are lit either per vertex or by light maps. This isn't really affected by the lighting model unless you go unlit. Custom lighting still uses shadow maps for directional and dominant light sources.
I agree that the meshes are not very good. I'm currently working on some meshes that have rougher topology that will hopefully make the transition less noticeable.
You guys have been a fantastic help with explaining the triplanar projection and mesh blending techniques that other people have used, so I just wanted to say thanks again to everyone.
Unfortunately I can't go and make modular pieces that will fit seamlessly together without intersecting, because I don't want to restrict myself to 90* angles and having to work exactly on the snap grid. But I will try out the other options like normal bending and decals to see what works best.