Hey guys,
As you might know I've been working a lot with shaders lately and I just today learned how to get ZDepth in Mental Mill and get it working in Maya.
Anyhow, I have two questions.
1) What usages or ideas for usages are there for ZDepth within a shader. Note that this isn't a post process effect. More experimental ideas I've been playing with for usages are color tint, color ramp, contrast and saturation. Any other known usages? Ideas?
2) I already have a multipurpose real time shader and I'm considering adding a B/W ZDepth feature in the shader for those who want to play with compositing their regular render with ZDepth as a mask for effects. Would you as an artist find this useful in your work application?
Thanks for your feedback.
Replies
Explain what you mean by both. Scene would be depth to farthest point being rendered? And "depth of pixel" from nearest to farthest within one object?
With pixel depth you can make the object change color at a distance or fade it's specular to reduce aliasing, maybe enhance a rim component.
with both you can create foggy depth stuff like making a fade at the intersection of the two. You could make a water plane that compares depth of the plane with the scene behind it to add thickness or fogginess to the fluid.
What I have at the moment is ray distance, I'm using this as the "Pixel Depth" term you described. Shift color or whatever.
Trying to comprehend what you meant with Scene depth. So lets say I have a sphere inside a box like room. The camera is inside the room focusing on my sphere. The Pixel Depth would be the distance to the evaluated point on the sphere's surface, the Scene Depth on the same pixel would be the distance to the wall behind the sphere?
With this I could make a fog which would be very dense at the far back and fade away as it nears the spheres distance?
Thanks for helping btw.
You can get the depth with either per-vertex or per-pixel precision (like basically everything).
You're right kodde, it's just a distance relative to the queried element (vert/pixel) from the near plane, the far plane being the maximum limiting factor.
Fog is probably the most common and simple usage of the zdepth.
Fading things out based on distance.
Soft particles that are near other geometry to avoid harsh clipping (compare zdepth of the scene prior to the particle being drawn, and then to the depth of the particle (+some fudged radius value so it's not so linear).
Another terminology when using a deferred rendering pipeline is that it's just the "depth buffer". The benefit there being that you can just read the depth on any pixel in view rather easily, not just the current vert/pixel.
Just try outputting the zdepth values from your pixel shader to get a visual representation of what's going on (and obviously move the camera around).
Now anyone care to givee some feedback or inspiration what I can do with this on a shader level. I'm thinking useful/experimental shaders to play with in Maya. Since it's Maya and not über next-next-gen deferred rendering engine of doom I'm quite limited to what I can do, but still lot's of possibilities. Read my two questions above
[ame]http://www.youtube.com/watch?v=aUL3Ds_dZS4[/ame]
[ame]http://www.youtube.com/watch?v=0LZMvGfIasc[/ame]
Hmm... wouldn't it be possible to do an optimized depth render pass from the exact opposite camera direction and position and thereby get rough "depth info" by comparing these two images? I mean if you had depth from opposite angle you ought get pass through on the ears when you are looking straight at the face, or like the nose when looking at a face from the side?