The wonders of technical art (Unreal Engine)

1235

Replies

  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    I want to learn more about SDF ray marching, so I started experimenting with reflections as a starting point.


    The goal is to have a nicely lit cornell box at the end, with mirror-like and glossy reflections, direct lighting with area shadows, and hopefully indirect lighting as well.
  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    Got the whole scene to reflect and to be reflected.


    So far I'm getting really good performance, and I'm curios to see how much worse it gets once I start adding much more complexity. Currently I'm having nearly 500 fps on a 2080ti on 1080p resolution, when the scene fills the whole screen.

    Next time I'll add reflections on reflected objects (so if I make the ball to be a mirror for example, its reflection on the wall will still shade as mirror), and I'll try adding roughness.

    The current code in the custom node looks funny as it only has really short lines. I should also do materials in a better way...Will think about this.

    Some functions such as MAP(this is the scene description) and GetNormal are added to a custom ush file so the custom node is less cluttered and there is some automation - I don't need to sample things one by one.

    ///////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
    float stepcount = 0;
    float3 normal = 0;
    float opacity = 0;
    float2 curdist = 0;
    float MAT = 0;
    float3 color = 0;
    float3 worldpos = 0;
    float3 reflection = 0;
    float3 reflpos = 0;
    float2 refldist = 0;
    float reflMAT = 0;

    for(int i = 0; i<150;i++)
    {
     curdist = MAP(ro,ti);

    if(curdist.x<.001)
    {
    opacity = 1;
    MAT = curdist.y;
    worldpos = ro;
    break;

    }

    stepcount += curdist.x;
    ro-= rd*curdist.x;
    }

    normal = GetNormal(ro,ti);

    //base colors
    if(MAT == 1)
    {
    color = float3(1,0,0);
    }

    if(MAT == 2)
    {
    color = float3(0,1,0);
    }

    if(MAT == 3)
    {
    color = .1;
    }

    if( MAT ==4 || curdist.y == 5)
    {
    color = .5;
    }


    if(MAT == 6)
    {
    color = 1;
    }

    //calculate reflections

    float3 reflvec = reflect(-rd,normal);
    reflpos = worldpos+(reflvec*.01);
    float fresnel = lerp(0.04,1,pow(1-saturate(dot(normal,rd)),5));

    //reflections loop
    for(int j = 0; j<60;j++)
    {
     refldist = MAP(reflpos,ti);

    if(refldist.x<.001)
    {
    reflMAT = refldist.y;
    break;
    }

    if(reflpos.x>1)break;

    reflpos+= reflvec*refldist.x;
    }

    //color reflections

    if(reflMAT == 1)
    {
    reflection = float3(1,0,0);
    }

    if(reflMAT == 2)
    {
    reflection = float3(0,1,0);
    }

    if(reflMAT == 3)
    {
    reflection = .1;
    }

    if( reflMAT ==4 || reflMAT == 5)
    {
    reflection = .5;
    }



    if(reflMAT == 6)
    {
    reflection = 1;
    }



    return color+(reflection*fresnel);




  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    Glossy reflections test:


    The gif compression kinda killed the thing...

    There might be a better way of doing this using SDF, just like you do soft shadows taking advantage of the falloff of the SDF, I'm not sure yet. What I do currently is to send multiple rays in randomized directions outwards from the reflection vector in a cone shape. The cone angle depends on the roughness value.
  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    I noticed some changes in how custom ush files works in the current versions of Unreal. Earlier it used to be like if you use an external ush file so you don't embed your code into an existing one, the engine won't recompile the shaders when you open the editor. But what happens now, is that it will recompile upon opening the editor even if use own ush. BUT... If you edit your ush while the editor is open, and you save it, then you press enter in your custom node to recompile it, it will recongize the changes in your ush live. BUT... If you do this , live ush editing and then close the editor and open it again, it will recompile all shaders on the next open. The live ush update is cool, but the editor close-open recompilation is soooo stupid. What the fuck.
  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    I also mocked up the same test with Unreal's ray tracing enabled, for reference and performance measurement, and man... The performance of this is far worse than what my ray marcher was producing. When I set max roughness to 1, and go fullscreen, the framerate is around 40 even when I use 1 sample per pixel. I'll post some comparisons later. I need to add a virtual point light to the ray marcher first, so the 2 scenes are closer to each other visually.

    This is kinda weird to me because there is only one big difference between the 2. While I use mathematical scene description, the RTX one uses poligonal meshes. I know that real time ray tracers uses BVH (bounding volume hierarchy) as acceleration structure, but I use "sphere tracing"(reference below) so both scenes ray tracing is accelerated.
    https://www.scratchapixel.com/lessons/advanced-rendering/rendering-distance-fields
    https://www.scratchapixel.com/images/upload/distance-fields/sphere-tracing-examples.png?
  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    I couldn't resist to try out some more fancy stuff before I continue pimping the cornell box.

    Here is a ray marched skylight with multiple samples. Runs nicelly. I get around 120 fps with 32 samples per pixel.

  • Obscura
  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    I also found that using higher mip level on the cubemap reduces noise but it also reduces shadow detail and makes it more soft. Which makes sense because there is less detail in the cubemap when its lower resolution. So ideally, for high quality sky lighting with rich shadow detail, you would need a certain cubemap resolution, but unfortunately you will also need many more samples to get it less noisy.
  • Obscura
    Offline / Send Message
    Obscura veteran polycounter
    Here is a little breakdown of what happens:
    - We do the usual visibility and normal calculation routine.
    - We make a very little offset on the ray along the normal direction, so we won't immediately fall into our hit distance threshold. Just a very little step.
    - We make a for loop again for our occlusion , for each sample. I generate a normalized per pixel vector noise with increasing z coordinate on each sample, so I get per pixel variance in each iteration.
    - We make a nested loop for our occlusion marching. 
    //So for each sample, we march along the normal direction minus our random direction vector. So we are marching in random directions inside a 180 degrees cone outwards from the previously hit surface.
    - Sphere tracing is used again, to speed up the marching somewhat. If we hit our geometry, we dont't do anything other than quiting the loop of this sample. If we don't hit anything after a certain distance, we sample the skylight.
    - When we finished all the loops, we divide the result by the amount of samples taken so we take the average.

    If we don't use per pixel ray direction variance, we get an image that has zero noise, but all the samples will be fully visible. We get sharp shadows with some direction offsets. Kinda looks like having some directional lights with sharp shadows and different rotations. It looks bad when you use low sample count. So bad, that the noise is better! The noisy one needs less samples to look decent. At least in the case of skylight. I'll try some similar thing with localized lights sometimes. Like a stochastic sphere light. Probably the stochastic directional light is the least noisy, because unless the sun is so huge, this would have the smallest penumbra radius. Also, if we would want to be fully realistic, we should use a gigantic sphere light for this, that is very far. But I don't think that would be any efficient, and I'm also not sure if we would get considerably better result than using many samples of random directions inside  a  given radius. One more thought about this. Ideally, in a more complex and realistic system, if we use a cubemap as stochastically sampled ray traced light source, and we implement bounces (global illumination), a directional light would not be needed because the cubemap would have directional lighting too, as a side effect of the hdr lighting over many samples. Even my skylight in the previous post already has. The ambient occlusion is a simplification. Thats where the bounce should happen. But ao as it is, is the lack of bounce. The darker the ao is, the more bounces you would need to do to calculate the correct lighting value. So you could even use an ao map to pre compute the amount of bounces needed with a given maximum. 


1235
Sign In or Register to comment.