Home Technical Talk

voxel cone-tracing GI discussion

Ok so I was thinking about voxel cone-tracing for GI, especially now that UE4 is implementing a version of it, and have a few inquires as well as misgivings that I hope can be cleared up.

Foremost, is using the averaged, lower-res voxel level really a way to describe diffuse lighting properly? How does the approach take into account the brdf of the material if it is using evenly averaged values sampled around a point, and how can it be accurate unless the 'cone' is a hemisphere?

The main way I could see the tech working for diffuse is if each point were sampled from a weighted hemisphere. If I were to sample for a point with a normal facing completely to the right and perpendicular relative to the camera, given lambert shading we know that the light sources from the front would have very little influence (they go toward black aproaching the edges0, the diffuse lighting loses influence as the surface normal becomes perpendicular to the light vector. The lighting of a surface point that faces the light source on the other hand gets the full diffuse value. (and light from behind the object would be glance angle, and picked up by specular/reflectivity)

So unless I'm misunderstanding, a weighted hemisphere would be necessary to sample for any point, but using a voxel octree structure you'd be using averaged, unweighted values, right (if your just sampling lower(higher up?) on the voxel octree)?? If the tech really is using a weighted hemisphere, then that's great news, especially if you are able to manipulate the weighting and therefore manipulate the diffuse reflectance model of the object, but I haven't read anything that describes that.

Replies

  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 16
    http://nvidia.fullviewmedia.com/gtc2012/0515-B-S0610.html

    many cones approximate a hemisphere, you can sample more than one cone. And you do have the normal information as well to do weighting of your own normal and the sender's normal.

    It's of course an approximation, and not like tracing tons of rays, but since diffuse and glossy reflections don't need that level of accuracy, you get away with it. You will not be able to do something like a perfect mirror reflection
  • Gestalt
    http://nvidia.fullviewmedia.com/gtc2012/0515-B-S0610.html

    many cones approximate a hemisphere, you can sample more than one cone. And you do have the normal information as well to do weighting of your own normal and the sender's normal.

    It's of course an approximation, and not like tracing tons of rays, but since diffuse and glossy reflections don't need that level of accuracy, you get away with it.

    Thank you so much! That link cleared up a lot.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 16
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    We had a little discussion on this topic in the unreal thread before Cliffy showed up and everything devolved from there.
  • gboxentertainment
    I have implemented my own version of this technique:

    11jx9qw.jpg
    2qxbztf.jpg
    2wrmez5.jpg
    spvloy.jpg
    2qi22iv.jpg
    2w4zplf.jpg

    The shadows are also cone traced, which I still need to fix up the artifacts (I think it may be to do with using surface voxelization instead of solid - to avoid self-shadowing artifacts I had to skip the base mip in the cone trace).

    No octrees are used yet, the entire scene sits inside a single 64x64x64 3d texture.
    I am using "unlimited" bounces by creating a revoxelization feedback loop using a two-pass algorithm (this took a bit of tweaking to get right).

    Please watch my demo video:
    [ame=" Cone Tracing Test Engine - YouTube[/ame]
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    Wow, that's awesome.

    Does the light only refract once per pixel? Does it refract agian through the back faces?

    That emissive looks awesome, but a bit jerky. A wonder if that will be as noticeable with octrees implemented.

    Do you have AO working?
  • Norman3D
    Offline / Send Message
    Norman3D polygon
    Cool!!!

    Unfortunately "Sparse Voxel Octree Global Illumination" has been dropped from U4. It was too expensive. More info here.
    "[SVOGI] was our prototype GI system that we used for Elemental last year. And our targets, given that we've had announced hardware from Sony, that's where we're going to be using Lightmass as our global illumination solution instead of SVOGI," senior technical artist and level designer Alan Willard told Eurogamer, stressing that this new iteration of the technology has evolved significantly beyond the current-gen system used in titles like Mass Effect 3. Certainly, just the presence of so much more memory on next-gen platforms should improve lightmap quality on its own.
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    That only confirms that they won't have it in the PS4 titles. They could still have it in the editor for previz and in the high end PC games, although that part us still vague.
  • gboxentertainment
    Computron wrote: »
    Wow, that's awesome.

    Does the light only refract once per pixel? Does it refract agian through the back faces?

    That emissive looks awesome, but a bit jerky. A wonder if that will be as noticeable with octrees implemented.

    Do you have AO working?

    I'm using "unlimited bounces" via a feedback loop (i.e. revoxelizing the scene each frame and re-injecting the indirect+direct color values into the same 3D texture). It's very difficult to see due to the low-quality of the cone-traced refraction, but technically you can have multiple refractions through several different objects. I am tracing a single refractive cone using the specular cone tracing function (replacing reflect with refract). In my final implementation, I probably would replace the cone-traced refraction with deferred shader refraction for higher quality.

    I don't believe octrees will make a difference with the jerkiness, in fact - I might not even implement octrees altogether due to their inefficiency. Instead, I want to implement some sort of cascaded camera position dependent detailed voxelization as introduced in http://realtimevoxels.blogspot.com.au/
    because it is much faster due to not having to construct, sort and lookup an octree every frame.
    The jerkiness is due to each triangle entering a new voxel, and there is no interpolation function based on density to gradually increase the color between empty space and solid object. So for instance, when a little bit of the triangle enters a new voxel, the voxel value will suddenly change from 0 to 1.

    I have not purposely implemented AO in this scene - everything is naturally based purely on the diffuse/specular cone traces.
  • Jerc
    Offline / Send Message
    Jerc interpolator
    I'm pretty sure the work involved in making a custom PC version with SVO would be overkill. You would need to build a whole new lighting for the game as Lightmass and SVO would behave differently almost everywhere.
    Still having SVO for previz would be killer but I'm not counting on it too much.
  • RC-1290
    Offline / Send Message
    RC-1290 polycounter lvl 7
    Jerc wrote: »
    I'm pretty sure the work involved in making a custom PC version with SVO would be overkill. You would need to build a whole new lighting for the game as Lightmass and SVO would behave differently almost everywhere.
    Having this kind of dynamic indirect lighting as an option would still be nice. Unity gives you the choice to switch to different lighting styles, which is very useful. Not all features are supported on all platforms, but that's no reason to cut them out entirely.

    Hopefully it is left up the end developer to choose whether or not to spend extra time on designing scene lighting twice / faking interactive indirect and emissive lights.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 16
    @gboxentertainment
    cool stuff. Given you are not using actual sparse octrees, which would be a requirement for large scenes and reasonable precision,
    what you are intending to do is similar to what CryTek does for GI
    http://www.vis.uni-stuttgart.de/~dachsbcn/download/lpv.pdf
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    I'm using "unlimited bounces" via a feedback loop (i.e. revoxelizing the scene each frame and re-injecting the indirect+direct color values into the same 3D texture). It's very difficult to see due to the low-quality of the cone-traced refraction, but technically you can have multiple refractions through several different objects. I am tracing a single refractive cone using the specular cone tracing function (replacing reflect with refract). In my final implementation, I probably would replace the cone-traced refraction with deferred shader refraction for higher quality.

    I don't believe octrees will make a difference with the jerkiness, in fact - I might not even implement octrees altogether due to their inefficiency. Instead, I want to implement some sort of cascaded camera position dependent detailed voxelization as introduced in http://realtimevoxels.blogspot.com.au/
    because it is much faster due to not having to construct, sort and lookup an octree every frame.
    The jerkiness is due to each triangle entering a new voxel, and there is no interpolation function based on density to gradually increase the color between empty space and solid object. So for instance, when a little bit of the triangle enters a new voxel, the voxel value will suddenly change from 0 to 1.

    I have not purposely implemented AO in this scene - everything is naturally based purely on the diffuse/specular cone traces.

    I came across your gamedev thread yesterday when I was googling SVOGI stuff and I read through it. I agree about the post process refraction replacement, seems like a good idea, but I do like the idea of everything being somewhat more physically accurate. Do you think there would be any way to solve the jerkyness problem for moving emissive? Like some kind temporal antialiasing or something of the sort?
    Jerc wrote: »
    I'm pretty sure the work involved in making a custom PC version with SVO would be overkill. You would need to build a whole new lighting for the game as Lightmass and SVO would behave differently almost everywhere.
    Still having SVO for previz would be killer but I'm not counting on it too much.

    Do you think it would be too difficult to make them work together and because of this one of them wouldn't make the cut?

    I mean, I don't doubt it would be difficult, but doesn't Unreal Engine 3 already have to deal with somewhat similar problems in matching their full and mobile renderers? There are also several games that have gone the route of implementing their own lighting systems or middleware such as what Medal of Honor did with Beast. I imagine that since they already have both SVOGI (Which they have been touting since the beginning and have had a GDC talk starring this lighting system) and Lightmass in place (Which their wording seems to indicates is the low end replacement for SVOGI), there would be a good possibility that both will make the cut. Or am I totally off base?
  • o2car
    Offline / Send Message
    o2car polycounter lvl 12
    SVOGI might look good and perform in a testscene for a siggraph paper but its not feasible for lighting a real game environment.
    Many have tried but failed. Maybe the next next-gen can figure out a way to use this tech. : )
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    One would hope that by then we would be on to full raytracing, like at least what the brigade engine can do.
  • EarthQuake
  • radiancef0rge
    Offline / Send Message
    radiancef0rge Polycount Sponsor
    [ame=" Radeon Sky - Ruby Off-Screen Tech Demo - GDC 2013 - YouTube[/ame]

    /relevant Im pretty positive this is their latest build with voxel based lighting
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    That's cryengine 3. That's what the Illfonic guys were up to for the last several months.
    It's basically a benchmark for AMD cards, as well as a reboot of the Ruby character, if you will. It doesn't having anything to do with voxels.

    Here's some bootleg-ass GDC slides about the making of this benchmark.


    ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


    Anywho, back on topic, Bitsquid engine is using voxel cone-tracing to drive some really nice and coherent AO. (Shown @ 2:12)

    [ame=" GDC 2013 Showreel - YouTube[/ame]
    EarthQuake wrote: »
    Do do do do want.

    Is this in reply to gbox's SVOGI engine, or the brigade mention? Both are amazing... :)
  • radiancef0rge
    Offline / Send Message
    radiancef0rge Polycount Sponsor
    im pretty sure that amd integrated their voxel tech into cryengine. theyve tech demoed voxel lighting in the past in 2009.
    Edit: However looking at the slides it does appear to be IBL. Blurgh Im wrong! :poly127:

    Anyway back to the subject, doesnt dynamic GI take a large part of the artistic control out of a lighters hands? I will admit it looks nice in philipk's work in that video.
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    im aware, im pretty sure that amd integrated their voxel tech into cryengine. theyve tech demoed voxel lighting in the past in 2009.

    Anyway back to the subject, doesnt dynamic GI take a large part of the artistic control out of a lighters hands?

    You will have to elaborate on that AMD voxel stuff then.




    On the topic of dynamic GI 'removing artistic control', I would say its the opposite.

    The enhanced workflow from rapid iteration alone would only make everything look better.

    Plus, given that you would effectively have access to cheap realtime area lights (And if you do the shadowing directly in the voxel-pass as gbox's engine ostensibly does, you would also get extremely cheap shadowed lights), you would be about as close to "painting with light" as we have ever been.

    Just an idea I have, you could probably, quite literally, open a scene in mudbox and paint area light sources. Given the right tools to go along with it, you could use them in interesting ways such as restricting the direct visibility of the light source, putting some lights into ambient-only/specular-only layers, separating out various lighting phenomena into passes for more specific per-object compositing, etc... kinda like cryengine and many film workflows do now. There is a lot of control and flexibility in that.

    I think many people get the feeling that the newer deferred shading engines give them less control over the scenes lighting and feel that the tradeoff of extremely cheap realtime point light as a replacement for lightmaps did not work out that well. Its partly true, and part of the solution to this problem would be to allow for more varied light types/shapes and controlls, starting with something like realtime area light sources with true soft/area-shadows.

    The blender game engine and the new Cryengine (from Crysis 3) recently started to offer something to this end. I posted about it earlier in the tech art thread:
    Computron wrote: »
    Martins Upitis got realtime area lights of rectangular shape and arbitrary texture (Even Video! Shown at 7:16 in the first video) to work in the candy build of Blender 3d. Specular and Diffuse area lighting work excellent and are supposedly very efficient.

    http://www.youtube.com/watch?v=3JUYFGkmE1Y


    http://www.youtube.com/watch?v=f8Foaf7u7Qk

    Supposedly, up until recently (before Wall-E, I think?), even Pixar weren't simulating actual, full-on GI for most of their shots (although their renderer supported it) but instead where faking the effect through Area lighs for the faster render times. I can't wait until more enginges get this feature, there are so many things you can do with area lights. The Cryengine for Crysis 3 will also supposedly have this.

    Now all we need is an efficient realtime area soft shadowing method.

    here's and embed of those videos:

    [ame=" BGE Candy features - textured area lights - YouTube[/ame]

    [ame=" BGE Candy features - area lights - YouTube[/ame]

    The AMD demo slides had a lot of good to say about area light sources as well.


    Also, I agree, PhillipK is a BOSS.
  • gboxentertainment
    Computron wrote: »
    Do you think there would be any way to solve the jerkyness problem for moving emissive? Like some kind temporal antialiasing or something of the sort?

    My main goal currently is to solve this problem. So far, I'm making a little bit of progress. The jerkiness is caused by each new voxel completely lighting up when a triangle touches it. What I have been able to do so far is create a voxel density by calculating the percentage that each triangle occupies the voxel, based on its dominant axis. It doesn't cost much either. So now the opacity of the voxel should be based its density.

    Still running through some teething issues but I will let you know how it goes.
  • osman
    Offline / Send Message
    osman polycounter lvl 15
    Is there anywhere, anything that's available right now to play around with voxel cone tracing stuff? ( sorry if I missed the post with that info in it )
  • Computron
    Offline / Send Message
    Computron polycounter lvl 7
    osman wrote: »
    Is there anywhere, anything that's available right now to play around with voxel cone tracing stuff? ( sorry if I missed the post with that info in it )

    Yea.

    Kurt loeffler won runner-up in Unity's DX11 competition several months ago with his Unity implementation of SVOGI. Play around with that.
  • osman
    Offline / Send Message
    osman polycounter lvl 15
    thank you, I was hoping for something I could drop my own assets in and play around with placing lights etc :)
  • EarthQuake
    osman wrote: »
    thank you, I was hoping for something I could drop my own assets in and play around with placing lights etc :)

    There is a playable demo here: http://www.altdevblogaday.com/2013/0...-cone-tracing/

    Though I don't think you can drop your own assets in, maybe play around in the source content and mod it?
  • o2car
    Offline / Send Message
    o2car polycounter lvl 12
    I was super hyped up about SVOGI a year ago, but the more examples I see it makes less and less sense to use it. Its a performance hog and I honestly don't know if I like how it looks. Seems like the indirect falloff is just way too short.
  • e-freak
    indirect falloff can/should be controllable. performance will need to get better, yes, but i'm pretty sure it can be done within a reasonable amount of time.
  • gboxentertainment
    o2car wrote: »
    I was super hyped up about SVOGI a year ago, but the more examples I see it makes less and less sense to use it. Its a performance hog and I honestly don't know if I like how it looks. Seems like the indirect falloff is just way too short.

    Every vct example I have seen varies quite a bit. Although I do believe that each one seems to use a slightly different methodology.
    With my engine, I did have to add some "fudge" factors to tweak it to looking as good as I can make it. Compared to a path-traced scene, it does differ significantly.

    In terms of performance, the biggest issue is the requirement for octrees (unless someone can come up with an alternative way of rendering large scenes). With the next generation of gpus, whether it be nvidia's dynamic parallelism or amd's sparse 3d textures, we should be able to get a much better performance.
  • gboxentertainment
    So I've just updated my voxel cone tracing engine. Please check out my video below:

    [ame=" Cone Tracing Experiment 2 - YouTube[/ame]
  • o2car
    Offline / Send Message
    o2car polycounter lvl 12
    Wow. Thats cool! Whats the performance like though?
  • gboxentertainment
    o2car wrote: »
    Wow. Thats cool! Whats the performance like though?

    20fps at 1024x768 on a gtx485m.
Sign In or Register to comment.