Home Technical Talk

Is this VR demo real time, CGI or both?

JordanN
interpolator
Offline / Send Message
JordanN interpolator
I was watching this Nvidia presentation from last year where they talk about bringing iRay rendered scenes to Virtual Reality and something stuck out to me.

https://www.youtube.com/watch?v=uFahmqmnKX0

For those unaware, Iray is a ray tracer/path tracing. It uses CPU & GPU to render scenes, but only 1 frame at a time (so not real time). And because it's meant to work on a frame by frame basis and is ray traced, there is theoretically no polygon/texture,shader limit etc.

However, Huang says that they rendered out light probes and @5:19, he says "everything is real time" and plays with the HTC headset.
https://www.youtube.com/watch?v=uAVJ3QsJ0fY



Keep in mind, Virtual Reality that we see in games right now requires ton of polygon budgeting to keep up with the 90-120fps requirement. So what is going on with the final output? Is the demo in fact playing back in real time the same polygon count and path traced lighting of a standard iRay scene, or is it a video being delivered to both eyes (but how would you interact with it)?  





Replies

  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    The only thing realtime here is the headset tracking. Everything is baked, and there's probably not even a 3d scene in that VR environment - just a sphere that everything is baked to. Look closely - there is no parallax of scene elements occluding each other.

    Basically what he is demoing is a one button export solution to "make a quicktime 360 thingie" from a given iray scene at a given vantage point. Great for archviz/interior design presentations, but not VR in the sense you are thinking.

    TLDR : It's just like VR porn video.
  • JordanN
    Options
    Offline / Send Message
    JordanN interpolator
    Yeah, that sounds correct. Unless he's able to look behind the staircase, he was only going around in a circle like if you were standing in the middle of a bubble.

    Still, it has me excited for the future of CG going forward. Instead of rendering stills from one camera spot, every viewpoint could be rendered at once. Now effortlessly translating the entire scene so you can look around it in VR would be the next step.

  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    The visual quality of his little "semi VR" 360 bubble is already achievable in fully VR-walkable environments anyways. Robot Recall, heck, even the default HTC space deck environment look pretty much just like this lobby scene. It's here.
  • spacefrog
    Options
    Offline / Send Message
    spacefrog polycounter lvl 15
    Just for completeness, a small correction to the quicktime thingy mentioned above: thanks god that quicktime (VR)  is a thing of the past and pano imagery display on websites etc.. is all possible without any plugins.  What's shown in the nvidia/Vive demo seems to be a stereo pano rendering mapped onto a sphere. And yes this is all without any real 3d content, just two spheres ( one for each eye) with a "equirectangular" renderings mapped onto. This can give great result and a pretty good immersion, except of course that you can't move ( only rotate your head) and thus no parallax is happening if you move your head/body...
    BTW: this does'nt require a vive or rift at all . A Gear VR reaches the same quality level ( mobile preformance is well enough for such applications), even on a plain android + google cardboard you can have this experience. On Cardboard with some quality tradeoffs only though ...
  • Bletzkarn
    Options
    Offline / Send Message
    Bletzkarn polycounter lvl 6
    no movement, it's rendered not in real time.
  • JordanN
    Options
    Offline / Send Message
    JordanN interpolator
    So I did a bit more research/reading on iRay VR and there's actually a bit more information to it.

    At a basic level, iRay can take snapshots of both a left and a right eye image which is used to make the 360 degree panorama. This is the stuff that runs on any headset and isn't performance taxing.

    What Nvidia showed in the above videos was actually a bit more complex. It's using light field technology which actually gives a bit more control/interactivity such as toggling between lights, the ability to change color of objects interactively, and you can "teleport" around the scene. This was running off their Quadro cards so in a way it is real time. Just not enough for you to modify/interact with the entire scene.

    There's one more thing. I could try it for myself (because I am an iRay developer) but there is a way to render scenes in real time for VR. Apparently, using the activeshade mode within 3DS Max. This is what I was interested in my first post, because I wanted to know if you can take the exact same scene that is made for CGI/offline render, but render it again in real time for Virtual Reality.

    You might be wondering "why not just use Unreal Engine 4/Marmoset to view such projects in real time"? Well originally, this was something I was going to do. But because I already have everything made in 3DS Max and using the iRay renderer, I would still have to make certain changes when porting assets over and re-rendering them again for real time. But because there does exist solutions right now where iRay can actually render out scenes and have them be displayed in VR, I feel this saves me a lot of time from having to go back and forth between moving assets around to Unreal Engine 4, just so I can achieve the same purpose of having a VR ready image!  :)

    Sources:
    https://forum.nvidia-arc.com/showthread.php?14883-nvidia-IRAY-VR
    http://www.tomshardware.co.uk/nvidia-iray-vr-gtc-2016,news-52753.html
    http://www.nvidia.com/object/iray-for-3ds-max.html
    https://blogs.nvidia.com/blog/2015/12/02/architects-use-nvidia-iray/
  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    Yeah, so basically, Nvidia made a cool little export tool to render iray scenes to simple spheres, so that they can be loaded and looked at into any cheap VR-friendly device, from mere phones to headsets connected to beefy gaming rigs.
    (See : it all makes much more sense as soon as the word "realtime" is banned altogether from the description).

    Bottom line : if you want to make "sphere" VR images, use any renderer that can render to a sphere camera. If you want to make a proper explorable VR environment, use a game engine. Nothing new here :)
  • JordanN
    Options
    Offline / Send Message
    JordanN interpolator
    pior said:
    Bottom line : if you want to make "sphere" VR images, use any renderer that can render to a sphere camera. If you want to make a proper explorable VR environment, use a game engine. Nothing new here :)
    I hope I'm not sounding difficult, but another reason why I made this thread is because I've actually been making discoveries that blur the lines between both game development and offline renders.

    For example, after I read all those links, I went and updated my software and found that the previous real time render mode was dramatically improved. Why is this important? Before the update, the real time mode was barely usable and thus the only practical alternative was sticking to the slow, frame by frame, path tracer. Now maybe it's just a coincidence, but after Nvidia started making their big push into VR, they revisited this function of iRay and gave it better support for real time shaders. No longer are certain materials like glass or SSS are broken anymore, and I can even explore any scene meant to be rendered for hours, now instantly with a bit of noise here and there.

    It's still the same software, but even traditional ray tracers are now starting to borrow from real time technology and using it to push both mediums forward. See what I'm getting at now?
  • low odor
    Options
    Offline / Send Message
    low odor polycounter lvl 17
    Seurat ( light fields in  general)  are going to  be a game changer.  It's more than just a Photosphere.

    https://www.youtube.com/watch?v=_rAHM70jfzo


    Nvidia is working on something similar, as well as Otoy.


  • Octo
  • Joopson
    Options
    Offline / Send Message
    Joopson quad damage
    Somewhat related question: Isn't all computer generated imagery called CGI? Whether it's realtime or prerendered?
  • JordanN
    Options
    Offline / Send Message
    JordanN interpolator
    For convenience reasons, CGI always referred to pre-rendered imagery first.

    But that's all starting to blur now. Even Pixar is starting to develop a new hybrid renderer that emphasizes real time. 

    RenderMan 22: live linking to compatible DCC applications
    So far, Pixar hasn’t posted a lot of information about the new features in RenderMan 22, but it has said that the release will feature “always-on” rendering in artist applications.

    Any changes made to scene geometry, materials, lighting or cameras in a compatible application – currently, Maya, Katana, Houdini and Blender – will propagate to RenderMan in real time.

    Pixar describes the system as delivering “incredible interactive frame rates” with the same renderer also used for batch renders.

    Other features due in RenderMan 22 include “fast vectorized OSL shader network evaluation on Intel scalable-SIMD CPUs”.

    http://www.cgchannel.com/2017/08/pixar-unveils-renderman-22-and-renderman-xpu/

    Looks like the race for the real time ray tracing is now on. I feel like I betted at the right time to pursue this technology back in 2016, but instead of waiting, I've been keeping pace with Nvidia and Pixar's solutions. Especially with VR now being available on shelves.
  • danr
    Options
    Offline / Send Message
    danr interpolator
    Young Terry Trailblazer was the luckiest boy in the whole of Canada. For all the way back in 2016 his inventor grandfather gave him the key to the secrets of offline raytracing, which only he could bend to his will. The other boys were left with their old-fashioned Realtime toys, and young Terry was the envy of the whole village!

    Read an incredible new adventure each week in Spazzer & Chips. In the next issue : a free Viewmaster reel for every reader!
Sign In or Register to comment.