Home General Discussion

GDC 2017 - State of Unreal "Realtime Rendering for Feature Film: Rogue One a Case Study."

polycounter lvl 5
Offline / Send Message
ingbue polycounter lvl 5
So I've watched the recorded 'State of Unreal' presentation from GDC 2017 ( https://youtu.be/K6tRt7c2elU starting at 12:56)... One thing that left me overflowing with curiosity and a slight sense of WTF did I just see? - was John Knoll's very short piece on the ADG GPU renderer integrated into UE4 at ILM for actual final shots used in Rogue one, with a teaser for more information at their presentation later that day; "Realtime Rendering for Feature Film: Rogue One a Case Study.", of which there is no recordings or articles unfortunately...

Has anyone attended the presentation or knows more information who could share what these jedi masters are up to!!???

Replies

  • Swaggletooth
    Offline / Send Message
    Swaggletooth polycounter lvl 5
    Realtime engines are increasingly being used for movies / etc, though from what I've heard before they're typically used for early prototyping before the 3D team completely go to town. It makes sense, in the same way that blocking out or whiteboxing does to us.
  • Ged
    Offline / Send Message
    Ged interpolator
    that is very impressive, I really didnt expect it to be such a complicated shot, Im curious how that was done too.
  • ingbue
    Offline / Send Message
    ingbue polycounter lvl 5
    Realtime engines are increasingly being used for movies / etc, though from what I've heard before they're typically used for early prototyping before the 3D team completely go to town. It makes sense, in the same way that blocking out or whiteboxing does to us.
    Swaggletooth, I completely agree with what you said, but for this case in particular, it seems as though it has gone far beyond blocking, and toward final image rendering. I am curious about what exactly the ADG renderer is, and how it is implemented. They clearly label the shots in the film that are produced from the ADG renderer (apparently within UE4), and they fit seemlessly.

    I am thinking that it is probably a gpu renderer plugin that feeds from a UE4 scene. So instead of using maya to build a scene and using renderman to render it, they are using UE4 to build a scene, and ADG renderer to render it, with the added benefit that you could capture and visualise the data in camera on set with virtual cinematography. The question is does the ADG rendering occur real time at the film's framerate, or do they have to still progressively render out the photoreal frames after the initial in engine real-time 'pre-vis'?

    Is it similar to other gpu renderers like redshift or vray rt, which still take a couple of minutes to progressively render a frame, or is it actually rendering in real time?

    Rendered with UE4/ADG renderer;


    Rendered conventionally with renderman;

  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    on a VCA they might be realtime.. the thing is ilm does a lot inhouse coding to make it happen...
    http://www.nvidia.com/object/visual-computing-appliance.html
  • ingbue
    Offline / Send Message
    ingbue polycounter lvl 5
    oglu said:
    on a VCA they might be realtime.. the thing is ilm does a lot inhouse coding to make it happen...
    http://www.nvidia.com/object/visual-computing-appliance.html
     :o That would be really impressive. They'll probably keep this proprietory ADG renderer tech in-house for many many years to come...
  • Swaggletooth
    Offline / Send Message
    Swaggletooth polycounter lvl 5
    ingbue said:
    for this case in particular, it seems as though it has gone far beyond blocking, and toward final image rendering. I am curious about what exactly the ADG renderer is, and how it is implemented. They clearly label the shots in the film that are produced from the ADG renderer (apparently within UE4), and they fit seemlessly.
    Yep, and this indeed pretty impressive! One of the reasons real time engines are (or were) used for the early blocking stage is that they look good, it's really interesting that we're at a stage where a modified engine is good enough for some movie scenes.
  • igi
    Offline / Send Message
    igi polycounter lvl 12
    I didn't expect these days come this early. 
  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    i dont see impressive stuff its just the robot... 
  • ingbue
    Offline / Send Message
    ingbue polycounter lvl 5
    oglu said:
    i dont see impressive stuff its just the robot... 
    I'll wait to see a tech demo of some kind of their workflow...
  • [Deleted User]
    Offline / Send Message
    [Deleted User] insane polycounter
    The user and all related content has been deleted.
  • thomasp
    Offline / Send Message
    thomasp hero character
     cool! think of directors using viewfinders or vr headsets to see something close to the final film in realtime, then add the ability to make changes :open_mouth:

    somehow i imagine it may improve things for those few very visuals/fx-savvy directors - and for the others it will just cause them to issue more and more last minute change requests which then filter down to the VFX people, adding to their workload. ;)

    like this - http://www.upcomingvfxmovies.com/wp-content/uploads/2014/08/PhilTippet_Quote.png


  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    thomasp... Phil nailed it... its how its happening here...
  • ZacD
    Offline / Send Message
    ZacD ngon master
    In theory, having VFX on the set is better than having it be an after thought going through a middle man. That actually leaves the director to make choices that matter, and will allow the vfx artists to do their jobs. That sort of leads into the argument of film vs digital, is it better to get instant feedback on a shot, or having to commit and wait for film to develop. Sometimes it's good to get stuck with decision and choices and not second guess, other times it's good to iterate. There's an argument for both.
  • Ged
    Offline / Send Message
    Ged interpolator
    oglu said:
    i dont see impressive stuff its just the robot... 
    Hey its not super impressive but did you watch the first video posted by the op? its 2 characters, the robot and a squad of stormtroopers and it feels very similar to the rest of the shots. Really natural looking camera blur and depth of field etc. Maybe its not super impressive but I thought it was more ambitious a shot than I was expecting ( I was expecting just the robot in a room ).
  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    i would like to see the stuff from the render... not the graded nuked one...
    maybe i get more impressed by the raw footage...
  • beefaroni
    Offline / Send Message
    beefaroni sublime tool
     cool! think of directors using viewfinders or vr headsets to see something close to the final film in realtime, then add the ability to make changes :open_mouth:
    https://www.youtube.com/watch?v=ihyRybQmmWc

    A bit old now but you may find that interesting. I can only imagine that it's gotten exponentially better since this.
  • ingbue
    Offline / Send Message
    ingbue polycounter lvl 5
    beefaroni said:
     cool! think of directors using viewfinders or vr headsets to see something close to the final film in realtime, then add the ability to make changes :open_mouth:
    https://www.youtube.com/watch?v=ihyRybQmmWc

    A bit old now but you may find that interesting. I can only imagine that it's gotten exponentially better since this.
    Thanks for the great contribution beefaroni. I was happy to see this. It might be two years old already, but definitely gives some great insight. The proof is in this pudding here.
  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    in the latest vray for max update video is something similar... live rendering to vr..n
  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    found the article about the video above...
    they are using 14 VCAs... one does cost about 100k...
    so the rig they are using is ~1.4 millions


    http://www.cgsociety.org/index.php/cgsfeatures/cgsfeaturespecial/using_nvidia_vca_and_chaos_groups_gpu_accelerated_v_ray_rt_for_final_frame
  • Demno
    Offline / Send Message
    Demno polycounter lvl 6
    ingbue said:
    So I've watched the recorded 'State of Unreal' presentation from GDC 2017 ( https://youtu.be/K6tRt7c2elU starting at 12:56)... One thing that left me overflowing with curiosity and a slight sense of WTF did I just see? - was John Knoll's very short piece on the ADG GPU renderer integrated into UE4 at ILM for actual final shots used in Rogue one, with a teaser for more information at their presentation later that day; "Realtime Rendering for Feature Film: Rogue One a Case Study.", of which there is no recordings or articles unfortunately...

    Has anyone attended the presentation or knows more information who could share what these jedi masters are up to!!???
    I did attend the GDC session and I'm beyond impressed. They rendered final frames (beauty, object ID and depth) straight from unreal at 9k using 64 frames of supersampling for motion blur. Then they piped that into the same comp network as the renderman renders and got results that were close enough that they were used in the final movie. So the impressive part is that they rendered final footage in minutes per frame instead of hours. On top of that, they were able to realtime light shots at near final quality. Unheard of. 

    Of course they used the engine for prototyping and camera work but that's old news these days. 

    When the presentation comes up on the vault, check it out. So cool!
  • Ged
    Offline / Send Message
    Ged interpolator
    aha that makes sense, very cool thanks for that! Kinda cool that they could just pass the footage through the same network as the renderman stuff and get similar results and sounds like they didn't have to do an in between pass.
  • ingbue
    Offline / Send Message
    ingbue polycounter lvl 5
    Demno said:
    ingbue said:
    So I've watched the recorded 'State of Unreal' presentation from GDC 2017 ( https://youtu.be/K6tRt7c2elU starting at 12:56)... One thing that left me overflowing with curiosity and a slight sense of WTF did I just see? - was John Knoll's very short piece on the ADG GPU renderer integrated into UE4 at ILM for actual final shots used in Rogue one, with a teaser for more information at their presentation later that day; "Realtime Rendering for Feature Film: Rogue One a Case Study.", of which there is no recordings or articles unfortunately...

    Has anyone attended the presentation or knows more information who could share what these jedi masters are up to!!???
    I did attend the GDC session and I'm beyond impressed. They rendered final frames (beauty, object ID and depth) straight from unreal at 9k using 64 frames of supersampling for motion blur. Then they piped that into the same comp network as the renderman renders and got results that were close enough that they were used in the final movie. So the impressive part is that they rendered final footage in minutes per frame instead of hours. On top of that, they were able to realtime light shots at near final quality. Unheard of. 

    Of course they used the engine for prototyping and camera work but that's old news these days. 

    When the presentation comes up on the vault, check it out. So cool!
    Yay! I am so glad this thread received input from someone who attended the presentation, no need for speculation anymore... I can't wait to be able to see it, it sounds great. Do you mean the GDC Vault? ( http://www.gdcvault.com/  )  Please post here (anyone) if you notice that the presentation has been posted somewhere.
Sign In or Register to comment.