Home General Discussion

Next-gen rendering trends

13
polycounter lvl 9
Offline / Send Message
samnwck polycounter lvl 9
With the inception of PBR being the norm these days (and doing a pretty good job at mimicking reality I might add) I want to know, what's next?

For example, I've seen more and more that VR is starting to get its claws into more full length titles (Fallout and Doom for example) vs titles that resemble more of a tech demo. One of the major issues I see with realistic rendering techniques for VR is that normal maps don't work well enough due to them essentially faking more geometry when in VR you have actual depth perception which rids that illusion. Do you see something completely new coming to replace the normal map or will it come down to being able to throw more polygons into scenes to make those illusions more imperceptible?

Another example, smarter rendering with machine learning. Nvidia debuted this bit where they use "deep learning" to help their ray tracing engine fill in the gaps to 'guess' what certain pixels will be to lower rendering times

https://www.youtube.com/watch?v=1tbHkWmOuAA

Will the next step rely more on better hardware specs (being able to pack more on screen) or more on revolutionary code-writing? The simple answer is obviously both, but I'm hoping to get some more specific insight on what technologies the people here think are starting to mature to the point where we may be seeing them in the games of tomorrow.

Replies

  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    i would use more poly... like they did in alien isolation..
  • Mstankow
    Offline / Send Message
    Mstankow polycounter lvl 11
    A slow decade or two transition to Disney's full on PBR system.

    https://disney-animation.s3.amazonaws.com/library/s2012_pbs_disney_brdf_notes_v2.pdf

    And also Nvidia's real-time global illumination is really cool. I can see that being improved and used in games in the future. And I imagine as computers get more powerful, the more reflections it can do in real-time.

  • samnwck
    Offline / Send Message
    samnwck polycounter lvl 9
    Mstankow said:
    A slow decade or two transition to Disney's full on PBR system.

    https://disney-animation.s3.amazonaws.com/library/s2012_pbs_disney_brdf_notes_v2.pdf

    And also Nvidia's real-time global illumination is really cool. I can see that being improved and used in games in the future. And I imagine as computers get more powerful, the more reflections it can do in real-time.

    Real time ray tracing definitely has a long way to go before it's ready for any type of gaming application, however generally reflections are generally as expensive as anything else in the scene, it comes down to how many light bounces each photon is given and such  or at least that's my ape understanding of it. But I really like the idea of machine learning being implemented into rendering applications. It seems AI is creeping into every facet of technology. 

    I don't think I've seen the Nvidia GI thing you were talking about. I'll need to look into it. 
  • kadeschui
    Offline / Send Message
    kadeschui greentooth
    samnwck said:
    I don't think I've seen the Nvidia GI thing you were talking about. I'll need to look into it. 
    https://www.youtube.com/watch?v=DKkgHx_tK2s
  • EarthQuake
    The idea that normal maps don't work well in VR is somewhat of a myth. Normal maps can't replace major forms (this isn't new - they never have been able to), so your lowpoly should have all of the important shit modeled into it, but there is still a lot you can get from a normal map, like micro detail and shading compensation.
  • Chimp
    Offline / Send Message
    Chimp interpolator
    samnwck said:
    Mstankow said:
    A slow decade or two transition to Disney's full on PBR system.

    https://disney-animation.s3.amazonaws.com/library/s2012_pbs_disney_brdf_notes_v2.pdf

    And also Nvidia's real-time global illumination is really cool. I can see that being improved and used in games in the future. And I imagine as computers get more powerful, the more reflections it can do in real-time.

    Real time ray tracing definitely has a long way to go before it's ready for any type of gaming application, however generally reflections are generally as expensive as anything else in the scene, it comes down to how many light bounces each photon is given and such  or at least that's my ape understanding of it. But I really like the idea of machine learning being implemented into rendering applications. It seems AI is creeping into every facet of technology. 

    I don't think I've seen the Nvidia GI thing you were talking about. I'll need to look into it. 
    No, its very close. Unity for example have Otoy Brigade path tracing implemented right now for realtime use- to be introduced publicly this year, and GPU manufacturers are following suit. We'll probably also see audio tech this system for sound propagation too.

    I give it 5 years until the PC master race compete to see who has the littlest noisiness.

    On the topic of AI, noise, as it happens, is what researchers are working on right now - clever renderers that prioritise only what matters to the clarity of the image (as opposed to all of it equally) etc. 

    edit: watched the OP's video - yeah exactly they're moving in this direction fast. 


    edit: here's otoy's renderer - unity's new renderer - running realtime 2 years ago:
    https://www.youtube.com/watch?v=FbGm66DCWok

    or five years ago:
    https://www.youtube.com/watch?v=aKqxonOrl4Q


    With the right settings, current top end machines run it fine. With developer support as engines start to integrate this stuff, gpu manufacturers have no option but to respond and performance will rise.
  • throttlekitty
    Offline / Send Message
    throttlekitty ngon master
    There's definitely a future in machine learning, I'd love to feed a bunch of pictures of some material into a program and say "this is concrete, give me a parametric set of textures that loosely follow the rules you find". I'm a huge fan of using things the wrong way, could be fun to have a ML "game" that understands only half of the game world and player actions and make up the other half as it plays out.

    I'm not sure I "get" what the difference in that nVidia denoiser demo is versus current routines and what makes it faster, but it looks promising.
  • Chimp
    Offline / Send Message
    Chimp interpolator
    I talked to allego a while back about this - i think they've got the rich data in the form of their massive substance graph libraries, it's not unfeasible that they could put something together that would recreate an input using nodes based on the library. Not necessarily appropriate as a make material button, but more to speed up getting a rough block out of a specific surface right, and manually take it from there.
  • radiancef0rge
    Offline / Send Message
    radiancef0rge ngon master
    The idea that normal maps don't work well in VR is somewhat of a myth. 
    Accurate statement. 
  • JordanN
    Offline / Send Message
    JordanN interpolator
    This technology generation has put me off following realism 100%. I believe PBR has its place and it should continued to be researched. But I'm a bit disappointed that stylized art hasn't received the same attention. I want to see art branch out so that future games and movies don't have to mimic one common artstyle just so they can claim to be realistic.

    When Arc System Works did their presentation on how they got their game to look like traditional anime but in 3D, I'm hoping we get to see more technology like that. 
    https://www.youtube.com/watch?v=Qx-ISmfOF4g&t=3s
    https://www.youtube.com/watch?v=Eqm_MiONvtU
  • Joopson
    Offline / Send Message
    Joopson quad damage
    JordanN said:
    This technology generation has put me off following realism 100%. I believe PBR has its place and it should continued to be researched. But I'm a bit disappointed that stylized art hasn't received the same attention.
    PBR does not exclude stylized art. PBR is a system, not a style.
  • JordanN
    Offline / Send Message
    JordanN interpolator
    I know. But I felt like all the attention to realistic lighting that PBR does has "sidelined" other art that doesn't aim for the same thing.

    I'll be happy when the two videos I posted make headway for providing artists with new tools.
  • Chimp
    Offline / Send Message
    Chimp interpolator
    PBR was invented FOR stylised art lol. It's physically based but they were working on stylised projects. If you want actual heavy stylisation though, those very same people have since done loads of stuff including hand drawn looks etc.

  • Obscura
    Offline / Send Message
    Obscura grand marshal polycounter
  • samnwck
    Offline / Send Message
    samnwck polycounter lvl 9
    Chimp said:
    No, its very close. Unity for example have Otoy Brigade path tracing implemented right now for realtime use- to be introduced publicly this year, and GPU manufacturers are following suit. We'll probably also see audio tech this system for sound propagation too.

    I give it 5 years until the PC master race compete to see who has the littlest noisiness.


    With the right settings, current top end machines run it fine. With developer support as engines start to integrate this stuff, gpu manufacturers have no option but to respond and performance will rise.
    I had seen that, it's definitely very exciting for sure because ray-tracing is definitely the best way to get realistic lighting. It's very close but still not there yet but I think it's naturally the next step.

    As far as the normal map VR thing,  I understand the mechanics of it and why it still works as it should, I was merely providing an example of what I was talking about but I agree that it was not a great example. 

    @JordanN I think what you're describing is that you want more stylized shaders which tell the PBR system how to react to lighting and all that other stuff, a PBR system I believe would be good for any stylized application.
  • Mstankow
    Offline / Send Message
    Mstankow polycounter lvl 11
    Chimp said:
    PBR was invented FOR stylised art lol. It's physically based but they were working on stylised projects. If you want actual heavy stylisation though, those very same people have since done loads of stuff including hand drawn looks etc.

    Was it invented for the movie Tangled, or did it exist in some form before that? That Disney doc I linked said Tangled but I am sure other companies had to have been working on PBR systems before that.
  • Chimp
    Offline / Send Message
    Chimp interpolator
    Well, invented is hard to say - bad choice of words - the guys that we in the games industry stole it from were the disney/pixar guys who did their work, so far as i know, based on the stuff these guys did:
    http://www.pbrt.org/
    in 2004
  • lotet
    Offline / Send Message
    lotet hero character
    @JordanN
    wow...that was really cool, had no idea anything like that existed.
  • Lt_Commander
    Offline / Send Message
    Lt_Commander polycounter lvl 10
    I look forward to a future without light baking, second UV sets, or probe placement.

    Between TB3's GI and the super experimental stuff buried in CryEngine, I'm excited to see Voxelized RT GI move into a deployable state for games. Seeing specular reflections and lighting being generated on the fly based off of voxelized scenes that can run on a consumer GPU really is something. It's not quite ray tracing, but it's way more feasible in the near term and produces promising results.
  • marks
    Offline / Send Message
    marks greentooth
    oglu said:
    i would use more poly... like they did in alien isolation..
    Please don't, it was horrible on so many levels. 
  • almighty_gir
    Offline / Send Message
    almighty_gir ngon master
    JordanN said:
    I know. But I felt like all the attention to realistic lighting that PBR does has "sidelined" other art that doesn't aim for the same thing.

    I'll be happy when the two videos I posted make headway for providing artists with new tools.
    I don't think i've ever met someone as willfully ignorant as you man...

    https://disney-animation.s3.amazonaws.com/library/s2012_pbs_disney_brdf_notes_v2.pdf <--- where PBR in the games industry was directly copied from.

    https://www.youtube.com/watch?v=87E6N7ToCxs

    The movie it was developed for. The very first iteration of a fully PBR production.
  • JordanN
    Offline / Send Message
    JordanN interpolator
    PBR used in stylized art =/= All stylized art

    I said in my first post, I'm interested in what Arc System Works did. You can find a presentation they did and it goes into detail about it.

    I also think people are either deliberately ignoring this sentence, or need to read it again.

    "I want to see art branch out so that future games and movies don't have to mimic one common artstyle just so they can claim to be realistic."

  • CrackRockSteady
    @JordanN I think you are confusing rendering technique with art style.  Nothing about PBR systems forces anyone to mimic a particular art style.  There are plenty of games and films that use PBR that all have wildly differing art styles.
  • Neox
    Offline / Send Message
    Neox godlike master sticky
    if pbr would be the solution, than there would be not as much research in npr ;)

    there are styles which profit from pbr are and others where there is no gain using it.
  • lotet
    Offline / Send Message
    lotet hero character
    Im gonna side with @JordanN on this one. 
    There are more things then PBR out there, and rendering technique is absolutely important to art style and stylized art, there where a lot of things that wasnt possible before PBR. personally I love what it did for stylized art but its by no means an all around solution.

    @CrackRockSteady
    it absolutly lock you into a certain style, just as the old specular system did for that matter. things like sin city or prince of persia 4 would be 100% impossible to do with PBR.

    now I dont  think Jordans point was "that this is better then that",  but rather "the more the better". thats at least what Im gonna take out of it:)
  • CrackRockSteady
    @lotet I'm not saying that rendering technique doesn't affect art style, or even limit it in some respects, but it absolutely does not force everyone to "mimic one common art style".  

     A realistically styled FPS like CoD and a heavily stylized game like Sunset Overdrive or Overwatch all use PBR.  If you're going to try to tell me that these games are all the same art style, you're batty.

    Edit:  I also was not saying that anything and everything can (or should) be done with PBR
  • Mstankow
    Offline / Send Message
    Mstankow polycounter lvl 11
    Nobody is going to build a unified platform for not traditional/super stylized rendering because the needs for each project will probably be drastically different. If you want an out there look, I think studios will have to work for it with some hefty in-house research and tool building.


  • lotet
    Offline / Send Message
    lotet hero character
    @lotet I'm not saying that rendering technique doesn't affect art style, or even limit it in some respects, but it absolutely does not force everyone to "mimic one common art style".  

    Thats a fair point, but I dont think we should downplay the important of NPR rendering techniques either.  and I will still say that PBR limits you in a heavy way to the kind of Disney/Pixar type of stylizisation which is only a small part of "stylized art".
  • CrackRockSteady
    lotet said:
    @lotet I'm not saying that rendering technique doesn't affect art style, or even limit it in some respects, but it absolutely does not force everyone to "mimic one common art style".  

    Thats a fair point, but I dont think we should downplay the important of NPR rendering techniques either.  and I will still say that PBR limits you in a heavy way to the kind of Disney/Pixar type of stylizisation which is only a small part of "stylized art".
    I wasn't trying to downplay the importance of other rendering techniques, the only thing I said was that JordanN was confusing rendering technique and art style.  His post seems to imply that anything using the same rendering technique is limited to one common art style, which is simply not true.  While rendering technique will absolutely influence art style, rendering technique =/= art style.  
  • lotet
    Offline / Send Message
    lotet hero character
    @CrackRockSteady - that I can get behind,  lets leave it at that, cause and Im not gonna start arguing how to interpenetrate Jordans posts,  thats a can of worm I really dont wanna dig into :P 
  • pior
    Offline / Send Message
    pior grand marshal polycounter
    Regarding the OP : Normalmaps "not working in VR" is a myth. Sure, if a model is super lowpoly the fake shading will look odd, and a first person weapon does benefit from having all its small details modeled in to avoid the surface from looking melted. But normalmaps on, say, a full character look fantastic in VR. The presence is amazing, with characters feeling just like the life-size sculptures seen at conventions.

    That being said !

    There's one thing that clearly doesn't work well in VR at this time : realtime reflections. They end up being rendered independently for each eye (which makes sense) but the result is a constant shimmering effect because of the small differences between left and right eye. It's not noticeable in real life, but in VR it is very jarring. I really hope for this to be addressed soon.
  • Justin Meisse
    Offline / Send Message
    Justin Meisse polycounter lvl 19
    I've worked on 3 VR titles as a character artist, there was really no difference from any other game. I actually wasn't aware I was doing VR characters for 2 of the games.
  • Neox
    Offline / Send Message
    Neox godlike master sticky

     A realistically styled FPS like CoD and a heavily stylized game like Sunset Overdrive or Overwatch all use PBR.  If you're going to try to tell me that these games are all the same art style, you're batty.
    I wouldnt call Overwatch PBR, it may be PBR-esque, in a sense of using a Metalness like workflow.  But it breaks the PBR rules way too much and follows its own ruleset, to call it a PBR game. 
  • Chimp
    Offline / Send Message
    Chimp interpolator
    Agreed @ Neox. It's a custom shader for both style and speed reasons.

    Next gen is path tracing guys, there's no two ways about it. That's the next step. That, and leveraging machine learning to make it, and any post fx we have in the mean time less costly.
  • Froyok
    Offline / Send Message
    Froyok greentooth
    marks said:
    oglu said:
    i would use more poly... like they did in alien isolation..
    Please don't, it was horrible on so many levels. 
    Would love some details on that, because on the paper it sounds like a good win (texture memory and easier/faster modeling and re-use).


  • EarthQuake
    From everything I understand about it, real-time path tracing, and I mean real time in the sense that you can hit 60fps while doing all the other things a game engine needs to do, is still many years off.
  • Chimp
    Offline / Send Message
    Chimp interpolator
    From everything I understand about it, real-time path tracing, and I mean real time in the sense that you can hit 60fps while doing all the other things a game engine needs to do, is still many years off.
    It's running absolutely fine right now :) Unity have already implemented Otoy's Brigade in alpha and will be releasing the first edition this year.

    I give it a maximum 5 years before the PC master race are competing with each other for speed, and from there not long before nvidia/amd respond properly. In the mean time PVR hardware is speeding ahead - mobile hardware can currently do it better than desktop but it's not a case of far off future tech, just we're on hardware not designed _for_ it.
    https://home.otoy.com/otoy-and-imagination-unveil-breakthrough-powervr-ray-tracing-platform/ 

    That isn't to say we'll have perfect offline style renders on average gamer hardware this year, it's gonna be noisy and slow if it's all turned up, but if all you want is proper reflections, proper transparency etc on normal hardware then it's here already and working fast with no noisiness whatsoever.
    I suspect devs will start to offer it as an optional setting and let PCMR to push hardware to improve)

    although, PVR ain't so bad, might see PVR consoles again - dreamcast was :)

    Dual top end Nvidias will do a fairly noise free beautiful 60fps render right now though, all on the GPU, so we're not terribly far off games.

    Unity actually have been playing with it for years, I've seen a number of demos that run in the thousands of FPS when it's just the reflections you want. Even seen mobiles do it absolutely fine.

    2-3 years old demo of what all of us are getting this year:
    https://www.youtube.com/watch?v=FbGm66DCWok

    From Otoy's own slides: and whilst I understand a company will market it's tech, I've seen it myself.



    On the software side, the clever people doing research right now are working on renderers that minimise the work needing to be done by intelligently deciding what parts of a render actually needs samples etc, as opposed to the brute methods of the last 10 years.
  • Joopson
    Offline / Send Message
    Joopson quad damage
    Chimp said:

    PVR hardware......PVR ain't so bad...PVR consoles...

    ???
    https://en.wikipedia.org/wiki/PVR

    My money is on plant variety rights
  • Chimp
    Offline / Send Message
    Chimp interpolator
  • throttlekitty
    Offline / Send Message
    throttlekitty ngon master
    Chimp said:
    PowerVR
    Power Voltage Regulator? LIke an ATM Machine, right?
  • Chimp
    Offline / Send Message
    Chimp interpolator
    Potential Vaginal Reflow
  • EarthQuake
    Chimp said:
    From everything I understand about it, real-time path tracing, and I mean real time in the sense that you can hit 60fps while doing all the other things a game engine needs to do, is still many years off.
    It's running absolutely fine right now :) Unity have already implemented Otoy's Brigade in alpha and will be releasing the first edition this year.


    This really depends on your definition of "fine". Do you have personal experience using this in production? From the information I can find on it it's nowhere near ready for 60fps real-time for real games (and the Otoy devs say as much as well).
     
    Here is some footage of it in action, it takes ~6 seconds to resolve a complex frame that isn't a mess of noise, which is very fast compared to a traditional CPU renderer, but needs to be roughly 350x faster (even more than that if we consider this is only rendering on half the screen) for real-world end consumer use. And again, this is rendering a canned cinematic, not dynamic gameplay.

    https://www.youtube.com/watch?v=RxoH_Cwvwe0

    Otoy's light field solution is, as far as I can tell, a light baking system which for all I know may be very fast, but is not real time path tracing.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 20
    Machine/Deep Learning imo will revolutionize several industries. Yes a bit of a hype train atm, but still the results so far are quite impressive and promising for future. Working for NVIDIA I am not exactly unbiased ;)

    For example:

    Style Transfer
    https://www.youtube.com/watch?v=0ueRYinz8Tk

    Animation
    https://www.youtube.com/watch?v=Ul0Gilv5wvY

    The tech is still a bit too heavy (like the raytracing denoiser) to be used in games today, but down the line say 2-5 years from now we will have more of the "magic" usable in mainstream.
    Likewise raytracing-like effects are already dominating games today (screen-space reflections etc., the voxel tracing etc. are all forms of "tracing"). That trend will continue. 

    We also get more and more games that use alternative ways to render their scenes (distance fields)
    https://www.youtube.com/watch?v=q6flyIrvKCA

    When I see the E3 videos today, and compare to say 10 years ago, it's amazing how far things come, the craftsmanship, the amount of detailing etc. That's why imo we will not see huge jump advances over night anymore, cause we have incremental improvements on so many fronts already. Another aspect is, that we have to work with the "incrementally" changing hardware as well, so it's not like over night we have infinitely more/different power accessible. High-end capabilities take a few generations to tickle down.

    Even more than in applying as effects, the machine learning will help a lot in content creation, clean ups, deriving data from paint/photo/video etc. and therefore drive the costs of rich environments down. There will not be a "make art" button, but we will get much closer to automating processes. So yes I expect a lot movement on tools built around this.
    All major research studios of game/film etc. are looking into this technology, so I suspect in the next years we will see a lot of movement there.

    Therefore, as usual, exciting times ahead :) Maybe even more than before ;)
  • cptSwing
    Offline / Send Message
    cptSwing polycounter lvl 11
    Claybook, jesus christ. That looks great.
  • Chimp
    Offline / Send Message
    Chimp interpolator
    Chimp said:
    From everything I understand about it, real-time path tracing, and I mean real time in the sense that you can hit 60fps while doing all the other things a game engine needs to do, is still many years off.
    It's running absolutely fine right now :) Unity have already implemented Otoy's Brigade in alpha and will be releasing the first edition this year.


    This really depends on your definition of "fine". Do you have personal experience using this in production? From the information I can find on it it's nowhere near ready for 60fps real-time for real games (and the Otoy devs say as much as well).
     
    Here is some footage of it in action, it takes ~6 seconds to resolve a complex frame that isn't a mess of noise, which is very fast compared to a traditional CPU renderer, but needs to be roughly 350x faster (even more than that if we consider this is only rendering on half the screen) for real-world end consumer use. And again, this is rendering a canned cinematic, not dynamic gameplay.

    https://www.youtube.com/watch?v=RxoH_Cwvwe0

    Otoy's light field solution is, as far as I can tell, a light baking system which for all I know may be very fast, but is not real time path tracing.
    Not talking about the baking but as it turns out it might be next GDC before full brigade is launched.  I'm mixing various things though to be honest, what I was getting at with very fast noise free realtime right now is that you can do a hybrid renderer like what Imagination Tech have been doing for years with unity for powervr hardware at hundreds of frames per second.

    Traditionally you're going to replace all your rasterisation code with raytracing code so that every pixel is drawn by emitting rays into the scene like a full offline renderer (30 rays per pixel etc) is going to be slow for now.

    However we can use it in other ways, for example a hybrid renderer whereby you submit geometry to the path tracer, build a database of the scene and use shaders to define ray behaviour for much better shadows, true reflections, transparency, hell, even AI and sound propagation.

    In the case of unity deferred, the Gbuffer can be reused as the inputs for the raytracer. For every pixel of the Gbuffer, you have the properties of the surface that was visible to the camera - normal, position, color and material ID - you take these objects use it to generate primary rays based on these properties. Rather than emitting rays from a camera, you emit rays from the surface that is defined by the Gbuffer.

    it's not a dichotomy – you do not have to have only have one or the other.  Pathtracing can coexist with traditional rendering and go beyond into all sorts of other uses like with sound, which is something that excites me a lot. Thats what you were seeing in that noise free nvidia demo I shared AFAIK. That's why it wasn't 5fps and full of noise like your video - not because its endless graphics cards, but because its a hybrid render - all the benefits like reflections etc.

    Lots of power coming here! :)

    the GDC vault might have a talk from 2013 or 14 on this but i don't have access so I can't share.

    I am a mere artist at the end of the day quoting researchers i've seen at conventions so I'm going to have a talk with my more knowledgeable programmer brother tonight or tomorrow about this topic and see what I can glean.

  • EarthQuake
    Chimp said:

    I am a mere artist at the end of the day quoting researchers i've seen at conventions so I'm going to have a talk with my more knowledgeable programmer brother tonight or tomorrow about this topic and see what I can glean.

    Cool yeah I'm in the same boat being an artist that has to ask engineers about this stuff, it seems like we're talking about slightly different things in general here. When I talk about path tracing I mean a full on path-tracer rather than a hybrid system, which is what I thought you were referring to as well. The graphics programmers I've talked to tell me this is still many years off.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 20
    The hybrid renderers give you little noise on direct lighting, but all indirect is still noisy due to nature of path-tracing and can take a good deal of the frame's complexity. I've seen the iray denoiser applied in real-time scenario, so only very few rays per pixel accumulated, and it gives a fairly stylistic look and it is not cheap in terms of real-time effect costs. It's cheap compared to ray-tracing accumulation ;)
    The costs of tracing are just one portion of the problem, the other is shading the things you hit. These can be substantial, and it's not trivial to manage it efficiently as rays may diverge on the types of surfaces they hit etc.

    There is another interesting technique that somehow relates to this future, which is called decoupled shading, or texture space shading. Basically it's like doing all shading in "lightmap" space. But the trick is that you only shade those texels that you actually need in the frame. This includes only updating the appropriate mipmap level of the shading texture. The benefit is that you also get implicit shader anti-aliasing, because we now sample the shading texture for final results, and texture sampling is "smoothing" out things and temporal stable across frames.
    Finding out which texels to shade is typically done by geometry pass and rasterization, but you can imagine that another pass could collect all the indirect texels required by any means that are "fast enough".

    In the end, the programmer's toolbox to improve anti-aliasing, to improve indirect lighting effects, keeps growing and improve those effects.
  • ZacD
    Online / Send Message
    ZacD ngon master
    Note, I'm posting this before reading the entire thread. But I saw some mention of machine learning and wanted to share a practical example of it being applied to game rendering.

    Neural Network (Screen Space) Ambient Occlusion
    http://theorangeduck.com/media/uploads/other_stuff/nnao.pdf
    http://theorangeduck.com/page/neural-network-ambient-occlusion

    I'm still interested to see what will be the next big thing for artists to learn, like normal maps and PBR. I'm expecting to see a lot more semi-procedural applications to speed up asset creation like Quixel and Substance, but hopefully in other areas. 
  • FourtyNights
    Offline / Send Message
    FourtyNights polycounter
    I guess the new real-time rendering trend is now "real-time ray tracing" revealed at GDC by multiple facets like Nvidia (with Remedy and Unreal Engine 4), Electronic Arts and Microsoft:

    https://www.youtube.com/watch?v=jkhBlmKtEAk

    https://www.youtube.com/watch?v=J3ue35ago3Y

    https://www.youtube.com/watch?v=LXo0WdlELJk

    https://www.youtube.com/watch?v=81E9yVU-KB8

    Can't wait to light and render my portfolio material with this technology soon!
  • JordanN
    Offline / Send Message
    JordanN interpolator

    Can't wait to light and render my portfolio material with this technology soon!
    You know, if ray tracing eventually becomes the standard, what's the difference between a portfolio that was rendered with Vray/Mental Ray and one that is rendered with UE4 + ray tracing? 

    Besides the obvious that UE4 is real time, but if employers are mostly looking for static images first, would they even be able to tell the difference? 

    These ray tracing demos are running on $80,000 workstations, so can't anyone just claim their Offline work is real time, just give it a lot of render power. 
13
Sign In or Register to comment.