Home Unreal Engine

Unreal Engine 5 - Epic Games Announces New Engine

13
sublime tool
Offline / Send Message
littleclaude sublime tool

Replies

  • zachagreg
    Offline / Send Message
    zachagreg sublime tool
    I have basically no words for what I just saw. Are they releasing the demo anytime soon?
  • Justo
    Offline / Send Message
    Justo interpolator
    This should be the real next-gen reveal shown to people; the marketing show microsoft put up last week was a joke compared to this  :o

    This looks so impressive. No more lightmap or texture baking? Millions of polygons? Unbelievable stuff...unreal even oh ho ho!
  • zachagreg
    Offline / Send Message
    zachagreg sublime tool
    @Ryandec probably pretty close to film texturing and stuff like Zhelong Xu
  • Ryandec
    Offline / Send Message
    Ryandec polycounter lvl 7
    zachagreg said:
    @Ryandec probably pretty close to film texturing and stuff like Zhelong Xu
    film texturing is based around painting onto polygons? not to fimilar with this
  • zachagreg
    Offline / Send Message
    zachagreg sublime tool
    I imagine as long as the UVs werent causing issues that could be done but wouldn't be ideal. https://magazine.substance3d.com/texturing-hero-assets-on-the-movie-pacific-rim-uprising-with-dneg/

    Here is a link regarding a little bit of the workflow when making Pacific Rim. Shit ton of UDIMS and unwrapping at a grander scale.
  • GlowingPotato
    Offline / Send Message
    GlowingPotato polycounter lvl 6
    More technical information about this demo in here.
    https://www.eurogamer.net/articles/digitalfoundry-2020-this-is-next-gen-unreal-engine-running-on-playstation-5

    "And that, in a nutshell, is the definition of a micro-polygon engine. The cost in terms of GPU resources is likely to be very high, but with next-gen, there's the horsepower to pull it off and the advantages are self-evident. Rendering one triangle per pixel essentially means that performance scales closely with resolution. "Interestingly, it does work very well with our dynamic resolution technique as well," adds Penwarden. "So, when GPU load gets high we can lower the screen resolution a bit and then we can adapt to that. In the demo we actually did use dynamic resolution, although it ends up rendering at about 1440p most of the time.""


  • Obscura
    Offline / Send Message
    Obscura godlike master sticky
    I'm suspecting that the crazy amount of geometry is possible using something like "mesh shaders" that nvidia was demoing in the last year. 

  • GlowingPotato
    Offline / Send Message
    GlowingPotato polycounter lvl 6
    More from the article above...

    "A number of different components are required to render this level of detail, right?" offers Sweeney. "One is the GPU performance and GPU architecture to draw an incredible amount of geometry that you're talking about - a very large number of teraflops being required for this. The other is the ability to load and stream it efficiently. One of the big efforts that's been done and is ongoing in Unreal Engine 5 now is optimising for next generation storage to make loading faster by multiples of current performance. Not just a little bit faster but a lot faster, so that you can bring in this geometry and display it, despite it not all fitting and memory, you know, taking advantage of next generation SSD architectures and everything else... Sony is pioneering here with the PlayStation 5 architecture. It's got a God-tier storage system which is pretty far ahead of PCs, bon a high-end PC with an SSD and especially with NVMe, you get awesome performance too."

    But it's Sweeney's invocation of REYES that I'm particularly struck by. The UE5 tech demo doesn't show extreme detail at close range, it's also delivering huge draw distances and no visible evidence whatsoever of LOD pop-in. Everything is seamless, consistent. What's the secret? "I suppose the secret is that what Nanite aims to do is render effectively one triangle for pixel, so once you get down to that level of detail, the sort of ongoing changes in LOD are imperceptible," answers Nick Penwarden. "That's the idea of Render Everything Your Eye Sees," adds Tim Sweeney. "Render so much that if we rendered more you couldn't tell the difference, then as the amount of detail we're rendering changes, you shouldn't be able to perceive difference."


  • zachagreg
    Offline / Send Message
    zachagreg sublime tool
    It's got a God-tier storage system which is pretty far ahead of PCs, bon a high-end PC with an SSD and especially with NVMe, you get awesome performance too."

    This line is odd, surely everyone in the game industry is using SSDs and some NVMes at this point? Is the storage in the PS5 a different architecture than this? Is it faster?
  • s1dK
    Offline / Send Message
    s1dK interpolator
    The tech demo looks amazing!!! I can`t belive we finaly get rid of baking. @Udjani probably we can just create a mid poly model and throw that in painter and use normal maps decals for small details.
  • Udjani
    Offline / Send Message
    Udjani polycounter lvl 3
    @Obscura I remember when amd CEO said that they would support raytracing but only when it could be done right, maybe this is it. Also there is alot of rumors about the new nvidia cards that are supposed to be much better in raytracing.
  • oglu
    Offline / Send Message
    oglu ngon master
    Just do a good base with uvs and go from there. The UVs wont get destroyed if you decimate the model.
    We do that all day for offline rendering.
  • Laughing_Bun
    Offline / Send Message
    Laughing_Bun polycounter lvl 13
    I wonder if Epic is going to help turn Quixel mixer into something viable to texture these dense assets. It doesn't need much just an auto UV algorithm that doesn't completely suck.
  • oglu
    Offline / Send Message
    oglu ngon master
    Im sure those new SSD hardware wont be cheap. And you need a new GPU and CPU. I fear my next rig will cost around 3k.
    https://www.anandtech.com/show/15352/ces-2020-samsung-980-pro-pcie-40-ssd-makes-an-appearance
  • JamesBrisnehan
    Offline / Send Message
    JamesBrisnehan greentooth
    This is amazing, but I have so many questions.

    What are next gen art pipelines going to be like?

    Will skipping the optimization phases help reduce the infamous 'Crunch Time' at AAA studios? 

    If game studios truly start skipping the optimization parts of the art pipeline, what are the file sizes going to be like? Are the mid-late generation UE5 games going to be like 9 discs to install, and one disc to play? What about downloading games? 

    Will this have any impact on the film/VFX/Animation industry? The last few years I've seen Unreal Engine start popping up in behind the scenes content, like for "The Mandalorian". Are render farms going to one day be a thing of the past?

    Will I even be able to run Unreal Engine 5 on my current gen PC?
  • littleclaude
    Offline / Send Message
    littleclaude sublime tool
    This is amazing, but I have so many questions.

    What are next gen art pipelines going to be like?

    Will skipping the optimization phases help reduce the infamous 'Crunch Time' at AAA studios? 

    If game studios truly start skipping the optimization parts of the art pipeline, what are the file sizes going to be like? Are the mid-late generation UE5 games going to be like 9 discs to install, and one disc to play? What about downloading games? 

    Will this have any impact on the film/VFX/Animation industry? The last few years I've seen Unreal Engine start popping up in behind the scenes content, like for "The Mandalorian". Are render farms going to one day be a thing of the past?

    Will I even be able to run Unreal Engine 5 on my current gen PC?
    Its all about the Cloud, its all over after the Next Gen Console Thread. This generation will be the end of what we know as consoles in the house with SSD and Disks. 
  • thomasp
    Offline / Send Message
    thomasp ngon master
    Was I the only only who got his next gen expectations crushed early by those hair physics?

    Anyway, the big thing seems to be about scalability which I imagine can only mean nice things for the whole PITA that are LOD's. The rest of the lofty promises I will treat with tempered expectations. As should you. :)

    Oh and apparently we are going to need all new rigs and all new content creation software to even handle the asset complexity some people are dreaming about. Sounds nice...?

  • PolyHertz
    Offline / Send Message
    PolyHertz sublime tool
    What are next gen art pipelines going to be like?
    The next gen pipeline should be the same, just minus the need to make a lowpoly and bake normals to it.

    Will skipping the optimization phases help reduce the infamous 'Crunch Time' at AAA studios? 
    Doubtful. Bad management will still be bad management.
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 6
    zachagreg said:This line is odd, surely everyone in the game industry is using SSDs and some NVMes at this point? Is the storage in the PS5 a different architecture than this? Is it faster?
    Obscura said:
    Whats kinda weird to me is that how is this suddenly running like this on a console hardware. The specs of the ps5 are good but not unbelievable.  They say the ssd has 5gb/s bandwidth, but a lot of nvme pc ssd-s are similar. Also, nvme haven't showed much difference in games in any way compared to a sata ssd until now. I'm also wondering if this "lumen" thing is a hardware independent raytracing? RTX ray tracing on pc has bad performance and needs to be used with a lot of caution. What has changed? 
    One thing that most people overlook about SSDs is that they still to this day ramain being optional piece of hardware. Basically all software is required to work properly without one, you don't see it listed as a hard requirement anywhere. That saying, no software is actually designed with SSD's in mind. Yes, you can speed up some processes by using an SSD, but it works transparently to most of the current gen software. Now, things get different with next gen consoles where SSD is a given, and while it maybe not be the fastest in terms of raw bandwidth, it is still lightning fast compared to what current gen software are disigned to work with on top of being heavily customized to a specific needs. So i believe that having SSD as a standart component of every console unit will lead to a new software design which will make SSD technology really shine. Far beyond simply improving loading times in specific places.
  • tynew
    Offline / Send Message
    tynew polycounter lvl 6
    This is every artist's wet dream, finally a step forward. The workflow of making a high poly to throw away purely to waste time by baking down to a lower res model is obsolete and over a decade old. So the fact that this has come out will unleash our potential!

    Don't be scared guys, this purely benefits us and our artistic qualities as artists. The poly count isn't what is standing out here (that's just for marketing), but the fact that we can now remove the obsolete workflow of baking/retopo for enviro/hard surface assets. We will have to learn to texture assets now more like film/vfx artists, but it will be worth it. Realtime engine film making has been crossing over into gamedev for a while now and prominently moreso with shows like the Mandalorian using unreal engine as a backdrop. 

    Unfortunately, I don't see how this could benefit character artists since they will be constrained to make usable topology for animators still. 

    We should all be celebrating for the bright future we have. We get more creativity at our hands to be actual artists rather than wasting time cranking out technical art confining to budgets like machines. 
  • Udjani
    Offline / Send Message
    Udjani polycounter lvl 3
    I was wondering about the hard surface workflow too so i made a test.

    Got an ak-47 subd model, removed all of the very simple meshes like bolts and cylinders that would be very easy to make an optimazed version, and then decimated the rest of the model. It ended up with 460k tri, and the fbx is 44 MB. Comparing with an optimazed version of another lowpoly gun that i have, wich the fbx only weights 880 KB is quite a lot of difference. 

    Even if unreal would have no problem at all with mesh like this, exporting/importing and storing could be quite a pain.

    Anyways still think and hope that we can say goodbye to baking normals on razor sharp edges like we do. Having a 3-5 edge bevel with weighted normals would make the mesh look much nicer and wouldn't be heavy too.
  • Pain
    Offline / Send Message
    Pain polycounter lvl 5
    I hope we can get rid of the retopology and baking in near future. UE5 is so crazy and it seems like it would make my dream comes true.
  • Cless
    Offline / Send Message
    Cless polycounter lvl 6
    Is there any of the AAA devs here able to tell us what is the average file size from meshes from a whole AAA project?

    It would be interesting to know if or how possible it would be to decimate the highpoly meshes. We would get absurdly high detail meshes for around x50 or x100 the file size of current meshes (or at least that is what I calculated over the top of my head).
  • Obscura
    Offline / Send Message
    Obscura godlike master sticky
    As large as your highpoly file is.
  • oglu
  • Prime8
    Offline / Send Message
    Prime8 greentooth
    It looks amazing, especially considering this is running on a console.
    I cannot imagine what kind of resources would be needed to create a full scale game with such a level of detail, If just one asset looks "average" you ruin the whole scene. :lol: Would it be feasible?
    As much as I love this eye-candy, in a game I would prefer visual variety to visual awesomeness, if I had to choose.

  • joebount
    Offline / Send Message
    joebount polycounter lvl 8
    Would I be wrong to assume that we could see practical use for displacement mapping (with proper subpixel tesselation). Not really what the presentation was about but if hte engine can handle that kind of polycount, that might become possible for it to be used in production...
  • zachagreg
    Offline / Send Message
    zachagreg sublime tool
    joebount said:
    Would I be wrong to assume that we could see practical use for displacement mapping (with proper subpixel tesselation). Not really what the presentation was about but if hte engine can handle that kind of polycount, that might become possible for it to be used in production...

    I think you would be in the realm of reality with that assumption. I know that the actual computation of tessellation and displacement will eventually add up but I do think rather than these super highpoly assets being thrown into engine at a couple million we will start to see a great rise in displacement map usage and tessellation in games. That to me seems like a more usable solution than these large file sizes everyone is talking about.

    But it may also not be that case at all due to however "Nanite" works.
  • Obscura
    Offline / Send Message
    Obscura godlike master sticky
    joebount said:
    Would I be wrong to assume that we could see practical use for displacement mapping (with proper subpixel tesselation). Not really what the presentation was about but if hte engine can handle that kind of polycount, that might become possible for it to be used in production...
    Not really. Based on the little hits here and there, its more like micropolygons in reyes renderers. Both of these has a wiki page though. Its some heavily technical stuff, and I think artists don't need to know how this works, because it happens under the hood. It basically recreates your mesh with some dense topology. And auto lods with less dense but still dense topology. Like mesh shaders.
  • guitarguy00
    Offline / Send Message
    guitarguy00 polycounter lvl 2
    I have my doubts on this. The demo looks amazing but the whole 'no baking' talk is nonsensical(from what we currently know). Importing hundreds of static meshes that has been subdivided and dynameshed straight into the engine would result in the frame rate going into the toilet. The file sizes are massive compared to a baked low poly with a normal map. I can barely import a few million+ poly meshes into Marmoset Toolbag on my beefy PC(2070, 32GB RAM, 3700x, etc) without severe loading and occasional crashing. 


  • gnoop
    Online / Send Message
    gnoop greentooth
    Would be interesting to see how this new system cope with trees, 3d grass  etc.    things that traditionally  use alpha clip/blend even in  movies. 

    And if it's micropolygon based  my guess it still would be necessary to bake displacement  textures?       Streaming all the geometry as it is in and out all the time would be insane.

    Even Clarisse which able to render billions of poly in a viewport  needs optimization  so I am not sure how could they do it just by engine.  Some super clever AI  guessing it altogether with GI  without actual ray tracing?


  • thomasp
    Offline / Send Message
    thomasp ngon master
    Isn't there going to be some presentation coming up soon for the next generation of Nvidia GPUs? I'm betting they are going to 1up on this demo with another round of game<->movie convergence.

    Looking forward to people registering to ask if you can have a career in games just sculpting really high poly rocks. ;)

  • oglu
    Offline / Send Message
    oglu ngon master
    That Nvidia presentation was today. And there is nothing new for Gamer. Just Server GPUs.
  • rollin
    Offline / Send Message
    rollin interpolator
    Hey, now I know for what these > 100 TB hard drives will be good for!

    Should be enough for.. one or two games.. ahhhh com'on .. let it be tree!

  • J4yst3r
    Offline / Send Message
    J4yst3r polycounter lvl 6
    marks said:
    Hundreds of gb, just for meshes. For textures, you're looking in the terabytes range most likely - that's typically what large numbers of Painter files runs into. Especially at the resolutions they're talking about here.
    So I don't know how any of this works, but out of curiosity, are there not also huge savings to be made by upping the model resolution? As you said, textures are orders of magnitude bigger than meshes, so I would assume removing the need for some of those textures by increasing model complexity would be a net positive for final game size. For example, would a low poly gun with high-res normal and AO maps be larger than a high poly gun that doesn't need those maps?
  • Kanni3d
    Offline / Send Message
    Kanni3d greentooth
    J4yst3r said:
    marks said:
    Hundreds of gb, just for meshes. For textures, you're looking in the terabytes range most likely - that's typically what large numbers of Painter files runs into. Especially at the resolutions they're talking about here.
    So I don't know how any of this works, but out of curiosity, are there not also huge savings to be made by upping the model resolution? As you said, textures are orders of magnitude bigger than meshes, so I would assume removing the need for some of those textures by increasing model complexity would be a net positive for final game size. For example, would a low poly gun with high-res normal and AO maps be larger than a high poly gun that doesn't need those maps?
    The highpoly raw mesh of the gun will still need textures somehow... It cant be vert paint since you'll need multiple channels for different values of specular/roughness/etc. etc. It'll most likely need a ridiculously high res texture a la megascans.
  • Obscura
    Offline / Send Message
    Obscura godlike master sticky
    I don't think we need to worry about memory issues of highres textures anymore. Virtual texturing in Unreal does some crazy job on compressing them. It happened today that I was comparing some sparse 8k non virtual texture to a virtual one. It had alpha channel too. The standard one used 65 mb memory while the virtual one only used 6. 
  • gnoop
    Online / Send Message
    gnoop greentooth
    Obscura said:


    Hehe , exactly my thoughts :)
  • J4yst3r
    Offline / Send Message
    J4yst3r polycounter lvl 6
    Kanni3d said:
    The highpoly raw mesh of the gun will still need textures somehow... It cant be vert paint since you'll need multiple channels for different values of specular/roughness/etc. etc. It'll most likely need a ridiculously high res texture a la megascans.
    Sure, but you needed those textures regardless of what rendering method you use. But I'm wondering if the file size increase from higher triangle count can be offset by the loss of the normal/AO maps?
  • poopipe
    Offline / Send Message
    poopipe hero character
    I'm going to work on the assumption that nanite revolves around vector displacement or voxelisation since I can't think of any other way to make a mesh resolution independent.  Whether that's exposed to the user or done as part of the build process is anyone's guess at this stage. 

    Assuming that's the case(probably is)  theyll be streaming in whichever mip of the displacement texture is required for the current view distance and generating the geometry from that so in terms of resource cost at runtime it should be pretty dynamic and flexible (and efficient) . 



    You'll still have to make good models with uvs and lods though I'm afraid,  this stuff can only add information, not take it away

  • rollin
    Offline / Send Message
    rollin interpolator
    Btw.. does anybody know is this true for all meshes or only static stuff? Like: can skinned meshes also be made out of zadrillion triangles?
  • oglu
    Offline / Send Message
    oglu ngon master
    You have to animate them. Clean topo is still needed. 
13
Sign In or Register to comment.