Home General Discussion

Professionals, is Nanite still leaning as the new paradigm in the industry?

jiggywattart
polycounter lvl 6
Offline / Send Message
jiggywattart polycounter lvl 6
I'm still a bit confused by the care-free "polycount doesn't matter" attitude of Nanite. I know it performs amazingly well for static meshes, but I can't wrap my head around the logistics of importing 5M-poly meshes into UE5.

1. If you're importing ~1GB+ FBX files into UE5, wouldn't that be impractical just for syncing data and just accumulate an absurd amount of disk space in the long run?

2. do programs like Substance Painter and RizomUV even handle insanely high polycounts well? Hard to imagine Rizom was designed to unwrap a 5M poly asset with ease.

Replies

  • zetheros
    Options
    Offline / Send Message
    zetheros interpolator
    IMO polycount will always matter and good topology flow will always matter. Until further developments are made to handle and work with limitless amounts of polys, as well as rig, animate and texture map nanite assets, I don't see nanite as a useful function.

    Also why would a game developer want to bloat their game file size any more than it has to? Most studios are already dismal at optimizing their games. This leads into a disappointing trend that I'm noticing; increasingly powerful hardware and software compensating for developer's lack of fundamentals.
  • Neox
    Options
    Offline / Send Message
    Neox godlike master sticky
    "Most studios are already dismal at optimizing their games. This leads into a disappointing trend that I'm noticing; increasingly powerful hardware and software compensating for developer's lack of fundamentals."

    I mean, there is your answer ^^
    Nanite is a way to safe money on optimization, on training people good workflows, on many steps in the production. Only at the cost of bigger harddrives. Some will totally go for it.

    That being said, nobody forces you to feed abysmal ineffective data into nanite. 
  • zetheros
    Options
    Offline / Send Message
    zetheros interpolator
    there's certainly niche cases where nanite shines, but for general purpose development, one can achieve plenty good results with texture maps and lighting. Oh, another thing to consider is that Nanite will lock you into using UE5, as far as I know no other game engine has this tech. If UE5 decides to pull a Unity, you won't be able to change engines as easily. UE5 already kind of has a monopoly if you're wanting to make a high fidelity 3D game, and a number of studios have switched from their own engines to UE5 or are in the process to, like CD Projekt Red. It might only be a matter of time before something bad happens, as it does with most monopolies.
  • Joao Sapiro
    Options
    Offline / Send Message
    Joao Sapiro sublime tool
    i think youre conflading two issues : one that is nanite beeing used as a "mop" on bad optimization skills, and another where you complain about the lack of optimization of last games. Both are true in my opinion, but Nanite, like all the other new advancements, is a tool, that if correctly used has clear advantages and time saving in production, and sadly, saving time is  more important than saving resources in productions.
  • thomasp
    Options
    Offline / Send Message
    thomasp hero character
    Nanite-like tech is in the chinese-market-specific (for now, anyway) branch of Unity. Seems fair to expect this stuff in time to show up in other engines as well.

    In my opinion disk space/download size concerns will make sure this tech gets applied very selectively on projects that actually have to ship. Download size restrictions for games on online market places are still a thing I believe.
    And as for if the tools can handle all that data - of course not (well). ;) You always had to be patient (and sometimes work around stability issues) when doing really high spec work, same is bound to apply in this case.
  • Neox
    Options
    Offline / Send Message
    Neox godlike master sticky
    Again, you can use nanite with optimized geometry and textures and still profit from things nanite does. You do not have to put in millions of polies with shitty UVs and gigantic textures. It does not have to be huge filesize stuff
  • myclay
    Options
    Online / Send Message
    myclay polycounter lvl 10
    Rizom is able to quickly load big files like a hot knife goes thru butter. Rizom is performant while you do UVing.
    Here a Video from Rizom VS 2018, showing 600k+ Triangles are easily possible.
    Substance Painter can import big data for baking with little sweat.
    It is all depending on the hardware you have.

    Some treat GPUs as if they are cheap to upgrade, in reality GPU prices skyrocketed in the last years.
    GPUs are by far the most expensive upgradable part in a PC.
    SSDs are (again) getting more expensive.
    Storage sizes in portable devices I also check occasionally or directly test on them.
    From an indie gamedev view(without sponsoring from hardware manufacturers), I keep those things in mind and optimize.

    The addition of AutoLOD into all major Game engines( closed as well as opensource) was amazing.
    Knowing Fundamentals and being able to still implement the occasional manually created LOD for important meshes is still helpful.
    Fundamentals include many fields but it is an enjoyable process.
    FSR/DLSS felt imo more exciting than Nanite.
    Tech like FSR/DLSS I treat as a cherry on top. I optimize my project(s) as fast as it can get without having it enabled.

    EDIT; forgot to add;
     Unreal is used in games, movie making etc and caters to many fields. Some features can be utilized better in different fields.
    Build around the hardware requirements of the people you want to reach. Gamedev and Film have different needs and demands while creating as well as while being enjoyed as a final product.

    Majority of people here is doing gamedev and thats why you will see gamedev/artist centric answers.
    Answers from a wide variety of people from big studios, outsourcing studios, indies, tech artists, generalists etc.

    Thing is, Unreal is not only used in Gamedev but also in all kinds of different Film making projects.
    Depending on the Filmmaking setups with Unreal, you don´t have to care (that much) about realtime and non-realtime recording can be done well with Unreal.
    The VFX Recommended Hardware list gives you a glimpse into professional filmmaking setups which vastly differs from what the majority of players use.
    https://docs.unrealengine.com/4.26/en-US/WorkingWithMedia/InCameraVFX/InCameraVFXRecommendedHardware/

    For Gamedev centric stuff, I would look for example into the Steam Hardware Survey and build games roughly around hardware mentioned there. Depending on the time it takes you to build your game, you could take some guesses and use a better PC for testing.
    https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam


  • zetheros
    Options
    Offline / Send Message
    zetheros interpolator
    Apparently UE5.5 has nanite for skeletal meshes now. The question is how performant will it be? He's getting like 25 fps, but that's on a laptop of unknown make, and with several dozen characters on screen. https://www.youtube.com/watch?v=-I1Y0s_tZa0

  • Shrike
    Options
    Offline / Send Message
    Shrike interpolator
    Nanite is cool but 

    The thing is, just because you can render a 5 mio mesh does that not mean you want to work with a 5 mio mesh.
    Painfully slow export and import, terrible UV mapping, crashes memory issues, insane repositories, long pull and push times, bloated harddrives
    Then we get things such as scripting, booleans and whatnot struggling to iterate with CPU over millions of polys.
    Or your houdini pipeline suddenly taking minutes instead of seconds.

    So the best practice is to be reasonable and not just use because you can. 
    Especially if you use standard workflow with diffuse + full stack maps you'll accumulate so much filesize.

    Also, Nanite right now is not an optimization. Nanite is still a performance downgrade. 
    Nanite is a graphics upgrade. You can render a lot greater things for relatively very cheap, but its not cheaper than classic + LODs. It scales amazingly well upwards simply.
    Same as lumen. Lumen is amazingly cheap for what it does, but its never cheaper than just direct lighting. So you get this trickle down / anchoring effect of taking the features which offer extreme bang for buck, but they still cost.
  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter
    yeah -  it's important that people appreciate that while nanite and lumen are excellent ways to get really high quality results very efficiently that doesn't mean  they're cheap. 

    it's not really a downgrade once you get past the overhead - in  fact there is little reason to not use nanite if the platforms you are targeting can handle it - it is a better way to handle meshes than unreal's alternative and it really shines when used with upscaling.  Also IMO it is the only way to get foliage that doesn't look shit at a distance

    lumen is a good choice in any scenario where baked lighting works well or if you don't need to deal with dusk/dawn in an open world
  • sprunghunt
    Options
    Offline / Send Message
    sprunghunt polycounter
    If you look at the example Nanite meshes that come in Epic made projects like the Valley of the Ancient they're not raw 1GB meshes. So the idea that using Nanite means you never reduce the polycount on meshes or do any kind of optimization isn't something that's practical. 

    Even for non-realtime rendered projects (eg: film) you usually can't use raw 1GB meshes. An offline render engine still has memory limits of what you can load at any one time. The last time I made something pre-rendered we had a limit of 1GB for the entire scene

    Just like any other increase in rendering technology you have to adjust your poly limits a bit. But you can't just do anything you want. 
Sign In or Register to comment.