I'm still a bit confused by the care-free "polycount doesn't matter" attitude of Nanite. I know it performs amazingly well for static meshes, but I can't wrap my head around the logistics of importing 5M-poly meshes into UE5.
1. If you're importing ~1GB+ FBX files into UE5, wouldn't that be impractical just for syncing data and just accumulate an absurd amount of disk space in the long run?
2. do programs like Substance Painter and RizomUV even handle insanely high polycounts well? Hard to imagine Rizom was designed to unwrap a 5M poly asset with ease.
Replies
Also why would a game developer want to bloat their game file size any more than it has to? Most studios are already dismal at optimizing their games. This leads into a disappointing trend that I'm noticing; increasingly powerful hardware and software compensating for developer's lack of fundamentals.
I mean, there is your answer ^^
Nanite is a way to safe money on optimization, on training people good workflows, on many steps in the production. Only at the cost of bigger harddrives. Some will totally go for it.
That being said, nobody forces you to feed abysmal ineffective data into nanite.
Here a Video from Rizom VS 2018, showing 600k+ Triangles are easily possible.
GPUs are by far the most expensive upgradable part in a PC.
Fundamentals include many fields but it is an enjoyable process.
Tech like FSR/DLSS I treat as a cherry on top. I optimize my project(s) as fast as it can get without having it enabled.
EDIT; forgot to add;
Unreal is used in games, movie making etc and caters to many fields. Some features can be utilized better in different fields.
Majority of people here is doing gamedev and thats why you will see gamedev/artist centric answers.
Answers from a wide variety of people from big studios, outsourcing studios, indies, tech artists, generalists etc.
Thing is, Unreal is not only used in Gamedev but also in all kinds of different Film making projects.
Depending on the Filmmaking setups with Unreal, you don´t have to care (that much) about realtime and non-realtime recording can be done well with Unreal.
The VFX Recommended Hardware list gives you a glimpse into professional filmmaking setups which vastly differs from what the majority of players use.
https://docs.unrealengine.com/4.26/en-US/WorkingWithMedia/InCameraVFX/InCameraVFXRecommendedHardware/
For Gamedev centric stuff, I would look for example into the Steam Hardware Survey and build games roughly around hardware mentioned there. Depending on the time it takes you to build your game, you could take some guesses and use a better PC for testing.
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
The thing is, just because you can render a 5 mio mesh does that not mean you want to work with a 5 mio mesh.
Painfully slow export and import, terrible UV mapping, crashes memory issues, insane repositories, long pull and push times, bloated harddrives
Then we get things such as scripting, booleans and whatnot struggling to iterate with CPU over millions of polys.
Or your houdini pipeline suddenly taking minutes instead of seconds.
So the best practice is to be reasonable and not just use because you can.
Especially if you use standard workflow with diffuse + full stack maps you'll accumulate so much filesize.
Also, Nanite right now is not an optimization. Nanite is still a performance downgrade.
Nanite is a graphics upgrade. You can render a lot greater things for relatively very cheap, but its not cheaper than classic + LODs. It scales amazingly well upwards simply.
Same as lumen. Lumen is amazingly cheap for what it does, but its never cheaper than just direct lighting. So you get this trickle down / anchoring effect of taking the features which offer extreme bang for buck, but they still cost.
it's not really a downgrade once you get past the overhead - in fact there is little reason to not use nanite if the platforms you are targeting can handle it - it is a better way to handle meshes than unreal's alternative and it really shines when used with upscaling. Also IMO it is the only way to get foliage that doesn't look shit at a distance
lumen is a good choice in any scenario where baked lighting works well or if you don't need to deal with dusk/dawn in an open world
Even for non-realtime rendered projects (eg: film) you usually can't use raw 1GB meshes. An offline render engine still has memory limits of what you can load at any one time. The last time I made something pre-rendered we had a limit of 1GB for the entire scene.
Just like any other increase in rendering technology you have to adjust your poly limits a bit. But you can't just do anything you want.
You're not loading the whole mesh with Nanite -
It's streaming in the appropriate level of detail based on screen coverage for the clusters you can see