Home Technical Talk

Strategy for combining 100 Texture sets (Blender > Substance > Blender > Unreal?)

polycounter lvl 2
Offline / Send Message
GoLionMk2 polycounter lvl 2
Polycount is one of the most hardcore CG modeling / rendering resources online. Because of that I'm really I'm hoping someone can advise on something I've been struggling with for months.

My end goal is a game ready composite asset in Unreal, made up of around 100 medium models (a dinosaur skeleton from scan data)
I have high res bone scans, and have successfully reduced polys of single bone in Blender (from 1 million to 5k) > baked normal / albedo / roughness in Substance Painter from high res scan to low poly, export 2K textures.

That looks amazing for a single bone in Unreal (5000poly bone with a 2k normal/albdeo/roughness baked from the 1million poly scan).

Now here's the problem: I don't think I can have 100 5k models, with each having a 2K texture in Unreal. From what I've read, that might be too many draw calls. The poly count doesn't seem to be a problem (500k) but the textures are likely too much.

So, do I then import everything back to Blender, connect all the textures up to the low res models, and re'bake / combine the 2k bone texture sets in groups of 5 or 10, so that every 10 models share the 2k maps? That's going to be a super labour intensive process, and one I'm not even sure will work. If anyone has any suggestions for me I'd be really grateful.




Replies

  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    UE4 has a built-in tool to do exactly that on static meshes.
    Window > Developper tools > Merge actors. 

    Assuming that your individual UVs are not to wasteful to begin with, it will pack all the source textures into atlases and will create a single static mesh with a single mat ID, filled by a single lightweight material combining everything (and that also includes any color tinting done inside the original materials). Incredibly useful stuff.
  • Eric Chadwick
    Options
    Offline / Send Message
    What do you mean exactly by "game-ready"? Depends on the game, the hardware platform, and how the model is being used. A mobile game has very different needs than a console game.
    http://wiki.polycount.com/wiki/Polygon_Count#Typical_Triangle_Counts

    Decide your target resolution, then work backwards from there. 200k tris with 1 material at 4096x4096 texture set? 30k tris with 1 material at 512x512 textures?

    The Smithsonian released a bunch of scanned meshes as open source models recently. You could examine those to see how they're assembled. Triceratops scan:
    http://www.3d.si.edu/object/3d/triceratops-horridus-marsh-1889:d8c623be-4ebc-11ea-b77f-2e728ce88125

    If you download the low resolution GLB file, you can load that in https://sandbox.babylonjs.com/ and see how it's built, just how much detail you get for a 150k triangle model, with 4096 textures.

    Plenty resolution for a showcase asset, like if a player needs to hide inside the skeleton to avoid being eaten by a zombie horde, or whatever.



  • Obscura
    Options
    Offline / Send Message
    Obscura grand marshal polycounter
    You can easily have 100 5k models and 100 2k texture sets. It will eat up a great amount of video memory, but most modern graphics cards should be able to handle this with ease. You wouldn't be able to make a full game like this, but I guess thats not the goal here. Once I worked on a project that had more than 100 8k textures, and it was still ok. I don't suggest to go that way, I'm just saying that its clearly doable if you have a decent machine. 

    2k textures uses around 8 mb of video memory if they are not normal maps. For a normal map, its double. So even if you don't pack the grayscale textures, and we count with 4 textures + a normal, thats 48 mb for one texture set. Multiplied by 100, still 4800 mb. Meshes uses very little video memory, so the whole thing could end up being less than 6 gb. Packing the grayscales would reduce the texture memory from 4800 to 3200. Enabling virtual texturing would reduce this even further. Depending on how much the uv spaces are filled, and how high frequency are the details in the textures, maybe even cut it in half. With tightly packed uvs, or high frequency detail, not so much though.

    Otherwise, the solution suggested by Pior works great I can confirm.
  • GoLionMk2
    Options
    Offline / Send Message
    GoLionMk2 polycounter lvl 2
    Wow - you guy are THE BEST. Polycount experts dropping the knowledge!

    @pior that's a great tip, I had no idea and will take a look.

    @er@"Eric Chadwick" thanks for the amazing info and sorry for the generality of 'game ready'.. It's for VR on an Oculus with a 1070 GPU.
    The  skeleton is the main focus of the experience, with relatively few other assets in the world except for a hall it's in.
    I'm reducing the bones to 5k each, so x 130 bones that's 650k poly.. Does that sound reasonable?
    That's a fantastic link to that Smithsonian model! I will go look at that right away. Regarding textures: That model looks like shares 1 texture right? One challenge I face is that the Blender file is way to big to join / decimate the whole model at once, so I'm exporting bones 1 by 1 to decimate, then baking the high to low in Substance and exporting the maps. Where I'm a little confused is how to join the 100 models / 100 texture sets back together (I have bake tool for Blender, hoping that will work but worried about my UVs not being efficient enough for the atlas). Although if @Ob@Obscura 's suggestion is correct (just have the 100 models with texture sets in there) then maybe I can get away with that? Maybe it would be better to use 1k textures for all 100 bones (rather than a 4k atlas) to save on video memory?

    Here's an image of what one of my UVs look like on the low poly.

    If you guys have any other tips let me know (especially anything to know with the Pior approach in Unreal, as opposed to loading and baking texture atlases in Blender etc).

    You guys are GOLD. Thank you so much. I've been asking on the Blender artists, Unreal forums and many more, but have never gotten the amazing knowledge I have here. I need to spend way more time on Polycount.





  • GoLionMk2
    Options
    Offline / Send Message
    GoLionMk2 polycounter lvl 2
  • GoLionMk2
    Options
    Offline / Send Message
    GoLionMk2 polycounter lvl 2
    @pior does that loook like an overly wasteful UV? It's using Blender's smart UV unwrap
  • Obscura
    Options
    Offline / Send Message
    Obscura grand marshal polycounter
    It definitely is. 
  • GoLionMk2
    Options
    Offline / Send Message
    GoLionMk2 polycounter lvl 2
    Obscura said:
    It definitely is. 

    Definitely overly wasteful? Do you have any suggestions on what I should be doing differently?
    Maybe I should be defining one long seam on the inside of the bone.. Seems like it would be super slow to click each vertice to split, especially on a 100+ bone workflow. Any tips appreciated.

  • Obscura
    Options
    Offline / Send Message
    Obscura grand marshal polycounter
    You should use some better packing tool. I'm pretty sure there are many for blender. All those tiny uvs could be placed around the large pieces filling up the huge gaps between them. You also don't need as many splits. Such a piece could be unwrapped into 1-2 islands, which would provide a more performant mesh overall (uv splits are not entirely free), less visible seams, better mip mapping, and better use of texture space -> better resolution.
  • GoLionMk2
    Options
    Offline / Send Message
    GoLionMk2 polycounter lvl 2
    For example, should I be going in and marking edges? I can't seem to click a long loop as it's the scan data / auto re-decimation.

  • GoLionMk2
    Options
    Offline / Send Message
    GoLionMk2 polycounter lvl 2
    Sorry just missed your most recent post.
    I will start searching for best UV unwrapping practice. Sorry for noob questions.
    Any keywords or links let me know.

  • Obscura
    Options
    Offline / Send Message
    Obscura grand marshal polycounter
    You could probably select faces instead of edges and split based on face selection. I'd enable back face selection, and cut  in half. If its stretching too much, add 1-2 more cuts. Then, if needed, add a few edge cuts here and there. This sounds fast enough to me. Yes, it would probably take like a day of work on such a complex mesh, but you would end up with a few uv islands per piece instead of hundreds.
  • GoLionMk2
    Options
    Offline / Send Message
    GoLionMk2 polycounter lvl 2
    Obscura said:
    You could probably select faces instead of edges and split based on face selection. I'd enable back face selection, and cut  in half. If its stretching too much, add 1-2 more cuts. Then, if needed, add a few edge cuts here and there. This sounds fast enough to me. Yes, it would probably take like a day of work on such a complex mesh, but you would end up with a few uv islands per piece instead of hundreds.

    Thanks for the workflow suggestion -- that sounds like a great plan. Thanks BIG TIME
Sign In or Register to comment.