Home Technical Talk

How to bake maps from extreme high poly model?

Hello! I recently created a 3d model using photogrammetry. It consists of about 180 million polygons.I tried to bake a normal map and vetex color map with it using XNormal, but it render but the program renders only the blue square. Please help!(Sorry for my English, it's not my native language)

Replies

  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range
    Can't you optimise it further? You would need a huge resolution map or multiple large udims to even capture that much detail. 

    Working with that much data in a single mesh is not a good workflow. I suggest you revisit and aim for a much lower res at source.
  • darkperson8
    Options
    Offline / Send Message
    Can't you optimise it further? You would need a huge resolution map or multiple large udims to even capture that much detail. 

    Working with that much data in a single mesh is not a good workflow. I suggest you revisit and aim for a much lower res at source.
    Yes I need a huge resolution texture from this mesh. I set resoulition to 4k just for test bake.I would like to bake a 8k-16k texture and i don't want to decimate mesh.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    Xnormal is definitely capable. I once baked 15 meters of dirt road surface in form of  750 mil polygons from  Agisoft Metashape (former Photoscan). It took forever just to load the obj and whole night to bake.  
    Decimation (optimization) is not an option really for such a h-res things because it needs much more RAM than to produce that mesh itself in both Reality Capture and Metashape  and anyway  it starts to eat details after 5-7 times of decimation.
     
    Better to just do it in parts




  • darkperson8
    Options
    Offline / Send Message
    gnoop said:
    Xnormal is definitely capable. I once baked 15 meters of dirt road surface in form of  750 mil polygons from  Agisoft Metashape (former Photoscan). It took forever just to load the obj and whole night to bake.  
    Decimation (optimization) is not an option really for such a h-res things because it needs much more RAM than to produce that mesh itself in both Reality Capture and Metashape


    okay, i will chek my bakeplane, maybe it's spoiled. Thanks for explanation.
  • darkperson8
    Options
    Offline / Send Message
    gnoop said:
    Xnormal is definitely capable. I once baked 15 meters of dirt road surface in form of  750 mil polygons from  Agisoft Metashape (former Photoscan). It took forever just to load the obj and whole night to bake.  
    Decimation (optimization) is not an option really for such a h-res things because it needs much more RAM than to produce that mesh itself in both Reality Capture and Metashape  and anyway  it starts to eat details after 5-7 times of decimation.
     
    Better to just do it in parts




    I'm sorry, I don't know how to do this. Can you explain to me or show me an article or video on this topic?
  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range
    You would need a 16K map. 8K will only capture 64mio polys.

    Split your mesh to parts as gnoop advised. Break it into manageable chunks and composite bake the map, so you would be baking to the same map each time but using different areas of UV space. Haven't used Xnormal in years, but I'm sure it can composite bake like this.
  • darkperson8
    Options
    Offline / Send Message
    You would need a 16K map. 8K will only capture 64mio polys.

    Split your mesh to parts as gnoop advised. Break it into manageable chunks and composite bake the map, so you would be baking to the same map each time but using different areas of UV space. Haven't used Xnormal in years, but I'm sure it can composite bake like this.
    Thank you very much! But excuse me for the stupid question: how do I split the mesh into parts? It is very highpoly and my computer can't edit so huge number of polygons.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    I actually meant  doing "chunks"  in Metashape or building a part of initial low res point cloud in reality capture, then another part .  Although I could agree it's a huge pain in the ....   I think 60 mil  would be already ok for Max and Zbrush2019. 

       Xnormal could bake to 32k textures too  but it all depends on how much RAM you have.   32gb is an absolute minimum  for such things.  Phtogrammetry is an extreme RAM hog. 

    Clarisse could load and bake up to 300 mil I think . Takes forever to load it there too.

    Xnormal is still definitely a champion


  • darkperson8
    Options
    Offline / Send Message
    Введите свой комментарийgnoop said:
    I actually meant  doing "chunks"  in Metashape or building a part of initial low res point cloud in reality capture, then another part .  Although I could agree it's a huge pain in the ....   I think 60 mil  would be already ok for Max and Zbrush2019. 

       Xnormal could bake to 32k textures too  but it all depends on how much RAM you have.   32gb is an absolute minimum  for such things.  Phtogrammetry is an extreme RAM hog. 

    Clarisse could load and bake up to 300 mil I think . Takes forever to load it there too


    Thank you! I'll try clarisse with whatever it is. But I still have a question: if I do not have enough RAM, XNormal will not be able to bake the maps correctly?
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    it should show you something , en error message. "not enough ram"  or something.  if it doesnt  it's rather something in the settings, axis mismatch maybe. I recall I once saved hi res as obj and then  low poly target as fbx and they was scale mismatching

    Xnormal could be quite puzzling , It's hard to tell what's wrong.  Try to increase ray cast distance maybe
    .  
    try 5 instead of default 0,5 for both front and rear
  • darkperson8
    Options
    Offline / Send Message
    gnoop said:
    it should show you something , en error message. "not enough ram"  or something.  if it doesnt  it's rather something in the settings, axis mismatch maybe. I recall I once save hi res as obj and then  low poly target as fbx and they was scale mismatching

    Xnormal could be quite puzzling , It's hard to tell what's wrong.  Try to increase ray cast distance maybe
    .  

    And this is probably my case. You opened my eyes! Thank you! I will try to fix that.
  • darkperson8
    Options
    Offline / Send Message
    gnoop said:
    it should show you something , en error message. "not enough ram"  or something.  if it doesnt  it's rather something in the settings, axis mismatch maybe. I recall I once saved hi res as obj and then  low poly target as fbx and they was scale mismatching

    Xnormal could be quite puzzling , It's hard to tell what's wrong.  Try to increase ray cast distance maybe
    .  

    My highpoly is obj, but bakeplane is fbx.
  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range
    Thank you very much! But excuse me for the stupid question: how do I split the mesh into parts? It is very highpoly and my computer can't edit so huge number of polygons.
    Well, that's why I said working with extremely dense meshes is not a goid workflow. Everything slows to a crawl and becomes unmanageable unless you have a monster machine.

    Even scanning to parts and patching them together at high polycount is a nightmare. What asset are you working on?
  • Mark Dygert
    Options
    Offline / Send Message
    (Looks at desktop. Runs screaming) AHHHHH! 

    But seriously, 180 million polys is a lot. Anything you can do to bring that number down will help. I've never baked anything that high in either but if one baker fails, maybe another will succeed? 

    You might want to try "MightyBake" https://www.mightybake.com/product-features/ 

    It sounds like you should break up your object into seperate meshes/files. MightBake lets you break up your high res model into several files, similar to xNormal but you can also use name matching in a single file, so several objects can be rendered independently, as long as the names match. xNormal might do this but I've never done it and I can't find a way to do it, I'm probably missing something.

    MightyBake might do it in a slightly different way that might allow the bake to squeek by but really... 180 million polys is a lot...

    I think both xNormal and MightyBake use both the GPU and CPU to bake but xNormal might just use one or the other, I don't think it uses both. Maybe I'm wrong. Documentation has never been xNormals strong suit.
  • darkperson8
    Options
    Offline / Send Message
    Thank you very much! But excuse me for the stupid question: how do I split the mesh into parts? It is very highpoly and my computer can't edit so huge number of polygons.
    Well, that's why I said working with extremely dense meshes is not a goid workflow. Everything slows to a crawl and becomes unmanageable unless you have a monster machine.

    Even scanning to parts and patching them together at high polycount is a nightmare. What asset are you working on?
    i5-4460, gtx 1060 3gb, 16gb ddr3(I originally collected it for video games, but now I'm completely given to 3d.Well, almost completely.)
  • darkperson8
    Options
    Offline / Send Message
    (Looks at desktop. Runs screaming) AHHHHH! 

    But seriously, 180 million polys is a lot. Anything you can do to bring that number down will help. I've never baked anything that high in either but if one baker fails, maybe another will succeed? 

    You might want to try "MightyBake" https://www.mightybake.com/product-features/ 

    It sounds like you should break up your object into seperate meshes/files. MightBake lets you break up your high res model into several files, similar to xNormal but you can also use name matching in a single file, so several objects can be rendered independently, as long as the names match. xNormal might do this but I've never done it and I can't find a way to do it, I'm probably missing something.

    MightyBake might do it in a slightly different way that might allow the bake to squeek by but really... 180 million polys is a lot...

    I think both xNormal and MightyBake use both the GPU and CPU to bake but xNormal might just use one or the other, I don't think it uses both. Maybe I'm wrong. Documentation has never been xNormals strong suit.
    Thank you for advice! About my desktop: genius rules over chaos.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    My guess the target thing is rather a texture than a mesh.   I could say with all those impressive things people do in Substance Designer it's still days of R&D and non-stop tweaking , sometimes a whole week if you are doing something new.  And  photogrammetry provide you with super realistic result in half of a day maybe. 
    And with a soft like Artomatix  and a scanned chunk 2-3 times bigger then your expected texture you could do countless variations looking better than SD ones where in fact only the one you are working on looks truly real and a half of what random seed makes you is quite bizarre.

  • darkperson8
    Options
    Offline / Send Message
    gnoop said:
    My guess the target thing is rather a texture than a mesh.   I could say with all those impressive things people do in Substance Designer it's still days of R&D and non-stop tweaking , sometimes a whole week if you are doing something new.  And  photogrammetry provide you with super realistic result in half of a day maybe.  

    Totally agree. Сan I ask one more question: is it possible to make money on photogrammetery?
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    I have no idea , I do it only for our current projects, not for sale.
       Megascan makes I guess. But looks like they don't bother to build anything bigger then 2x2meters. Probably because of a same pain in the ... and unfortunately  such things is a little help for what we do

  • darkperson8
    Options
    Offline / Send Message
    gnoop said:
    I have no idea , I do it only for our current projects, not for sale
    OK, thank you! I will try to fix it and then write.
  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range
     What asset are you working on?
    i5-4460, gtx 1060 3gb, 16gb ddr3(I originally collected it for video games, but now I'm completely given to 3d.Well, almost completely.)
    I meant what mesh/subject(asset) are you working on? Helps a bit if we know.

    But still, your RAM is going to let you down a LOT.
  • darkperson8
    Options
    Offline / Send Message
    It's works! Thank you all and especially gnoop!  It was all about formats.
  • darkperson8
    Options
    Offline / Send Message
     What asset are you working on?
    i5-4460, gtx 1060 3gb, 16gb ddr3(I originally collected it for video games, but now I'm completely given to 3d.Well, almost completely.)
    I meant what mesh/subject(asset) are you working on? Helps a bit if we know.

    But still, your RAM is going to let you down a LOT.
    bricks as you can see.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    Bricks are thing that Substance Designer does really quick an easy although. Basically anything man made are.  It's pure mother nature  where photogrammetry helps IMO
  • darkperson8
    Options
    Offline / Send Message
    gnoop said:
    Bricks are thing that Substance Designer does really quick an easy although. Basically anything man made are.  It's pure mother nature  where photogrammetry helps IMO
    It's just some kind of training. I want to do the whole texture creation process.
Sign In or Register to comment.