Home Technical Talk

How to decimate very big photoscanned mesh (over 370 million faces)?

Hi!
I've photoscanned a tree bark with Agisoft Photoscan. The decimation stage at the end of building the mesh would have taken so long, that I had to set the face count to unlimited and skip the decimation stage.
The result is a 370,720,451 poly and 33GB mesh that can't be opened in ZBrush because of the 100 Million face limit.
What tool do you suggest to decimate the mesh to a level that it can be opened in ZBrush for further work?

Replies

  • oglu
    Options
    Offline / Send Message
    oglu polycount lvl 666
    you could try to use mudbox... i had meshes with around 300 mio faces working in mud...
  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter
    I'd suggest you let photoscan decimate it tbh.  Zbrush can do more than 100 million verts,  is it an import restriction? 
  • oglu
    Options
    Offline / Send Message
    oglu polycount lvl 666
    On a single polyshell its hard to get that high in Z... 
  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter
    I guess if you're short on memory it might be an issue.   Buy more RAM?

    My point really is that it seems prudent to tackle the problem at the source.  In this case by waiting a couple of hours or fiddling with some settings 
  • NullPointer
    Options
    Offline / Send Message
    oglu said:
    you could try to use mudbox... i had meshes with around 300 mio faces working in mud...
    I tried it with Mudbox, it imports it and goes "not responding" and nothing happens.
    poopipe said:
    I'd suggest you let photoscan decimate it tbh.  Zbrush can do more than 100 million verts,  is it an import restriction? 
    It would take 36 days for Photoscan. I don't have that much time for it.
    poopipe said:
    I guess if you're short on memory it might be an issue.   Buy more RAM?

    My point really is that it seems prudent to tackle the problem at the source.  In this case by waiting a couple of hours or fiddling with some settings 
    I have 32GB (8x4GB) DDR4 and I don't even want to think about buying more, the maximum is 64GB and it would cost a horrible amount of money :(
  • MatejCH
    Options
    Offline / Send Message
    MatejCH polycounter lvl 6
    This is probably long shot , but maybe try Meshlab or Instant meshes ? 
  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter
    36 days suggests you're feeding photoscan more than it's designed to handle.  I'm not familiar with it but surely there's a way or either breaking it up into smaller chunks or settling for lower fidelity?  
  • Noren
    Options
    Offline / Send Message
    Noren polycounter lvl 19
    Do you need all those faces? Do they actually add finer detail? This doesn't seem like a practical approach, nor necessary for a scan of tree bark.
    You could already start working with less details, thin out your point clouds, use medium detail mesh generation etc.

    You could also try to split your project into smaller chunks and reassemble it later.
  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range
    Noren said:
    Do you need all those faces? Do they actually add finer detail? This doesn't seem like a practical approach, nor necessary for a scan of tree bark.

    100% this

    370mio is insane for the subject. You wouldn't even be able to capture that detail to a single sheet texture map. You would need a 128k resolution map!

    You definitely need to optimise the point cloud at source.
  • TTools
    Options
    Offline / Send Message
    TTools polycounter lvl 4
    There's an easier way to do this in agisoft.  We do this all the time.  So you export out your high poly model (320 million polys with texture that you generated from your dense cloud). FWIW, I also agree that 370 million is way too much, but having said that:  Once you have your high model exported out and your agisoft scene saved, simply go to back to Workflow and Build Mesh, except switch from dense cloud to sparse cloud, and choose a more reasonable polycount in the drop down menu settings.  You can get a model that is in the 30,000 to 60,000 poly range that you can now layout UVs for and bake the high poly data down to.  It's super simple and really fast without trying to find an application to decimate the high poly itself.  Just be sure not to save your agisoft scene after you've done this, as you won't want to save over the processed high poly model and texture data within the agisoft .psx scene.

    Hope this helps :)
  • NullPointer
    Options
    Offline / Send Message
    TTools said:
    There's an easier way to do this in agisoft.  We do this all the time.  So you export out your high poly model (320 million polys with texture that you generated from your dense cloud). FWIW, I also agree that 370 million is way too much, but having said that:  Once you have your high model exported out and your agisoft scene saved, simply go to back to Workflow and Build Mesh, except switch from dense cloud to sparse cloud, and choose a more reasonable polycount in the drop down menu settings.  You can get a model that is in the 30,000 to 60,000 poly range that you can now layout UVs for and bake the high poly data down to.  It's super simple and really fast without trying to find an application to decimate the high poly itself.  Just be sure not to save your agisoft scene after you've done this, as you won't want to save over the processed high poly model and texture data within the agisoft .psx scene.

    Hope this helps :)
    Thanks, this worked!
    BTW, just need to duplicate the model and then build again with the sparse cloud, so you will keep both the high and lowpoly mesh.
  • TTools
    Options
    Offline / Send Message
    TTools polycounter lvl 4
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    TTools said:
    There's an easier way to do this in agisoft.  We do this all the time.  So you export out your high poly model (320 million polys with texture that you generated from your dense cloud). FWIW, I also agree that 370 million is way too much, but having said that:  Once you have your high model exported out and your agisoft scene saved, simply go to back to Workflow and Build Mesh, except switch from dense cloud to sparse cloud, and choose a more reasonable polycount in the drop down menu settings.  You can get a model that is in the 30,000 to 60,000 poly range that you can now layout UVs for and bake the high poly data down to.  It's super simple and really fast without trying to find an application to decimate the high poly itself.  Just be sure not to save your agisoft scene after you've done this, as you won't want to save over the processed high poly model and texture data within the agisoft .psx scene.
     
    Hope this helps :)
    What do you use to bake high poly data?   I bet not every 3d soft/baker could load 320 mil model or support UDIM that is  usually necessary for such  meshes.   I use Xnormal but there is no UDIM there unfortunately.   Substance Designer just hangs on such meshes  for example
  • TTools
    Options
    Offline / Send Message
    TTools polycounter lvl 4
    gnoop said:
    What do you use to bake high poly data?   I bet not every 3d soft/baker could load 320 mil model or support UDIM that is  usually necessary for such  meshes.   I use Xnormal but there is no UDIM there unfortunately.   Substance Designer just hangs on such meshes  for example
    I typically use Substance Painter or Marmoset for baking, even when it comes to my photogrammetry data.  I would agree that not many bakers are going to be able to handle 320 mil polys.  Honestly, I never  bother working with content at that poly resolution.  I mean, if you can't get it through your pipeline, then having all that resolution doesn't do a person any good.  I have found for my needs, typically I rarely need to exceed 20 mil polys, and in my most extreme cases I did hit the 100 mil poly mark.

    If I had to deal with something any larger than that and I would break up the data into separate chunks.  I wish I could provide more info about support for UDIM, but it's not something I deal with so the software solutions I mentioned above with smaller polycounts have been my workflow.
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
      Thanks TTools for making it clear.  320 mil is what caught my mind.  I once managed to bake 700 mil mesh in Xnormal  but gosh it took  forever to just load it there . There are few subjects , like terrain, rocky mountains and other  macro things that could still be  useful with such a polycount.  Doing it in parts would constitute its own problem with proper stitching, overlapping etc.     Decimation from that works ok  really only with Reality Captures "simplify"   and  it outputs faceted meshes which may be backed same faceted way in your target surface while smoothing them hampers resulting normal map crispness.  
       Wonder why the soft like Photoscan and RC couldn't do displacement texture on its own.   Photoscan(Metashape)  at least can  build   "depth map" for each camera pretty quickly. Why they couldn't just do same  for low poly shell mesh and then calculate the difference/deviation  to be then projected same way color images are or maybe ray-tracing world coordinates from dense cloud to low poly shell.        I bet Houdini could do it  probably. Too bad I am not so good with it.

Sign In or Register to comment.