Home Technical Talk

Removing AO from photogrammetry

loggie24
polycounter lvl 3
Offline / Send Message
loggie24 polycounter lvl 3
Hi!

Been thinking of doing some photogrammetry again soon and i have something i'v always wondered how to do, and that is how to remove AO from your scanned object. This peaked my interest after seing the method Epic used on their kite scene a while back, where they briefly explained that they used 18% gray and chrome balls to then delight it using an unkown technique. That's where i'm stuck since they haven't from my knowledge released any more information on this.

Cheers!

Replies

  • gilesruscoe
    Options
    Offline / Send Message
    gilesruscoe polycounter lvl 10
    Although I'm not familiar with how Epic have done it, from my understanding, you can somewhat successfully remove lighting from a texture by doing the following:

    1. Photograph your object from many angles.
    2. Photograph a chrome ball in place of your object.
    3. Process your object, outputting a mesh + texture.
    4. Create "Lit Sphere" texture from your chrome ball images by cropping and centering the ball into a square texture.

    This gives you everything needed for the colour correction step:
    5. Sample the corresponding surface normal for each pixel in the texture
    6. Using this sample, lookup the pixel colour in the lit sphere texture (details here)
    7. You now have two values for this pixel, the scanned texture value and the lit-sphere lookup value.
    8. Since lighting is (in its most basic form) albedo * light, we can deduct the unlit surface colour by reversing it.

    That's the very basic approach, there is actually a lot more complexity to it, like consideration of self shadowing on the object, specular highlights, surface roughness etc. To compensate for these, you need to calculate many more parameters for each pixel lookup and possibly process the litsphere to match the surface roughness before sampling. For Ambient Occlusion removal you can do the same reverse techique, but by calculating and deducting an AO value and approximating an ambient light colour from the lit sphere.

    TLDR: Sizable code implementation, I'm not aware of any software that can do it for you.



  • ProperSquid
    Options
    Offline / Send Message
    ProperSquid polycounter lvl 12
    I have another method that should work just as well. I think this is how Epic did it.
    1. Go through the photogrammetry as usual. Make sure that you capture an hdr chrome ball of the environment, or do a full 360 hdr capture of the environment and throw the images into a photo stitching program to get an environment map. The key things are that you have captured the environment information as close to the thing you're capturing as possible, and that you have a high quality high dynamics range image for getting the light information from.
    2. Go into your favourite 3d package, and put your 3d mesh (with UVs generated at some point) in the scene. Get a basic diffuse shader and apply it to the mesh. I used a shader with the values 0.8/ 0.8/ 0.8, but feel free to play around with the values to get something that seems good.
    3. Plug in the environment map into the environment light/ sky light/ whatever it is called in your renderer/ 3d package of choice.
    4. Make sure global illumination is turned on. If you're using a path tracer like Cycles in Blender, Arnold, or Render Man's RIS engine, you get this for free.
    5. Bake out the light texture to a 32 bit float image (exr) that is the same resolution as the source texture for the mesh. You may be able to get away with a regular 8 bit or 16 bit image, but I don't like to sacrifice quality until the last step.
    6. You should now have 1 texture map that is the albedo and light information, and another texture map that is just light information.
    7. The formula for light (at least in Cycles) is albedo * light = final. So, if we do final / light, we get the albedo. You may want to check how your render engine handles light. If it is a modern path tracer such as Cycles, Render Man RIS, or Arnold, it should be the formula I gave.
    The reason this works is because we are exploiting the fact that path tracers are really good at recreating physically accurate lighting. What we do is recreate the local lighting on the mesh using only the mesh, the environment light that we captured while capturing the images for generating the mesh, and a really simple diffuse shader. This should give a very close approximation to the lighting conditions when the capturing was done, so should do a pretty good job of removing the light from the captured textures. The only drawback to this method is it is pretty slow, since you have to do a full path trace on every mesh generated.

    Hopefully this made sense. If you have any questions, feel free to let me know!
  • alekseypindrus
    Options
    Offline / Send Message
    alekseypindrus polycounter lvl 10
    Some info about Epic's approach [they will post more detailed guide later AFAIK]:

    To remove the lighting information from the assets, Antoine created a semi-automated process which re-generated the lighting condition at the time of each asset’s shoot using the capture HDR and use that information to create a ‘delighting’ pass.

    https://www.fxguide.com/featured/epics-unreal-engine-open-world-behind-the-scenes/
    https://www.unrealengine.com/blog/creating-assets-for-open-world-demo
    https://www.unrealengine.com/blog/imperfection-for-perfection
    https://www.youtube.com/watch?v=clakekAHQx0
    https://www.youtube.com/watch?v=eYDn2sQ8uKs
    https://www.youtube.com/watch?v=hNgcQ0BeGCs

    https://moritzweller.wordpress.com/2015/03/13/the-tech-behind-epics-open-world-demo-in-unreal-engine-4/
    http://morganmcd.com/?p=6776

    UPD Plus check this topic:
    To delight you have to capture a 360 degree HDRI at the time of model capture and using something like vray to render a lightmap that exactly matches reality. Then in Photoshop blend the delighting texture and the photoscanned texture with the divide blend mode. If done correctly the lighhting gets stripped out.
    It's blooming hard though. Getting your delighting lightmap to perfectly match reality is super hard, if it's off by a little bit the small errors will literally glow once blended.
  • loggie24
    Options
    Offline / Send Message
    loggie24 polycounter lvl 3
    Thanks a lot, really good info guys!!
  • alekseypindrus
    Options
    Offline / Send Message
    alekseypindrus polycounter lvl 10
    And here's promised details from Epic Games about delighting process:
    https://www.unrealengine.com/blog/imperfection-for-perfection-part-2
  • loggie24
    Options
    Offline / Send Message
    loggie24 polycounter lvl 3
    Yess!! Thanks :smile: 
  • dorodo
    Options
    Offline / Send Message
    dorodo polycounter lvl 3
    I've been wanting to try this out, but I only got a normal photo head + the standard 18-35 nikon zoom lens (On a tight budget atm ). Will using a Grey + Chrome probe setup be as effective as an HDRI Panorama? Also, if you're using a circular polarizer to photograph your assets, should you also use it when taking the photos for a HDRI Panorama / Grey + Chrome probe?
  • JFinn_UK
    Options
    Offline / Send Message
    JFinn_UK polycounter lvl 7
    Thanks for all the info on this!
  • kurt_hectic
Sign In or Register to comment.