Hi!
Been thinking of doing some photogrammetry again soon and i have something i'v always wondered how to do, and that is how to remove AO from your scanned object. This peaked my interest after seing the method Epic used on their kite scene a while back, where they briefly explained that they used 18% gray and chrome balls to then delight it using an unkown technique. That's where i'm stuck since they haven't from my knowledge released any more information on this.
Cheers!
Replies
1. Photograph your object from many angles.
2. Photograph a chrome ball in place of your object.
3. Process your object, outputting a mesh + texture.
4. Create "Lit Sphere" texture from your chrome ball images by cropping and centering the ball into a square texture.
This gives you everything needed for the colour correction step:
5. Sample the corresponding surface normal for each pixel in the texture
6. Using this sample, lookup the pixel colour in the lit sphere texture (details here)
7. You now have two values for this pixel, the scanned texture value and the lit-sphere lookup value.
8. Since lighting is (in its most basic form) albedo * light, we can deduct the unlit surface colour by reversing it.
That's the very basic approach, there is actually a lot more complexity to it, like consideration of self shadowing on the object, specular highlights, surface roughness etc. To compensate for these, you need to calculate many more parameters for each pixel lookup and possibly process the litsphere to match the surface roughness before sampling. For Ambient Occlusion removal you can do the same reverse techique, but by calculating and deducting an AO value and approximating an ambient light colour from the lit sphere.
TLDR: Sizable code implementation, I'm not aware of any software that can do it for you.
- Go through the photogrammetry as usual. Make sure that you capture an hdr chrome ball of the environment, or do a full 360 hdr capture of the environment and throw the images into a photo stitching program to get an environment map. The key things are that you have captured the environment information as close to the thing you're capturing as possible, and that you have a high quality high dynamics range image for getting the light information from.
- Go into your favourite 3d package, and put your 3d mesh (with UVs generated at some point) in the scene. Get a basic diffuse shader and apply it to the mesh. I used a shader with the values 0.8/ 0.8/ 0.8, but feel free to play around with the values to get something that seems good.
- Plug in the environment map into the environment light/ sky light/ whatever it is called in your renderer/ 3d package of choice.
- Make sure global illumination is turned on. If you're using a path tracer like Cycles in Blender, Arnold, or Render Man's RIS engine, you get this for free.
- Bake out the light texture to a 32 bit float image (exr) that is the same resolution as the source texture for the mesh. You may be able to get away with a regular 8 bit or 16 bit image, but I don't like to sacrifice quality until the last step.
- You should now have 1 texture map that is the albedo and light information, and another texture map that is just light information.
- The formula for light (at least in Cycles) is albedo * light = final. So, if we do final / light, we get the albedo. You may want to check how your render engine handles light. If it is a modern path tracer such as Cycles, Render Man RIS, or Arnold, it should be the formula I gave.
The reason this works is because we are exploiting the fact that path tracers are really good at recreating physically accurate lighting. What we do is recreate the local lighting on the mesh using only the mesh, the environment light that we captured while capturing the images for generating the mesh, and a really simple diffuse shader. This should give a very close approximation to the lighting conditions when the capturing was done, so should do a pretty good job of removing the light from the captured textures. The only drawback to this method is it is pretty slow, since you have to do a full path trace on every mesh generated.Hopefully this made sense. If you have any questions, feel free to let me know!
https://www.fxguide.com/featured/epics-unreal-engine-open-world-behind-the-scenes/
https://www.unrealengine.com/blog/creating-assets-for-open-world-demo
https://www.unrealengine.com/blog/imperfection-for-perfection
https://www.youtube.com/watch?v=clakekAHQx0
https://www.youtube.com/watch?v=eYDn2sQ8uKs
https://www.youtube.com/watch?v=hNgcQ0BeGCs
https://moritzweller.wordpress.com/2015/03/13/the-tech-behind-epics-open-world-demo-in-unreal-engine-4/
http://morganmcd.com/?p=6776
UPD Plus check this topic:
https://www.unrealengine.com/blog/imperfection-for-perfection-part-2
https://www.youtube.com/watch?v=DmRu0Ze8gwY
https://www.youtube.com/user/ICTGraphicsLab/videos?spfreload=10