Hi Polycount!
I’ve spent the last year obsessed with a single problem: Why do we need to take 50+ photos and wait for cloud processing just to capture the surface of a brick or a piece of wood?
I’m developing Nack — an Android tool that uses Shape-from-Shading (SfS) to reconstruct geometry from a single flash photo. No LiDAR, no cloud, no tripod. Just one "click" and you get raw 16-bit data.
Why I think this matters: Traditional photogrammetry often struggles with micro-reliefs (it "blurs" them or creates noise) and requires a perfect setup. Nack analyzes light-shadow falloff from the phone's flash to reconstruct the surface at a sub-millimeter scale.
Key Specs:
16-bit Depth Maps (PNG): Ready for ZBrush displacement.
Instant Normal Maps: High-frequency detail capture.
On-device processing: All math happens in OpenCV locally.
The "Proof" (Compare for yourself):
https://youtu.be/ORexhPYd4wA
I’m looking for a "Stress Test": I’ve put a Free Demo on Gumroad. I want to see if this can handle the most complex surfaces you guys can find. Try it on concrete, fabric, or old tree bark and post your bakes here. I’m ready for brutal feedback.
Get the Demo / Full Version: [https://rkey.gumroad.com/l/sdohp]
Current Goal: Improve the Poisson solver for even cleaner maps. Let me know what you think about the reconstruction fidelity!