After pouring over the Quixel Megascans page
, I want to learn the process behind capturing real-world surfaces.
"...each pixel of every map has been physically scanned, and every material component has been fully separated. Every reflectance value has been measured, and each normal reveals how light interacts with the surface and subsurface."
I'm familiar with capturing diffuse, and using polarizing filters to capture and derive Specular contribution, but that is just careful use of a DSLR. Creating Normal Maps from photos is kind of hit or miss from there, and generally not the best practice.
They hint at some proprietary technical wizardry, [ame="
coming right out and claiming so.[/ame] Pretty sweet Quixel branded PCB though.TL;DR
My question is:
How do you capture (rather than derive) these maps?
- Ambient Occlusion
Are we talking about some crazy laser/radar/holographic projection/scanning tech, or just clever use of a nice camera?To anyone with experience capturing these materials that can guide me in the right direction, Thanks.