Hey everyone, we just published a tutorial about how to get good photogrammetry results using a Smartphone camera, Photoscan and Substance.
Click!This one is focused on a 3d outdoor scan, we will also publish another tutorial in a few weeks covering seamless material creation.
Please let us know what you think and if you already have experience with DIY photogrammetry or if you tried our method, we would love to hear your feedback!
Replies
Used Agisoft, Zbrush for the baking, you can check it out here: http://jakobappleby.com/Photogrammetry-Test
Is there a specific reason why you choose to create the height map in Substance instead of baking it in Blender as well?
In my experience photogrammetry based materials works so-so with Substance Designer/Painter. Both softs are not suited well for too hi-res sources . Works ok only when you scan a small fragment and then "make it tile" through the whole texture.
But the main point of photogrammetry is opposite, an ability to make unique and real surface across pretty big extent. Once you try to do something like this , Substance Designer would make you hitting your head against a wall.
We could have. Generating the height in Designer is just faster and automatically updates anytime we rebake or tweak the normal. You'll find that with organic/high frequency surfaces, the height generated by the filter in SD is very close to what you would have baked and the quality is good enough for this asset that doesn't have large low frequency details.
For hard surface scans or organic stuff with very protruding elements such as big rocks, you probably want to bake the height instead.
The results from the height to normal can be pretty good, and can be useful to generate height in cases where you only have a normal map.
I've been messing with foliage capture here and using normal to height looked pretty decent even though my setup is pretty flawed at the moment.