The future is coming along great boys.
https://img-9gag-fun.9cache.com/photo/a47vMep_460svvp9.webmThere was also this video feature that was shown months ago about actually scanning live action MOIVING scenes into a complete 3D scenes.
I really wonder how far and fast this will develop in the next couple of years.
Replies
Imagine you go online and you find someone has a 3D model of you.
Or imagine standing outside of Hollywood and scanning celebrities. Free Bruce Willis model!
https://blog.kenkaminesky.com/photography-copyright-and-the-law/
http://www.dmlp.org/legal-guide/california-right-publicity-law
Oops, I was totally wrong. This is something new. Decent hand-held scanning on mobile!
I'm amazed by the metallic bird, see how the blue and pink papers are reflected dynamically on its belly as it moves. Are they creating light probes from the environment?
I think most of the reflections are simply baked into the diffuse, the highlights on its back, etc. And it's probably not generating a normal map, nor proper roughness/specularity. But still pretty neat.
https://developer.apple.com/documentation/arkit/adding_realistic_reflections_to_an_ar_experience
The bread is not even the same. The frontal bread structure is totally different and the crust too, no way that is projected.
Im sure they modelled / scanned the things per hand and then composited it all in a video. The bird has clearly different base lighting and is way too defined. Clearly you are amazed because its total fake.
Also no way you are getting higher resolution textures on your scanned model than your mobile camera can even display, thats ridiculous.
Im recording a blurry mess and then my texture projection is suddenly sharper and looks totally different? Yeah sure.
Thats not even close to texture projected, thats clearly not the same. Modeled per hand and projected in after effects. Probably trying to get funding with a composited demo. Or they have state of the art neural network aided mesh and texture interpretation, which I would heavily doubt
Which is a shame, because technology-wise we're really not very far from that - Qlone on Android is pretty damn neat, and even though it requires a printed grid for calibration and a slow pan around the object, I could totally see it being optimized further especially with phones with multiple cameras.
But yeah regardless, this video is just BS hype material at this stage (or just part of a media package to raise funding). I am constantly amazed by how easily people fall for that kind of stuff as soon as some shaky cam and relatively clean compositing is thrown in. This reminds me of the initial Pokemon Go trailers - even though it was obvious that what was shown was an artistic/ideation trailer, some folks still wanted to believe that this would be what the game would look like
(Also, the wireframe having varying thickness is a dead giveaway too. There's no reason for it to look like that ... except if it was done as a post processing effect of course ... just like here :
http://www.mustaphafersaoui.fr/quick-tip-01-c4d-wireframe-render-generator/ )
(BTW : "Its", not "it's" )
pior said:
Qlone on Android is pretty damn neat, and even though it requires a printed grid for calibration and a slow pan around the object, I could totally see it being optimized further especially with phones with multiple cameras.
Geez!...now that is indeed very neat, although pity AFAIK the app currently is IOS centric whereas with Android it's still not fully featured?
https://3dscanexpert.com/qlone-smartphone-3d-scanning-app-ios-android-review/
Anyway regardless as a 3D Scan/photogrammetry noob, IMO looks to be a low cost entry into getting my mitts on a piece of this not so new/new fangled tech.
It's not all that far-fetched, we've actually tried it out here. TBH though it's no end-game tech, cell phone lenses suck, and the point cloud is pretty low-res. But it's still a great proof of concept.
Somebody at my work fascinated everybody showing off the new IKEA application with his smartphone. The guy was able to seamlessly place, rotate and scale furniture around the room! The angle, placement and position of the scanned (?) asset was consistent throughout the whole thing...
Apple's ARKit gives you the scaling/moving/lighting tech already all figured out. So developers like IKEA/Wayfair can just plugin their models, and it just *works*. Google has similar tech in their ARCore toolkit for Android.
The scanning thing though is new.