Sorry if this is a nooby question, I'm still much of a novice with 3D but I couldn't find an answer on this anywhere.
From what I gathered is that this is a really unconventional question but bear with me, I'll try to explain myself.
I'm a college student currently doing an intership at a plane repairshop. Part of my project is to create/process 3d scans. The idea of my product owner is to be able to scan parts that come in, but because a lot of the parts are the same shape storing multiple meshes would be redundant, so he wants to have a universal model and then transfer the texture from the 3D scanned parts over to it. Because of the large amount of parts minimizing the storage size is important, therefore he wants to implement this method because then he'll be able to only save the texture files.
The universal model would be made with a traditional 3d scanner, so it has a very high poly count. The other scans are made with photogrammetry or neural radiance fields so the poly count is much lower.
So would it be possible to transfer the color data of the low poly models and transfer/bake it to the high poly model? I've tried it with xNormal but it failed producing anything.
Replies
But after fiddling with some tutorials I found one that does a caged texture bake via blender (cycles engine). But instead of a normal bake I swapped the source model with a low-poly one and the destination model with a high-poly one, that worked suprisingly very well. There were still some artifacts but I think that has more to do with the models getting moved a bit between the scans, so that requires some more experimentation from my part