As the title says. I've read that you shouldn't do that, but this time I have to (: I tried baking with Marmoset, Blender and every time normal map looks different than original. Do I have any options or should I cry in a corner?
"Baking normal to duplicated mesh with different UV" Try to be as clear as you possibly can. You are barely making any effort to describe your problem - no pictures, no examples.
Do you mean that you have a model with its normalmap, and another one wich is a duplicate of the first model but with different UVs, and you want to transfer the nmap from one to the other ? If that's the case :
Yes, this is 10000% doable in Blender, and the resulting normalmap will actually be accurate (in the sense that it will not just be a transfer of the pixels, but actually a recalculation of the surface data taking both the model and the source nmap into consideration). It works perfectly between two models, and it also works prefectly within one model with two UV sets. I personally do this all the time when working on modular assets, as this allows to change UVs on the fly without the need to ever rebake from high.
"Baking normal to duplicated mesh with different UV" Try to be as clear as you possibly can. You are barely making any effort to describe your problem - no pictures, no examples.
Do you mean that you have a model with its normalmap, and another one wich is a duplicate of the first model but with different UVs, and you want to transfer the nmap from one to the other ? If that's the case :
Yes, this is 10000% doable in Blender, and the resulting normalmap will actually be accurate (in the sense that it will not just be a transfer of the pixels, but actually a recalculation of the surface data taking both the model and the source nmap into consideration). It works perfectly between two models, and it also works prefectly within one model with two UV sets. I personally do this all the time when working on modular assets, as this allows to change UVs on the fly without the need to ever rebake from high.
Says the polycount forum. But the answers were from 2011
I couldn't send any pictures, because the problem was inside my job project, but I see you understood me pertectly. I tried to bake the normal map as an albedo (diffuse with only "color" option selected in baking section). Baking as a normal map helped. I was informed that if I want to bake as an albedo, I should stick to the previous UV island rotation – by this I mean I can change scale of UV island on duplicated mesh, but I can't change the rotation. I hope this will help someone in the future
"Baking normal to duplicated mesh with different UV" Try to be as clear as you possibly can. You are barely making any effort to describe your problem - no pictures, no examples.
Do you mean that you have a model with its normalmap, and another one wich is a duplicate of the first model but with different UVs, and you want to transfer the nmap from one to the other ? If that's the case :
Yes, this is 10000% doable in Blender, and the resulting normalmap will actually be accurate (in the sense that it will not just be a transfer of the pixels, but actually a recalculation of the surface data taking both the model and the source nmap into consideration). It works perfectly between two models, and it also works prefectly within one model with two UV sets. I personally do this all the time when working on modular assets, as this allows to change UVs on the fly without the need to ever rebake from high.
Says the polycount forum. But the answers were from 2011
I couldn't send any pictures, because the problem was inside my job project, but I see you understood me pertectly. I tried to bake the normal map as an albedo (diffuse with only "color" option selected in baking section). Baking as a normal map helped. I was informed that if I want to bake as an albedo, I should stick to the previous UV island rotation – by this I mean I can change scale of UV island on duplicated mesh, but I can't change the rotation. I hope this will help someone in the future
If it's tangent, it should be fine to rotate/scale up and down, object/world space, obviously relies on the mesh normals/tangents in conjunction.
But come to think of it, if you have embedded height details like rivets/grooves in a tangent map, they may look strange once rotated. I'm quite certain I've re-uved/transferred uv channels over and have some shells rotated without issues.
While you could certainly transfer the "colored pixels" information if the change of UVs only consists of scaling and translations, you actually don't need to concern yourself with any of that at all if you perform the bake in Blender using the normal option. As said, it will accurately take the source normals (source object + source nmap) into account, and transfer that as desired. This allows for all kinds of tricks, like having a source model with hard edges and a target set to soft, to create hard surface normalmaps without a highpoly model.
As said I do that constantly when working on assets in order to combine parts, turn tiling details into unique unwraps, and so on. It works flawlessly either across 2 UV channels or from one model to another, with a cage bake.
Also, in the case of two models it is possible to recreate a secondary UV channel with the desired target UVs thanks to thevery useful data transfer modifier.
I guess what 2011 Polycount said is that you can't transfer normal maps like albedo maps. Normals map "colors" are coding for 3d directions and are dependent of the U direction of UV's to define their (tangent) space. If you rotate your UV's, all directions in your normal map have to be re-evaluated, hence the "colors" will change. That's also the reason why you can't just simply rotate a normal map in photoshop.
I don't know if there's a good ultimate resource online explaining what tangent space and normal map really are, because it looks like a lot of artists believe baking good normal maps is some kind of voodoo while it's pretty simple. If you get how it works, you know what to do and to not do.
IMO there are great documentations and youtube tutorials covering the basics, but first problems appear when you want to do something more than just standard bake from high poly to low poly.
For example I still don't understand the tangent orientation (right/left-handed) and calculation (per-pixel/vertex) in MT4. All the documentations I found were about programming. Not that I've ever needed these options, but actually I don't even know If I've ever needed it.
The second thing is that in case of long UV islands they are usually disappearing in the half of an island. I always had to paint it over in SP.
I know that I've found solution of the main problem from the topic title (: but I wonder If you had the same issues.
Well those options depend on how the mesh and tangent space is handled in your final shader. There's no real standard, tho it's getting better with mikktspace. Basically you just have to check which configuration is correct for your engine, and never touch it again. But yeah this is very programming oriented. You can read some good informations here https://bgolus.medium.com/generating-perfect-normal-maps-for-unity-f929e673fc57
And i don't get your second question. But what i can say is fixing manually your normal maps is usually a terrible idea.
While the OP hasn't done the greatest job at explaining/showing the issue, this doesn't seem like a case of a high not having details that can be caught by rays - after all, the screenshot clearly shows a slant. He even mentioned that "It happens when I deal with rectangular uv islands", hence suggesting that the issue doesn't happen otherwise.
If anything, that's a great example how it is near impossible to give any advice when the person asking for it doesn't lay things out clearly Come on man, you can do better than cryptic one-liners !
Screenshot shows a slant at the bottom where we're getting some normal information coming through, the upper part is doing what I'd expect given the models. If you look close at the normal bake you can see that left side of the upper half has a very slight slope to it.
However, pior as usual has a point. I have no clue what the OP has actually done. I'm assuming the OP has cropped the normal bake image - if that's the case I'm not sure there's actually anything wrong.
Can we see the full normal map compared to what happens when the UV Island is not rectangular?
While the OP hasn't done the greatest job at explaining/showing the issue, this doesn't seem like a case of a high not having details that can be caught by rays - after all, the screenshot clearly shows a slant. He even mentioned that "It happens when I deal with rectangular uv islands", hence suggesting that the issue doesn't happen otherwise.
If anything, that's a great example how it is near impossible to give any advice when the person asking for it doesn't lay things out clearly Come on man, you can do better than cryptic one-liners !
TBH two users understood clearly what may be wrong from the screenshot and you're just complaining that's impossible. But I have to admit sending .blend file would be much better
Well it's not impossible to guess, it's impossible to go straight to the point *without having* to guess - especially since you even mentionned that the issue only (seemingly) happened in certain cases related to UVs ...
Replies
Says who ?
"Baking normal to duplicated mesh with different UV"
Try to be as clear as you possibly can. You are barely making any effort to describe your problem - no pictures, no examples.
Do you mean that you have a model with its normalmap, and another one wich is a duplicate of the first model but with different UVs, and you want to transfer the nmap from one to the other ? If that's the case :
Yes, this is 10000% doable in Blender, and the resulting normalmap will actually be accurate (in the sense that it will not just be a transfer of the pixels, but actually a recalculation of the surface data taking both the model and the source nmap into consideration). It works perfectly between two models, and it also works prefectly within one model with two UV sets. I personally do this all the time when working on modular assets, as this allows to change UVs on the fly without the need to ever rebake from high.
While you could certainly transfer the "colored pixels" information if the change of UVs only consists of scaling and translations, you actually don't need to concern yourself with any of that at all if you perform the bake in Blender using the normal option. As said, it will accurately take the source normals (source object + source nmap) into account, and transfer that as desired. This allows for all kinds of tricks, like having a source model with hard edges and a target set to soft, to create hard surface normalmaps without a highpoly model.
As said I do that constantly when working on assets in order to combine parts, turn tiling details into unique unwraps, and so on. It works flawlessly either across 2 UV channels or from one model to another, with a cage bake.
Also, in the case of two models it is possible to recreate a secondary UV channel with the desired target UVs thanks to thevery useful data transfer modifier.
There are no circumstances where that will work
Also mikktspace is the standard if we all refuse to accept anything else
If anything, that's a great example how it is near impossible to give any advice when the person asking for it doesn't lay things out clearly Come on man, you can do better than cryptic one-liners !
However, pior as usual has a point.
I have no clue what the OP has actually done.
I'm assuming the OP has cropped the normal bake image - if that's the case I'm not sure there's actually anything wrong.
Can we see the full normal map compared to what happens when the UV Island is not rectangular?
I guess that's the case, because I've used to do cuts perpendicular to the surface XD
Aaaaanyways.