Posting a lot of normal map questions today. I'm doing some documentation and need to figure some of this stuff out.
This was taken from this site:
http://ddnetworkofartists.net/index.php?itemid=34&catid=7
The fact is, each vertex on a 3D model can only have one UV; therefore when you split your uv's you are actually creating two vertices sitting on top of eachother. Software like Maya, and Max conceal this fact to make things more manageable for the user. By doing this, the software breaks the tangency between one side of the vertex and the other. And since most normal maps are tangent based, this will lead to seams wherever your uv's are split. The same applies to normals. If you harden your normals you are creating multiple vertices which is why collision objects need to have their normals softened.
I'm really confused by what they mean when they say "the software breaks the tangency between one side of the vertex and the other" are they trying to say that all UV border edges = split vertex normals?
I'm really confused by this because you can have averaged vertex normals (single smoothing group) along UV border edges, and even if you did have duplicate vertex normals why wouldn't they be averaged?
That entire method kind of confuses me, and I really would like to understand it better.
If anyone can help me out I would really appreciate it.
Replies
However since a vertex can only have one set of information, it must split the vertex, even if the normals for both vertices are identical - if the tangent is different, you need a new vertex.
Or do you get two verts and two vertex normals, and the vertex normals just happen to have the exact same angle?
Also, I'm still confused why breaking the UVs creats a split tangent basis along that edge? I can see it happening in Xnormal, so you are indeed correct, but I don't understand why the tangents would be pointing in different directions if your vertex normals were averaged?
EDIT:
Actually I looked at this a bit closer and it looks like the tangents are doing the exact same thing the vertex normals are doing.
If I split the vertex normals I get split tangents, and if I averaged the vertex normals I get averaged tangents. This confuses me even more. What I'm seeing here doesn't match up with what that article is stating.
And yes, you will then have two vertices with two vertex normals, and the vertex normals "just happen" to have the exact same angle - this is because the vertex normals are derived from the surrounding geometry, not the surrounding UVs - and the geometry (the faces themselves) hasn't changed.
Basically the normals and the tangents are all derived from mathematical processes based on the geometry - so if any of the inputs change, the result will change too. Which is why if you move a vertex, the face changes angle, which affects the normal. Since the normal changes, this also affects the tangent.
However, breaking UVs will only affect the tangent, and not the normal - this is because UV data isn't used as an input to calculating the normal.
I think this makes sense now.
Normally changing the UVs and thus changing the tangents isn't an issue because the baker takes this change into account and changes the baked data in the normal map to compensate.
The issue shown in the explanation only applies to situations where you are applying a normal map to a model that wasn't baked for that geometry.
By moving the break in tangency (the uv border edge) to the middle of a planar face you are no longer breaking continuity since the two resulting tangents will be the same since they are on a planar face and will be getting the same data from the adjacent vertex normals.
Not necessarily. :shifty:
When switching a UV border to a different mesh edge, even if the vertex normals are exactly the same, like in the middle of a planar surface, the tangents could still be very different.
Two of the tangent axes are derived directly from the U and V axes, not from the vertex normal (which creates the third tangent axis). Those two axes are usually the same as U and V but sometimes the tangents have to be adjusted in code to solve cases like UV mirroring.
So if you change your UV border to a different location on the mesh, but you rotate or mirror the UVs on either side of that border, the tangents will change as well to compensate.
When the direction in UV space changes, the direction in tangent space changes as well.
It seams like the article is suggesting that a break in the tangents will reult in the z axis (up axis) of the tangents pointing in a different direction, but my tests have shown that this will only ever happen if you splt your vertex normals.
The X & Y axis, while admitadly derived from the orentation of the uvs, shouldnt create the "break" that the image on that site is depecting, because its the angle of the the up axis that ultimatly determines the default angle of the resulting texel normals in tangent space.
Could it be that the site is only talking about a situation that occurs in certain game engines where the tangent basis is calculated in such a way that the resulting tangents would have an up axis that points in different directions and that this usuallys this isnt an issue?
Thanks for all the responces so far everyone this has been extremely informative.
It sounds more like what you said lastly there, that it's an issue coming from some engine/shader/baker or whatever not calculating stuff correctly? I'm not entirely sure, though, it could be a valid problem.
Its a bit of a deviation from what this thread was trying to cover, you may want to start your own thread and explain exactly what you are trying to do so that we can give you some insight into the best course of action.
@MoP: I think your right. I was starting to arrive at that conclusion myself. I would love if someone could chime in and explain what is going on there since it appears, on the surface at least, to be a legitimate solution to a legitimate problem.
However, my inability to explain what they are doing makes me suspect that its a special case, and the result of some proprietary tangent space calculation that I haven't ever encountered.
They do specifically say that this should only occur on models where the normal map wasn't specifically baked for the low poly geometry your putting it on. Just to clear this all up I ran this test:
I wasn't able to get the seam. So has far as I'm concerned I'm labeling this as "myth busted" unless somone else chimes in. Thanks so much for all your help.
Sorry but this made my day:poly142:
You can usually bake various surface properties from a single mesh into a map or CAV color, be it the object's normals, U/V basis, etc. Not sure how to do it in Max mind you, but its easy enough in Softimage
I'm flattered to see my article has been a subject of such debate.
I've updated the post with a video that walks through the steps I described.
http://www.digitaldouble.net/blog/index.php?itemid=34
Though your particular situation may not warrant this as the best solution to the problem, it does presents an alternative to most other solutions out there.
Also, keep in mind that a special set of circumstances have to be in place in order for a seam to even show up. It's quite possible you may not have a seam at all. But in the event that you do, you can consider this as a solution, especially if procedurally generating your normal maps from an existing high-res model is not an option (e.g. making normalmaps in Photoshop through the nvidia filter, etc).
If nothing else, I hope my article was at least educational. That was the original intent, anyways :poly121:
best wishes,
Kamal
Perhaps I misunderstood the issue. I thought it wasn't to do with mirrored UVs at all, but was instead to do with the fact that the normal map wasn't baked from the geometry. I'll have a look at the updated post later today and see if it all makes more sense.
Unfortunately I need to head out and take care of a few things but when i get back I'm going to look at the updated explanation.
Really happy to see everyone pitching in to clear this up. Its nice to know the nature and specifics of how these issues work.
Nope.
See, I want to get the look my existing smoothing groups creates for my model in Max, into Maya, through use of a normal map. Is that a stupid idea? I don't see how...you can have that seperate smoothing groups/hard edge look without actually having to go through all that edge hardening nonsense in Maya. The work is already done in Max, why re-do it? (Also, upon export/import the smoothing groups messed up so that's why I am having to do this)
Is there a way to do it?
If you want your "smoothing groups" aka vertex normals to export from 3dsmax to Maya you shouldn't have a problem if you use a file format that supports it. Most do, in fact i'm not aware of one off the top of my head that doesn't.
Just use OBJ or FBX and be mindfull of any options in the export/import process that allow you to include/exlude the vertex normals.
Normal maps, don't store smoothing groups, they store data to create normals for the pixels of a texture applied to the model. Two completely different things.
BH, I never did it in Max, but tried it some time ago in Maya (or was it Xnormal?) and it worked there. Basically I was putting hard edges/smoothing groups where I wanted them to appear on the lowpoly (not everywhere, just in specific locations, to get an oldschool 'smoothing groups' look) and baked that to the same mesh, but with smoothing everywhere. It basically created a normalmap of the carefully hard edged lowpoly... on itself.
It does sound useless but! with some smart post processing in Photoshop one can extract the hard edges information and blur them just enough to create a fantastic filleted edge effect. It's very cheap and hacky but it can save a huge amount of time on props for instance.
(At that time I was wondering how the guys behind MGS4 managed to get such crisp results on their assets. I dont think they used that one technique in particular but still it can look great.) Ill post an example soon.
I originally thought you meant just baking the lowpoly to the identical lowpoly, which of course would have no net result since the source and destination mesh normals are 100% identical.
I understand that anytime you break your UVs your doubling the vertices along the resulting UV border edge so you can hold the two UV coordinates.
I also understand that the tangents are derived, in part, from the from the UV coordinates. So shifting around your UVs will will create a change in the tangents. Since the UV border edges involved don't share the exact same UV coordinates we are going to get a "break" in the resulting tangents along that edge.
That all makes sense. What doesn't make sense is why making the border edge "inline" or on the surface a planar face (where the two adjacent edges share the same plane in 3d space) fixes the seam. Shouldn't the tangents be just as mismatched they were along the original edge?
The only change I can see is the position of the UVs the vertex normals for either of the two resulting vertices should be identical in either case (if all your normals are averaged).
Isn't the issue more to do with the fact that your UVs along the border edge have different coordinates in general, and not with the location of the edge in 3d space?
Basically what I'm trying to say is that I thought the issue was a 2D one, not a 3D one. So the position of the border edge in 3D space shouldn't matter its the fact that you have a border edge period.
Obviously, this isn't the case but that's what my intuition is telling me.
If anyone could help me better understand what's going on I would greatly appreciate it.
I still don't think that example of changing the UV split position is really a valid one? Personally I wouldn't get too worried about it - it sounds like your understanding of the way normals, tangents and "local space" works now is better than most game artists on the planet
I'm not worried about so much for the sake of making art. The fact that it works or doesn't work is all I need to know to do that. I'm just trying to better understand whats going on so I can explain it to others and also troubleshoot normal map issues in general.
I just saw this particular problem as a very interesting phenomenon that didn't sync up with what I already know.
blenderhead...should also be able to render out an object ro world space normal map from the smoothed object to the non smoothed in maya...
pior love those sorts orf tricks
edit- yeah i dont get it at all, tried a couple of shaders... and no luck reproducing it... which makes me think its not to be worried about at all...having said that...i learnt something about tangents:poly121:
My research suggests that the tangency is being calculated after vertices are split, which would make the edge appear as if it had hard edge normals. Whereas if the tangency was calculated before the verts were split they'd both have the same normal. I'll have to ask around to confirm that this is the case.
It's possible, then, that this is an issue with the shader engine itself and why some of you are seeing the issue while others aren't.