-Symmetrized meshes
-Hard edges along UV seams (I tried with soft edges, it's worse)
-Uv are not overlapped for the bake
-It's an FBX export in Xnormal
-Triangulate before bake
I tried with maya cage, xnormal cage, custom cage, and I have the same errors, not matter which.
I tried in OBJ, nothing change.
On some meshes, everyting is fine, I don't understand!
The issue on the left looks like unwelded verts and the one on the right looks like your cage wasn't pushed out quite far enough to encompass the high poly entirely.
But, of course, I could be wrong. And likely am...
And for the artifact on the left, I've checked, the cage is good, the artifact isn't symmetrical, and it appears ONLY in marmoset toolbag! I don't see them in the xnormal viewer!
Bit of a basic question, But I assume you've frozen transforms before exporting, And turned off two sided lighting to check for any flipped faces or goofy normals?
If the problem persists i would try exporting as an OBJ. Then import it back into maya. Then out as an FBX. Or possibly the FBX converter I've heard about a few times.
Or you could just export an MB file and import it into a new scene to try and clear whatever weird geometry history might be going on.
Would you mind posting an image of the model with no normal map in marmoset? just the low poly.
it could be a problem that if you export the normals, toolbag, xnormal or udk will not recalculate the tangents and show some strange effects in lighting.
Offset one half of your mirrored UVs outside the 0-1 UV space and then post a screenshot. That problem usually shows up if you have overlapping mirrored UV islands because of the way Marmoset renders normals.
Hello, is there another way to get nice shading in your normals without adding extra loops ?
In the left i added extra loop to control the shading in my normal map...and the right one is without and you can see what a ugly shading i have, also where i have uv split i have different smth group. Adding on all your models extra loops i think is to much geo or is no other way. Please someone enlight me.
If you have a synced workflow (look at the handplane thread) it does not matter how harsh the shading on the normal map looks. The first thing is to make sure you high poly and low poly match as closely as possible. You could get better shading with only adding one extra loop if it really bothers you or you cannot have a synced normal map work flow.
Thanks zac for your answer, i have a good workflow everything is going fine but when i have this type of surfaces i dont think one edge would make it good enough...because if i add one edge on the right side then i will have a bad gradient between and not a nice solid color.
Gradations in your low poly like that get translated to your normal map bake in unsynced workflows.
Bit of a pedantic remark, I know, but I just wanted to point out that this isn't accurate, as I understand it.
Syncing has nothing to do with whether or not there are gradients in your normalmap,they're there to counteract lowpoly smooth shading interpolation - the only issue being that compression artifacts and tangent basis mismatches can make those problematic.
I'd say the more the normal map has to compensate the low poly, the more shading issue you will get with an unsynced workflow. And indeed the dds compression (and even 8bit format) is an issue, i wouldn't recommand to have these large gradient on big surfaces for the best result. Depends of the size of the thing in the end, if it worthes to add chamfers, split the uv's, etc...because it adds vertices. Anyway, everything is explained in the 1st post
Bit of a pedantic remark, I know, but I just wanted to point out that this isn't accurate, as I understand it.
Syncing has nothing to do with whether or not there are gradients in your normalmap,they're there to counteract lowpoly smooth shading interpolation - the only issue being that compression artifacts and tangent basis mismatches can make those problematic.
Oops yea, my wording is off. Gradations like that get baked in all scenarios where the low poly geo creates that gradient. In an un-synced workflow the gradation may cause (severe)issues, less issues happen in synced workflows. Sorry about my bad wording.
I wouldn't even have pointed it out if it weren't for the fact this is a thread aimed at teaching people. Wouldn't want it to get picked up by people like 'MAEK ERR'TIN QUADS' has been.
I wouldn't even have pointed it out if it weren't for the fact this is a thread aimed at teaching people. Wouldn't want it to get picked up by people like 'MAEK ERR'TIN QUADS' has been.
Yep no problem man. Maybe I should go back and fix that so I don't confuse...
Hi there, I'm trying to get to terms with normal baking and I get the feeling there's something fundamental I haven't figured out in regards to smoothing.
The cube has six UV islands and six smoothing groups yet there's still a visible seam.
The second mesh is 2 UV islands and 2 smoothing groups. Where the smoothing groups meet there's a seam while the rest of the mesh smooths properly.
Is there Anything I have done wrong to cause this or should i use a cage to get rid of these seams ?
Even with a cage I can't seem to get it better than this. It's just a low poly cube surrounded by a high poly smoothed cube surrounded by the cage/scaled up low poly.
There's got to be something I'm missing ...
Been trying for 2 hours now and my results only seem to be getting worse.
Low poly is a cube with flat shading or smooth with an edge split modifier, doesn't seem to matter. Plenty of room for edge padding.
High poly is just a smoothed cube around the LP.
The cage is a scaled up version of the LP.
They're individually exported as .obj. I don't know the ideal export settings though.(Include edges, smooth groups, Include normals)
Using the default bucket renderer with 32 pixels of edge padding in Xnormal.
I'm using these settings for LP and Cage Meshes for the latest blender builds (this is with an Edge Split Modifier for both):
For 2.68 and before you can (need to?) also check "Include Normals".
In case you don't get this working I'd suggest to upload a .blend file so someone else can give it a try.
Made a quick test and everything worked fine. All I did was moving LP and Cage Mesh to the same location as the HP and adding a triangulate modifier. Then exported with the settings above. http://www.abload.de/img/cubecagebake001rgjyv.jpg
I wasn't able to compare it with your normal maps because they haven't been packed with the blend file.
Getting there, the right edge looks how I want it's just the others that still have a seam in them. (or am I asking too much here ?)
Also, do you know of any way to change blender to Y- ? I do most of my work for cryengine so it would be easier if I could preview my maps in blender without having to do any manual inverting.
I'm writing a document for my team about best practices for baking normal maps. There are a couple of things I'm still not sure about, I thought I would ask here and see what people think. I've read the wiki and a lot of this thread, but still the answers to these two questions escape me! Here goes:
1) I understand that placing hard edges can give you cleaner results with less distortion, but I'm wondering why specifically. Is this because of the altered surface normals, or is it just because the normal map doesn't have to work so hard to correct the shading on the low?
When baking using averaged projection/cage the low poly mesh normals are ignored... so surely the improved normal map can't be because of the surface normals?
2) Assuming this statement is correct "During the baking process, rays are cast outward from the low poly mesh, and the high poly's surface is sampled at the point where the rays intersect it" How then does a recessed part of the high res mesh get sampled, a part that sits beneath the surface of the low res? At what point do the rays get case the other way?
I've gotten to a point whereI am am fairly happy with my workflow, and have been asked to share it. I just want to make sure I understand why I'm doing the things that I do. I'd hate to pass on bad information to my peers.
I painted the grunge on this skateboard wheel in mudbox. I remade it a couple times and made sure there are no seams in Mud but always shows up when rendering with the normal map in Maya.
Think it's something to do with vectors sharing two UV's which I don't understand. I've tried splitting Uv's, and cutting UV edges - nothing worked.
1) I understand that placing hard edges can give you cleaner results with less distortion, but I'm wondering why specifically. Is this because of the altered surface normals, or is it just because the normal map doesn't have to work so hard to correct the shading on the low?
Essentially, it's because the gradient doesn't have to be accounted for as strongly. Where a normal map is that light blue colour (R/G/B:127/127/255) it means that the normal points in exactly the same direction as the interpolated vertex normal at that pixel. So the more of that colour you have, the less the normal map is having to adjust the normal. That's why if you bake a cube with hard edges, you get a mostly blue map back out and if you bake a cube that's all softened, you get some pretty large gradients in your normal map.
Neither is the best way. So long as your tangent baker/renderer are properly synced, then you should be fine.
When baking using averaged projection/cage the low poly mesh normals are ignored... so surely the improved normal map can't be because of the surface normals?
The normals may be ignored for baking with, which is fine. However, they might also be ignored when it comes to creating the tangent basis, this is bad. The vertex normals form a part of the tangent basis and if the normals used to create it are not the normals from your mesh, you'll get a bad result.
2) Assuming this statement is correct "During the baking process, rays are cast outward from the low poly mesh, and the high poly's surface is sampled at the point where the rays intersect it" How then does a recessed part of the high res mesh get sampled, a part that sits beneath the surface of the low res? At what point do the rays get case the other way?
Technically, the rays are cast inwards, not outwards. The point the ray is cast from is actually projected outwards from the surface along the normal (or the cage, if there is one) and then the ray is fired back down that same path to see if it can hit any high-res geometry. As for beneath the surface, some applications will continue the ray to an infinite distance if there is no high resolution geometry above the low-res surface, some will continue the ray for the same distance it's already travelled, some might double the distance inwards, etc...
I have a question regarding UDK and normals. EQ states that you don't NEED to introduce hard edges where you have UV splits. And it totally makes sense. If I have a mesh in Marmoset it works like a charm (I only have separate SGs where the separate UV islands are). But with Unreal I run into a problem: when I import the mesh, its vertices running along all the UV shell borders are not only split (which is to be expected), but their normals are no longer unified (facing a bit different directions, resulting in a not very prominent hard edge). The import options are all fine.
Here is a part of a mesh:
1) One Mesh, One SG. The seams are where the mesh would be least visible.
[IMG][/img]
2)The same mesh:
A) Imported as is into Marmoset. Everything remains the same. All the vertex normals are unified. Imported as is into UDK. Unreal automatically splits the vertex normals along the UV borders and changes their direction.
3) And as the expected result we have:
C)Smooth mesh with a normal map in Marmoset with no seams.
D) A visible seam in UDK.
Syncing the normals by means of Handplane doesn't help, of course, as, essentially, the mesh that was used for normals projection and the final mesh in Engine have different vertex normals along the UV seams as the result of this Unreal behavior.
The same thing can be observed in Cryengine Sandbox.
My question is why does this happen? What is the purpose of this automatic normal directions alteration. The vertex split is to be expected, of course (it is always there in the first place), but why change the direction of the normals?
To solve this I might have to manually introduce hard edges with explicit normals of the split vertices along the UV borders, which is a choire. Especially in Max, where my options are to manually assign SGs or use Edit Normals, which doesn't really have any loop selection options (when you select Normals by Edge). Textools only allows to convert separate shells into separate SGs, it does not introduce any hard edges along ALL the UV edges.
have you flipped the green channel of your normal map?
Of course, baked in Xnormal with Y-. Had I not flipped it the artifacts would not have appeared as a mere seam. There was another mesh, which I rendered with MAX RTT (I needed the Mat IDs) and had exacltly the same thing (as expected). The mesh changes on import - that much is obvious.
The projection is fine. All the channels are in order. Everything triangulated before bake. Cage is used (though it's a single SG anyway).
This thing is mentioned in the wiki. It's just that I wanted somebody more tech-savvy than me to explain this engine pecularity and possibly better ways to work with it.
I believe UDK will always split uv seams, so you do need to make those edges hard before you bake, because those edges will be hard in UDK. You can try messing with the "Import Tangents and Explicit Normals" in udk whhen importing the model and see if that helps.
so you do need to make those edges hard before you bake
Yeah, I understand that. If only I could find a quicker way of converting all of the UV seams to hard edges...
You can try messing with the "Import Tangents and Explicit Normals" in udk whhen importing the model and see if that helps.
It doesn't, unfortunately. Unreal splits and alters the direction of the normals along the UV seam no matter what.
Just out of curiosity would still be interesting to know why the Engine does that
When a mesh is actually rendered on the GPU, any UV seam has to been split into 2 separate edges, that's why the seam happens. UDK technically could keep the original vertex normals, but it doesn't, it just reads the mesh and splits it.
When a mesh is actually rendered on the GPU, any UV seam has to been split into 2 separate edges, that's why the seam happens.
I don't think that's exactly it. The vertices are always split along UV borders, Mat ID changes, SG changes in any engine - that's true. But the split of the vertices does not automatically imply that the vertex normals should change their direction. Hence my question. In my original post I showed in the picture that, for example, in Marmoset NO such vertex normal alternation happens. It is, naturally, also rendered on a GPU. The vertices along the UV border are still split, but the normals of the split vertices are alligned (facing the same direction), thus shading causes no seam and one might even think there is no split whatsoever (which is a mistake). Long story short - a vertex split doesn't mean misaligned vertex normals. Here's a simple example. In essense it's the same as the one I posted before:
1. A sphere with one SG
2. Each of the polygons is a separate UV shell .
3. The blue lines show the normals. And while there is only one blue line per vertex in the editor, in actuality there are more than one vertex at each of these points, and, thus, more than one vertex normals (the UV splits cause that). But the mesh is smooth. Max Nitrous viewport is also rendered predominantly on the GPU, but there is no seam, because the normals are unified\alligned.
4.We import the sphere into Marmoset - the vertex normal are intact. And to me it seems logical.
5. If wee look at the tangent basis we will see that the vertex normals are still aligned(same good old blue lines
An now we import into UDK. The Mesh now has its vertices split and vertex normals misaligned, resulting in a hard edge.
UDK technically could keep the original vertex normals, but it doesn't, it just reads the mesh and splits it.
Yeah, splits the vertices and changes the vertex normal directions and I just don't get it :poly142: I'm no programmer and I'm almost certain there is a reason for that.
I understand that the extra hard edges that I need to make are no big deal and do not bring any extra verts due to them already being split cause of the UVs.
But this wonderful stickied thread might be a bit confusing for some. EQ writes that:
You do not however, NEED to use hard edges wherever you split your uvs(as some may suggest)
And it's true and it's logical and it's how it should be, but it just isn't in case of Unreal *shrug*. With Unreal you MUST always use hard edges whenever you have UV borders, otherwise you'll get a seam.
Or I could be just dead tired and missing some well hidden option in UDK :smokin:
Replies
-Symmetrized meshes
-Hard edges along UV seams (I tried with soft edges, it's worse)
-Uv are not overlapped for the bake
-It's an FBX export in Xnormal
-Triangulate before bake
I tried with maya cage, xnormal cage, custom cage, and I have the same errors, not matter which.
I tried in OBJ, nothing change.
On some meshes, everyting is fine, I don't understand!
The issue on the left looks like unwelded verts and the one on the right looks like your cage wasn't pushed out quite far enough to encompass the high poly entirely.
But, of course, I could be wrong. And likely am...
And for the artifact on the left, I've checked, the cage is good, the artifact isn't symmetrical, and it appears ONLY in marmoset toolbag! I don't see them in the xnormal viewer!
Thanks for the help though WarrenM
This, mesh with the WRONG normap map on it, have the same kind of artifact! Good or wrong map, the artifacts is here!
I presume it has no relation with the map!
EDIT : in OBJ I haven't this artifact! Must be my FBX export
If the problem persists i would try exporting as an OBJ. Then import it back into maya. Then out as an FBX. Or possibly the FBX converter I've heard about a few times.
Or you could just export an MB file and import it into a new scene to try and clear whatever weird geometry history might be going on.
Would you mind posting an image of the model with no normal map in marmoset? just the low poly.
http://www.polycount.com/forum/showthread.php?t=124191
and found an answer here:
http://www.polycount.com/forum/showthread.php?t=116474
it could be a problem that if you export the normals, toolbag, xnormal or udk will not recalculate the tangents and show some strange effects in lighting.
Offset one half of your mirrored UVs outside the 0-1 UV space and then post a screenshot. That problem usually shows up if you have overlapping mirrored UV islands because of the way Marmoset renders normals.
You are right, the overlapping was the problem!
Thanks everyone for the fast answers!
In the left i added extra loop to control the shading in my normal map...and the right one is without and you can see what a ugly shading i have, also where i have uv split i have different smth group. Adding on all your models extra loops i think is to much geo or is no other way. Please someone enlight me.
Syncing has nothing to do with whether or not there are gradients in your normalmap,they're there to counteract lowpoly smooth shading interpolation - the only issue being that compression artifacts and tangent basis mismatches can make those problematic.
Oops yea, my wording is off. Gradations like that get baked in all scenarios where the low poly geo creates that gradient. In an un-synced workflow the gradation may cause (severe)issues, less issues happen in synced workflows. Sorry about my bad wording.
Yep no problem man. Maybe I should go back and fix that so I don't confuse...
Blender
- Modeled and Unwrapped (High and Low poly. Both smooth shading)
- Exported as .fbx (also tried .obj) (including normals)
Xnormal- Loaded UnityTangentSpace calculator
- Baked Normals (in both averaged and "use exported normals")
Unity3dSee the weird shading? I am running out of things to try...any ideas?
If your FBX version doesn't match the version that xnormal uses, you get bad baking errors.
http://www.polycount.com/forum/showthread.php?t=116474
i got nearly the same problem.
The cube has six UV islands and six smoothing groups yet there's still a visible seam.
The second mesh is 2 UV islands and 2 smoothing groups. Where the smoothing groups meet there's a seam while the rest of the mesh smooths properly.
Is there Anything I have done wrong to cause this or should i use a cage to get rid of these seams ?
Check this wiki entry for an explanation: http://wiki.polycount.com/NormalMap?action=show&redirect=Normal+Map#Working_with_Cages
Even with a cage I can't seem to get it better than this. It's just a low poly cube surrounded by a high poly smoothed cube surrounded by the cage/scaled up low poly.
There's got to be something I'm missing ...
Been trying for 2 hours now and my results only seem to be getting worse.
Do you bake in xNormal, exporting models as .obj from blender?
High poly is just a smoothed cube around the LP.
The cage is a scaled up version of the LP.
They're individually exported as .obj. I don't know the ideal export settings though.(Include edges, smooth groups, Include normals)
Using the default bucket renderer with 32 pixels of edge padding in Xnormal.
For 2.68 and before you can (need to?) also check "Include Normals".
In case you don't get this working I'd suggest to upload a .blend file so someone else can give it a try.
I put a .blend on my dropbox.
https://www.dropbox.com/s/yh5smavrellamfp/BakingTest.blend
I wasn't able to compare it with your normal maps because they haven't been packed with the blend file.
Hmmmm, I keep getting these when I bake.
Also, do you know of any way to change blender to Y- ? I do most of my work for cryengine so it would be easier if I could preview my maps in blender without having to do any manual inverting.
As for inverting the green channel in blender you can probably use a simple texture node setup to do so.
I'm writing a document for my team about best practices for baking normal maps. There are a couple of things I'm still not sure about, I thought I would ask here and see what people think. I've read the wiki and a lot of this thread, but still the answers to these two questions escape me! Here goes:
1) I understand that placing hard edges can give you cleaner results with less distortion, but I'm wondering why specifically. Is this because of the altered surface normals, or is it just because the normal map doesn't have to work so hard to correct the shading on the low?
When baking using averaged projection/cage the low poly mesh normals are ignored... so surely the improved normal map can't be because of the surface normals?
2) Assuming this statement is correct "During the baking process, rays are cast outward from the low poly mesh, and the high poly's surface is sampled at the point where the rays intersect it" How then does a recessed part of the high res mesh get sampled, a part that sits beneath the surface of the low res? At what point do the rays get case the other way?
I've gotten to a point whereI am am fairly happy with my workflow, and have been asked to share it. I just want to make sure I understand why I'm doing the things that I do. I'd hate to pass on bad information to my peers.
I'd be really grateful for any advice.
Have my babies, it works perfectly now.
Think it's something to do with vectors sharing two UV's which I don't understand. I've tried splitting Uv's, and cutting UV edges - nothing worked.
Rendered image:
Texture map:
Please help, all suggestions appreciated!
Neither is the best way. So long as your tangent baker/renderer are properly synced, then you should be fine.
The normals may be ignored for baking with, which is fine. However, they might also be ignored when it comes to creating the tangent basis, this is bad. The vertex normals form a part of the tangent basis and if the normals used to create it are not the normals from your mesh, you'll get a bad result.
Technically, the rays are cast inwards, not outwards. The point the ray is cast from is actually projected outwards from the surface along the normal (or the cage, if there is one) and then the ray is fired back down that same path to see if it can hit any high-res geometry. As for beneath the surface, some applications will continue the ray to an infinite distance if there is no high resolution geometry above the low-res surface, some will continue the ray for the same distance it's already travelled, some might double the distance inwards, etc...
http://www.flickr.com/photos/106486657@N03/10457954894/
Should I change the projection cage? I tried changing it, but it didn't have any affect on the diagonal seams.
Here is a part of a mesh:
1) One Mesh, One SG. The seams are where the mesh would be least visible.
[IMG][/img]
2)The same mesh:
A) Imported as is into Marmoset. Everything remains the same. All the vertex normals are unified.
Imported as is into UDK. Unreal automatically splits the vertex normals along the UV borders and changes their direction.
3) And as the expected result we have:
C)Smooth mesh with a normal map in Marmoset with no seams.
D) A visible seam in UDK.
Syncing the normals by means of Handplane doesn't help, of course, as, essentially, the mesh that was used for normals projection and the final mesh in Engine have different vertex normals along the UV seams as the result of this Unreal behavior.
The same thing can be observed in Cryengine Sandbox.
My question is why does this happen? What is the purpose of this automatic normal directions alteration. The vertex split is to be expected, of course (it is always there in the first place), but why change the direction of the normals?
To solve this I might have to manually introduce hard edges with explicit normals of the split vertices along the UV borders, which is a choire. Especially in Max, where my options are to manually assign SGs or use Edit Normals, which doesn't really have any loop selection options (when you select Normals by Edge). Textools only allows to convert separate shells into separate SGs, it does not introduce any hard edges along ALL the UV edges.
The projection is fine. All the channels are in order. Everything triangulated before bake. Cage is used (though it's a single SG anyway).
This thing is mentioned in the wiki. It's just that I wanted somebody more tech-savvy than me to explain this engine pecularity and possibly better ways to work with it.
Same things have happened before (the last post):
http://www.game-artist.net/forums/support-tech-discussion/13103-normal-seams-unreal.html
Just out of curiosity would still be interesting to know why the Engine does that
I think TexTools for 3ds Max has a button for that, http://www.renderhjs.net/textools/ if not, there's a script for that in this thread http://www.polycount.com/forum/showthread.php?t=71406
1. A sphere with one SG
2. Each of the polygons is a separate UV shell .
3. The blue lines show the normals. And while there is only one blue line per vertex in the editor, in actuality there are more than one vertex at each of these points, and, thus, more than one vertex normals (the UV splits cause that). But the mesh is smooth. Max Nitrous viewport is also rendered predominantly on the GPU, but there is no seam, because the normals are unified\alligned.
4.We import the sphere into Marmoset - the vertex normal are intact. And to me it seems logical.
5. If wee look at the tangent basis we will see that the vertex normals are still aligned(same good old blue lines
An now we import into UDK. The Mesh now has its vertices split and vertex normals misaligned, resulting in a hard edge.
Yeah, splits the vertices and changes the vertex normal directions and I just don't get it :poly142: I'm no programmer and I'm almost certain there is a reason for that.
I understand that the extra hard edges that I need to make are no big deal and do not bring any extra verts due to them already being split cause of the UVs.
But this wonderful stickied thread might be a bit confusing for some. EQ writes that: And it's true and it's logical and it's how it should be, but it just isn't in case of Unreal *shrug*. With Unreal you MUST always use hard edges whenever you have UV borders, otherwise you'll get a seam.
Or I could be just dead tired and missing some well hidden option in UDK :smokin: