Hello polypeople,
I have created several high poly models and i've retopod them so they have an acceptable polycount. When it comes to creating UVs for these low poly models i seem to be having problems with visible seams in my end result. I've looked up a lot of guides and asked many a forum, but so far nobody really managed to solve my problems.
I think it's easiest to explain by using an example.
I've created this concrete barrier. I didn't use a cage to bake it, so the 90° angles on the model aren't going to be perfect. If i had used a cage though i would say these UVs are pretty much perfect since there are almost no visible seams to it. Yet most people tell me i should completely change my UV layout for better results. The biggest problem in the example below is that the sides of the barrier have really weird lighting results that i can't get rid off even though they're flat.
https://www.artstation.com/artwork/0zxYVThe second example with 'cleaner' UVs ends up adding visible seams to my model. As you can see in artstation where the concrete is broken, it adds a hard line between the broken and non-broken concrete. I really want to know how i can hide these seams.
And again i'm stuck with really weird lighting on the sides of the barrier.
https://www.artstation.com/artwork/WPNK3
If it helps, i could share the low/high poly .obj files so you can do it yourself and educate me in the process.
Replies
http://wiki.polycount.com/wiki/Normal_Map_Technical_Details#Synched_Workflow
I also managed to get rid of most of the seams. My best guess is that my low poly topology has to be optimized so i can blend in the smoothing groups better.
If i could ask a quick question about synching two programs to solve seams. I read the guides you linked, but i'm having trouble understanding parts of it. What would i have to more or less if i made a high poly in zbrush, exported the high/low poly files as .obj files and baked normal maps with xnormal. If i then wanted to place these in both marmoset toolbad and cryengine, would i have to do anything specific to synch the workflow?
A synced workflow means that both the baking and destination software are using the same tangent basis. In the case of xnormal, UE4, and substance for instance, mikktspace.
It means worrying about UV splits/smoothing groups isn't really an issue, and eliminates the usual problems that arise from baking to non-synced.
TB2 has a mikkt option, I believe, but I don't know about Cryengine.
The following normal map has been baked in Xnormal with a cage.
http://i63.tinypic.com/142699u.jpg
Cryengine:
Substance Painter:
Marmoset TB2:
Anyway, i've done similar tests like him and sometimes i get better results than normal. But when i change the program to view the model + normal map in the results are usually worse. So since everyone keeps asking me if i synched the workflow or whatever i have no clue. And even though i received some links with information on it it still doesn't make sense to me.
Could you try to explain what this synching and tangent basis mumbo jumbo is somehow?
in 3dsmax:
-give everything the same smoothing group
-export as an fbx with only smoothing groups, triangulate and preserve edge orientation selected
in SP:
-upon importing select 'compute tangent space per fragment'
-bake your normal map in SP (not much is specified here, i just kept the default settings while adjusting the max frontal and rear distance a little. Default being where 'relative to bounding box', 'average normals' and 'ignore backfacing' is selected.)
My results:
-the seams seem to be mostly dissapeared which is a very good start! However if you look at the right side there is a lot of funny stuff going on. (i remade my lowpoly cause i was desperate and i suppose this could have something to do with it)
-Yikes!
I want to fit my end result in Cryengine however, but i'll forget about that step for now until i'm able to bake normals like an absolute boss in substance painter. What do you think the issue is right now? My lowpoly? Take a look https://www.artstation.com/artwork/oxaDq
Using this method and since apparently both 3dsmax and substance painter are using mikktspace. Does this mean that i can use this model + normal map in any program that also uses mikktspace and getting the same result? (i have no clue what mikktspace is still, googling doesn't really help me either)
mikktspace is just the name give to this particular tangent basis algorithm. Thankfully most software that deals in normal maps seem to be adopting it. This is good news as it eliminates a lot of the old issues with trying to get similar results in different software with NMs.
And what do you mean with specifying 'normals'? Do you mean checking the option 'Tangents and Binormals' when exporting an .fxb file in 3dsmax?
This might not be your issue, but it might be one thing throwing it off if everything else is working as it should.
I'm still looking after the right knowledge and thinking process of what normal mapping is exactly so that when i have to create one for a different object or in a different rendering program i just know what to do. I believe i know what steps i have to follow to get a decent normal map inside SP and Cryengine, but i don't really understand most of them sadly. :c
By 'specifying normals' in UE4, I mean there is an option upon .fbx import to 'Import Normal Method' - choose 'Import Normals' This tells UE to import the T/Bi-N from the .fbx, and that's why it's important to UNCHECK that option upon .fbx export - because UE will calculate them itself in the same way SP does when the 'compute tangent space per fragment' option is checked.
I know it seems a bit confusing, but it really isn't. A synced workflow avoids so many of the pitfalls associated with the more traditional method of normal map creation.
You might be asking yourself: "what in the hell did you do with your UV unwrap bro?" Well, since i'm trying to find out how to bake like a boss, it shouldn't really matter howmany uv splits i use!
So i know i didn't upload a picture of the normal without baking with a cage, but take it from me it looked great! (in SP at least)
Why does my normal start to look like a world space normal when i use a cage? The baking process looked just fine when i used the cage in Xnormal.
What do you mean exactly with "don't use the substance normal export"? I exported out of 3dsmax as an .fbx and uncheccked bi-normals/tangents and on importing into SP i checked the 'compute tangents' option which if i understand correctly should be the right way to do it.
Thanks for all the help everyone! (especially musashidan )
Great stuff. Glad it worked out for you in the end mate.