It looks like you haven't set up your smoothing groups properly. What I usually do is give my low poly soft edges and then anywhere there is a hard angle I set that to hard. You then you need to split your uv's at your hard edges. If you don't you will get those weird lines. Hope this helps.
Don't even bother with UV/smooth splits. Set the entire mesh to smooth, export .fbx with tangents bi-normals unchecked, import to SP with compute per-fragment checked (default) bake should be perfect at default ray-distance (.02)
musashidan Can you tell me how the hell that works? I just tried it on a cube and it looks amazing! The normals look weird but it looks fine in the veiwport. If you can't be bothered explaining this to a beginner like myself, a shortcut to some documentation of any sort would be appreciated. I'm very confused but happy about this technique. Thanks.
@Nerdicon3000 it's an averaged vert bake using mikkt tangent space, so any other software (which is most of them now) that also uses mikkt will be 'synced' I'm going to work now but when I get home I'll post some thread links from here where I helped others and explain the method.
Don't even bother with UV/smooth splits. Set the entire mesh to smooth, export .fbx with tangents bi-normals unchecked, import to SP with compute per-fragment checked (default) bake should be perfect at default ray-distance (.02)
That's cool, I have never dared doing cageless bake before but this might save some time. Little nitpicking: by default 'compute per-fragment' is turned off :P
@ant1fact Yes, cage is not needed in this workflow. Strange. I've been exclusively baking with SP for 18 months and 'compute per-fragment' has never not been on by default.
- Low and high silhouette matching closely - single smoothing group(no UV/SG splits or cage necessary) -
export .fbx - low-poly has smoothing groups checked and has already
been triangulated. Tangents&Bi-Normals unchecked(SP will look after
this - see next step) - SP import - compute tangent space per fragment checked - Bake with between .01 and .04 ray distance -
perfect results every time,(provided you follow the steps) and because it's using mikkt with T&BiN
being calculated by baker you are guaranteed that it will be perfect in
any other software that is synced - mikkt.
The normal map looks like that because it is an averaged-vert bake. The gradients are compensating for a single smoothing group across perpendicular surfaces. But it doesn't matter as both baker and destination engine are synced and both compute the same tangent basis. The reason a UV/smooth split normal map looks the way it does is because the normals terminate at the splits and isolate flat surfaces, which means that the RGB data encoded in the map is consistent.
Also, an extra benefit of this workflow is that you can unwrap your UVs without needing to worry about splitting islands, which, in turn, means lower on-card vert count.
The only trade-off/caveat I see with this workflow is that sometimes you
can get compression/mipping artifacts in-engine, but that is something
you want to test out yourself and may be unnoticeable in most cases.
The only trade-off/caveat I see with this workflow is that sometimes you
can get compression/mipping artifacts in-engine, but that is something
you want to test out yourself and may be unnoticeable in most cases.
And those issues are only a big problem when you have a really low poly model with lots of sharp angles and very little geometry. Throwing in an extra support loop in those cases and problem areas can easily and quickly fix those artifacts. Or if you want to get fancy, face weighting geometry.
But for a portfolio piece, you probably don't need to worry about mipps, compression, or LODs.
Thanks heaps for the info @musashidan I really hated having to split at hard edges all the time. Some of my uv's were a mess because of it. I will be using this technique most of the time from now on I think I will be keeping an eye on some of your other threads too. I have always been afraid to move away from sub-d modelling but some of your techniques with booleons look pretty awesome.
@Nerdicon3000 should have a video on my channel soon about this very process.
I wouldn't pack in sub-modeling just yet. It is still the backbone of the majority of hard-surface modeling I do. These boolean techniques are great, but should be used wisely. They work excellent on some models and are inferior to sub-d on a lot of others.
Replies
@Nerdicon3000 Here's a description of the workflow:
- Low and high silhouette matching closely
- single smoothing group(no UV/SG splits or cage necessary)
- export .fbx - low-poly has smoothing groups checked and has already been triangulated. Tangents&Bi-Normals unchecked(SP will look after this - see next step)
- SP import - compute tangent space per fragment checked
- Bake with between .01 and .04 ray distance
- perfect results every time,(provided you follow the steps) and because it's using mikkt with T&BiN being calculated by baker you are guaranteed that it will be perfect in any other software that is synced - mikkt.
The normal map looks like that because it is an averaged-vert bake. The gradients are compensating for a single smoothing group across perpendicular surfaces. But it doesn't matter as both baker and destination engine are synced and both compute the same tangent basis. The reason a UV/smooth split normal map looks the way it does is because the normals terminate at the splits and isolate flat surfaces, which means that the RGB data encoded in the map is consistent.
Also, an extra benefit of this workflow is that you can unwrap your UVs without needing to worry about splitting islands, which, in turn, means lower on-card vert count.
The only trade-off/caveat I see with this workflow is that sometimes you can get compression/mipping artifacts in-engine, but that is something you want to test out yourself and may be unnoticeable in most cases.
Have a look at this thread where the I baked out the OP's problem mesh using the workflow I'm describing: (second last post)
http://polycount.com/discussion/comment/2443944#Comment_2443944
And also this thread where I did same for OP using same workflow:
http://polycount.com/discussion/comment/2441974#Comment_2441974
And this thread:
http://polycount.com/discussion/comment/2428355#Comment_2428355
And those issues are only a big problem when you have a really low poly model with lots of sharp angles and very little geometry. Throwing in an extra support loop in those cases and problem areas can easily and quickly fix those artifacts. Or if you want to get fancy, face weighting geometry.
But for a portfolio piece, you probably don't need to worry about mipps, compression, or LODs.
Thanks man! Ill go right ahead and try this workflow when I get home!
@Nerdicon3000 should have a video on my channel soon about this very process.
I wouldn't pack in sub-modeling just yet. It is still the backbone of the majority of hard-surface modeling I do. These boolean techniques are great, but should be used wisely. They work excellent on some models and are inferior to sub-d on a lot of others.