Home Technical Talk

Making sense of hard edges, uvs, normal maps and vertex counts

13468915

Replies

  • EarthQuake
    Bit off topic but in light of the recent sexist awareness I have to point out that the title of this thread is potentially offensive.

    LOL is this some early april fools trolling?
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
    Let me see if I'm finally understanding all of this:

    1) Average Projection cage and Explicit Projection cage determines how the normals will bake. Average Projection does just that, it averages the normals together between the low and high poly completely ignoring any smoothing groups you apply. Explicit Projection uses the low poly's normals specifically to determine the direction for the bake so any smoothing groups will show up in the bake in the form of gaps. Think of an Average Projection cage as a clone with one smoothing group that when baked, will ignore hard edges by filling these projection gaps and eliminating any potentional seams. Explicit Projection wouldn't do anything to counter these seam errors. Average Projection is the preferred choice.

    2) Once we're passed which projection type to use we get to the meat and potatoes. Whenever you have a hard edge you need to split them. If I'm only using a normal map on part of a model does this rule apply on the parts that are not being normal mapped? You can apply the hard edges manually and then make each UV island based on those smoothing groups or create the UV islands where the smoothing groups will be and run a script to create the hard edges automatically. Don't add any unnecessary hard edges.

    However, if you're using a synced workflow with an Average Projection it means your baker and game engine are going to provide an accurate representation of the normal map. This provides you the opportunity to avoid the whole "you must have uv splits where there's a hard edge". In most cases you'll want to continue to use hard edges because it'll cut down on the gradients, and you can use the same normal map for LODs/won't have horrendous mip maps. The best part is they're free meaning you aren't adding any additional geometry. Maybe that doesn't matter to you so you stick with just using one smoothing group to make unwrapping/texturing easier. That's fine and dandy. You're just being more artistically efficient and less technically efficient. If you're not using a synced workflow you're stuck with having to split your uvs wherever there's a hard edge. With that in mind you'll want to cut down on the amount of hard edges you use. This is best done by applying bevels which in turn will cut down the in game vert count. How do you know if you're using a sync workflow? Is there some sort of list of engines/progam combos?

    3) Before you go about baking you should triangulate your mesh. If you do this after you bake it's going to render a useless normal map since the map's originally associated to the actual normals of the low poly. Triangulating the mesh can alter the direction of the normals so it's best to at the very least run a test bake to see if there's a difference.

    4) What's the difference between padding an the uvs in the unwrap vs baking with padding?
  • leleuxart
    Offline / Send Message
    leleuxart polycounter lvl 12
    I've been following this thread since the beginning, but haven't been able to keep up with each page, so I apologize if this was answered, but

    How would this apply to someone that currently only works in Maya, xNormal, and UDK? Some of the information was losing me, just because of the translation from Max to Maya. I don't know enough of Max to completely switch either. For a recent asset, my workflow was:

    1) Model basic high poly asset and low poly in Maya
    2) Sculpt high poly and use that as the final high poly version
    3) Smooth all edges on the low poly in Maya, then harden all of the UV seams/edges

    Used default settings in xNormal, except with a cage from the 3D Viewer, and got a pretty good result. My low poly had to have lots of bevels though; would I normally have to harden the UV seams and add bevels to the low poly?

    Sorry that this seams relatively basic, just having a hard time grasping some of the info.
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
    From what I understand you want to always use an Average Projection. What this means for xnormal is you need to export a cage. Theres another alternative to that within xnormal but I'm spacing on it. Its a really easy alternative. The reason why you want to use an Average Projection is because the cage will ignore any hard edges when you do bake. An Explicit Projection uses the normals from the low poly and transfers the hard edges to the normal map so when applied to the model the normal map will show the hard edges.

    A synced workflow is the next thing to worry about. From what I've been told the chances youre usinga synced workflow is slim. That's where Alec Moody's software Handplane comes in. I assume creating a synced workflow is a bunch of technical jargon. Alecs Handplane creates a makeshift sync flow. You bake lime normal but you bake out an object space normal map instead of a tangent space one. You import the object space normal map and the low poly into Handplane and it kicks out a tangent space normal map as if it were baked in a synced system. A synce workflow with an Average Projection mesh has pros and cons:

    1. You can use one smoothing group but you'll run into shading errors when you apply a lower resolution normal map. These shading errors will look worse and worse the lower the resolution normal map you use which you will see with LODs.
    2. To counter these shading artifacts you need to put extra edge loops that will control the shading but also inflate the vert count. You get to keep the uv seams down to a minimum however since you don't have any hard edges to worry about.
    3. The third option makes use of hard edges. You'll place hard edges wherever there's a uv seam, which is why everyone says you get hard edges for free. Theres already a split there. I guess the reasoning behind using this method is if you're using a synced workflow you don't have to have hard edges everywhere and can have a combo of soft edges? No matter what you do you have to split your hard edges and uv seams. This method can potentially create more seams but limits the vert count. The best thing to do is combine this method and the smoothing group method that has you adding some extra edge loops to control the shading.

    Not sure if that's confused you more. Hopefully I'm accurate in all areas. Dont want to mislead anyone or myself. Basically a synced workflow or using Handplane lets you use one smoothing group or any combination of soft and hard edges. Using no hard edges is the worse off option.
  • Eminor
    Thanks for this info!
  • Nihlus
    400644_2081812243640_117131051_n.jpg

    I don't know why I can't get a decent map out of anything. Ever. I've spend an entire night until morning trying different things. Hardening edges on seams, making sure the right stuff is checked during obj export, baking using a cage, splitting the uvs, different smoothing groups, the works. I don't know what to do at this point. I've looked on here and read tons of posts by earthquake and whatnot, and I still get this. It slopes in all funky on these faces and I just don't understand what's up with these gradients... I feel so dumb!
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    Nihlus wrote: »
    400644_2081812243640_117131051_n.jpg

    I don't know why I can't get a decent map out of anything. Ever. I've spend an entire night until morning trying different things. Hardening edges on seams, making sure the right stuff is checked during obj export, baking using a cage, splitting the uvs, different smoothing groups, the works. I don't know what to do at this point. I've looked on here and read tons of posts by earthquake and whatnot, and I still get this. It slopes in all funky on these faces and I just don't understand what's up with these gradients... I feel so dumb!

    There is nothing wrong with the gradients you are seeing. They are there to compensate for the shading on the low poly mesh and are perfectly correct.

    What are you baking/viewing in?
    Have you made sure that your normal map swizzle of the renderer matches the engine/application you are viewing the mesh + Normal map in? (Maya, Marmoset, Unity should be X+Y+Z+ & Max, UDK, CryENGINE 3 should be X+Y-Z+).

    If that is correct then following these rules should get you a great bake.
    1. If it has a hard edge in 3d space, it must have a break in UV space and padding between it and other islands.
    2. Triangulate your mesh before baking
    3. Use the same mesh that you baked with in game
    4. Use a cage.

    Try this out and see if it works.
    1. Unwrap the mesh as you would normally
    2. Triangulate your mesh
    3. Set the borders of the UV islands to hard
    4. Make and set up a cage/set up the envelope correctly so that it fully encompasses the high poly mesh
    5. Bake
    Hope that helps!
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
    Noob question but what do you mean by the gradients are trying to compensate for the shading of the low poly? You mean the low trying to match the high?
  • WarrenM
    Like, look at the mesh in lighting only mode in UDK. If you see weird smoothing artifacts running across it, the normal map will have a gradient in it to try and remove that artifact. Basically the more smoothing artifacts you have, the harder the normal map has to work to compensate and make the surface appear smooth.

    There's nothing inherently wrong with making a normal map work for its money (aka having gradients) but just be aware that fixing the smoothing on the low poly is probably a better option through vertex normals or baking with a proper cage.
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    Chase wrote: »
    Noob question but what do you mean by the gradients are trying to compensate for the shading of the low poly? You mean the low trying to match the high?

    The gradients from the smoothing that you see on the low poly are baked baked directly into the normal map. If you look at the examples below you can see that the same gradients are present on the low poly and the normal map, though the gradients on the normal map are a direct inverse of the gradients on the low poly mesh.

    Hard edges/multiple smoothing groups to the left, Soft normals/one smoothing group to the right
    Hard_vs_Smooth_Vert_Normals_Gradient_zps925f11d9.jpg:original

    Hard edges/multiple smoothing groups to the left, Soft normals/one smoothing group to the right
    Cubes_TS_normals_Gradient_zps0f10699d.jpg:original
    This information is used to work out the difference in the shader so that even when the gradients are really strong it will still render correctly when applied (assuming that the baker and render are using the same matching tangent basis).

    Hard edges/multiple smoothing groups to the left, Soft normals/one smoothing group to the right
    Hard_vs_Smooth_Normal_Map_zpsd305b594.jpg:original
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
  • joeriv
    Offline / Send Message
    joeriv polycounter lvl 7
    Maybe someone else can give a more technical answer but this is how I understood it, normal maps are basicly encoded in a certain way, and it needs to be decoded.

    Maybe a bit stupid to explain it like this and probably doesn't make that much sense but:

    Synced would be:
    Baking: 5+5=10
    Renderer: there is a 10 here, so that means I should do 5+5

    Not synced, would be baking with 5+5=10 but rendering with 5*2=10
    It's not wrong, but it just isn't the same, so you have to do things to compensate that. (this being breaking smoothing groups at 90 degree angles).

    X/Y/Z being different would be just a matter of the data being -10 instead of 10, and if you know that, basicly just a matter of inverting it (inverting a channel in PS)
  • EarthQuake
    Chase wrote: »
    Ok so the gradients show up whenever there is a smoothing error which is simply just having soft edges in places that would normally be hard. If you have a synced workflow or using Handplane you can get away with these gradient errors in your normal map no matter how many shading errors there are. Just know that the more smoothing errors your low poly has the more intense the gradient in the normal map will be. This is because the normal map is trying to compensate for these shading errors of the low poly to make the model appear smooth and error free with the normal map applied. Again, this is ok only if you have a synced workflow or are using Handplane. Did I get the gist?

    No, gradients do not denote errors or anything of the sort. Gradients simply are there to compensate for your lowpoly vertex normals, no matter what those vertex normals look like. The more extreme your vertex normals are, the more extreme the gradients you'll see in your baked map. You can never spot an "error" by looking at the content of the normal map, smoothing errors and things of that nature will only show up when the normal map is applied to a model.

    Gradients aren't in any way inherently bad, however, the more your normal map has to compensate for extreme mesh normal, the more likely you will have smoothing errors or other artifacts pop up, especially if your pipeline isn't synced.
    Side note about a synced workflow....I know that this is when your baker to render the normal map matches the engine that will display the normal map. What does this exactly mean? Is it when your baker and engine's X,Y, and Z are all the same?
    Most apps calculate what is called a "tangent basis" a little differently, specifically, the normals, bi-normals and tangents. Put simply, the math varies a bit, for the best results you want that math synced up between the app you use to bake the map and whatever you're using to render/view the map.
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
    So gradients aren't a bad thing to have? Alecs video on Handplane was about how these gradients are bad. They show up in order to compensate for what the low polys normals are doing to create the appearance of the high poly. When you say compensate for what the low polys normals are doing to appear high poly this entails making the low match the high in general as well as when you use a combo of smoothed and hard edges like Andy's box example?
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    Chase wrote: »
    So gradients aren't a bad thing to have? Alecs video on Handplane was about how these gradients are bad. They show up in order to compensate for what the low polys normals are doing to create the appearance of the high poly. When you say compensate for what the low polys normals are doing to appear high poly this entails making the low match the high in general as well as when you use a combo of smoothed and hard edges like Andy's box example?

    The gradients are not bad in and of themselves. They are totally acceptable, correct and work perfectly well if the tangent basis of the render and baker match (The encoder is the exact inverse of the decoder).

    However, problems can and do arise when the tangent basis of the encoder and decoder (baker and engine etc.) do not match, but this does not mean that the gradients are bad, just that the math is incorrect between the normal map and renderer. This is when Handplane comes into its own as it allows direct conversion between different tangent spaces. Handplane will not remove any gradients that are present in the normal map, they will just be encoded correctly for the target engine.

    Having said that, there are most definitely situations where having a bake with as few gradients as possible is beneficial and even desirable. There are a few reasons for this.
    1. Gradients resize and compress badly.
    2. It can be harder to texture when strong gradients are present.
    3. It is possible to have a deficit of texel density when using smaller textures where there is not enough resolution to draw gradients accurately, even on large textures.
    4. When using Mipmaps, normal maps with strong gradients will break faster where as textures with fewer gradients will hold up for longer.
    5. Often the Tangent basis between the baker and renderer do not match which can cause errors with normal maps that contain strong gradients, even without mipmaps.
    Here you can see the cube test I did earlier with various declining texture sizes from top left to bottom right.
    Notice that as the resolution gets smaller, the cubes that have the normal map with the strong gradients become much more pixelated.

    Mips_Uncompressed_zps2a792053.jpg:original

    This is compounded when DXT (or in this case 3Dc) compression is used along side declining resolution. This is the worst case mesh that EQ made a while back, but shows the problem perfectly.
    DDS_3Dc_Compression_zps03b2b672.jpg:original

    In most situations you probably just best reducing the gradients simply for ease of use and because they are more flexible when using a baker that doesn't match the renderer and therefore allow for greater transferability. In most cases you should be fine just adding hard edges around the borders of your UV islands as this will reduce the gradients enough so that they are much less of a problem. It also makes sense to do this as it is a free way to reduce this issue (no extra vert. cost).
  • EarthQuake
    Having less gradiation in your normal maps is generally a good thing for all of the points Andy mentions here(as well as the points in the original post, and Alec's video).

    However, I really like to stress that gradients are not in some way evil. I've seen inexperienced artists try to "remove" gradients by painting over them in photoshop somehow thinking that gradients are the cause of smoothing errors. This is a very silly and totally incorrect thing to do, and will only make matters worse.

    If your pipeline isn't synced up, what you should really be thinking about is using less extreme mesh normals, don't worry about how much gradiation your normal map has, that will come naturally with a well set up lowpoly mesh. So this means using hard edges + uvs splits, or bevels wherever is appropriate to limit harsh normal angles. If your normal map is full of extreme gradients, this is a good sign it won't play nice with a non-synced workflow, however, you should be able to tell simply by looking at the shading of your lowpoly mesh if you have extreme changes in vertex normals before you even bake.
  • mister_s
    Oh my god, I understand normal maps. Thank you EQ. Up until this point whenever something went terribly wrong (and it happened often) I just shrugged and said that magic is never perfect.
  • mister_s
    The above being said that reminded me I actually had a question. I have a test bake and I wanted to know if there is anything that can be done (or maybe something I did wrong) about very ugly lines when viewed from near.

    iiGtYKY.jpg

    When you get near that cube some of the lines inside get really pixelated. When I import that into UDK and a character walks near it it looks hideous. Any suggestions?
  • EarthQuake
    mister_s wrote: »
    The above being said that reminded me I actually had a question. I have a test bake and I wanted to know if there is anything that can be done (or maybe something I did wrong) about very ugly lines when viewed from near.

    iiGtYKY.jpg

    When you get near that cube some of the lines inside get really pixelated. When I import that into UDK and a character walks near it it looks hideous. Any suggestions?

    Up the texture resolution, up the anti-aliasing, up the super-sampling(depending on what app you're baking in) would all help.
  • mister_s
    Sounds good, sir. I'll give it a go. The texture resolution was 2048, but the anti-aliasing and super-sampling could definitely be upped more. I just wasn't sure if it was a rendering thing or if I missed something important.

    Thanks again!
  • Nihlus
    There is nothing wrong with the gradients you are seeing. They are there to compensate for the shading on the low poly mesh and are perfectly correct.

    What are you baking/viewing in?
    Have you made sure that your normal map swizzle of the renderer matches the engine/application you are viewing the mesh + Normal map in? (Maya, Marmoset, Unity should be X+Y+Z+ & Max, UDK, CryENGINE 3 should be X+Y-Z+).

    If that is correct then following these rules should get you a great bake.
    1. If it has a hard edge in 3d space, it must have a break in UV space and padding between it and other islands.
    2. Triangulate your mesh before baking
    3. Use the same mesh that you baked with in game
    4. Use a cage.
    Try this out and see if it works.
    1. Unwrap the mesh as you would normally
    2. Triangulate your mesh
    3. Set the borders of the UV islands to hard
    4. Make and set up a cage/set up the envelope correctly so that it fully encompasses the high poly mesh
    5. Bake
    Hope that helps!

    374980_2083149717076_755579082_n.jpg
    (Now I just have to worry about this AA issue a friend of mine brought to my attention. As you can see... YUCK! The rest of the stuff I do will definitely need a smoother high poly.
    417904_2082378217789_572610531_n.jpg
    Thank you, sir! That worked out pretty well.
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
    Gradients are due to the vertex normals being altered in some way. This means you have edges sharing the same SG instead of having a hard edge. Where the gradients show up on the normal map will directly correlate to where the vertex normals have been altered on the model. The more softened normals there are the more the normal map has to work to compensate for them being there in order to match the smoothing between lp and hp, and the more likely you are to get shading errors/artifacts. Gradients aren't the cause of errors, editing the vertex normals are. To lessen these errors you need to use a synced workflow/Handplane. You should also be limiting the amount of softened vertex normals in the model by using hard edges at least where the uv splits are. This will lessen the amount of gradation since there'd be less soft edges, ergo less errors.
  • EarthQuake
    Chase wrote: »
    Gradients are due to the vertex normals being altered in some way.

    Nope. Nothing to do with altering vertex normals. Vertex normals are generally soft(averaged) or hard(split along an edge). Vertex normals can also be manually edited, but thats a more advanced topic.
    This means you have edges sharing the same SG instead of having a hard edge.
    Not really here nor there.
    Where the gradients show up on the normal map will directly correlate to where the vertex normals have been altered on the model.
    Nope. More extreme gradients equate directly to more extreme vertex normals. By extreme vertex normals I mean, the vertex normal is facing in a direction other than the face normal. Two adacented faces at 90 degrees will create an averaged vertex normal between those two faces, which essentially means = harsh shading.
    The more softened normals there are the more the normal map has to work to compensate for them being there in order to match the smoothing between lp and hp, and the more likely you are to get shading errors/artifacts.
    Sort of, depends on what you mean when you say "softened". The less extreme your vertex normals all, or the less harsh your shading is, the less work your normal map will have to do. In this case having "softer" vertex normals would be a good thing, so "soft" is sort of an ambiguous term here.
    Gradients aren't the cause of errors,
    Yep
    editing the vertex normals are.
    Nope
    To lessen these errors you need to use a synced workflow/Handplane.
    Yep

    or

    By using less extreme vertex normals, which can be accomplished via using hard edges, or by adding bevels to soften your shading.
    You should also be limiting the amount of softened vertex normals in the model by using hard edges at least where the uv splits are. This will lessen the amount of gradation since there'd be less soft edges, ergo less errors.
    Again sort of odd wording, less extreme vertex normals are what you want, but you've got the idea I think.

    honestly though dude, you need to spend like 10x more time trying these theories out, and then posting when you run into issues. Trial and error is really how you're going to learn to understand this stuff.
  • WarrenM
    This might be a lack of caffeine or something but a problem popped up in my head.

    Say I have a set of custom vertex normals on my low poly (say, face weighted) and I then use that low poly mesh to generate a baking cage for XNormal.

    When XNormal is ray casting the normal map is it using straight interpolation from vert to vert or is it respecting the vertex normals on the cage and/or low poly and factoring them in? I imagine that it IS using them but for some reason my head is having a hard time reconciling this morning...
  • Farfarer
    They're separated out two two different sets of vectors, essentially. One used for the baking an object space map (uses the cage), one used for converting the object space map to a tangent space map (uses your low poly's normals).

    The baking process raycasts out along the cage vectors and records the normal of the high poly (it records them as if they're object space normals for the low poly).

    If you don't give it an explicit cage, it will simply make one up based on your choice (explicit or averaged vertex normals). This only affects which bit of the high poly mesh is recorded - in object space - to which bit of the low poly UVs.

    Once that's done, it takes those object space normals and converts them into a tangent space normal using the vertex normals of your low poly.

    Net result; doesn't matter if your bake cage and vertex normals point in different directions, you'll still get the correct bake result for your low poly mesh.
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
    I think my phrasing of "altering vertex normals" when I really meant "changing from hard to soft" might have been misleading. Here's what I should've said as it seems like we're on the same page. I'm well aware I need to just put this into practice to fully comprehend what I'm saying though.

    Gradients only show up in the normal map when harsh edge angles are set to soft. For example, a box has all of its edges set to 1 SG. The gradients in the normal map are the direct inverse of the shading of the box, ergo the shading result of 1 sg results in the gradients in the normal map. Having said that gradients are neither bad or good. They're just there to compensate for what the shading of the model is doing. The gradient is in essence going the opposite direction of the shading on the model to compensate for it. If you're pipeline is synced or are using handplane you can certainly use 1 SG for the entire model.

    What you need to remember is when you apply a lower resolution normal map you'll get artifacts showing up because the normal map isn't of a high enough resolution to compensate for the gradients....not sure about this part

    This leads to the idea of using both edge loops/hard edges to control potentially bad shading. You can put hard edges at uv splits because that's where the model's been split already and it won't add any extra verts. This is what you mean by having "free" smoothing splits. Edge loops will also minimize any other shading problems. Using one over the other or even both is a case by case basis to see how the vert count is effected.

    With handplane you're also free to unwrap how you wish where as if you weren't synced/using handplane you'd have to split the uvs wherever your model needed a hard edge. You'd have to do this because one edge's pixel information would be trying to blend over into the other, but isn't doing so equally. You need the padding of a smoothing split in order to get equal pixel blending. If you don't, this will make artifacts show up along the edge. Assuming this is mildly correct, I think it's a less jumbled version of what I had written out before.
  • ratatatatat
    Hi. I've read this post for a couple hours now and have a few questions. I didn't get through all the pages, so I may ask answered questions but I was searching for answers and decided asking might be faster. okok

    1. I use 3ds max to bake my normals and AO, using projection and cages. I use Marmost to render assets and UDK for environments. Is there any benefit to using Xnormal as far as quality is concerned?

    2. I use Alex Moody's workflow that he used in the Briefcase tutorial from 3Dmotive. EXACTLY the same for all my non organic assets. Is this workflow still good?

    3. Explained basically, What exactly is a synced normal workflow? Can anyone give an example? Is Alex Moodys technique a synced workflow?

    4. lastly, when baking an assets in xnormal with multiple parts, I get pieces baking onto each other. For example, if I have an overhanging cord or pipe, it will bake onto the faces it is attached to. In 3dsmax, you can use a multisub material and material ID's, set up RTT to 'hit only matching material ID's' and not have this problem. Is there a way to do this in xnormal? Or is there a different method to bake multiple part assets in Xnormal to avoid this problem?

    ***EDIT***
    5. one more:P Is there only a need for split UV's with padding on 90 degree angles?

    These questions are what brought me to this thread, would love some help:):)

    Thanks

    Cheers
  • Obscura
    Offline / Send Message
    Obscura grand marshal polycounter
    Hi!
    1. I prefer 3dsmax because I don't see too much difference, and the "hit only matching material id's" is very useful. UDK and normalmaps are not the best friends, so you can't 100% hide the seams and the bad shading.

    2. I don't know, because I don't have seen that video.

    3. Synced workflow is when you are using a normalmap that is good for the target engine.
    http://www.farfarer.com/temp/unityTangentSpace3.png
    The one on the middle is synced to Unity.

    4. You can use exploded mesh to bake a normalmap, but its not working properly when you want to bake ao.

    5. Well, As I said, UDK and normalmaps are not the best friends, so you can do it how you want, the result will be not perfect. But using UV/smoothing splits is working better with UDK. When you are using one smoothing group, then you will get a little ugly shading, and when you are using uv/smoothing splits, then you will get hard edges in UDK.

    These are my experiences.
  • ratatatatat
    Obscura wrote: »
    Hi!
    1. I prefer 3dsmax because I don't see too much difference, and the "hit only matching material id's" is very useful. UDK and normalmaps are not the best friends, so you can't 100% hide the seams and the bad shading.

    2. I don't know, because I don't have seen that video.

    3. Synced workflow is when you are using a normalmap that is good for the target engine.
    http://www.farfarer.com/temp/unityTangentSpace3.png
    The one on the middle is synced to Unity.

    4. You can use exploded mesh to bake a normalmap, but its not working properly when you want to bake ao.

    5. Well, As I said, UDK and normalmaps are not the best friends, so you can do it how you want, the result will be not perfect. But using UV/smoothing splits is working better with UDK. When you are using one smoothing group, then you will get a little ugly shading, and when you are using uv/smoothing splits, then you will get hard edges in UDK.

    These are my experiences.


    Thanks so much for this info. I'll keep on going with my current workflow then:) Sounds like its gonna be easier to do max projection bakes and if the quality isn't better in xnormal I'll skip it for hard surface stuff. Thanks again.
  • cptSwing
    Offline / Send Message
    cptSwing polycounter lvl 11
    xNormal's AO baking is way faster though. And you can use blockers to stop rays from hitting areas you don't want them to.
  • ratatatatat
    cptSwing wrote: »
    xNormal's AO baking is way faster though. And you can use blockers to stop rays from hitting areas you don't want them to.

    I'm not much fussed on render speeds. I'd rather use one program, one that I know like the back of my hand. For rocks and tree trunks and organic assets made in zbrush though, I use Xnormal, with a max cage. You think this is alright to go with? Hard edge with max, using splits and smoothing groups (to uv's with tex tools) and xnormal for organic assets, characters etc? Or whatever works for you?
  • Giankharlo
    Offline / Send Message
    Giankharlo polycounter lvl 5
    Hello guys I'm simply gonna ask for your advice with this baking proccess in maya and Xnormal , Im following the MetallicAndy workflow to bake normal maps , but still Im getting some issues when baking , hope the images Im gonna show here help you guys to give some tips on to improve these results
  • Giankharlo
    Offline / Send Message
    Giankharlo polycounter lvl 5
    Just wanted to up;oad my uv layout which I did just for the sake of testing , As I read on some of the post harden edges have been split to avoid issues but still Im getting some black spots in my LR mesh , any advce guys ???
  • .Wiki
    Offline / Send Message
    .Wiki polycounter lvl 8
    Have you set enough padding in your baking setup?

    Your uvs are really poor. Use the full space of your uv and not only the two upper thirds.
    Your uvs also look distorted, make them straight when they are straigt. When they are distorted there will be a jagged edge in your normalmap. When these jagged edges and not enaugh padding come together you get those really nasty black lines.

    Your uvs are looking as if they are not in an equal size related to each other. Scale them so that every piece of your mesh has the same texture resolution. You can check this with a checker pattern.
  • EarthQuake
    .Wiki wrote: »
    Have you set enough padding in your baking setup?

    Your uvs are really poor. Use the full space of your uv and not only the two upper thirds.
    Your uvs also look distorted, make them straight when they are straigt. When they are distorted there will be a jagged edge in your normalmap. When these jagged edges and not enaugh padding come together you get those really nasty black lines.

    Your uvs are looking as if they are not in an equal size related to each other. Scale them so that every piece of your mesh has the same texture resolution. You can check this with a checker pattern.

    Pretty much all of this.

    Your uvs need to be straightened, otherwise you need a lot more texture resolution to represent the same information. You only need 1 row of pixels to represent a straight line, however, you need 2-3 to represent an angled line. Pixels are square, make sure your right angles are square in your uv map as well.

    The edges of your highpoly are also too tight, soften them up, again you need a lot of texture resolution to represent such small bevels.
  • 3DKnight
    Offline / Send Message
    3DKnight polycounter lvl 17
    Hey guys been seeing a lot of baking questions. I would share a doc I made to help my team clear up confusion in the baking process and all the variables you have to think about. After years of baking I'm sure everyone has their process, but i found this to be the most efficient at teaching new bakers the best way to start.

    It also includes a rough approval process we used, as well as checks to think about before baking. This is to avoid a lot of custom unwrapping rework. Nothing sucks more than spending time doing a great Unwrap, then having to do major changes to the model when it goes out for approval (happens more often than not). I've found that quick atlas unwraps and bakes a good way to quickly see how your highres is reacting with your low res topology

    However, it mainly focuses on the use of Max/Zbrush, Xnormal and Unreal... Though if you Object Space bake and use HANDPLANE, the engine wont really matter.


    BakingWorkflow2.jpg

    I've also included some models the team tested out to show some examples of problem areas and the fixes to remove the seams. Might be a little hard to see the seams on the examples, Here is a link to a larger res if you want more clarity!

    [url][/url]
    HUGE SIZE

    let me know your thoughts and suggestions/issues!
  • Chase
    Offline / Send Message
    Chase polycounter lvl 9
    Hey thanks for the post Knight. What do you mean by a quick attlas unwrap?
  • 3DKnight
    Offline / Send Message
    3DKnight polycounter lvl 17
    hey chase sorry my terminology is a bit different. Basically just a quick auto unwrap. You can do a quick one with Roadkill as well with a few quickly placed seams. The goal is not to spend to much time on the unwrap before the model is approved :)
  • joeriv
    Offline / Send Message
    joeriv polycounter lvl 7
    had a look at it, one thing is a bit confusing to me.
    at the left you have "A" wich is in one smoothing group (wich probably also means the unwrap stitched together as good as possible) and has 70 verts, wich would be the option with the least verts.

    but then option C, wich seems like it has some hard edges has 68 verts.

    So assuming A doesn't have unneeded UV splits I don't understand how it has 2 verts more, I'd assume A&C to be equal in the best case (same uv's, C only hard edges on uv splits).
  • 3DKnight
    Offline / Send Message
    3DKnight polycounter lvl 17
    oh your right joe, good catch! i think i got those two flipped :)

    there is a point where the two can be pretty similar in max vert count, I should update the jpg to be ENGINE vert count, including all the duplicated vert for island and smoothing splits!

    ill do that tonight :) thanks for the feedback
  • mister_s
    So there are times when you have to sacrifice vertex points for a clean bake when working with hard surfaces where the normal effects an edge? Am I correct? I'm sure there are exceptions where baking where a seam will result doesn't matter if you can't see it, but generally there is no other way to go about a clean bake at smoothing group splits?
  • 3DKnight
    Offline / Send Message
    3DKnight polycounter lvl 17
    Not sure i follow your question, but if you follow the simple rule of 1 smoothing group per UV island, you should get seamless bakes.

    I found with xnormal, outside of 1 smoothing group and 1 UV island bakes, you NEED to bake with a cage. But that's is just me.
  • bharatnag
    So I use Blender for my regular 3D work and have no idea so as how to create a cage mesh for baking inside blender. I used topogun to bake out some textures as it gives me a cage option to tweak around and I solved most of the problems that I was facing (http://www.polycount.com/forum/showthread.php?t=124843) with topogun. but it comes with its own little problems. the bakes look noisy and fragmented at some places. Its easy for me to go to photoshop and correct them but I want to know what might be the possible reasons for these artifacts.
    topogun%20bake%20problem.jpg
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    bharatnag wrote: »
    So I use Blender for my regular 3D work and have no idea so as how to create a cage mesh for baking inside blender. I used topogun to bake out some textures as it gives me a cage option to tweak around and I solved most of the problems that I was facing (http://www.polycount.com/forum/showthread.php?t=124843) with topogun. but it comes with its own little problems. the bakes look noisy and fragmented at some places. Its easy for me to go to photoshop and correct them but I want to know what might be the possible reasons for these artifacts.
    topogun%20bake%20problem.jpg

    I'm glad you posted to ask for help with this as the last thing you should ever do it edit your normal maps in Photoshop to remove stuff like this. It opens a huge can of worms & can really mess your bake up :)

    For the custom cage, I use this method.
    1. Triangulate your LP mesh using the Triangulate modifier set to Beauty. If you have already triangulated your mesh you can skip this step.
    2. Set up your smoothing groups/hard edges in Blender and add the EdgeSplit modifier to the modifier stack. Set this to 'Sharp Edges'.
    3. Make a duplicate of your LP mesh (without the Edge Split modifier applied, though still active in the stack)
    4. Add a solidify modifier to it and move it above the edge split modifier in the stack.
    5. Uncheck "Fill Rim" and check "Even Thickness" and "High Quality Normals"
    6. Change the "Thickness" into a minus number, so that it covers your HP mesh entirely
    7. Apply the Solidify modifier
    8. Go into Edit mode and select one of the faces of the outer most mesh.
    9. Invert the selection and delete the inner mesh.
    10. Go back into Object mode and apply the Triangulate & Edge split modifiers.
    11. Export the mesh with UV's and Normals.
    12. Done!
    Make sure that "Keep vertex Order' is checked in the obj export setting or you will get an error saying that the cage doesn't match the LP mesh.

    That should export a perfect cage mesh, with all the smoothing groups intact and you should now get a perfect bake.

    I have seen people say that they use the displacement modifier on the LP mesh to get a cage, but the results of the push are not as accurate so I wouldn't recommend it.
  • bharatnag
    I'm glad you posted to ask for help with this as the last thing you should ever do it edit your normal maps in Photoshop to remove stuff like this. It opens a huge can of worms & can really mess your bake up :)

    For the custom cage, I use this method.
    1. Triangulate your LP mesh using the Triangulate modifier set to Beauty. If you have already triangulated your mesh you can skip this step.
    2. Set up your smoothing groups/hard edges in Blender and add the EdgeSplit modifier to the modifier stack. Set this to 'Sharp Edges'.
    3. Make a duplicate of your LP mesh (without the Edge Split modifier applied, though still active in the stack)
    4. Add a solidify modifier to it and move it above the edge split modifier in the stack.
    5. Uncheck "Fill Rim" and check "Even Thickness" and "High Quality Normals"
    6. Change the "Thickness" into a minus number, so that it covers your HP mesh entirely
    7. Apply the Solidify modifier
    8. Go into Edit mode and select one of the faces of the outer most mesh.
    9. Invert the selection and delete the inner mesh.
    10. Go back into Object mode and apply the Triangulate & Edge split modifiers.
    11. Export the mesh with UV's and Normals.
    12. Done!
    Make sure that "Keep vertex Order' is checked in the obj export setting or you will get an error saying that the cage doesn't match the LP mesh.

    That should export a perfect cage mesh, with all the smoothing groups intact and you should now get a perfect bake.

    I have seen people say that they use the displacement modifier on the LP mesh to get a cage, but the results of the push are not as accurate so I wouldn't recommend it.
    Hey thanks for help dear. Ok so are you completely sure that this is a cage problem?? 'cause ive tried tweaking cage values in topogun but same result?
  • metalliandy
    Offline / Send Message
    metalliandy interpolator
    I guess the only way to find out is try a bake in xNormal. ;)

    Seriously though, it looks to me like one of the following:
    1. The cage is intersecting with the HP mesh.
    2. It has been decimated and some of the verts are in such a position that causes the edges to overlap in an odd way (concave face).
    Make sure the cage fully encompasses the HP mesh with zero intersections.
  • EarthQuake
    1. It has been decimated and some of the verts are in such a position that

    Yeah actually, it looks a lot like this, moreso than the cage thing, because generally with intersections from a poorly set up cage you won't have a fine noise pattern like this, more uniform chunks will be missing usually.

    I've had similar looking results with poorly decimated meshes though.
  • bharatnag
    I guess the only way to find out is try a bake in xNormal. ;)

    Seriously though, it looks to me like one of the following:
    1. The cage is intersecting with the HP mesh.
    2. It has been decimated and some of the verts are in such a position that causes the edges to overlap in an odd way (concave face).
    Make sure the cage fully encompasses the HP mesh with zero intersections.
    Ok ill give this try too. Well xnormal and blender internal are really not working for me here. giving me this result http://www.polycount.com/forum/showthread.php?t=124843

    But topogun works just fine. And I dont get it, changing the application changes the result. I really I dont understand why a DECIMATED cage is required and many more things. Any written records to study as far as baking and maps go??
  • BeachBum
    Offline / Send Message
    BeachBum polycounter lvl 4
    Im having trouble with my Normal bake

    http://www.flickr.com/photos/14183466@N02/9581023088/

    I know I have a pole. But I thought if you Baked and Projected the HP to the LP Unwrap it would not cause this. I have the the LP as one smoothing group. Is there somthing I am doing wrong to cause this?
  • Quack!
    Offline / Send Message
    Quack! polycounter lvl 17
    Gradations in your low poly like that get translated to your normal map bake. So to fix that in unsynched workflows you need to work your low poly to match your high polys smoothing a bit closer. You can do this via adding more geometry through chamfers, or you can split up your smoothing groups, instead of using 1.

    I recommend, that if this is a hero/portfolio object, to just spend the extra triangles and add some smart chamfers to ease the smoothing.
13468915
Sign In or Register to comment.