Home Technical Talk

Baked normals in different engines.

So I've been having a problem in unreal with smoothing errors. It looks like it's displaying my normal map properly, much like the max viewport does for me. The big triangular smoothing artifacts that are on the low poly in max's viewport are still there on the final normal-mapped asset in unreal. When I view it in the marmoset toolbox viewer though it looks perfect.

I've realized that max's bakes appear to have an inverted green channel for some reason? A max bake looked really bad in marmoset until I fixed this, then it looked more or less like the bake I got from xnormal. Does unreal require this inverted green channel? Or perhaps another channel to be inverted? The problems I'm getting look like the ones I had in marmoset before I inverted the green, which is why I ask.

I've been doing my baking in xnormal with a high and low obj that I exported, but I noticed that it also accepted .ase, which is what I've been feeding UE3, so I thought I'd give that a go too. XNormal gives me this error though:
Invalid function parameter 'msh': The mesh has vertex UV indices but the count is not the same than the position vertex indices. Please sure th mesh->UVIndices.size() is ZERO or the same than the mesh->PositionIndices.size() as specifyed in the SDK documentation.

...which I don't understand at all.

Why does unreal give me shading errors while marmoset displays everything correctly? Would getting my .ase file into xnormal and baking with that fix it?


[edit] Oops I didn't mean to post this yet, I was still editing it into something legible. I'm installing the ActorX plugin now and I'm going to do some tests with it as well.

Replies

  • EarthQuake
    glib wrote: »
    So I've been having a problem in unreal with smoothing errors. It looks like it's displaying my normal map properly, much like the max viewport does for me. The big triangular smoothing artifacts that are on the low poly in max's viewport are still there on the final normal-mapped asset in unreal. When I view it in the marmoset toolbox viewer though it looks perfect.

    This makes sense, Xnormal uses tangents/binormals more accurate to marmoset's engine than max or UE3. Max renders its normals to display correctly in offline render, not realtime. Which explains smoothing errors in max renders. And UE3 probabbly displays the same as max's realtime shaders(again not accurate to max bakes).
    I've realized that max's bakes appear to have an inverted green channel for some reason? A max bake looked really bad in marmoset until I fixed this, then it looked more or less like the bake I got from xnormal. Does unreal require this inverted green channel? Or perhaps another channel to be inverted? The problems I'm getting look like the ones I had in marmoset before I inverted the green, which is why I ask.
    Marmoset uses the maya/nx format for green channel, afiak UE3 uses max's format.
    I've been doing my baking in xnormal with a high and low obj that I exported, but I noticed that it also accepted .ase, which is what I've been feeding UE3, so I thought I'd give that a go too. XNormal gives me this error though:
    Invalid function parameter 'msh': The mesh has vertex UV indices but the count is not the same than the position vertex indices. Please sure th mesh->UVIndices.size() is ZERO or the same than the mesh->PositionIndices.size() as specifyed in the SDK documentation.

    This means your mesh doesnt have uvs, or your uvs are messed up. Check your export options, and make sure you dont have multiple uv channels in max(it may be exporting the wrong uvs)

    The best solution would be to use a format that correctly stores the mesh tangents/normals/bi-normals as they display in UE3, in xnormal. I'm not sure if there is such a format supported by XN.
  • glib
    Thanks for the reply EQ. I know that Rikki used max's bakes for his eat3d broken pillar tutorial, and I seem to remember Kevin Johnstone or someone else at epic mentioning 3dsmax bakes as well, so that plus the matching inverted Y or green channel in both max's default settings and unreal's coordinate system makes me think maybe they use max bakes?


    Here are images that describe my issues a lot better:

    So low poly geometry has smooth errors in the max viewport. In the normal map that is baked out from here there are undulations that match these errors, and I assume are equal and opposite to them to counteract them once the map is applied (I'm looking mainly at the big flat part of main platform that has those long triangular smoothing errors in it). In every case except the unreal viewport, when the mesh is viewed without normals applied, the smoothing errors are visible, then the normals are applied and the errors are corrected. In the unreal viewport it appears *without* errors, and thus when the normals are applied they try to correct something that isn't there, and thus *create* the errors.
    2D59f.jpg

    This is the frustrating part: the following two images show the unreal material editor, first with the normal texture node disconnected, then connected. Here there are smoothing errors visible, that are then fixed when the normals are applied. Why the hell does the viewport work differently than the material preview window!?

    2D5N.jpg


    I know a lot of people use this engine, how does everyone else handle this? I couldn't find much on this issue in my google travels.
  • SHEPEIRO
    Offline / Send Message
    SHEPEIRO polycounter lvl 17
    is it re-triangulating it differently- try triangulating the mesh befor export and see if that fixes the problem
  • glib
    I don't think so, that section that I've been trying to fix has already been manually triangulated:
    2DfWV.jpg

    I double-checked by triangulating it, but the errors were still there.
  • Vailias
    Offline / Send Message
    Vailias polycounter lvl 18
    Glib:

    This could have to do with Unreal's unit granularity. I recall that on export verts are effectively snapped to a grid, albeit a fairly fine grid.. I think it was on the order of 100ths or 1000ths of a UU, but those are often larger than the finest granularity of a 3d app where vertices are expressed with a 32-64 bit float. If the changes between the vertices are fairly below that threshold they will still show in max and potentially in other engines, but for UT the surface will be planar.

    Try this. Export the model from unreal back to an OBJ and load it up into max to see if the smoothing errors are still there. If not, you've found the issue. Make the faces planar, and rebake.
  • Xoliul
    Offline / Send Message
    Xoliul polycounter lvl 16
    On another note, I would just add a measily 2 to 10 extra tri's at the problematic part to make your life easier. You can even compensate those more than adequately by reducing poly's on those 6 little cylinders.

    edit: and even more off-topic: you could easily share UV space for those 6 equal parts, no reason to unwrap all 6 of them uniquely.
  • glib
    Vailias wrote: »
    Try this. Export the model from unreal back to an OBJ and load it up into max to see if the smoothing errors are still there. If not, you've found the issue. Make the faces planar, and rebake.

    The obj that unreal spits out looks like a single triangle, but is actually 906 tris sitting in the exact same place. The uvs are totally messed up as well. Is this indicative of a problem, or does unreal's exporter just suck?

    Xoliul: The problem is there are also smoothing errors on each of the 6 flat pieces that sit on top of the large platform. These already have bevels to fix smoothing issues as well. I don't want to just keep adding geo until I'm back at my high poly. I shouldn't need to add any more, especially since it's showing up properly in another engine. This seems like an unreal issue to me? Maybe I'm wrong. And yes, only one side of those side pieces has a unique unwrap. If I ever get this working properly and do some tests, I may reduce it to only two pieces having a unique unwrap then just move the other pieces around to mix-and-match. But first, the issue at hand.
  • Vailias
    Offline / Send Message
    Vailias polycounter lvl 18
    O_o hmm.. haven't seen unreal's export function do that before.

    Would you mind posting the geo and normal map so I can mess with it in ut3 later on?
  • glib
    Vail: will do. Do you want my .max or my .obj from max or both?


    On another note, I know I can 'fix' this problem by assigning more smoothing groups to my mesh (aka. hardening a bunch of edges). Here I ran an autosmooth at 30 degrees, then re-baked normals:
    2GEL.jpg

    The issues are visible in the blown up section. This solves the large-scale smoothing errors at the expense of smaller-scale errors due to all of the hardened edges on the mesh. I consider this a last-resort rather than a fix. The epic guys always talk about thinking of your low poly as 'putty' that your normal map gives definition to. Hard edges break this concept, and really mess with the look imo.
  • Vailias
    Offline / Send Message
    Vailias polycounter lvl 18
    obj, as I'm not sure I'll have the right version of max. :)
  • Xoliul
    Offline / Send Message
    Xoliul polycounter lvl 16
    glib wrote: »
    Vail: will do. Do you want my .max or my .obj from max or both?


    On another note, I know I can 'fix' this problem by assigning more smoothing groups to my mesh (aka. hardening a bunch of edges). Here I ran an autosmooth at 30 degrees, then re-baked normals:


    The issues are visible in the blown up section. This solves the large-scale smoothing errors at the expense of smaller-scale errors due to all of the hardened edges on the mesh. I consider this a last-resort rather than a fix. The epic guys always talk about thinking of your low poly as 'putty' that your normal map gives definition to. Hard edges break this concept, and really mess with the look imo.

    Read this:
    http://www.svartberg.com/tutorials/article_normalmaps/normalmaps.html
    Lots of people miss the basic concepts of smoothing groups & normalmaps, these are especially important when working with mechanical stuff.
  • Snight
    Offline / Send Message
    Snight polycounter lvl 16
    On mechanical like stuff, the only solution I have found is to break the smoothing groups where you have a UV seam, because you will have a seam there anyway. A lot of times I will chamfer the low (if its in your budget) and you wont notice this problem as much. On large flat surfaces is usually when you see the incorrect "bowed" surfaces, so I tend to only break the smoothing groups on large surfaces. On rounder surfaces such as organics, I rarely see this happening.

    I've always wondered if there was a better way of doing this for mechanical objects, because using multiple smoothing groups shoot your vert-count through the roof and can get quite expensive. I'm hoping someone will have a better solution to this problem.
  • Ott
    Offline / Send Message
    Ott polycounter lvl 13
    As someone who is more than intimately familiar with this issue, I'm actually surprised it isn't brought up more here. I've seen it multiple times on various hard surface stuff and usually just hack-worked around to fix it.

    Can it be seen when you finally get a texture on it? Ignore it.
    Can you add some chamfers and fix it? Do that.

    But today, a few of us here who were aware of it kicked around a few ideas and tested it, and I think we found what the issue is.

    Apparently, something wonky happens when you either

    A) Use Reset-Xform on the mesh before exporting can cause it to do this noticeably inside of Max.
    B) Simply importing it into Unreal after collapsing everything together.
    C) Simply attaching pieces together can cause it as well inside of Max.

    What it seems like is happening, is that through either attaching or resetting X-Form in Max it is altering the way the normals on the object work, and when you bring an object that appears correct in Max, Unreal decides to "auto reset" for you and breaks the mesh.

    The screenshot below shows my workaround. Long story short - Reset X-Form BEFORE you bake, and it should mostly take care of the issues. In extreme cases, the only real workaround would be to use different smoothing groups.

    25jcz75.jpg
  • SHEPEIRO
    Offline / Send Message
    SHEPEIRO polycounter lvl 17
    interesting wtf does the xform do to the normals, thats pretty pants
  • Pope Adam
    Offline / Send Message
    Pope Adam polycounter lvl 11
    thats a pretty sharp observation ott. how'd you figure this out?
  • Ott
    Offline / Send Message
    Ott polycounter lvl 13
    I first noticed it when I would attach objects together and see a visual change in the low poly mesh. When I would render the attached / collapsed mesh in Max, the normal results would look messed up. Send it to the Engine, messed up. Textured it and ignored it. The guys at work considered me crazy because we could see it, but had no idea how I did it. Insert "LearnToModelNoob" joke.

    Then, when using our custom exporter that has a Reset-XForm option on it, I would again notice a visual change in my meshes in 3dsMax after running the script. Again I thought it was an Unreal problem. We've noticed it before and just adjusted smoothing groups instead.

    Realistically it's rare for small details for it to be noticeable with this issue, and you shouldn't be attempting 90 degree angles on large surfaces without chamfering anyhow, so I think it slipped past most of the artists. I'm usually given a pretty low budget for some of my props and have definitely seen it on several objects.

    We had a little bit of downtime today so we ran some tests. Looks like it is fixing the biggest offenders so far.
  • Ben Apuna
    Nice catch Ott, thanks for sharing :) So many things to keep in mind when normal mapping stuff...
  • glib
    I appreciate the post Ott, but that doesn't seem to be my problem. I double-checked and made sure to reset the xform but I get the same results.

    It just looks like I need to use much more geometry in unreal than in other (better?) engines. A lot of what I had hoped to pick up with the normal map will instead need to be modeled in. Frustrating.
  • Snight
    Offline / Send Message
    Snight polycounter lvl 16
    @ Ott: Learn To Model Noob
  • Ott
    Offline / Send Message
    Ott polycounter lvl 13
    I appreciate the post Ott, but that doesn't seem to be my problem. I double-checked and made sure to reset the xform but I get the same results.
    Did you basically start over at the low, reset all the low, rebake, re-import the new normal map and mesh? Still get it?

    If you don't mind posting up the Low / High poly Max files it would be easier for me to check it out and test it to see where you might be having problems. The only other time I have also seen this happen is when you have overlapped UVs and you have to go to the low poly and turn edges for the collapsed / attached overlapping faces.

    I have a feeling that this is what is happening sometimes when you collapse the mesh, hence the reason I see a visual change in the geometry when collapsing / Resetting X-Form...for no obvious reason.
  • glib
    Oops! I had forgotten that I switched the green from 'Down' to 'Up' on my normal bake in 3dsmax while I was testing in marmoset and forget to turn it back. Once I inverted the green channel I ended up with a normal map that works great in the model viewer or material viewer windows, but spits out the same smoothing errors in the main window.

    This is infuriating. I'll post the normals, .max and .obj files tomorrow. I'm going to finish my drink and go to bed before I punch something.

    2IwB1.jpg
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 20
    it is a known issue that 3dsmax normal baking results are meant to be used in the offline renderers, which work slightly different than viewport renderer. Hence in 3dsmax bake result will always look good in "renderer" (ie material editor, or straight rendering), whilst have glitches in realtime viewport. And sadly custom shaders can only do so much as they currently rely on max feeding certain attributes used for tangent creation.
  • JordanW
    Offline / Send Message
    JordanW polycounter lvl 19
    Ok, without having your mesh myself and knowing what exactly is causing your problem I cant guarantee that I know what will fix it, so I'm just going to spit out a lot of things.

    1. Yes reset Xforms, I thought everyone knew this about making any art in max whatsoever.... basically if you ever scale ANYTHING in max on an object level or mirror it, it will screw up your normals. It has something to do with them keeping their normal information from the original scale and just moving the geometry.

    2. I've said this in other threads but I really do not trust 3dsmax to bake my normals, I havent baked a normal map in max in 3 years and my life has been easier because of it. It will crash when you get lots of geometry and it just doesnt bake normals well (Obviously this is my opinion and there are other people out there who use max just fine, I just find it a pain to get it to do what I want) I recommend Xnormal for baking normals, it's faster and my results are more consistent.

    3. Having said that I export an .obj to Xnormal from max. I make sure my whole object is one smoothing group. (this is an area of debate in other threads but in gears2 all of my low poly meshes utilize 1 smoothing group and I'm happy with my results)

    4.I dont think your base low poly geometry is helping any of the matters. There's a lot of waviness in the smoothing and you could help the renderer get past some of your problems by adding more supporting geometry in there, maybe a ring of edges around the top of your mesh. Just try to get rid of those big dark triangles.

    5. When you bake are you exploding your mesh or baking it all together? You should really be exploding if you aren't. It'll make processing a lot easier.
  • HAL
    Offline / Send Message
    HAL polycounter lvl 13
    1. Listen to Xoliul and use smoothing groups, you just have to use unique uvs.
    And yes this works perfectly.

    2. In addition use exploded baking like JordanW suggested.
  • EarthQuake
    Alright, since there has been much discussion on "Which is better, 1 smoothing group or hard edges" i'de like to write a little something on the topic.

    First off, its important to understand why these issues are being caused in the first place. The recurring theme here is that none of this stuff is standardized, pretty much every baker,engine,file format, etc tends to do things slightly different. By that i mean calculating tangents/normals/binormals. Now there have actually been a few apps/engines to get this right. Doom3 for the most part, if you rendered your maps inside of doom3 was great in this regard. And if you're rendering and viewing inside of maya, your results will be nearly perfect without having to do much extra work.

    Ok, so now that we know the issue is slight deviation from app to app, we can explain why the two accepted methods tend to help.

    So, the greater the change you have in your normals, the more apparent any deviation is going to appear. So the smoother/flater your normals tend to be, the less likely you're going to have visible errors.

    When you're adding geometry, beveling edges etc what you're reallying doing is softening the normals, making the transitions less harsh. This is going to help with smoothing errors, and its also going to help to bake more accurately(less skewing etc) because the normals are more accurate to the surface. This is of course going to be the ideal solution, but far from the only thing that should be considered.

    When you add in hard edges, what you're doing is breaking the normals, and this will gives you pretty much the same result as adding in more geometry, in that it makes a more "suitable" surface to display a normal map that is essentially created incorrectly(mathimatically). The main problems here are: 1. you need to split your uvs along the same edges that you make hard, this isnt really a huge deal if its something you're aware of when going into it. And 2. the less geometry you have in your cage(assuming your cage is averaged, and not using the split edges, which will give you even more problems) the less accurate your bake is going to be. So if you've got small details, they very well may end up be projected skewed etc.

    Now the really important thing here, is that you need to take into account various factors. What sort of budget are you dealing with in the engine, how important is the object, etc. Its silly to say always add in more edges, because that simply is not always an option. Its also silly to say always add hard edges whenever you have problems, you need to understand how this stuff works and make these sort of decisions on a case by case basis.
  • glib
    I was slowing typing out a response here to everyone that offered advice when I stumbled on the issue: unreal doesn't take the material into account when lighting! I thought I had checked this issue when I viewed the 'lighting only' mode (as posted in that large image above) but it turns out that mode isn't just the display of the lightmap on a flat-shaded model, but rather the lightmap on the shaded model. I only realized this when I clicked on the package where these maps were being stored and saw what they looked like:
    2KrX.jpg

    So what must have been happening is this:
    - base mesh has smoothing issues
    - lighting takes these into account and lights it as if 'flat', correcting these smoothing issues as it does so
    - normal map then tries to correct smoothing issues that have already been fixed, and thus causes them

    This explains why the mesh looks okay in the static mesh viewer and material editor preview windows; they must use a different lighting system (since the material editor preview window would be rather useless if it *didn't* take the material into account when lighting). This also explains why I would need to use hard edges/smoothing groups before baking in unreal, since this base geo is what the lighting is generated from. Finally, this explains why the base geo with no normalmap applied looks different in unreal versus the max viewport or other engines.

    So with this in mind I deleted my point light that was the sole light in the scene previously, and added a skylight (aka. ambient light). The result was exactly the same as what I'd seen in the material and static mesh editors:
    2Ktof.jpg


    So now that I've figured out what the issue is, I'm not quite sure how to approach fixing it. I guess I'm stuck with the lighting the way it is, despite it not making sense to me. I suppose this is why I would need to use smoothing groups and hard edges to try to get a more consistent result with unreal's inconsistent lighting setup. On a sidenote, is this 'per-vertex' lighting and I was expecting 'per-pixel'?
Sign In or Register to comment.