Home Technical Talk

normal mapping for games

polycounter lvl 12
Offline / Send Message
ZippZopp polycounter lvl 12
hey guys, quick question. I come from a film background and haven't done any game work but was wondering about normal mapping for games. I know with gams, UV and texture space must be optimized. so lots of times the left and right arm with have the exact same UVs. how does this affect normal map generation in a package like zbrush or mudbox..what is the general workflow for that normal map process for a game model?

Replies

  • Vitor
    Options
    Offline / Send Message
    Vitor polycounter lvl 18
    Usually you hide/dettach the mirrored UVW parts of the model so you don't get it all messed up. After you done with the normal mapping and you attach the whole model again there shouldn't be any problem with it now as most (i'd almost say every) normal map shaders/game engines support mirroed UVWs plus normal maps.

    About general workflow on normal mapping, i prefer after model the highpoly model get the lowpoly version done well optimized and UVW it. Run your favourite normal mapping tool (mine is Render to Texture or Xnormal) and voil
  • Mark Dygert
    Options
    Offline / Send Message
    If you start out with shared UV's and later in Zbrush or Mudbox want to paint different details on the mirrored pieces you'll have trouble.
  • rawkstar
    Options
    Offline / Send Message
    rawkstar polycounter lvl 19
    if the left and right arms have the same UVs and you're baking ur normal map all at the same time it'll project one on top of the other and you'll get a total mess, if you want mirroring and ur highpoly geo is the same you will have to offest one arm's UVs to either be outside the 1x1 UV square or just scale it down in a corner somewhere.

    DO NOT detach or delete mirrored parts because you want the smoothing to interpolate over the whole model, if you delete one side its not going to be correct and you'll get seams. and the normals won't be correct and won't look right from certain angles.

    general workflow is:

    build a base model for mudboxing/zbrushing (or use one u already have) and do all the work u need in mudbox/zbrush, then build a game ready mesh on top of that (mostly because your topology for zbrushing/mudding is going to be way different than the topology optimized for games) then unwrap the game model and project the normal maps either in mudbox or other application like max or some people use xnormal (i haven't tried it so i can't say) I've gotten good results out of mudbox, zbrush is good if its all one mesh, like say an organic body for a humanoid creature or whatever, but if your game model isn't the lowest resolution of your highpoly model then i wouldn't recommend zbrush same goes for if you have a bunch of extra pieces u just built with subds etc, also it all depends where you're going to render it, where its going to be presented, so say if you're going to make ur final presentation in max then render to texture it out of max, some game engines have normal mappers built in, like the doom3 engine has renderbump and thats the best way to generate normal maps for it, it looks the best and usually totally eliminates seams.

    what sucks about normal maps is that there isn't one clear standard, every app and every game uses different code for generating normal maps and rendering them. so you can render a normal map in max and it'll look fine in max, then you put it in a game and it looks horrible smile.gif and there's differences beyond just having the Y axis (green channel) facing different directions, just in general different apps have different code for rendering normal maps, so in some you'll get horrible seams no matter what, some will recognize it and do it without seams, but at the same time if you put that normal map into a game engine that has different code that handles normal maps differently it'll look different... in general yeah every normal map will look about right just about anywhere, but every app has its own subtle differences and sometimes thats just enough to aggravate the hell out of you smile.gif

    oh in mudbox do NOT invert binormal lol that'll basically switch ur red and green channels and invert them, lol ... its bad smile.gif real bad
  • Mark Dygert
    Options
    Offline / Send Message
    Rockstar, is it possible to assign the mirrored parts to another material ID so it outputs to another file instead of stacking? Or would that still produce seems? And by moving one piece of the mirror off the UV area do you need to line it up as if the texture was tiled?
  • Fuse
    Options
    Offline / Send Message
    Fuse polycounter lvl 18
    Rockstar, do you mean to uv both arms on the low poly .. and then after baking the normals, delete and mirror the arm ?
  • EarthQuake
    Options
    Offline / Send Message
    No. He means uvs both of the arms onto the same uv space, duplicate the model, scale down the uvs on one of the arms so they dont trace onto each other, and then render your normals map. And then just apply it to the orig. model.

    I think i've gotten away with mirroring things in xnormal and not even worrying about messing with the uvs at all. It just worked. I could be completely off here but i vaguely remember doing that once.
  • rawkstar
    Options
    Offline / Send Message
    rawkstar polycounter lvl 19
    what EQ said.

    offsetting is the easiest way i found, i use chuggnut uv tools so all i have to do is just offset by 1 on the U, it doesn't matter if it tiles or not cuz when the normal map is generated it only uses the 1x1 square and nothing outside of it. u technically don't need to clone the model either u can just select the offset piece and just repeat that offset with a negative value after you generate the normal map to get it to snap back.
  • Eric Chadwick
    Options
    Offline / Send Message
    I don't worry about snapping the mirrored UV pieces back... they're still mapped correctly as long as they were offset in whole units. Textures tile by default, right?

    I learned the differences between the way programs render each others' normalmaps are because normalmaps rely on more than just their pixels. The real-time lighting code in your game also needs to generate special per-vertex tangents (bitangents, binormals, etc).

    These are used to transform the light from world-space down into the local tangent-space that the normalmap was created from, then finally the per-pixel normals affect the lighting direction. Programs calculate these vertex tangents differently, so apparently for the best normals your game should use the same vertex-tangent generation code that your normalmap baker used (or the exporter should grab and export the tangents themselves).
  • CrazyButcher
    Options
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    eric: and even then you can get problems (as in max). because interpolation along a triangle might differ. max generates normalmaps that it renders correctly (offline renderer and normalmap-material), but that it shows with seams and so on in the viewport renderer. which sucks... as on export and so on you only get the same "gametype" interpolation and values that max feeds to the viewport. well, guess thats when people do the tracer themselves. it's just a bit dumb by autodesk that they do not push more towards the gamestuff, like the uvchannel access in viewport shaders and other stuff we discussed. it's still as if render-to-texture was only meant to reduce offline-rendertimes, instead of making it more game-tech focussed, or rely on 3rd party software which also adds to the costs </rant>
  • EarthQuake
    Options
    Offline / Send Message
    We created a custom importer for xnormal that lets us take the actual game mesh exported from maya and render our normals using that. Works perfect every time!
  • Eric Chadwick
    Options
    Offline / Send Message
    Cool. So you guys only use Xnormal for generating your normal maps? No zbrush/modo/max/maya/etc.?
Sign In or Register to comment.