Toying around with normal map approaches

1
polycounter lvl 14
Offline / Send Message
kodde polycounter lvl 14
Didn't want to resurrect the old "Approach to techy stuff" thread so I'll post it here. Please do add your ideas and views on this subject.

I was tempted to try this shape myself to see what I got working best. Baked with Maya's Transfer Maps.

So what really is the best approach for a surface like this?

nmapwall1.jpg

nmapwall2.jpg

Replies

  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    Far right one is what I'd use wherever possible. Not too many polys, no vertex normal splits, and the normalmap looks flat and clean.

    I would not bevel something like the one 2nd from left unless it's going to be really huge or in the player's face all the time, you don't get a much better normalmap result, in fact it can be harder to UV-map too, and you waste polys.

    There is a time and a place for geometry like that, though.
  • Mark Dygert
    Far left if you plan to take it into a sculpting app. Otherwise its just a waste for an object that simple.

    If all you where doing was beveling the edges I wouldn't use a normal map for that. Adding bevels to the geometry isn't nearly as costly as it once was, especially on bit objects like the edges of buildings. If you can eliminate one texture from going through the pipeline by adding a few polys to the geometry then its a smarter move resource wise.
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    Vig: Haha, the far left one was his highpoly source object for baking, not any lowpoly...
  • Mark Dygert
    o.O

    hahaha I wasn't commenting on which would be best. Opps! I agree far right for that. Or possibly the second from the left depending on the objects place and prominence in the world.

    I was trying to say I would only use far left as a high poly if I planned to take it into a sculpting app, its a huge waste to sub-d something that much just for a few bevels... and its even more of a waste to normal map something just for a bevel, when you can might be able to get away with tossing a few more polys on the object.

    But yea to actually answer the question, far right. But make the most out of your normal map, toss in more stuff then just a weak bevel!
  • timwiese
    Offline / Send Message
    timwiese polycounter lvl 9
    Depends where you are rendering it. If this is just going into a scene that is rendered in your 3d app, then the far right one is the best.

    But if you plan on taking an the object into the UT3 Engine those hard edges with smoothing groups applied across them wouldn't light properly. So the two middle ones would work best for that.
  • Chai
    Offline / Send Message
    Chai polycounter lvl 12
    Ehm, that just does't make any sense.
    If you have all edges soft (1 smoothgroup) you're going to get smoothing errors withthe one at far right - what you [posted here looks like a render rather than realtime.
    If you bevel edges like 2nd one from the left, you won't get much smoothing errors, however the mesh will still look a bit funky on a lot of engines.
    If you have hard edges (multiple smoothgroups), you'll be best to use the image below, and you'll have to break the uv's to islands to avoid seams. (though the added edges likely not to effect performance, still doesn't hurt to have a clean mesh)

    Hardsurface meshes with normalmapping are cool, but smoothing errors hinder things unfortunately.

    screenshot392qs6.jpg
  • EarthQuake
    really that far right, i have no idea of any engine that will actually give you that good of results. What sort of shader are you using? One thing you need to make sure to do is apply some high spec to these, which often times will show some glaringly obvious errors that you would miss with simple diffuse + normals.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    These are screengrabs from Maya 2009, High Quality Rendering in the viewport. Doesn't this make them "realtime"?

    They are using a default Lambert material, meaning no spec. One directional light that are lighting all of them.

    Oh and of course the 3 to the right each have a bump2D node + file node with normal maps that are generated by comparing them to the far left one.
  • EarthQuake
    Heres pretty much the same example, using a realtime shader in max, that should be fairly representational of what you would actually see in a game engine. You see only the example that uses hard edges really gives you a clean result, the other two examples having pretty obvious smoothing errors.

    #3 isnt absolutely terrible, but at the cost of extra poly's, and worse results, i wouldn't really recommend using it. You'll have to add a lot more edges to get a result comparable to using smoothing groups, and at that point it really isnt worth it.

    nmtests.jpg
  • rasmus
    Definitely an issue that should get more attention! Basically I'm with Chai on this one... I guess it could be argued that mapping the all-smoothed version as one continous element would elimate drawcalls as opposed to separate UV elements of the hard-edged one, but personally I'm still reluctant about taking something that low poly and smoothing it all over - call me old-fashioned. Light also has a tendency to "spill over" this way from angles where it really shouldn't be able to.

    EDIT: Touche, EQ.
  • pior
    Offline / Send Message
    pior insane polycounter
    That really makes me wonder why object-space normalmaps are not more widespread. They are basically perfect, just need a simple flag system for symetry, and look as good as example 1 for a fraction of the time spent (no splitting required)...
  • EarthQuake
    pior wrote: »
    That really makes me wonder why object-space normalmaps are not more widespread. They are basically perfect, just need a simple flag system for symetry, and look as good as example 1 for a fraction of the time spent (no splitting required)...

    I personally blame the wide-spread misconception that models with OS maps can not be animated or deformed, both of which are incorrect.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    pior wrote: »
    That really makes me wonder why object-space normalmaps are not more widespread. They are basically perfect, just need a simple flag system for symetry, and look as good as example 1 for a fraction of the time spent (no splitting required)...

    Flag system as in making it possible to reuse texture areas with overlapping UVs? Like a flag for each UV shell?
  • EarthQuake
    kodde wrote: »
    Flag system as in making it possible to reuse texture areas with overlapping UVs? Like a flag for each UV shell?

    We've discussed this a few times here, and i think there are a few different ways to do this. One of the most simple, imo, is to just offset the mirrored uv's out of the 0-1 range, thus making it easy for the shader to "Tag" it as mirrored.

    Another would be to use maybe a different material on the other half, that has a tag to mirror? This wouldn't be very efficient.

    And a system i've talked to CrazyButcher about a few times, a sort of hybrid tangent/OS system, where you use the tangents to figure out the direction/rotation/etc etc, i think that could be a very robust system if its possible.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    In theory, shouldn't a normalmap solve the crappy smoothshading you get with any of these version?

    I mean, some of them which are sharing same smoothing areas look crap due to the lack of polygons and extreme angles right? When these are compared to a highres perfectly smoothed version shouldn't the normal map neutralize the smoothing errors on the lowres version by adding correct color variation?
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    The OS version with flags sounds really interesting. I'm itching to play around with comparing OS to TS normal maps. Still at home drinking my coffee, need to get to work now :)
  • EarthQuake
    kodde wrote: »
    In theory, shouldn't a normalmap solve the crappy smoothshading you get with any of these version?

    I mean, some of them which are sharing same smoothing areas look crap due to the lack of polygons and extreme angles right? When these are compared to a highres perfectly smoothed version shouldn't the normal map neutralize the smoothing errors on the lowres version by adding correct color variation?

    With more accurate, slower, offline rendering(scanline in max for example), yes that is the case. But the way most realtime engines do it, no, it just doesn't work out how you would want/expect. Welcome to the biggest annoyance dealing with NM crap =)

    I blame the damned programmers.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    Oooo... this is good ammunition for whining at my colleague (programmer).

    Yeah I figured this was the case, reality kicking theory's ass as usual.
  • ivars
    Offline / Send Message
    ivars polycounter lvl 12
    Anyone know what's up with the maya viewport HQ-render? It usually displays NM's way better than Unreal or FarCry for instance. What's the actual difference and how much slower is it really :P
  • Chai
    Offline / Send Message
    Chai polycounter lvl 12
    Ya I have to agree there, the whole smoothing errors walkaround thing is a big pain in the ass.

    EarthQuake, I remember seeying assets you've made using OS, but I've never saw mirrored OS - how would you say that's working out ? are there any noticable seams around the mirrored areas ?

    And if not, why the hell isn't that becoming mainstream ? I would assume a lot of companies would have benefited from it since it saves polycount and make eases up the work for their artists.
  • NZA
    Offline / Send Message
    NZA polycounter lvl 9
    Isn't it a compression thing as well? Tangent normal maps compress more readily because of the range of colors they use.
  • Peris
    Offline / Send Message
    Peris polycounter lvl 13
    hard edge is the way to go, that way you can also reuse the texture in different situations without breaking the smoothing.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    nmapwall3.jpg

    Still looking quite good imo. Not using mayas "high quality render", won't do nothing for ya with the CGFX shaders.
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    The strong colours in Kodde's normal-maps are what I'd expect to see from baking an object like this.

    EQ: What app did you bake / display those meshes in? It looks like the baker and/or shader aren't calculating things correctly. As you can see from the Maya bakes and shaders, the output normals are much stronger, and the previews are much less blobby (which implies it's calculating tangent basis better). You can see a couple of small issues on the far right mesh but not hugely noticeable.

    Basically if you can't display a normalmap like that in an engine then something is probably wrong with your shaders, or your normal map baker.
  • Chai
    Offline / Send Message
    Chai polycounter lvl 12
    MoP wrote: »
    The strong colours in Kodde's normal-maps are what I'd expect to see from baking an object like this.

    EQ: What app did you bake / display those meshes in? It looks like the baker and/or shader aren't calculating things correctly. As you can see from the Maya bakes and shaders, the output normals are much stronger, and the previews are much less blobby (which implies it's calculating tangent basis better). You can see a couple of small issues on the far right mesh but not hugely noticeable.

    Basically if you can't display a normalmap like that in an engine then something is probably wrong with your shaders, or your normal map baker.

    Funny, in all engines I worked with, I had smoothing errors like EQ pointed out.
    In the doom3 engine you guys been working with, the tangent support is a bit more advanced - as the engine treats seperate UV islands kinda like smoothing groups.

    Also note the kodde removed polys from the bottom and I believe one of the sides, which eases the smoothing errors a bit.
    EarthQuake's mesh is all closed, and in most situtations you would deal with a lot of closed meshes with sharp corners.
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    Yeh, if it was a closed mesh with bottom corners then you'd definitely need more supporting polys, or split normals.

    The other solution too, which a lot of people overlook I think, is to manually edit your normals. A little tweaking here and there can produce a perfectly nice normal bake with no extra overhead. No time lost either, the time you'd spend adding cuts or edge loops can account for rotating/unifying a few vertex normals.

    Also, we no longer use the id-style "unsmoothedTangents" which you refer to, Chai - our game model formats now use vertex normal data directly from Maya, so if it looks correct in Maya then it looks correct in the game.

    In the meantime I wrote a Maya script for doing what unsmoothedTangents does, it just makes all UV island borders hard - produces excellent bakes for most objects, mostly useful on mechanical/hard surface things.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    Here's another huge comparison picture. I'm preparing stuff for a lecture anyways so might as well post it here.

    Yes I know that the Object Space shouldn't differ whether lowpoly has soft/hard edges. But since what works in theory doesn't always work in practice I'm not taking any chances.

    These are once again screengrabs from Maya 2009 viewport. Regular Lambert materials using Bump2D nodes, no specular. High quality rendering enabled to be able to view normalmap in realtime.

    The UV seam is the edges pointing at the camera. Notice the errors in the Tangent Space Soft Edges version.

    NormalMapComparison.jpg
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    Regarding the Tangent Space All Soft Edges version above, the normalmap + maya viewport implementation does a good job countering the nasty smoothing errors imo. But the generated normalmap seems flawed where the seam in the UV is. Anyone know any good workarounds for this?
  • perna
    Offline / Send Message
    perna quad damage
    It's time someone made a test object OBJ and people ran it through different techniques and software so we could get some proper comparisons going.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    Word, nice idea.

    I can share my shape in the latest picture. It has quite nasty smoothing when using 1 smoothing group/softening all edges.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    Here's the test object.

    Included is:
    -Lowpoly version
    -Highpoly version
    -Tangent Space Normal Map, All soft edges
    -Tangent Space Normal Map, All hard edges
    -Object Space Normal Map

    Also try generating your own normal maps.

    Please do post results here.
  • NZA
    Offline / Send Message
    NZA polycounter lvl 9
    Here is how kodde's maps came out in max when I used them. Note this is a scanline render. I've never used object space normal maps before and I don't know how you display them real-time in the max viewport. Also I had to swizzle some the green channels:
    3253576380_6a375a3b92_o.jpg
    Heres a tangent map rendered from max compared to kodde's:
    3253576384_6233dc01b8_o.jpg
    Heres a scanline render of the same scene:
    3253576386_a83a881edc_o.jpg

    It's amazing how much of a better result you get with scanline :poly105: It would seem to me that the built in realtime shaders in Maya (that is what your using right kodde?) produce a more accurate result than max's. I dunno which one is more representative of how it would appear in game. I guess it varys from engine to engine.
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    Perna has a point, it's -useless- to compare these techniques in Max or Maya, how many games do you guys know that are rendered in these software packages? Even if they're using the same .cg shaders as the engine the content is getting treated differently. Any time you bring a mesh into an engine the tangents and bitangents have to be read/generated for each vertex.

    Some of the examples shown in this thread look surprisingly good for as low res the low poly is but i am sure you will run into problems when rendering it in a proper engine, it'll look soft, there will be waviness and you'll just hate it.

    This also kind of goes along with why i'm pretty much opposed to using object space normal maps. See if I build my low poly tight enough (a decent amount of support edges, nothing crazy high) and i use tangent normal maps, I can reuse chunks of that normal map over and over. I can break off part of a desk to make a window trim, I can use a chunk of a pillar as a floor tile. All I have to do is build another low poly mesh and unwrap it to my previous texture. If my normal maps are in object space or my normal map is fucking crazy wavy from supporting some super low poly mesh I cant do this.

    oh BTW, I use xnormal to render all my normals so max and maya can eat it!
  • EarthQuake
    Alright here is the test case i used, its 3 objects, all using the same UVs so you can bake them all at once. Make sure the middle object in the low has smoothing set to 45 degrees if the normals arent saved correctly in the obj.

    http://dl.getdropbox.com/u/499159/nmtest.rar

    Also, make sure if you guys are viewing a nm from maya, in max, or vice versa that you invert the green channel. It looks like the main difference in NZA's result is that the right one doesnt have the green channel fliped as it should.
  • perna
    Offline / Send Message
    perna quad damage
    JordanW: I did some work for your company, and they provided me with a max shader that represented the ingame look, and also the maps were generated in max... so I'm not sure what you're saying :) I think it's directly beneficial for many of us to delve into normal map generation and rendering as deeply as possible.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    A question that popped into my mind is this.

    If Maya's or Max's viewports are not representative to how most game engine normal map implementations work then what other app should an artist use to preview normal maps? That is if the artist isn't already working directly with a specific engine at a games company.

    xNormals 3D viewer perhaps? Any ideas?
  • EarthQuake
    Viewing in-engine will always be the best solution.

    If you can get a plugin for xnormal to read the exact bi/tangents from your file format, that can be a huge plus too.
  • Mark Dygert
    If you have an engine, use it to test. If you don't, treat 3ds or Maya as your engine and play by real time rules?

    EDIT: err yea what EQ said.
  • EarthQuake
    "real time rules?" No such thing. Nearly every app/engine does things a little differently, so really the only accurate way to tell anything is to preview it in your engine. Its really a pain because in most cases, what you're using to generate, and what you have in your engine will not match up. People generally have to settle for "good enough" when doing anything with normals.
  • Mark Dygert
    By real time rules I mean:
    - Hugely expensive rendering techniques you don't find in real time games.
    - Stick to standard point lights.
    - Don't use photometric lighting it could give you drastically different results.
    - Don't use final gather, Global illumination, radiosity or just about anything that calculates bounce lighting or Ambient occlusion at render time.
    - Don't use 50 lights to flood the scene and wash out shadows, faking some kind of sky light, when you'd never get that many lights around a character in game.

    I'd also toss in that, even if you preview it in engine, the lighting set up in your test scene, might be different then what players will finally see... so really even testing it in engine isn't going to be as great as actually testing it in game =/
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    perna wrote: »
    JordanW: I did some work for your company, and they provided me with a max shader that represented the ingame look, and also the maps were generated in max... so I'm not sure what you're saying :) I think it's directly beneficial for many of us to delve into normal map generation and rendering as deeply as possible.

    Ah, i'm not saying that it's not beneficial to delve into this as deep as possible, I'm actually saying that a lot of the tests previously posted in this are pretty much stopping at viewing their results in max or maya and saying "yep this works" or "this looks like ass" and this isn't a very good test because max/maya rendering is not accurate especially if you render using scaline/mental ray (these are actually super best-case scenarios because somehow they manage to make shit look perfect no matter how low or high poly the low res mesh is).

    Also even if a dev provides an artist with a .fx shader for max the chances of it looking different in engine are still pretty high just because of the way different devs handle importing geometry and generating all of the additional mesh info that we artists never see.

    I would also say that the low poly methods that look the best in one engine are probably going to look different in another engine. Heck i know that even in UE3 using different lighting methods on a mesh will reveal different problems from normal mapping. Using vertex lighting for example will be much less forgiving of super low poly normal mapped assets than using light map lighting.
  • kodde
    Offline / Send Message
    kodde polycounter lvl 14
    So solutions for aspiring 3D artist who are trying to learn the techniques would be to use mod friendly engines like for UE3 for instance to preview their work? At least that way they would face some kind of problems which are likely to not show in Maya/Max?
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    I would say so yes. It will also help you get used to bringing assets into an engine and dealing with making something look good "in game" and lit using and engine rather than max/maya.
  • EarthQuake
    The most important thing is just to take some time to make sure your asset looks ok in game. This doesnt mean that after every time you make a change to your model, rebake, test bake, etc that you load it up in UE3. Just that you should try it atleast once before your final bake, because there very well might be some differences you can account for.

    Now as far as UE3 goes, and a few other engines like idtech, don't these engines ship with thier own normal map generators? You would think that you're going to get the best possible results using these custom made apps, simply because they use the correct model format, with the correct bi-tangents/tangents etc.

    I'm curious Jordan, but is there a reason you dont use the epic tool for generating normals? I've never done anything with UE3 so i may be missing something obvious here.
  • EarthQuake
    Vig wrote: »
    By real time rules I mean:
    - Hugely expensive rendering techniques you don't find in real time games.
    - Stick to standard point lights.
    - Don't use photometric lighting it could give you drastically different results.
    - Don't use final gather, Global illumination, radiosity or just about anything that calculates bounce lighting or Ambient occlusion at render time.
    - Don't use 50 lights to flood the scene and wash out shadows, faking some kind of sky light, when you'd never get that many lights around a character in game.

    I'd also toss in that, even if you preview it in engine, the lighting set up in your test scene, might be different then what players will finally see... so really even testing it in engine isn't going to be as great as actually testing it in game =/

    Do you mean image based lighting when you say photometric? Because this is pretty common these days, in games. And really, weather you're using a point light, or convoluted(is that the right word, lol?) lighting from a cube map, the problems we're talking about here are artifacts from smoothing errors, so really if your normal is facing the wrong direction, you're going to get similar errors weather you're using a point light, or PRT, or image based lighting etc etc.
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    I believe there was a tool once used called shtools or something but i don't think it's as fast as the tools typically used now. I'm just speaking from what others have told me, I never used that tool and I don't believe anyone here does currently.

    These days I use Xnormal to render my normals. It quick as hell and I dont have to worry about running out of memory. Max has become such a pain in the ass to render models, especially if they are in the multi-millions of polies, let alone over 10mil.
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    When he says photometric I believe he's referring to the photometric lights used in Max/Mental ray by default now. They are way different than what we use in games because they don't use a falloff distance, you give an intensity and it uses the inverse square function to determine how fast the light fades. They also have a kelvin rating and have to be combined with some form of exposure control to render correctly... all stuff we dont really worry about in games.
  • nome_sane
    Kodde I noticed on your second example you are using the same UV layout on both the “all soft” and the “hard normal” instances. With soft normals its a good idea to keep UV's connected but where you have hard/split normals the UV shells should be kept separate.
    This is why you are getting nasty-ass seams on your hard normal instance.

    I would definitely favour the hard edge method out of your examples the kind of wavy/multi-coloured interpolation in your soft edge cases is bad juju in my opinion.
    A sexy smooth lavender colour is going to behave better.

    -Things may look fine in a simple example like this but if you perform deformation on your asset it could totally change the way the normals are interpolated across a face, also you may want to delete part of the asset this will also ruin the result.

    A hard edge (flatter normal map) will not be affected so much and is more flexible.


    -Also if your mesh is not triangulated I suppose the end result in game could be triangulated differently and not match.


    -These are very simple shapes just inheriting bevels from their high poly brethren.
    Things would be different if you have more shapes (screws, vents whatever) modelled in the high poly. With soft normals across 90 degree edges your only option is to have the surface transfer rays fire out interpolated around the edge.
    This may warp and distort the shapes on the surface. If the edge is hard the rays can be made to fire straight and parallel on flat areas and not distort the forms.

    With a single bevel the 90 degree angle has been turned into 45 this is still going to interpolate across the flat areas and give some slightly sketchy results.

    To echo the point JordanW made about tangent space maps over object space ones. Having this kind of interpolation over flat areas means you cant use the same flat area of the map on some other geometry. It becomes bespoke and locked down to the original sample geometry.
  • EarthQuake
    JordanW wrote: »
    I believe there was a tool once used called shtools or something but i don't think it's as fast as the tools typically used now. I'm just speaking from what others have told me, I never used that tool and I don't believe anyone here does currently.

    These days I use Xnormal to render my normals. It quick as hell and I dont have to worry about running out of memory. Max has become such a pain in the ass to render models, especially if they are in the multi-millions of polies, let alone over 10mil.

    Sure, i remember people complaining about it long ago.

    I generally use xnormal when i can, at 8ml one of our programmers used the SDK to create an import plugin for xnormal, so that we can read the exact normals, tangents/bi-tangents, etc from out ingame file format. This helped us get as accurate results as possible with the tech we had.

    So, what sort of format do you use in xnormal? Do you export SBM from max? I guess if you get all of that info from max when you export to unreal, and the same info is being exported from max when you export and SBM, you wouldn't really have any problem.

    I'm just curious if anyone has brought up these issues with your tech team. Or ever had the need to really(you would notice some differences between what you get in xnormal, and what you get in game).
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    I just use .obj, i tend to use 1 smoothing group on my objects and so I think that helps me keep from running into any problems.
1
Sign In or Register to comment.