Home Technical Talk

Why you should NOT trust 3ds Max's viewport normal-map display!

1246712

Replies

  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    So what would happen if this mesh was UV-mapped with much less seams? Then you wouldn't be able to make the edges hard along the UV borders, and it would revert to looking crappy in a Max bake, and better in a Maya bake :)
  • Xoliul
    Offline / Send Message
    Xoliul polycounter lvl 10
    Yeah true. The unwrap here isn't really optimal (2 min job), but I would never do it with the pelt/relax style part EQ did, way too much distortion for a straight, hardsurface object.
    In this case, If I had to texture that, I wouldn't even mind the amount of seams I have there: I always bake with Max, so I always have such an amount of seams and i've learnt to live with them, they're not that hard to texture for me anymore. Seams are overrated :p !
    (not that i would mind correct bakes in Max, regardless of seams)
  • EarthQuake
    yeah it was a worst case scenario mesh of course, so taking it and making it into a best case scenario mesh doesnt really help to solve the problem =)
  • perna
    Offline / Send Message
    perna quad damage
    yeah you can't call making splits everywhere a viable solution when it madly increases the vertex count, you might as well add chamfers
  • pior
    Offline / Send Message
    pior veteran polycounter
    Hehe Yeah Xol I took it straight from EQ's zip, baked it Max and XN and threw it in Unreal. I understand what you did to fix it and it's good to see that it is possible to get something clean out of Max+Unreal ... but to me it also stresses the big weaknesses of this combination.

    What are your results out of a bake from the original UVs ? I could barely believe what I saw honestly, it looked so bad when I tried.

    Also on top of increasing the vert count, splitting everything would really make texturing a pain... (I mean texture overlays)

    Still have to try out D3 renderbump!
  • Michael Knubben
    I've had arguments where I say I find it annoying that splitting the uv's so excessively seems neccesary, the argument arising from people just not being able to stomach any negativity or curiosity about the tools we use on a daily basis. You know the kind... 'it's not the tool...'. Well, obviously I am living with it, but I still find it a horrid waste of tris as well as a big pain in the arse while unwrapping and texturing, so I'm glad to see the debate in this thread, even when it's people arguing against the point (like Xoliul). It makes for interesting discussion.

    It's only too bad we have practically 0 influence on Autodesk to change any of this, so we're still limited to talking about workarounds.
  • Sage
    Offline / Send Message
    Sage polycounter lvl 15
    I always thought it was me when I made normal maps, it's nice to see that it isn't me. So is it safe to say that you are keeping the extra edges to keep the model looking nice?
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    I've kind of been reading this thread off and on. It's good to evaluate techniques but I feel like this is something this board kinda goes in phases about, I really thought this same thread occured with the same examples like 6mo or a year ago? :) At any rate, here's my take on normal mapping

    it sucks, there's so many problems and every engine has it's pros and cons. One thing you have to keep in mind is that a lot of times you may get more visual inconstancies because there are approximations in the shader math. I think someone mentioned earlier this is why offline renderer's look the best.

    Just looking at that worst case example makes me cringe :) It contains like every case where normal maps fall apart.

    I don't know if it's good work ethic or being lazy but every time I make a normal mapped asset I try to make it as shrinkwrapped as possible and with as many supporting edges as I can without making an uncomfortable polycount :) I also try to just go ahead and put a construction seam where I know i might have some wonkyness to hide some of the ugly shading.
  • EarthQuake
    I dont know man, it it lazyness or just a desire to be more efficient and get better results? Even with a very clean mesh, you're still going to have some errors if your baker and engine aren't synced up. So you're doing extra work AND your end result is still worse than what it would be if your engine/pipeline was simply doing it correctly. Again, lazyness or having the sense not to want to waste time while still producing a inferior result, you tell me.
  • JordanW
    Offline / Send Message
    JordanW sublime tool
    Hmm I think you missed my point. I was saying that normal maps suck in general and I've yet to see a perfect result in a game engine. So I usually go the lazy route of adding supporting geo and a more shrink wrapped shape vs splitting uvs smmothing groups etc.

    I've never run into major warping issues in ue3 but I guess I could be using 'too much' geometry so it minimized the areas where the normal map has to work hard. There's nothing special about my pipeline jus max + xnormal.
  • EarthQuake
    Oh wait, i think i read you saying lazyness as in the people here who are trying to find better solutions as the ones being lazy, creating poor meshes etc, which didnt make a whole lot of sense to me.

    I think the idea that it will always suck so we might as well buck up and accept it is a flawed notion, sure there will be things that are always going to be difficult to do, but i think most workflows can be improved a considerable amount(as shown with the maya examples). With a little bit of tools work, you should be able to save a LOT of time you would otherwise spend debugging poor shading/adding in extra supporting geometry/spliting your uvs to hell/etc and still end up with a better overall result, which to me is just win-win in every way.

    I think its easy to get into these cycles of thought in game dev where we just accept things like this as "facts of life", infact i bet if you poll graphics programmers working in the games industry, 90%+ of them would tell you that it is impossible to animate models using Object Space normals, and that you have to use tangent for anything that animates/deforms. Which is simply incorrect. I see the sentiment that normals are always going to suck being in the same boat as that thought process. Most workflows simply arent doing it correctly/well enough, so we've become accustomed to certain ways of working, which are actually pretty in-efficient, adding tons of small, thin triangles to our meshes in the form of bevels to correct rendering errors etc, inefficient in time spent in asset creation and in terms of render performance.

    So while subjectively these sort of workflows may be alright, or someone might not notice the difference in a game running around high speed, it doesnt have a whole lot of bearing on what is a technical discussion. And an entirely solvable technical problem. (Just wait to see what CB is working on for max =P )
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    Yeah it seems like the main difference here is that people using Max & UE3 have just put up with adding extra supporting geometry, or increasing the number of UV splits in problem areas, mainly due to Max's baker being dodgy with regard to real-time tangents.

    I have a feeling that if people were using Maya for UE3 stuff they might not have to worry about the lowpoly and UVs quite as much, simply because the baker's tangent space is more representative of the real-time display method, therefore you don't have to "fight it" as much.

    I agree that normal-maps are not an ideal solution, but there are varying degrees of this, and I'd much rather be using the "best" version of that sort of solution rather than a mediocre one that I have to manually take into account and work around.
  • vargatom
    Just wait until you get to use tesselation + displacement mapping in addition to normal maps. That'll be the real fun... ;)
  • [HP]
    Offline / Send Message
    [HP] polycounter lvl 12
    vargatom wrote: »
    Just wait until you get to use tesselation + displacement mapping in addition to normal maps. That'll be the real fun... ;)

    It's gonna be the same, just an extra bake map. (Heightmap)
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    [HP], but now you have to worry about displacement accuracy and mesh density too, for example a standard 8-bit displacement map will look "stepped" due to the lack of range. That's a whole different discussion though :)

    I imagine vargatom knows what he's talking about since IIRC he works on prerendered cinematics and probably already uses this stuff every day.
  • Ben Apuna
    fozi wrote: »
    Hi guys, Diogo here:

    What I was suggesting more specifically was a universal Object Space to Tangent Space converter tool. It would accept the mesh (xyz + uv), a normal map in OS and custom (scriptable, or imported) tangent basis (with a few presets). The output would be a TS normal map in whatever basis your pipeline requires.

    Frontends could be made afterwards to integrate the tool with Max, Maya, Blender, etc..
    It could also allow importing any OS normal map into Max, converting to Max's native tangent basis in the process.

    I'm all for this especially the "universal" part as this isn't really a Max issue but a issue dealing all apps that bake normal maps and all engines that display them. Not everyone is going to have access to any particular app to do "corrected" bakes.
  • vargatom
    MoP wrote: »
    [HP], but now you have to worry about displacement accuracy and mesh density too, for example a standard 8-bit displacement map will look "stepped" due to the lack of range. That's a whole different discussion though :)

    Exactly. Also remember that the normals you want to modify are the normals after the tesselation and displacement... so all the problems discussed here would apply with the extra problem of having less control over the original normals (cause tesselation will be view- and camera distance dependent).

    The added bonus is when you have relatively small, but highly displaced details, like spikes, carvings etc. In these cases the tesselation is probably not going to be detailed enough to sufficiently reproduce the shapes and forms, so you get some blobby mess of a mesh. Then you add your normals on top of it and you get an even bigger mess... So you may have to selectively exclude details from one map or the other... or between LOD changes... I can only guess at this point but the potential for suffering will definitely increase.
    I imagine vargatom knows what he's talking about since IIRC he works on prerendered cinematics and probably already uses this stuff every day.

    Yeah, and we've abandoned PRMan with the quasi-unlimited tesselation quite a while ago, so we have to work with displacement and normal maps combined. Can be a real pain in the ass at times, we've ended up building more detailed highres models to sidestep the problem.
  • Neox
    Offline / Send Message
    Neox ngon master
    pior do you import your normalmap as a normalmap into unreal? i didn't try and maybe you did but if you didn't you should try it, as importing it as a ordinary texture always messes up things in unreal
  • perna
    Offline / Send Message
    perna quad damage
    We've done work for clients who used maya, supposedly had their ingame shaders tuned to maya and yet used objs exported to xnormals for bakes. Also, like someone mentions above I think, clients who didn't state a baking preference at all (to them, a normal map is a normal map, this is frighteningly common).

    So when we finally got around to using the full and proper maya path we were blown away by the increase in quality. After being used to adding thousands of polygons exclusively to make normal maps look better for unreal engine content (it seems most of the devs out there these days use unreal, way to go Epic :) ), we can now use all those polies to define shape and silhouette instead. For a great deal of the stuff we're building you check to be able to tell whether you're looking at the hipoly or lowpoly model. We wouldn't be able to do that without the maya path, it would just be too geometrically heavy.

    Now everyone start lobbying for better normal mapping standards. Honestly we don't want to do max/unreal style normals anymore, it won't look as good in our portfolio ;)
  • Mark Dygert
    I would like to add one thing:
    bakes004b.jpg
    Sometimes imperfections like seen on the right (teal arrow) can help a model look more real world as it could easily be a mold imperfection or a dent. In our search to make everything perfect and mathematically correct we could be stripping out the kinds of detail we try hard to build in.

    We might actually want to learn why and how these things pop up so we can force them in certain situations rather than eradicate them.

    The real world objects we seek to replicate have tons of imperfections, we need to be careful we don't eradicate all traces and judge them to be errors... The imperfections you see in Maya bakes (as above) seem to be more passable than the ones in max (hidden edges killing surface detail).
  • pior
    Offline / Send Message
    pior veteran polycounter
    Well yeah but in the case where you REALLY want something smooth (like a polished object with a env map reflection on it) you really want accuracy ...

    Now I understand that in such cases one might be better off increasing the density and not even use normalmaps at all (like on car driving simulators) but still, it shouldnt prevent us from looking for a perfect normalmapping solution :P

    I say, IRRELEVANT!
  • Mark Dygert
    I agree figure out what causes it, so you can stamp it out if you need to. Just take a second and ask yourself, "can this be used as imperfect detail?" Before taking the extra time to kill it just to be mathematically perfect.

    In our quest for the perfect bake we can end up taking more time and end up with less detail.
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    If it's not a detail you want, then surely it shouldn't be there in the first place? :)

    I see what you mean, Vig, but if i want an imperfection, I'll model or texture it in rather than trying to make some bake error fit into my texture.
  • EarthQuake
    Right, smoothing errors to me are never a good thing. If i want a specific look i will create the specific look, not rely on bad software to .... maybe give it to me.
  • perna
    Offline / Send Message
    perna quad damage
    Right, Vig, I've got to side with MoP and EQ here (begrudgingly, as they are very bad people) in that I would rather specifically make something look the way it should on purpose than sit and hope for the infinitely small chance that a baking error should represent the exact effect that I intended ;)
  • Mark Dygert
    Yeah good points. I agree if its a planned detail you've probably already thought about it and its not like it wouldn't be hard to paint in. I agree I probably wouldn't waste time trying to getting a spot to bake badly, but I might not nuke it if pops up... maybe...
  • dur23
    Offline / Send Message
    dur23 polycounter lvl 15
    MoP wrote: »
    dur23: Well, the whole point of a normal-map is to represent the high-poly normals using the texture, it's not really anything to do with having "the inverse of the low poly shading" as you seem to be suggesting.

    All it represents is the difference between the interpolated vertex normal and the high-poly normal, per pixel. It just so happens that a lot of the time, to make that look correct, you end up with per-pixel normals which are "counteracting" the shading of the lowpoly normals, resulting in the stuff you're identifying in those images.

    Yar mop, i do understand the per-pixel representation of the high poly normals. I think i just worded it incorrectly, because in the second paragraph you kind of say exactly what i was saying by "counteracting the shading of the lowpoly normals". Which translates to (imo) the baker taking the normal of the lowpoly, inversing (pixel based) it and overlaying it over top of the per-pixel high poly normals. Which means the low poly shading (normals) = zero. So when you add the High poly normals it does 100% of the shading. Which means the low polys shading represents 0% of the overall shading.

    Now when i look at the max render, it appears it is not properly adjusting for the low poly meshs normals (smoothing groups). Which to me looks like the whole problem starts at the interpolation of the low poly normals. Anyone wanting to use max baking in their engine would have to interpolate the low poly normals the same way that max (which is why it only looks right in max) does in order to replicate the shading within max. Am i crazy? Am i being useless by pointing this out???
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    dur, not really useless, it's clear you get it - and that's the whole point of the thread. and it's not really the lowpoly normals as such, more actually the tangent space it creates from the lowpoly at render time, which is different to the viewport tangent space.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 15
    3psbake-test1.png

    single smoothing group applied, raytrace projection, crytek method generates its own custom normals (however those won't affect the bake ray direction)
    further tests need to be done, whether smoothing groups help cryteks way at all and so on... but anyway proof of concept is done. How this is taken further within 3ps and outside (considering the associated additional work of rebuilding the maya way) still needs evaluation.
  • Joshua Stubbles
    Offline / Send Message
    Joshua Stubbles polycounter lvl 15
    Wow that is quite interesting, Crazy. I still some some minor issues, but nothing that would be noticeable once textured. That's nice.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 15
    be in mind that I am no baking expert, I just applied the standard stuff, set projection method to "Raytrace" and thats it. As said if its really worth further efforts, will be found out by fellow artists.
  • onionhead_o
    Offline / Send Message
    onionhead_o polycounter lvl 12
    bugo wrote: »
    I use Turtle inside Maya, I wonder why nobody use that, it's so good.

    wuts turtle? links plz
  • Ark
  • pior
    Offline / Send Message
    pior veteran polycounter
    CrazyButcher, got to love how major behemoth apps are being fixed by their users, again ...
    Not really sure if I fully understand what you are doing but yeah that screenshot looks odd and awesome at the same time :P like something that should have been here always, but only happens now, thanks to you!
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    CrazyButcher: Very cool. That could turn into an extremely useful plugin.
    bugo wrote: »
    I use Turtle inside Maya, I wonder why nobody use that, it's so good.
    Because it costs money and you have to email Illuminate Labs for pricing information?
    How much does a single user license cost, bugo?
  • Ark
    MoP wrote: »
    How much does a single user license cost, bugo?

    http://www.illuminatelabs.com/shop/Turtle/purchasing-turtle

    Never tried it, but it's always prasied in 3DWorld.
  • vargatom
    Turtle also makes a mess of all your Maya scenes with extra nodes that you can't remove if you don't have the plugin installed. We've done a few outsourcing jobs where the client sent us these nice surprises ;)
  • undoz
    This is an awesome thread!

    I did my own tests with EQ's mesh and my conclusion is that Max uses a different method to interpolate the normals in the software render no matter what you do.

    The only partial solution is to try to bring the normals in the viewport close as possible to what the offline engine engine does.

    The best result with EQ's mesh was with after I ran a Retriangulation in Edit Poly and I added an Edit Normals modifier and Retested them.
  • glib
    All hail Crazybutcher!
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    haha, no wonder more people don't use Turtle if it's $1500 per license... that kinda prices it out of anything other than a studio who do a lot of lightmap baking's budget.
  • pior
    Offline / Send Message
    pior veteran polycounter
    Undoz, can you elaborate on that? Please post your results too
    The best result with EQ's mesh was with after I ran a Retriangulation in Edit Poly and I added an Edit Normals modifier and Retested them.

    I don't see why or how this would change anything at all but I totally trust you, and wouldnt be much surprised if it was yet another one of these special Max cases...
  • undoz
    The difference is not that spectacular, and pretty far from perfect.

    ExAPG.jpg
  • bugo
    Offline / Send Message
    bugo polycounter lvl 12
    MoP wrote: »
    CrazyButcher: Very cool. That could turn into an extremely useful plugin.


    Because it costs money and you have to email Illuminate Labs for pricing information?
    How much does a single user license cost, bugo?

    I have no idea, I use here at the company. So I don't know the prices, I can research that tho.
    But yes, it might be expensive for one user only.

    Taking the price out, it is a great plugin, best I've ever used.
  • cw
    Offline / Send Message
    cw polycounter lvl 12
    Aargh what is this magic?

    So I retested it because I get like that sometimes, maybe I can crack this impossible problem, but of course I can't.

    Interesting though, that adding edit normal modifier in max fixes some of the errors! How so? Who knows! It must be magic!

    here:

    from left to right, scanline, shader before edit normals, shader after edit normals.

    good old max, the package that keeps on giving! :D

    nrmtest.jpg

    oh, I think this is the same as undoz posted, sorry!
  • fozi
    I've also got some progress going on: I'm well on my way developing the conversion tool, but decided to test the theory first.

    In the shots below, taken from the viewport, I'm using Max's own tangent basis. Both shaders are using an object space normal map as a normal source. The mesh on the right, however, is running a shader that converts OS to TS on the fly.

    specular.jpg

    nmaps.jpg

    I should be able to get the tool up and running in the next couple of weeks.
  • MoP
    Offline / Send Message
    MoP polycounter lvl 14
    fozi, that looks perfect :)
  • engelik
    that's pretty amazing fozi, looks flawless :)
  • bugo
    Offline / Send Message
    bugo polycounter lvl 12
    that's pretty good!
  • undoz
    That's awesome fozi! Looking forward to see what you come up with.
  • cw
    Offline / Send Message
    cw polycounter lvl 12
    so fozi, what we see there on the right is what the generated normal map from max would have to look like in order to work perfectly with realtime shader in max viewport? Interesting stuff! IT would be interesting to check the delta between that shader output of yours and max's default generated normal map, to see what the differences are.

    good work! :D
1246712
Sign In or Register to comment.