Home Technical Talk

Why you should NOT trust 3ds Max's viewport normal-map display!

2456711

Replies

  • Xoliul
    Offline / Send Message
    Xoliul polycounter lvl 14
    ELD: the triangulation is not the issue. regardless of it, the shading still looks worse. Also, the left mesh has the same bad triangulation, yet it's not visible.

    Also, the default way of computing normals is not an optimization, as you seem to think, it's just the default, straightforward way. The special method Max uses is an improvement on this default method. Many people seem to think it's too heavy for realtime since it's seemingly been reserved for software renderers, but Max can actually do it realtime without a problem.
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    Xoliul, I know, read my post above. my question is, why is it a shader model 3 setting in max?

    It's really strange to hear that unreal engine, known for its visual tools and features, would go halfway on such a core feature, because it was the default easy way, there has to be some reason.
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    edges.jpg

    This is in the 2.0 realtimeview in max, with a flat normalmap on it.

    1. max bakes with the scanline renderer, and in his point of view, the mesh looks perfect, like on the right.

    2. the left is rendered with the results and agony of this thread, while the sm3 shader would render it like it looks on the right, regardless of how the triangulation looks.

    3. the max normmalmap output has assumed that the shading is perfect like it is on the right, since thats how the software renderer will make it look, just like how the sm3 improved one will make it look as in realtime.

    4. maya will render the normalmap KNOWING that the shading is fucked up from start, and will not assume that the mesh is all neatly clean.



    so what I've got from this thread is, most games will render the lowpoly and its normals in some kind of way that is not so pretty, maya knows of the problem and will render its normalmap to compensate for this, MAX doesn't care and will render a normalmap with the assumption that the lowpoly has perfect software shading, so when we have baked in max we have to make sure that the triangulation is correct to make up for the artifact of the lowpoly shading.

    now read through this a couple of time so I'm not misunderstood as not understanding the thread :)
  • MoP
    Offline / Send Message
    MoP polycounter lvl 18
    eld wrote: »
    so what I've got from this thread is, most games will render the lowpoly and its normals in some kind of way that is not so pretty, maya knows of the problem and will render its normalmap to compensate for this, MAX doesn't care and will render a normalmap with the assumption that the lowpoly has perfect software shading, so when we have baked in max we have to make sure that the triangulation is correct to make up for the artifact of the lowpoly shading.

    Yep, I think this is a good and correct summary.

    Maya simply seems to acknowledge that the "cheap" way of calculating tangents works fine, and uses that across the board, meaning the results are consistent.
    Whereas Max has two different methods, and uses the "cheap" one in the viewport unless that SM 3.0 method is enabled, at which point it uses the "more expensive" one.

    Which invites the question - why is the more expensive (and presumably more accurate?) method used at all, when the cheap method appears to work just fine?
  • Neox
    Online / Send Message
    Neox veteran polycounter
    why does someone use mental ray or maxwell or vray or any other offline renderer and not a realtime renderer? its always quality and accuracy, i personally hate offline rendering it takes ages and i'm sooooo impatient when it comes to this, but well thats how it is.
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    MoP wrote: »
    Which invites the question - why is the more expensive (and presumably more accurate?) method used at all, when the cheap method appears to work just fine?

    yeah, but maya just fixes the symptoms, not the problem :)

    there are many cases where I could have a cylinder shape, lets say a pipe, and I could just use a flat normalmap and work with that to create the end-result normalmap, because I knew that I would have a perfectly cylinder surface, if I would fix the triangulation to make up for how the engine does it.

    but if I wouldn't care about the triangulation, and let maya bake a cylinder highpoly as a normalmap, then I wouldn't be able to move the mesh around, or assume it could aswell be flat.

    I know many enviromental artists work on surfaces without ever baking from a highpoly, and they'll use the same textures for a flat wall, or a curved corner wall, and in that case,the non-baked normalmap will not fix any problems with the lowpoly.


    ps. if this threads ends with a happy conclusion, explanation, party, whatever, I will die a happy man.
  • OBlastradiusO
    Offline / Send Message
    OBlastradiusO polycounter lvl 11
    Autodesk needs to rehaul most of Max's modifier's and they're normals process. One of the reason why I left Max for Maya and Modo what because of they're shitty updates and outdated tools.
  • Fang
    Offline / Send Message
    Fang polycounter lvl 7
    What I've gathered from this thread so far is: I'm using xNormal from now on forward :)

    I've had these weird normalmap issues alot, might be my fault half the time, but I bet it's Max's just as often.







    I just thought of a new term: Normalmapoholic
  • MoP
    Offline / Send Message
    MoP polycounter lvl 18
    eld wrote: »
    yeah, but maya just fixes the symptoms, not the problem :)

    there are many cases where I could have a cylinder shape, lets say a pipe, and I could just use a flat normalmap and work with that to create the end-result normalmap, because I knew that I would have a perfectly cylinder surface, if I would fix the triangulation to make up for how the engine does it.

    but if I wouldn't care about the triangulation, and let maya bake a cylinder highpoly as a normalmap, then I wouldn't be able to move the mesh around, or assume it could aswell be flat.

    I know many enviromental artists work on surfaces without ever baking from a highpoly, and they'll use the same textures for a flat wall, or a curved corner wall, and in that case,the non-baked normalmap will not fix any problems with the lowpoly.

    Hmm, this is a good point. So basically, to fix the "problem", everything should really use the high quality tangent derivation that Max's renderer uses, I guess?
  • odium
    Offline / Send Message
    odium polycounter lvl 18
    I'm watching this thread with a twitching weener in the hope it can turn into a full blown wooden apocalypse. This issue has hurt our dev process for far too long, and in the end the only way we knew how to fix it was chamfering or extruding with extra edges which is not a good way at all. Not if it can be helped. Party on Wayne.
  • Blaizer
    Offline / Send Message
    Blaizer interpolator
    No doubt, Maya and Xnormal does a slighty better job, but as eld said, it does not fix the problem.

    I'm stucked with Max scanline renderer for generating normal maps (super sampler on, adaptative halton), the results i obtain are quite good so far as to be figthing with nitpickings. My conclusion, after losing hours and hours like a stupid, is that it can't be helped. All games with normal maps have issues with them, artifacts, is like something related to this technique.

    With the experience i've learned that the best solution so far, is to have a few more polygons and that's all, a bevel or some polygons cuts. You won't care if it's a normal from Max, Maya, or Xnormal. If you see a good shaded model in the viewports, the normals will be nice. The cost will be more tris, but i think that's not a problem with actual hardware, shaders are the most expensive thing to render out.

    for some reason, with Object space type normal, the results are far better for me.

    My two cents
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    I did the "local space" to "custom tangent space" conversion, for the reason that local space will be the same for all (unless uv-sampling interpolants are also different in scanline). Then you can transform the normal into your custom space... ideally you would do this with high precision local normals... but anyway from the past http://boards.polycount.net/showthread.php?p=662442
  • Sage
    Offline / Send Message
    Sage polycounter lvl 19
    I'm not sure, but when I was working in Max learning to do tangent base normal maps what I eventually noticed was that when max bakes normal maps it took into account the lighting of the lowpoly. As you know the reason to have nice even quads is to get good lighting because of vertex lighting. To me this is utter crap. That why you made the high poly and made sure it was nice and neat so the freaking 3d renderer would steal the information and when this got applied to the low poly it wouldn't care about the low poly normals but use the information of this glorified map. However this is not what happens. I think the problem is the normals map as in it sucks and the fact that max viewport doesn't display things like if you rendered it is a separate issue. Both issue suck though.

    Max scanline renders should be better than what the viewport displays because the viewport needs to be fast, that's common sense. However the viewport should displays things that the scanline renders out properly. Thats is also common sense.

    I think Maya renders shit out better because they don't use smoothing groups like Max does and setting things up in Maya is a lot easier for the user since the software automates this process.

    Back to the normals map issue. If the low poly lighting doesn't look smooth in the viewport then when you bake your normals that information gets mixed with the highpoly. The question is why? I'm questioning how the map was programed to get this information. This doesn't happen with a bump map on a low poly. It just uses the shade of gray to tell the engine where the bumps are. Why doesn't the normal map do the same? It has the colors that indicate the direction of a surface. I'm wondering why can't this be improved.

    For baking I just make sure my low poly cage has smooth lighting in the viewport in any 3d app I'm using with one smoothing group( edges set to soft) and then I duplicate that and optimize. I take out the edges that make it smooth well. This seems to work fine. I have been told that's it's not a good way to do it, but it's the only way I found for the low poly lighting not to mess up the normal map..
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    the Tangentbase normalmap works after the tangent of a surface, meaning, it relies on the information on the surface it is on.

    the objectspace normalmap will just use the pixels as normaldirection, meaning if you moved the polygon around, it would still shade in exactly the same way, this wouldn't be possible to animate.

    animations are the reason we have tangent normalmaps.
  • odium
    Offline / Send Message
    odium polycounter lvl 18
    Ok guys, just to chime in here... We have done a few tests with this. First of all, let me know if this is correct as a low poly, in terms of creation. I made sure it was GOING to have some triangulation issues, hence why it does, simply to test this out. Now maybe these are too harsh to fix and that it would need a extrude/chamfer, not sure, but check these out.

    First of all, here is the low poly:

    triangulation_1.jpg

    And here is the high:

    triangulation_2.jpg

    These are the files:

    http://www.teamblurgames.com/odium/triangulation_models.zip

    NOTE: These models were just created for testing, so, use how you wish guys, they wont be used for anything.

    Right, first of all, I did a Doom 3 test. This is the result:

    normals_doom3.png

    Secondly, I did an xNormal test. This is the result:

    normals_xnormal.png

    And finally I did a Max test, and this was the result:

    normals_max.png

    As you can see, the same issues are present in every app tested, and I've tested each map in Doom 3, OverDose, xNormal and Crazy Bump, each still obviously displaying the problem. In this situation with that harsh 90 degree angle, would that need a chamfer? Because in my experience I've seen models get around that without cutting up the UVs at all or using more smoothing groups.

    UPDATE:

    This is what each one looks like when applied to the mesh. I wont sit here and take pics of each one in every engine, so just take my word for it that each one looks exactly the same in whatever I try:

    triangulation_3.jpg
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    eld wrote: »
    animations are the reason we have tangent normalmaps.

    that's not true, you can have animated local-space normal maps just fine, even organic deformations (BF2 used this). I think this myth will prevail so, no matter how often people state the opposite. Think about this, if you lighting happens in world space, then you must turn your tangent-space normal to world-space to do the lighting. Ie you must compensate for animation as well. The same transformation would happen with a local space normal.... If animation was a problem with normals, we could not even have vertex lighting with animated stuff, as vertex normals are also in "local-space".

    @odium, it's pointless comparing normal-maps like that, when you cannot be sure that the tangent-space is the same for all. They all must look "weird" with the smooth vertex normals causing strong interpolation changes.
  • OBlastradiusO
    Offline / Send Message
    OBlastradiusO polycounter lvl 11
    Is it safe to say that when dealing with hard edge more detailed objects you should bevel the edges for correct normals? Is it also safe to say that when dealing with less detailed hard edge object you should split the edges?
  • odium
    Offline / Send Message
    odium polycounter lvl 18
    @odium, it's pointless comparing normal-maps like that, when you cannot be sure that the tangent-space is the same for all. They all must look "weird" with the smooth vertex normals causing strong interpolation changes.

    If you read my post at the bottom, you will see I tested them out in engines too. I have very good eyes for normal differences, working with them so bloody much :p But yeah I can edit the post to show the results in real time if it makes things easier for everybody?
  • pior
    Online / Send Message
    pior grand marshal polycounter
    Odium (and all), correct me if I am wrong but just 'looking at the colors' of the baked normals will never take us very far.

    It would be fine if we were just dealing with bump or disp for instance. But here it's a much more intertwined issue. To represent the faked highpoly we fetch and play with data from : the lowpoly's vertices, their normals, their tangents (somehow, the Uv 'combing' so to speak, also editable), and the triangulation guesses (does not apply in your case since you triangulated everything). And then you have shaders able to respond to the strong changes in tangents orientation in order to detect mirrored UVs and mirror the normalmap 'shading' accordingly. Some shaders are fine with stuff being offset of 1 unit in UV space, some are not. And more! Hence, no norm.

    The fact that you feed similar or identical maps to D3, Xnormal and the Max viewport, does not mean they will be displayed the same in those three environments...

    Also regarding d3 renderbump : try to fiddle with the command line parameters. If you find the one very specific parameter thingie they used for D3, you will notice VERY different colors in your maps. More yellow/oranges at the sharp turns in the lowpoly, iirc. From old memories, this was the cleanest normalmaps I ever dealt with. I guess Xnormal doesn't count since the viewport is not exactly realtime ? (recalculates stuff on launch)
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    that's not true, you can have animated local-space normal maps just fine, even organic deformations (BF2 used this). I think this myth will prevail so, no matter how often people state the opposite. Think about this, if you lighting happens in world space, then you must turn your tangent-space normal to world-space to do the lighting. Ie you must compensate for animation as well. The same transformation would happen with a local space normal.... If animation was a problem with normals, we could not even have vertex lighting with animated stuff, as vertex normals are also in "local-space".

    @odium, it's pointless comparing normal-maps like that, when you cannot be sure that the tangent-space is the same for all. They all must look "weird" with the smooth vertex normals causing strong interpolation changes.

    It's quite possible I'm all wrong, but I thought an object space normalmap ment discarding any lowpoly normals, since the normalmap itself would decide the normal of the pixel without using the underlying surface at all, so if you deformed it, how would it know how to interpolate on a surface?

    and then why do we use the clearly inferior tangentspace normalmap before object space?

    I thought the conversion was a direct influence on the objects rotation in the world in an objectspace normalmap. while the tangentspace was influenced by that and the angle of the normals of the surface it is on.
  • odium
    Offline / Send Message
    odium polycounter lvl 18
    I really cant believe you guys cant read normal maps :p But Either way, point taken, I've updated the post with screenshots of it in real time for ya.
  • pior
    Online / Send Message
    pior grand marshal polycounter
    Eld, nope, object space are completely usable on rigged/deformed meshes, and while transformed in the world aswell. It's just a matter of looking at the correct reference and work from there. It's less straightforward than tangent (from a thinking process point of view) but in code its not heavier. Some programmers just have a better hang on those things than others.

    My guess is that since they don't allow much creative UV use (stretching the texture over a random shape would never work with OS), studios generally go for the unified TS route. But yeah on problematic objects like the ones we are talking about here (triangulation issues and whatnot), OS would look absolutely gorgeous and error-free, easily.
  • MoP
    Offline / Send Message
    MoP polycounter lvl 18
    odium: I can read the differences between flat normal maps perfectly fine, and it's quite clear to me that there's a big difference between the normals found in the XNormal bake and the Max/renderbump bake. The most obvious part is on the left side, where max and renderbump have given a smoother gradient across 2 triangles, while XNormal has made the gradient sharper across the triangle edge.
    This is no substitute for viewing it on the model though, you can't just look at a flat normal map for specific geometry and say "this is wrong" unless it's intensely wrong in the sense that your green channel has been swapped with the blue, or there are backward-facing normals etc. You have to see it in the environment it's intended for display in.

    Mainly the problem there is you've used a mesh with long triangles going over 90 degree corners which is always going to look worse. It might be interesting to see how Maya handles a bake like that, though.

    Edit: Per, I dunno, I mean XNormal can do the same as Maya (and so can ZBrush actually - oddly enough, ZBrush has more options than any other baker I've seen, it can emulate the tangent space calculations used in any other app, to the extent that you can actually simulate baking in Max or Maya just using ZMapper!), basically when you bake and preview in the same app they all tend to look "correct", it's just this Max method where the baking calculations are different to the default viewport calculations which are causing the problem.
  • CrazyButcher
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    @eld: some more discussion around the topic (local vs tangent) was in this thread http://boards.polycount.net/showthread.php?t=60694, if you want to I can walk you (or anyone else) through the math involved
  • Xoliul
    Offline / Send Message
    Xoliul polycounter lvl 14
    pior wrote: »
    Eld, nope, object space are completely usable on rigged/deformed meshes, and while transformed in the world aswell. It's just a matter of looking at the correct reference and work from there. It's less straightforward than tangent (from a thinking process point of view) but in code its not heavier. Some programmers just have a better hang on those things than others.

    My guess is that since they don't allow much creative UV use (stretching the texture over a random shape would never work with OS), studios generally go for the unified TS route. But yeah on problematic objects like the ones we are talking about here (triangulation issues and whatnot), OS would look absolutely gorgeous and error-free, easily.

    Also, I never managed to overlay extra detail tangent space maps over baked object space normalmaps... Bit of a bummer if you can't edit the normalmaps afterwards.
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    Thanks for explaining the downsides of it, even if it contrary to what I believed, supports animation.

    editing normalmaps and playing around with uv's and optimizations is a big important thing for me with normalmaps, so I can see why it's used more as a standard.
  • Eric Chadwick
    Great thread! Glad I'm not using tangent normalmaps... :)
  • [HP]
    Offline / Send Message
    [HP] polycounter lvl 13
    Great thread! Glad I'm not using tangent normalmaps... :)

    Well, I guess you're one of the few lucky bastards.

    Also guys, don't forget the UV's also pay a very important role in all this.
    I saved this pic a few months ago, by zackf! :)

    353avrd.jpg
  • DarthNater
    Offline / Send Message
    DarthNater polycounter lvl 10
    HP that's a good point. UV's play a huge role in proper normal baking and I think most people over look that. Sometimes it seems like people think just because their hi and low poly meshes look stellar that its going to end up looking amazing. Proper UV work, I'd say, is probably 70% of the battle.
  • odium
    Offline / Send Message
    odium polycounter lvl 18
    Trouble is sometimes you don't want to or simply can't put a uv break in places. I'm one of those people that likes a constant UV, because then your specular falloff on the edge looks much nicer and the whole piece blends together a lot nicer too, but then I'm mostly using this stuff in a PPL area so those kind of things show up worse. In that case I would simply use a chamfer edge, and it would look a lot nicer, but not perfect.

    Swings and roundabouts I guess...
  • Mark Dygert
    Great thread guys! Awesome info.

    I think this sort of starts to fall into a pipeline issue.

    Maybe they figure, the viewports are good enough for most of the major tweaks and you should be checking it in engine for the last 10%?

    Or maybe they assume (like MLichy pointed out), streamline the export import process and make it painless for artists to see the models in the native environment. Which would fall on the internal dev team?

    It would be awesome if Max/Maya had access to simple external 3D viewers for a few of the major engines, like an Export+ feature, export this model, place it in a scene and launch a simple viewer. Maybe have some kind of SDK for plugging in your own engine/viewer or updating existing viewers. Something quick and light where you can set the engine and choose a scene file (or load a default). That way they're not having to force max or Maya to work in slow buggy way that is counter to how it is written, or rewriting them for each specific case.

    I agree a set standard would be awesome and probably would have cleared up this mess ages ago but I think there are subtle reasons each handle things a little differently?

    That way you're not stuck loading the editor, updating the asset, then load the entire game/level (even though that would be ideal if it was automated) just to see it.

    Updates to the viewports should be external and not left up to autodesk.
  • EarthQuake
    People keep saying that this is all a matter of triangulation in max, i think that is a load of shit. You can have an entirely triangulated mesh in max and still get bogus errors you wouldn't get a in maya. On the other hand, you can have a mesh comprised half of 5 sided faces in maya and still get a very accurate result.

    As far as maya just fixing the symptoms, i dont really agree. You see the problem is that max's renderer and realtime display do not match up, that is the problem. Maya fixes this *problem* by doing it correct(or at the very lest much better than any other app). Maya may not be the perfect solution to the problem, as of course there are other factors, but using maya to bake, previewing in maya, and having your engine synced up to maya is clearly the best solution to the problem i've ever seen. Max falls appart in this regard even if you have your engine synced up to it, simply because it does no accurately preview your results.

    And, yeah you can get decent results in any application, i dont think that is really what the discussion is about here, we're trying to discuss how to get the BEST results, not something that is mearly "good enough", as atleast to me personally, spending 2-3 days on a high res mesh only to have the lighting break down into shit in the end is not acceptable in any case. I think a good portion of the people who say "i just make it good enough" work on characters mainly, and because of the "soft" and organic nature of characters, these errors will always be less apparent then on precise, hard surface models.



    I made some examples for Jeffdr at 8ml to help him improve the tangents in Marmoset, as it doesnt really match up to maya, nor .mesh from XN, nor .obj from XN, even though he's grabbing the tangent info directly from the .mesh format which is used to display in game. I didnt bother even testing max in this setup, because it always produced the worst results in similar tests. Here is basically a copy paste of what i sent him:

    nmtest01:
    1. high
    2. standard shading
    3. "high quality" shading
    4. hq textured
    5. wires

    This is pretty awful geometry in general, but the smoothing errors in maya are very slight, forgivable in pretty much every instance here.

    nmtest02:

    Interesting "smoothing errors" related to image resolution, as you get less res, there are less pixels to accurate account for the gradiations in smoothing so you see artifacts. cool stuff

    nmtest03:
    OBJ, rendered in XN, and displayed in XN


    nmtest04:
    MESH, rendered in XN, and displayed in MESH
    Both XN examples look significantly worse than maya, but likely much better than max(not even going to include max examples cause it sucks so hard). Interestingly, where extra supporting geometry has been added(beveled edges) the errors seem to be worse, likely due to having those small thin triangles and maybe resolution issues or something? Generally these sort of beveled edges should HELP, but seem to hurt here. In maya you can see that because it is done correctly, the beveled edges on the square shapes in the top image produce a better result than the more simple geometry in the bottom image.


    nmtest05:
    all 3 test NMs displayed in marmoset with the same settings. You can see all are wrong in different ways! None are really much better than the others, the maya actually may be the best here which is a little surprising, i assumed it would be the .mesh.

    Just one more note on the resolution thing there, WHY it is interesting: Even if your renderer is doing everything correctly, you will still have need to use hard edges on very complex assets, as often times you wont have the resolution to properly deal with the angle change.

    MonkeyScience(Andres): it looks like on the last set the lighting gets darkened on triangle edges in the OBJ and MESH examples. Most notably you'll see it in the hilight on the left-most nub, also the more pronounced dark diagonals in the stair-steps. How come that's different?

    nmtest06, 07:
    me: On these areas where you've got small thing triangles and major direction changes, theres like..... less room for error perhaps? Because you actually have a pretty large range in such a small area. I'm not sure if that makes sense but its a theory? The maya example visually looks similar in the texture itself, but because the calculations are done correctly there it isnt an issue. You see those sort of areas break down the worse as they are downsized however, because you're lacking the resolution to show small angle changes.

    heres a rar of crap to play with. you can load the .ma in maya, hit 6 for textured mode, and in the viewport click renderer->high quality. you may also need to right click on the mesh, go to material attributes and make sure the normal has the correct path. You can load the other maps here too. You'll notice that some of the different smoothing errors you get in toolbag are from triangulation(either by XN in obj, or maya for the stooge format tho these should be the same triangulation) you can tell its a triangulation issue when there is a "cross" shape smoothing error ie: gradiation running against the edge.

    http://dl.dropbox.com/u/499159/nmtest.rar

    ^ source files, so anyone can test the same asset in max, triangulate it, etc. I would be verycurious to see if anyone can get results anywhere near as good as maya on a "worst case scenerio" mesh like this. WITHOUT USING HARD EDGES. =)


    01
    nmtest01.jpg
    02
    nmtest02.jpg
    03
    nmtest03.jpg
    04
    nmtest04.jpg
    05
    nmtest05.jpg
    06
    nmtest06.jpg
    07
    nmtest07.jpg
  • ivars
    Offline / Send Message
    ivars polycounter lvl 15
    MoP wrote: »
    It might be interesting to see how Maya handles a bake like that, though.

    Rendered at 512 resolution in maya and viewed in maya viewport:

    normalv.jpg

    Viewing it in our current engine shading errors become quite visible.
  • Xoliul
    Offline / Send Message
    Xoliul polycounter lvl 14
    The solution is really simple though, Autodesk just needs to add a dropdown to select tangent basis calculator from. Simply because the "correct" type of normals will look incorrect when rendered with Scanline, and there are still people that will want to do this.

    But in any case, I'd jump to Xnormal completely if it had decent supersampling (upscaled rendering isn't quite as good as Supersampling since it blurs when downsizing).
  • OBlastradiusO
    Offline / Send Message
    OBlastradiusO polycounter lvl 11
    Is it safe to say that when dealing with hard edge more detailed objects you should bevel the edges for correct normals? Is it also safe to say that when dealing with less detailed hard edge object you should split the edges?

    Is that a sensible way to go?
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    ivars wrote: »
    ...Viewing it in our current engine shading errors become quite visible.

    and this is why I'm concerned :)

    it seems like everything _but_ maya does it wrong.

    edit: or is it the other way around?
  • ironbearxl
    Offline / Send Message
    ironbearxl polycounter lvl 18
    Xoliul, have you tried enabling 4x AA before baking?
  • MoP
    Offline / Send Message
    MoP polycounter lvl 18
    eld wrote: »
    and this is why I'm concerned :)

    it seems like everything _but_ maya does it wrong.

    edit: or is it the other way around?

    Well, as I think CrazyButcher and others have mentioned, it's not "wrong", it's just different.
    It only becomes wrong if the tangent basis you're baking with is different to the tangent basis it will be viewed with (which is the case with Max, therefore "wrong" between baking and viewing in viewport).

    It will never be "wrong" as long as it's consistent between the things you're using it for. Obviously if your game engine uses a different tangent basis to Maya's baker/viewport, then your Maya normal-maps which looked perfect in Maya, will of course look wrong in your engine. The solution there is to make sure your engine calculates tangent basis in the same way as Maya.
  • Joao Sapiro
    Offline / Send Message
    Joao Sapiro sublime tool
    an image showing the same as joe, but in max and with the adition of a max generated normal map.

    nmbakedd.jpg

    So yeah , so far i barely have to add any supporting edges for shading when dealing with maya bake, and when i do i just use the uvw splits as hard edges ( not adding vertex count since theres already a split there ) and its almost like an object space bake in terms of "cleaness".

    In max to bake something like those stairs i would have to either chanfer alot/play alot with hard edges and uvw splits ( making the uvw map not as optimal as possible and giving problems to the texturing process in the long run ) and making a voodoo spell.

    Edit : replaced image with one where the text isnt cropped kekeke...
  • EarthQuake
    Yes that pretty much sums it up, maya bakes are near as clean as max's scanline renderer. Anyone need more proof that max's workflow is broken, and maya's is *by far* superior?
  • OBlastradiusO
    Offline / Send Message
    OBlastradiusO polycounter lvl 11
    EarthQuake wrote: »
    Yes that pretty much sums it up, maya bakes are near as clean as max's scanline renderer. Anyone need more proof that max's workflow is broken, and maya's is *by far* superior?

    What about Modo's normal renders compared to Maya? Also can you answer this question?

    Is it safe to say that when dealing with hard edge more detailed objects you should bevel the edges for correct normals? Is it also safe to say that when dealing with less detailed hard edge object you should split the edges?
  • EarthQuake
    Sorry i've never rendered anything in modo, i assume like most apps that modo does things just a little bit differently, so you're in the same boat as to needing to sync your engine up with the correct tangent calculations. Bitmap has done some baking recently in modo, he may be able to tell you.

    As far as hard edge/beveled edge, there are no overall rules for these things. You should be able to get visually near the same results using a hard edge, provide that A. you have a uv split there and B. you're using a cage system that doesnt split the edges when rendering normals(in max this means using cage, not offset, and in XN this means editing your cage to make sure the verts are welded).

    A case where you may want to use hard edges: at the end caps of objects, where you will naturally have a uv seam anyway. You may want bevels in areas close to view, and where you dont want to have a uv split but need to help shading. Detail doesnt really have much to do with it, and its a case by case basis, even with one mesh you will likely want to do both in some areas.
  • pior
    Online / Send Message
    pior grand marshal polycounter
    The irony being that Maya basically sucks when it comes to highpoly modeling hehe :P
    DERAIL!!
  • EarthQuake
    Sadly yes
    Maya got a thousand problems but a bake aint one
  • Joao Sapiro
    Offline / Send Message
    Joao Sapiro sublime tool
    ok someone on msn said "oh but by using hard edges on 3ds max you get a great bake.
    i did what i do in maya ( setting up hard edges on uvw splits , basically i do the hard edges while doing the uvw layout :) ) and the same process in max :

    nmbakedd2.jpg
  • EarthQuake
    This mesh isnt a good example of using hard edges, but yes in general even using hard edges in maya vs max, you'll still get better overall results in maya. I posted an image earlier in thread show an example in max with hard edges etc that still had obvious errors.

    However you can clearly see bad shading, seams etc on that bevel cylinder that for all accounts and purposes, shouldn't produce any errors.
  • eld
    Offline / Send Message
    eld polycounter lvl 18
    MoP wrote: »
    Well, as I think CrazyButcher and others have mentioned, it's not "wrong", it's just different.
    It only becomes wrong if the tangent basis you're baking with is different to the tangent basis it will be viewed with (which is the case with Max, therefore "wrong" between baking and viewing in viewport).

    It will never be "wrong" as long as it's consistent between the things you're using it for. Obviously if your game engine uses a different tangent basis to Maya's baker/viewport, then your Maya normal-maps which looked perfect in Maya, will of course look wrong in your engine. The solution there is to make sure your engine calculates tangent basis in the same way as Maya.

    It was wishful thinking! that there would be a perfect choice in this world, the funny part in all this is that maya and max are both owned by the same company.
  • sprunghunt
    Offline / Send Message
    sprunghunt polycounter
    Vig wrote: »

    It would be awesome if Max/Maya had access to simple external 3D viewers for a few of the major engines, like an Export+ feature, export this model, place it in a scene and launch a simple viewer..

    I've seen a few engines that have this feature. Gamebryo is one I remember having a viewer you could quickly launch from inside max.

    For UE3 I just have the editor running in the background and do a quick reimport.
  • Pedro Amorim
    OK! Enough yapping and more tutorials!

    How do i bake shit in maya. cause i have no idea how to use maya!
    Can't even make a cube!
  • MoP
    Offline / Send Message
    MoP polycounter lvl 18
    bitmap: baking in maya - http://en.9jcg.com/comm_pages/blog_content-art-51.htm
    making a cube: shift+right click and hold in the viewport, you'll get a radial menu with a choice of primitives.
2456711
Sign In or Register to comment.