Home Technical Talk

Why does UDK suck so bad at NMs

2

Replies

  • Xoliul
    Options
    Offline / Send Message
    Xoliul polycounter lvl 14
    cptSwing wrote: »
    d'oh. is part of the problem their uv channel 0 lookups for specular lightmap calculation? so annoying.

    It's just part of the grand scheme of getting all that complex stuff to render at decent performance on current gen technology. The uv channel specular thing isn't necessarily related. Normally you never notice about this stuff in any engine, and nobody noticed this when they wrote it probably, as no-one was aware of these issues back then (probably like 2005-2006).

    Ace-Angel: no, there is no way to properly sync. It is completely impossible, since it's something along the lines of the tangent accuracy being dropped after the normalmap has been applied. The best you can do is build your meshes and Uv's to avoid the problem. It's what we do.
  • EarthQuake
    Options
    Offline / Send Message
    Ace-Angel wrote: »
    ^
    Wait, hold on, I thought if you used the old broken baker from Max (2008, 2009, etc) you could get the correct tangents for UDK to render properly?

    Unless I'm mistaken from what I read around the net, that sounds like an issue where no one wants to use that 'broken' math anymore and simply doesn't want to implement it in it's own bakers, which is kinda like saying "Oh, we have an old 'broken' solution for a 'broken' issue, but won't implement to fix the issue because we only like clean, working stuff".

    See the paradox?

    Max's RTT normal map baking hasn't really changed at all since it was introduced as far as tangent space etc it concerned. The "baker" has never really been broken, its the RT shaders and specifically how the tangents are stored/exported/etc from max. 3pshader for instance simply saves the correct tangent information in an extra vertex channel to sync up the rendered result with the baked result.

    Unreal/UDK has never had sync'ed tangents with max, any documentation you find that says otherwise is simply incorrect.

    Max's normal map baker isn't(and wasn't in the past) inherently broken, its simply a matter of miscommunication between the baker and the end result(viewport shader, game engine, etc).
  • EarthQuake
    Options
    Offline / Send Message
    [HP] wrote: »
    AFAIK this problem is currently being looked at internally, as soon as we get it, and it's tested properly, the plugin will be released along with the FreeSDK.
    So far, Crytek's art teams have been struggling with the same problems as anyone else. Hopefully this pain will stop, for everyone using CE3, and we'll have no more Tangent Space shading discrepancies between the baker and the ingame renderer.

    Just wanted to say this is awesome! It seems like it has taken forever but finally the issue is really catching on at larger studios, great to see.
  • Ashaman73
    Options
    Offline / Send Message
    Ashaman73 polycounter lvl 6
    Don't hack on epic, there're technical reasons that a real-time rendering engine not works like a modelling tool.

    The problem is not really the baking of a normal map, but the tangent space matrices per vertex. I.e. mirroring a normal map means to have two different tangent space matrices at the shared vertices, and which is even nastier, these two tangent space matrices have a different handness. So, even when duplicating the vertex in the game engine to handle both tangent space matrices, you need to handle the different handnesses too.

    When not doing it, you can't extract the normal from the tangent space matrix (it will be a separate value most likely in max) and you will have really nasty interpolation artifacts when interpolating over a tri with different tangent matrix handness (some artifacts runs through a model like a line).

    A solution needs the modification of the complete pipeline, from the exporter up to the shaders.
  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    Yeah but ... the 3point guys showed clearly that it is possible and even offer their help regarding the good implementation of their pipeline. And regardless : Doom3, one of the first game using this kind of visual tech got the problem figured out, thanks to more accurate syncing.

    From what I have noticed, there is indeed a certain amount of normalmap display approximations that comes from conscious optimisation choices ; but a very big portion of the issues comes from a lack of quality control, and a studio settling down for something that seems "good enough" on test meshes yet ends up costing hundreds of artist hours, because tweaks have to be made to meshes just to avoid the issue.

    This problem has been plaguing this workflow for years now... For that gain alone, modifying exporters and shaders seems very much worth it to me!
  • Ace-Angel
    Options
    Offline / Send Message
    Ace-Angel polycounter lvl 12
    Ashaman73 wrote: »
    Don't hack on epic, there're technical reasons that a real-time rendering engine not works like a modelling tool.

    The problem is not really the baking of a normal map, but the tangent space matrices per vertex. I.e. mirroring a normal map means to have two different tangent space matrices at the shared vertices, and which is even nastier, these two tangent space matrices have a different handness. So, even when duplicating the vertex in the game engine to handle both tangent space matrices, you need to handle the different handnesses too.

    When not doing it, you can't extract the normal from the tangent space matrix (it will be a separate value most likely in max) and you will have really nasty interpolation artifacts when interpolating over a tri with different tangent matrix handness (some artifacts runs through a model like a line).

    A solution needs the modification of the complete pipeline, from the exporter up to the shaders. .

    And this is bad because...? Without criticism how is anything going to improve? The worst part is, some people still refuse to 'improve' something, even after you told them something needed improving to help save up on time, money and issues.

    Are you honestly telling me, having each artist spend on average between 15-45 minutes on each non-main character piece to fix up a Normal Map issue is sane thinking?

    Sorry for what I'm about to say, but just like Max, Epic isn't exactly making AMAZING industry changes at this point, and even less as a gaming company, publishing any AMAZING game titles last I checked.

    So the least they could do with their current talent is write a tangent baker that works at has an outside 'source' so people have easy access to, or give out the information needed to Tech Artists so they can at least 'compliment' the pipeline. Having a somewhat 'bleed-sycnhed' Normal map is far better then not having a non-synched Normal Map at all.

    It's not going to take all year, or all the month, and Normal Maps aren't some industry secret that needs to be kept under wraps on how it's written and interpreted. It's out there, it's free, and easy to read, write and free-source engines and bakers have done a better jobs an synchronizing stuff in the span of a couple of weeks then multimillion companies did in 10 years.

    So yeah, not really impressed, if this keeps up, not even the next engine of ANY company is going to be worth the hassle you put in the Normal Maps at that point with Tesselation being the new solution (which amazingly enough, has most of it's kinks worked out really well).
  • CrazyButcher
    Options
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    Ashaman73 wrote: »
    So, even when duplicating the vertex in the game engine to handle both tangent space matrices, you need to handle the different handnesses too.

    this sounds weird to me. IF you do the split along the mirror seam, each uv chart will have a common "handness".

    Yes there are a couple precision issues, but you could account for them at baking time as well. For experiments I had done a custom baker-plugin for 3dsmax that takes game-typical optimizations into account (say no pixel-normalized tangent space) and if I recall correctly the results didn't suck ;)

    The situation can be improved, if people spend a little time on it. And I'd argue that there is not that many tools being used for baking/exporting most game assets.

    At some point they have to create the TBN data and I would be surprised if the recreation of tangent space was non-deterministic ;) At least when I looked at UE3 shader code years ago, everything mesh-related ran through the same TBN code.

    The quickest way to improve this:
    * allow fbx importer in udk to source tangent/binormal/normal data from custom uvw-channels also stored in fbx. That is trivial work for the importer, and avoids changing exporters which are in Autodesk's or other's people hand.
    * tell us what you do with this information. I'd guess only T&N are used and binormal is only used to encode handness. Typically B is reconstructed as cross product from the other two multiplied with handness( 1 or -1). And no per-pixel normalization of the TBN matrix is done.
    * community can now provide better data, as we can add uvw channels in any 3d package easily to encode data or mess with the baker's output.

    unity, udk, cryengine on the one end, max, maya, xnormal on the other, can't be that hard ;)
  • Mark Dygert
    Options
    Offline / Send Message
    1st, +1 to everything that Pior pointed out.

    2nd,
    Sorry for what I'm about to say, but just like Max, Epic isn't exactly making AMAZING industry changes at this point, and even less as a gaming company, publishing any AMAZING game titles last I checked.
    I agree with this and at the same time I think they have largely stalled out on pushing the tech envelop because they've already taken 3-4 massive steps past what the current generation of console hardware can support.

    I agree this should give them time to go back and solve a lot of the "minor" issues that eat up massive amounts of time and would make artists lives so much easier as well as allow for greater optimization and utilization of assets, I can't fault them too much for not wanting to dig into these things. They must be horribly complex and working around them is probably much easier than fixing them.

    Personally this issue took the wind out of the sails of my latest personal project. I had big plans to use a lot of the same tricks I've been using in max and our engine at work, for the last few years, in UDK. But this blew it out of the water and really drained most of the enthusiasm I had for the project...

    If it was my regular 9-5 I would push through but it's not, so it's not as fun if I'm just hiding seams, using larger sheets to get the same pixel density as I can get with lower sheets and covering things up just to get "close enough" to what I know I can get normally...
  • CrazyButcher
    Options
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    Epic does a lot of wonderful things. Seeing the Samaratian demo on a single GPU these days should be pretty amazing for anyone in realtime graphics. Yes techdemo but still.

    What we are talking about is a matter of raising awareness and importance of the issue at hand. Let's not blame Epic for a problem that exists for most major engines.
    It costs money, if people have to do a little work here and there for everything to save minor issues, it adds up.
    The question then is, is it necessary spending that extra time, or can you take the quality hit (I am not talking portfolio pieces but production art). Cause if you argue improving quality costs time, the question will be is that quality needed, if yes, then you have a nice argument for investing some technical time to save money. If not you could argue that with a little tech time you improve quality for everyone not just iconic assets.
    If some people complain on forums about personal projects, it doesn't have as much importance as major studios realizing they are "loosing" money for something that could be improved without major undertakings.

    I do think this is something important to improve for everyone, just like the quality normals in our shader was really a "revealing moment". But we have to be fair that X different wishes go into middleware/game making. In the end what gets done is driven by business decisions. Which means you have to have a strong position and not just "making the world a better place" ;)

    I am sure Crytek could argue well with their investment in realtime lighting solutions over money spent on "baking" and improving iteration times. Licensing technology is business.

    If only a handful people complain, it simply isn't important enough. You can either accept that or try pushing it more. I wouldn't blame Epic's or others decision making on a single problem. They wouldn't be in that position, if they hadn't solved a lot of things well.
  • Computron
    Options
    Offline / Send Message
    Computron polycounter lvl 7
    Are people actually bashing Epic's games in a thread about Unsynced Tangent Basis? :(

    Either way, how could it be possible that this issue could not be fixed as someone in this thread pointed out earlier? Are we to expect inaccurate normals until unreal 4?
  • Xendance
    Options
    Offline / Send Message
    Xendance polycounter lvl 7
    You should probably ask Tim Sweeney (is he in charge of UE 4 development?) :D
  • Ashaman73
    Options
    Offline / Send Message
    Ashaman73 polycounter lvl 6
    this sounds weird to me. IF you do the split along the mirror seam, each uv chart will have a common "handness".
    Imagine a plane, subdivided in the middle with a mirrored normal map at the middle edge. The normal of the plane is pointing up.

    The issue is, that you get the TBN from the uv coord of the texture in relation to the vertex.

    I.e. for the right side you would get a left handed TBN, but for the left side, you would get a right handed TBN, because you only mirror one axis (x). To correct the handness, you need to invert the normal, so that for the left side it will point down. The effect would be weird lighting behaviour for the left side.

    A modelling tool uses more or less a theoretical, slower rendering technique, whereas game engines uses more of a hacked rendering technique. Take a look at all game engines, they have all different look'n'feel although theoretically they use the same lighting calculation, but it are the 'hacks' which makes them unique.

    I think that epic will change it, but I think or fear, that for the current technology generation (still xbox360/ps3) it is performance wise better to invest more artist manpower as to sacrify some of the performance. Else max/maya would be the better game renering engine, would it ?
  • AlecMoody
    Options
    Offline / Send Message
    AlecMoody ngon master
    Ashaman73 wrote: »
    Imagine a plane, subdivided in the middle with a mirrored normal map at the middle edge. The normal of the plane is pointing up.

    The issue is, that you get the TBN from the uv coord of the texture in relation to the vertex.

    I.e. for the right side you would get a left handed TBN, but for the left side, you would get a right handed TBN, because you only mirror one axis (x). To correct the handness, you need to invert the normal, so that for the left side it will point down. The effect would be weird lighting behaviour for the left side.

    A modelling tool uses more or less a theoretical, slower rendering technique, whereas game engines uses more of a hacked rendering technique. Take a look at all game engines, they have all different look'n'feel although theoretically they use the same lighting calculation, but it are the 'hacks' which makes them unique.

    I think that epic will change it, but I think or fear, that for the current technology generation (still xbox360/ps3) it is performance wise better to invest more artist manpower as to sacrify some of the performance. Else max/maya would be the better game renering engine, would it ?


    I keep reading your posts and can't tell exactly what you mean. However, there are plenty of game engines that have synced tangent basis with their baker and handled mirroring easily. Id has always done this and splash damage had maya tangent basis running perfectly in Brink. This isn't an issue of "A modelling tool uses more or less a theoretical, slower rendering technique, whereas game engines uses more of a hacked rendering technique."


    My understanding of what unreal is doing is that it isn't fixable without low level changes to render code (unrelated to mesh tangent creation).
  • artquest
    Options
    Offline / Send Message
    artquest polycounter lvl 13
    AlecMoody wrote: »
    I keep reading your posts and can't tell exactly what you mean. However, there are plenty of game engines that have synced tangent basis with their baker and handled mirroring easily. Id has always done this and splash damage had maya tangent basis running perfectly in Brink. This isn't an issue of "A modelling tool uses more or less a theoretical, slower rendering technique, whereas game engines uses more of a hacked rendering technique."


    My understanding of what unreal is doing is that it isn't fixable without low level changes to render code (unrelated to mesh tangent creation).

    I don't mean to hijack the thread but since the discussion is quite related to tangents I figured this is the best place to ask:

    So at work... we're about to start a new project. What exactly am I asking for when I say to the programmers at work "I would like you to sync the new games tangent basis with maya?"

    After dealing with the problems discussed in this thread on the last project I'd really like to get it sorted out.

    At what location do the calculations differ?

    Anyone know of any good documentation on this matter?
  • ZacD
    Options
    Offline / Send Message
    ZacD ngon master
    Point them to this thread, http://www.polycount.com/forum/showthread.php?t=72861
    Its not direct but spends a lot of time talking about the issues.

    and
    perna wrote:
    Developers can write shader@3pointstudios.com
  • AlecMoody
    Options
    Offline / Send Message
    AlecMoody ngon master
    artquest wrote: »
    I don't mean to hijack the thread but since the discussion is quite related to tangents I figured this is the best place to ask:

    So at work... we're about to start a new project. What exactly am I asking for when I say to the programmers at work "I would like you to sync the new games tangent basis with maya?"

    After dealing with the problems discussed in this thread on the last project I'd really like to get it sorted out.

    At what location do the calculations differ?

    Anyone know of any good documentation on this matter?

    Tell them you want to match mesh tangent generation in your engine to whatever tool you guys bake with. You also need to make sure your engine is creating normals the same way as your baking app and that the tangents are interpolated across a triangle using the same method.

    Edit:
    Actually, what engine are you guys using? If you are on unreal you are pretty much SOL and qualified/explicit normals workflow is your best bet.

    More info on the qualified normals workflow
    http://www.3dmotive.com/exportingqualifiednormals/

    If you are working with maya you can skip all the 3dsmax stuff and just make sure you are exporting tangents with your fbx and then check the correct box on import.
  • artquest
    Options
    Offline / Send Message
    artquest polycounter lvl 13
    AlecMoody wrote: »
    Tell them you want to match mesh tangent generation in your engine to whatever tool you guys bake with. You also need to make sure your engine is creating normals the same way as your baking app and that the tangents are interpolated across a triangle using the same method.

    Edit:
    Actually, what engine are you guys using? If you are on unreal you are pretty much SOL and qualified/explicit normals workflow is your best bet.

    More info on the qualified normals workflow
    http://www.3dmotive.com/exportingqualifiednormals/

    If you are working with maya you can skip all the 3dsmax stuff and just make sure you are exporting tangents with your fbx and then check the correct box on import.

    We're using our own in house engine. When I mentioned this issue to a co-worker programmer of mine he said there's only 1 equation to generate tangents and that was the end of the discussion. :P I feel like I'm going to need to understand more to convince them that this is a worthwhile pursuit.
  • EarthQuake
    Options
    Offline / Send Message
    artquest wrote: »
    We're using our own in house engine. When I mentioned this issue to a co-worker programmer of mine he said there's only 1 equation to generate tangents and that was the end of the discussion. :P I feel like I'm going to need to understand more to convince them that this is a worthwhile pursuit.

    In maya's documentation the tangent space info is covered pretty well actually.

    http://download.autodesk.com/us/maya/2010help/files/Appendix_A_Tangent_and_binormal_vectors.htm

    Programmer's generally don't like to be told they are wrong by artists, but he clearly is in the case. You should do a quick test mesh without any hard edges, bake in Maya, show the mesh in Maya's HQ viewport and compare it to how it looks in game, the difference should be obvious, and it has nothing to do with the game engine being "optimized".

    Here's a big-ass thread on the issue, though it is a bit more max-centered its all the same problem. http://www.polycount.com/forum/showthread.php?t=68173
  • Ashaman73
    Options
    Offline / Send Message
    Ashaman73 polycounter lvl 6
    EarthQuake wrote: »
    In maya's documentation the tangent space info is covered pretty well actually.

    http://download.autodesk.com/us/maya/2010help/files/Appendix_A_Tangent_and_binormal_vectors.htm
    Yep, the calculation is always the same, BUT a modelling tools works not like a game engine. Mainly the data structure for vertices,edges,tris etc. differs alot. A modelling tool have a data structure to manipulate meshes in many different ways, whereas game engines uses a data structure for high-performance rendering on the GPU.

    The main reason game engines have (sometimes) problems with tangent space matrices is, that a game engine tries to minimise the vertices as much as possible.

    When you have a model of 10k tris and lets say 15k vertices, a modelling tool has no problem to keep this in its data structure, because a vertex can contain multiple different attributes for example uv coords for different triangles. A game engine can assign certain attributes only once to a single vertex (i.e. uv-coords) therefore the number of vertices often increases when the model is imported into a game engine. The worst case is , that every tri gets its own set of unique vertices, in this case our model would have 10k tris and 30k vertices.

    UV coords and normals (hard edges) are often the reason, that a game engine have to increase the number of vertices. Handling the tangent space matrices is an additional way, leaving this out is more or less a optimisation to reduce the number of tris, even when both, game engine and modelling tool use the same calculation !
  • MoP
    Options
    Offline / Send Message
    MoP polycounter lvl 18
    EarthQuake wrote:
    ...and it has nothing to do with the game engine being "optimized".

    Actually, in this case I think it does. Our main graphics programmer (who has many years experience with Unreal engine) has said that the main reason your lighting will never look correct on normal-mapped meshes in Unreal, is because they are using an approximated calculation for the lighting pass due to not having enough vertex buffers to do the "correct" calculation.
    This is basically (I think) what Ashaman73 has been trying to say.

    The result is that the normals and/or tangents are all slightly off when calculating lighting/shading/specular.

    This is regardless of whether your tangent space has been synced perfectly between your 3D app and the engine. Even if you used their exact calculations for tangent basis, the result would still look slightly wrong when viewed in-game. It is an engine quirk.

    Edit: It's been a while since he explained this to me so I may be slightly hazy on some details. I'll check again today and try to get a more scientific response :P
  • almighty_gir
    Options
    Offline / Send Message
    almighty_gir ngon master
    surely the best solution here would be:

    AUTODESK LEADING THE CHARGE!!!

    By that, i mean that... Autodesk synchronise their tangent basis calculations between all three apps (3ds max, maya, softimage), they would then all bake normals exactly the same way... so no matter which app any given studio or artist uses, they KNOW that their normals are all going to be the same.

    from there, Autodesk just make their tangent basis calculation freely known to any games developer who wants to know it... that way they can use that calculation in their engines.

    surely this is the simplest way of doing it? whereas right now we have... 3ds max using calculation a, maya using calculation b, softimage using calculation c, xNormal using calculation d, zbrush/mudbox using calculation e, Unity using calculation f, UDK using calculation g, Crytech using calculation h.

    i mean, i know it's a longshot hoping for that to happen, but isn't that what really NEEDS to happen?
  • sprunghunt
    Options
    Offline / Send Message
    sprunghunt polycounter
    surely the best solution here would be:

    AUTODESK LEADING THE CHARGE!!!

    unfortunately this is a problem with any engine. For many features depending on something which is locked into a software program like max or maya means either: extra work from your programming team responding to each new release. Or not ever upgrading. Because a minor change from autodesk may mean the engine programmers need to do a lot of work to support it.

    I suspect that this is part of the reason why UDK has such a complicated and fully featured editor. Because to rely on using max or maya as the interface for these features would be a nightmare of extra work for epic.
  • almighty_gir
    Options
    Offline / Send Message
    almighty_gir ngon master
    but is it really so hard for them to go:
    "okay so 3ds max, maya, and softimage now use (x*y+z=t)"
    as their tangent calculation, and keep it that way forever? i mean... why would the tangent basis change with every new edition of max? that makes no sense at all!
  • sprunghunt
    Options
    Offline / Send Message
    sprunghunt polycounter
    but is it really so hard for them to go:
    "okay so 3ds max, maya, and softimage now use (x*y+z=t)"
    as their tangent calculation, and keep it that way forever? i mean... why would the tangent basis change with every new edition of max? that makes no sense at all!

    That would be nice. But it's just like setting any other standard. You'd need a concerted effort on the part of all stakeholders to agree to a certain method.

    It does happen - but it's a massive effort to do this and involves a lot of politics so I wouldn't say it's something that's easy to do. There's also no pressure on them to do this since they've got a near monopoly on 3D graphics software.
  • EarthQuake
    Options
    Offline / Send Message
    MoP wrote: »
    Actually, in this case I think it does. Our main graphics programmer (who has many years experience with Unreal engine) has said that the main reason your lighting will never look correct on normal-mapped meshes in Unreal, is because they are using an approximated calculation for the lighting pass due to not having enough vertex buffers to do the "correct" calculation.
    This is basically (I think) what Ashaman73 has been trying to say.

    The result is that the normals and/or tangents are all slightly off when calculating lighting/shading/specular.

    This is regardless of whether your tangent space has been synced perfectly between your 3D app and the engine. Even if you used their exact calculations for tangent basis, the result would still look slightly wrong when viewed in-game. It is an engine quirk.

    Edit: It's been a while since he explained this to me so I may be slightly hazy on some details. I'll check again today and try to get a more scientific response :P

    From what I understand that is specific to unreal though? The poster I was responding to said he had an inhouse engine.
  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    Sure it can all come from an optimization - but again still, such an optimization might have been made on the assumption that "it looks close enough", and by people who are great at what they do but might not know all the intricacies of the practice of normal map baking. (not saying it was the case here - just a hypothetical scenario!)

    I don't think that any "tech reason" is valid when it comes to an issue with such impact in terms of wasted man hours...
  • CrazyButcher
    Options
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    Ashaman73 wrote: »
    Imagine a plane, subdivided in the middle with a mirrored normal map at the middle edge. The normal of the plane is pointing up.

    The issue is, that you get the TBN from the uv coord of the texture in relation to the vertex.
    ...
    A modelling tool uses more or less a theoretical, slower rendering technique, whereas game engines uses more of a hacked rendering technique.

    The handness problem only happens if your middle vertices (along the seam) are not split uv-wise. That's what I meant with if you split the uv-vertices there, there is no problem. That is you can select each "half" of the divided plane in your uv-editor and move around freely. The middle vertices still share positions and normals.

    Now if some engine decides to merge those vertices back (the seam ones) and ignore the handness, that is an importer problem. And probably what you meant, but I'd say as artist the best you can do is make that split in your uvs.

    Whilst there is only "one way to compute tangent space", it is a per-triangle space. The same can be said about normals. However, we use smooth normals, (and smooth tangentspace) in games. The difference of how people go about TBN in the end is how they combine those triangle-vectors to a single "game-vertex". And what they store in the end.

    The typical game optimizations done are (and I think UE uses both):
    - instead of storing all 3 (TBN) just store 2 and the handness information and reconstruct the third as perpendicular. Means quality loss if the original third was not perfectly perpendicular (likely).
    - when passing the TBN from vertex to pixel shader, not renormalizing, again loosing quality.

    Ashaman73, I am a graphics coder for living (NVIDIA), and the author of the 3pointshader tech, been "on the trenches" with this stuff ;) 3dsmax bakes stuff through a plugin that can be changed. If your game uses those "optimized TBN vectors", you also have to use them in the baker. Everything is just software, no magic :)

    Now I don't know how easy it is to change the Maya baker, but I think xnormal was also quite open on this. So at least 2 of the 3 main apps people use for baking/exporting can be changed to improve the situation and get encoder/decoder closer together.
  • EarthQuake
    Options
    Offline / Send Message
    Also from a real world perspective, doing these sort of destructive tangent optimizations, you have to look at what that is doing to the entire pipeline.

    Are your artists spends hours more per asset?
    Do your artists need to use 30% more geometry to get the same result?
    If yes to the above two(likely), what are you *really* saving?
  • CrazyButcher
    Options
    Offline / Send Message
    CrazyButcher polycounter lvl 18
    well, in the end you want to squeeze out more performance. Adding more geometry is not as costly as increasing the memory/setup costs per-vertex or per-pixel. So it's understandable those optimizations are done. I would try to avoid saying one has to sacrifice one for the other, rather try focus on delivering for both needs.
    I guess it also depends how long this stuff wasn't reevaluated, one might be able to add that additional accuracy back for todays hardware (not for the old consoles, so).
  • artquest
    Options
    Offline / Send Message
    artquest polycounter lvl 13
    wow. Lots of knowledge being dropped in this thread! Thanks to everyone posting here i'm starting to get a better picture of whats going on. :D
  • System
    Options
    Offline / Send Message
    System admin
    dont qoute me on it but I believe that Mayas normal map calculations are the closest of all apps to unreal's. Or at least this is what Ive been told by polygoo, marcus dublin and a few pms at Autodesk.
  • Ashaman73
    Options
    Offline / Send Message
    Ashaman73 polycounter lvl 6
    The handness problem only happens if your middle vertices (along the seam) are not split uv-wise. That's what I meant with if you split the uv-vertices there, there is no problem. That is you can select each "half" of the divided plane in your uv-editor and move around freely. The middle vertices still share positions and normals.

    Now if some engine decides to merge those vertices back (the seam ones) and ignore the handness, that is an importer problem. And probably what you meant, but I'd say as artist the best you can do is make that split in your uvs.
    Yes, that is what I mean. The converter/importer will most probably re-evaluate all mesh data to optimize the vertex/tri setup. And I think that things like handness are more or less ignored. Splitting the UVs or using a hard edge to enforce the converter/importer to handle tangent space correctly, is a valid "hack".
    The typical game optimizations done are (and I think UE uses both):
    - instead of storing all 3 (TBN) just store 2 and the handness information and reconstruct the third as perpendicular. Means quality loss if the original third was not perfectly perpendicular (likely).
    - when passing the TBN from vertex to pixel shader, not renormalizing, again loosing quality.
    With the assumption, that you can re-orthonormalize the tangent space matrix without introducing too much distortion, you can even save the TBN as quaternion,thought you still need to encode the handness.
    Ashaman73, I am a graphics coder for living (NVIDIA), and the author of the 3pointshader tech, been "on the trenches" with this stuff ;) 3dsmax bakes stuff through a plugin that can be changed. If your game uses those "optimized TBN vectors", you also have to use them in the baker. Everything is just software, no magic :)
    Hehe, I'm a coder too, but only on hobby level when it comes to gaphics coding. I have stumbled upon TBN fun while implementing normal mapping in my engine (I use the quaternion approach), thought I never reached the art level to stress the tangent space generation (currently I'm moving away from normal mapped meshes for "art" reasons). That was the reason, I never really noticed the role of the baker, thx for clearing this up for me :)
  • JordanW
    Options
    Offline / Send Message
    JordanW polycounter lvl 19
  • Norman3D
    Options
    Offline / Send Message
    Norman3D polycounter lvl 14
    I think I can die happy now.
  • MrOneTwo
    Options
    Offline / Send Message
    MrOneTwo polycounter lvl 12
    Woooooooot ? What a day. Zbrush 4r4, now this ? One question I've got is : can I export as SBM to xnormal? Only for baking maps. Then export again as FBX and put those into UDK ? Triangulation shouldn't be a problem since this I can do in max. SBM and FBX would have the same triangulation. What about tangents and normals ?? Will it break this new workflow ?
  • JordanW
    Options
    Offline / Send Message
    JordanW polycounter lvl 19
    Your low poly sent to xnormal should be fbx. I don't think the highpoly matters as much.
  • MrOneTwo
    Options
    Offline / Send Message
    MrOneTwo polycounter lvl 12
    Yeah I know hp doesn't matter. Sending mesh (low poly) as SBM to Xnormal gave me usually best results. It can export cage with the mesh. Since low poly mesh sent to Xnormal should have the same tangents and normals as the one sent to UDK I just wonder if I can use SBM for baking. If Xnormal handles tangents and normals the same way for FBX and SBM it should work fine. I don't know if thats the case though.
  • osman
    Options
    Offline / Send Message
    osman polycounter lvl 18
    Man this is great news, downloading now, can't wait to try it out. I hope I'm not too off topic JordanW, but I was wondering if you could help with something. I have noticed that UDK can bake specular from lightmass and that there's a checkbox in the material to turn that off. But I was wondering if there was a way to acces that specular map in the material somehow?

    edit: I ask because now we that we have nearly 100% control of the normalmap rendering, those baked specs is still something I don't have control of in UDK.
  • Pola
    Options
    Offline / Send Message
    Pola polycounter lvl 6
    It blows my mind.

    Seriously this is awesome :D
  • Electro
    Options
    Offline / Send Message
    Electro polycounter lvl 18
    :D yay! Thanks Jordan
2
Sign In or Register to comment.