Home General Discussion

Following good/necessary technical practices in production

polycounter lvl 10
Offline / Send Message
dzibarik polycounter lvl 10
Since I've started working in the industry I had a chance to test out all practices and methods I've learned here on Polycount and share them with my colleagues and people from other companies.

The response hasn't been universally positive so far. Some people absorb knowledge but certainly they are in minority. For example most people can't get their heads around baking in synched normal tangent basis, it's sound like some occult bullshit to them. They continue on baking in their 3D package without any plugins and dumping these normal maps straight into an engine. Even when they use Xnormal they don't want to use Farfarer's plugin for synched tangent basis with Unity (I'm not even talking about Handplane since baking an additional map is apparently too much of a hassle). Showing difference falls on deaf ears.

Then people can constantly dispute how PBR works even though it's documented in details and even some necessary practices like triangulatiing before baking can be neglected because "we don't have time for this in production/the source model can be lost/whatever reason".

May be I'm too much of an idealist? How was it for you when you started and how do you deal with such unwillingness to accept good practices?

Replies

  • Amsterdam Hilton Hotel
    Options
    Offline / Send Message
    Amsterdam Hilton Hotel insane polycounter
    seek perfection in your own work but don't demand it in others. offer help when requested and offer encouragement in all other scenarios. practicality uber alles
  • huffer
    Options
    Offline / Send Message
    huffer interpolator
    It's one thing here on polycount, where everyone is passionate about this industry and about art, self-improvement and so on, and another thing out there, where it's just another job for many.

    So there are employees who are interested and eager to learn by themselves, good employees that are open to improvement and then the bad kind that are there to earn money...or maybe lost their drive (or never had it in the first place).

    Age difference could also be a problem - accepting a practice told by someone younger (or much younger). These problems are just part of any job I guess.
  • dzibarik
    Options
    Offline / Send Message
    dzibarik polycounter lvl 10
    huffer wrote: »
    Age difference could also be a problem - accepting a practice told by someone younger (or much younger). These problems are just part of any job I guess.
    yes, this one is especially true. You can swap age with years of experience.
  • Panupat
    Options
    Offline / Send Message
    Panupat polycounter lvl 15
    There are times when these things are requested by clients. And when that happen every one will just have to do it.

    We used to get comments from clients about our render edges being very jagged. I did a comparison and it definitely happened during composite (straight out of render, our image was sharp and smooth). The compositors kept insisting it was due to Z-depth. It definitely didn't look like a Z-depth issue to me but rather an artifact from overly using sharpen filter... The compositors refused to even listen to my suggestion and it wasn't until the supervisor himself gave the word that it was finally looked into. Which indeed was overly sharpening.
  • sprunghunt
    Options
    Offline / Send Message
    sprunghunt polycounter
    dzibarik wrote: »
    Since I've started working in the industry I had a chance to test out all practices and methods I've learned here on Polycount and share them with my colleagues and people from other companies.

    The response hasn't been universally positive so far. Some people absorb knowledge but certainly they are in minority. For example most people can't get their heads around baking in synched normal tangent basis, it's sound like some occult bullshit to them. They continue on baking in their 3D package without any plugins and dumping these normal maps straight into an engine. Even when they use Xnormal they don't want to use Farfarer's plugin for synched tangent basis with Unity (I'm not even talking about Handplane since baking an additional map is apparently too much of a hassle). Showing difference falls on deaf ears.

    Then people can constantly dispute how PBR works even though it's documented in details and even some necessary practices like triangulatiing before baking can be neglected because "we don't have time for this in production/the source model can be lost/whatever reason".

    May be I'm too much of an idealist? How was it for you when you started and how do you deal with such unwillingness to accept good practices?

    My preferred approach in this situation is to automate the pipeline more.

    For example you mention triangulating before baking - why not have some tools that automatically export and bake a mesh? then they won't have the option to not triangulate the mesh.

    This will actually make things faster - then they can't say it 'takes too long'
  • dzibarik
    Options
    Offline / Send Message
    dzibarik polycounter lvl 10
    sprunghunt wrote: »

    This will actually make things faster - then they can't say it 'takes too long'

    yeah, if I was in charge I would do this and many other things. But I'm not. So for now I just have to deal with this.
  • Teclis
    Options
    Offline / Send Message
    Teclis polycounter lvl 15
    dzibarik wrote: »
    For example most people can't get their heads around baking in synched normal tangent basis, it's sound like some occult bullshit to them. They continue on baking in their 3D package without any plugins and dumping these normal maps straight into an engine.

    Actually I might be one of these people. I bake normalmaps in max without any plugins. I bake in 3dsmax and would change the channelorientation (y+ / y- and so forth) to the target Engine. Then i bake it down as a tangent space normalmap. I don't see where you need any extra tools for that? Always work's fine for me.

    For the other points in my experience you can't expect the same interest for knowledge and technique from every artist. I always try to show the things i think optimize workflows and some adapt to it some won't. But it's their right to do it like they did it for 10 years.
    So try to improve things in the department or for other artists but don't try to force things and also don't expect people to always follow logic or reason ^^
  • Fansub
    Options
    Offline / Send Message
    Fansub sublime tool
    My preferred approach in this situation is to automate the pipeline more.

    This.

    I think tasks like this should be a technical artist problem,not an artist's one.And even as an artist,I've started learning how to code specifically to deal with this sort of problems.
    Not only does it help me understand and help others by making their workflow faster,but it also makes you stand out as an artist :)
  • dzibarik
    Options
    Offline / Send Message
    dzibarik polycounter lvl 10
    Teclis wrote: »
    Actually I might be one of these people. I bake normalmaps in max without any plugins.

    frankly speaking I don't even consider this a bad practice because I've done it myself due to variety of reasons and my normal map looked good most of times except the couple of cases where Handplane actually improved shading. But if it looks good and shades alright then everything is fine. What actually bothers me is an agressive "these new practices are wrong" approach. I can understand that it may look time consuming for some people and that's fine but when they try to persuade me that the whole approach is *wrong* just because they don't want to use it - now that looks strange.
  • Neox
    Options
    Offline / Send Message
    Neox godlike master sticky
    Teclis wrote: »
    Actually I might be one of these people. I bake normalmaps in max without any plugins. I bake in 3dsmax and would change the channelorientation (y+ / y- and so forth) to the target Engine. Then i bake it down as a tangent space normalmap. I don't see where you need any extra tools for that? Always work's fine for me.

    just a simple word, sync

    your max normalmap will most likely not be synced to any engine, you can get lucky and work around and get fine enough results. but if you want flawless results you need to work in sync with your target platform.

    just because you did something for 10 years, doesn't make it a good habit

    that said, for some productions we have to bake with maya, it's god awful and I really need to find a way to get 100% reliable synced results, and sadly neither handplane nor mightybake manages so far. but i need to time to set up a proper NDAfree testcase for this
  • marks
    Options
    Offline / Send Message
    marks greentooth
    I think there are some real serious misconceptions about normalmap "syncing" going on here tbh. I don't really have the time right now to type out an enormous reply though. I've been considering writing a blog post or something covering all of the actual details, is that something you guys would appreciate?

    You know such things as: did you actually know that the renderer in the engine doesn't remotely give a shit what you program baked in? eg as long as all the data is exported correctly through your pipelines to the renderer, you can bake in pretty much anything you want and it should work correctly.

    And the "sync" people talk about isn't really syncing "to an engine" which is important to get right, it's the sync between the baked normalmap and the tangent/bitangent vertex data stored in your model by the time the model reaches the renderer.
  • dzibarik
    Options
    Offline / Send Message
    dzibarik polycounter lvl 10
    marks wrote: »
    I've been considering writing a blog post or something covering all of the actual details, is that something you guys would appreciate?

    I think everybody would certainly appreciate it.
  • Spoon
    Options
    Offline / Send Message
    Spoon polycounter lvl 11
    marks wrote: »
    is that something you guys would appreciate?

    Very much, yes!
  • thomasp
    Options
    Offline / Send Message
    thomasp hero character
    i guess you can sync all you want for baking but then you give that deformation friendly quad version of your lowpoly to a TD who uses maya or something else largely unconcerned with maintaining consistent triangulation and have that exported into game.

    is anybody actually making sure that triangulation stays consistent from bake to export, across apps, across multiple iterations of the mesh and with several people involved?
  • dzibarik
    Options
    Offline / Send Message
    dzibarik polycounter lvl 10
    thomasp wrote: »
    is anybody actually making sure that triangulation stays consistent from bake to export, across apps, across multiple iterations of the mesh and with several people involved?

    actually such situation was the reason given for "we don't do triangulation". A mesh can go through several people working in 4 different packages. But I think that the mesh can stay in quads but every revision should be triangulated before baking and exporting into an engine. It's one click anyway and you shouldn't erase your quad mesh.
  • Teclis
    Options
    Offline / Send Message
    Teclis polycounter lvl 15
    marks wrote: »
    I think there are some real serious misconceptions about normalmap "syncing" going on here tbh. I don't really have the time right now to type out an enormous reply though. I've been considering writing a blog post or something covering all of the actual details, is that something you guys would appreciate?

    You know such things as: did you actually know that the renderer in the engine doesn't remotely give a shit what you program baked in? eg as long as all the data is exported correctly through your pipelines to the renderer, you can bake in pretty much anything you want and it should work correctly.

    And the "sync" people talk about isn't really syncing "to an engine" which is important to get right, it's the sync between the baked normalmap and the tangent/bitangent vertex data stored in your model by the time the model reaches the renderer.


    That's something I also think but not quite sure. I tend to think i know my stuff when it comes to baking and normalmap understanding. And for me this syncing thing feels like useless if you do it right in the beginning. (correct me if I'm wrong)
    As a tangent space normal is always relative to the object normals, tangents and binormals. That's all it needs to function. The difference of the engine is (as far as i know of) only the interpretation of of the channels. Like x+,x- .y+,y- . But that's what you setup before baking (also like setting up smoothinggroups right etc. pp) And I always get good bake results.
    But I might be missing something that didn't occur to me so far I'd really like to know because i actually want to understand the topic as good as possible :)

    @neox : I didn't mean that doing somethign for 10 Years is the best way to do things and I am always open to try new and maybe better ways. My point is there are some who do and you can't force knew workflows to them, even if they are better. And as long as they get their work done in the project in time&quality they can use their workflow. If they get to slow or don't achieve quality then it's a different story, but also one a lead needs to tackle.
  • huffer
    Options
    Offline / Send Message
    huffer interpolator
    Teclis wrote: »
    As a tangent space normal is always relative to the object normals, tangents and binormals. That's all it needs to function. The difference of the engine is (as far as i know of) only the interpretation of of the channels. Like x+,x- .y+,y- . But that's what you setup before baking (also like setting up smoothinggroups right etc. pp) And I always get good bake results.

    I used to think this too, but no, a tangent space normal map is relative in the sense that the three values are encoded and decoded from vertex normals, tangents and binormals, BUT there are different implementations - Y vector is not just flipped in Max, it has a *different* value altogether. Now this is most apparent when you use soft edges a lot (big gradients in the normal map) - if you use bevels, hard edges (and above all export tangents and binormals all along the way from program to baker to engine) you're pretty much safe.
  • marks
    Options
    Offline / Send Message
    marks greentooth
    huffer wrote: »
    I used to think this too, but no, a tangent space normal map is relative in the sense that the three values are encoded and decoded from vertex normals, tangents and binormals, BUT there are different implementations - Y vector is not just flipped in Max, it has a *different* value altogether.


    No, in max the Y vector is just flipped. The relationship between the vector stored in the normalmap and the tangent-space basis is the same as it would be if you baked in any other program.

    What you're talking about, is that 3dsmax has a different algorithm for generating the tangent/binormal for each vertex than many other programs.

    Assuming that the process of getting your model from your DCC app into the renderer of your engine doesn't alter the tangent/bitangent vertex data:

    A normalmap baked in max and applied to a mesh exported from max should render correctly.
    A normalmap baked in maya and applied to a mesh exported from maya should render correctly.
    A normalmap baked in max and applied to a mesh exported from maya will not render correctly.
  • JasonLavoie
    Options
    Offline / Send Message
    JasonLavoie polycounter lvl 18
    Dustin nailed it!

    Once the big (or small) production machine starts whirling, it's very hard to introduce new workflow / pipeline to a team that is already in full swing.

    Jot down notes, concerns, ideas, suggestions about workflow / pipeline / tool issues and if you have the means and time, try and do some research on your own to find possible solutions.

    It may be harder to introduce these solutions during your current project, but as Dustin said, once you start on a new project (hopefully you get some pre-prod / R&D time) take the opportunity to talk with your team about improving workflow and pipelines. Break it down for them and hopefully have some examples to show that can help illustrate the gains.

    Also... +1 million to building a stronger automated pipeline, more than ever we have fantastic tools and software that allow this... it just takes some time to learn and to teach your crew.
  • JedTheKrampus
    Options
    Offline / Send Message
    JedTheKrampus polycounter lvl 8
    marks wrote: »
    I think there are some real serious misconceptions about normalmap "syncing" going on here tbh. I don't really have the time right now to type out an enormous reply though. I've been considering writing a blog post or something covering all of the actual details, is that something you guys would appreciate?

    You know such things as: did you actually know that the renderer in the engine doesn't remotely give a shit what you program baked in? eg as long as all the data is exported correctly through your pipelines to the renderer, you can bake in pretty much anything you want and it should work correctly.

    And the "sync" people talk about isn't really syncing "to an engine" which is important to get right, it's the sync between the baked normalmap and the tangent/bitangent vertex data stored in your model by the time the model reaches the renderer.

    Also some tangent spaces interpolate between the vertex data in different ways. See Cryengine for a wacky example
  • marks
    Options
    Offline / Send Message
    marks greentooth
    Once the big (or small) production machine starts whirling, it's very hard to introduce new workflow / pipeline to a team that is already in full swing.

    This x1000
  • GarageBay9
    Options
    Offline / Send Message
    GarageBay9 polycounter lvl 13
    The fundamental rule of art production is "if it looks right, it is right."

    The corollary to that is "if it looks good enough, it's good enough."

    Technically perfect art is really satisfying and usually looks awesome, but unless there's a demonstrable gain from changing gears to do it that way versus what is already established in a 10+ person team that has milestones to hit on a budget... well, the response is usually a shrug and a rueful grin and "next version".
  • Farfarer
    Options
    Offline / Send Message
    @marks
    There's also way more to it than the per-vertex vectors matching between baker and renderer.

    There's whether the bitangent gets re-generated (if at all) - either per-vertex or per-pixel. There's whether the vectors are re-normalized per-pixel. Finally, there's whether they're re-orthogonalised per-pixel.*

    (Technically these last two have per-vertex variants, too, although for normal maps most of the time they are always unit vectors and the tangent and bitangent are at least orthogonal to the normal if not to each other).


    * I think that's all the usual variables... although there could be others I've overlooked.
  • marks
    Options
    Offline / Send Message
    marks greentooth
    Okay I'm happy to be corrected here if someone knows better than I do, but as far as I understand it:

    The renderer just reads the vertex data from the mesh, reads the texture data from the normalmap (optionally reconstructs blue channel of the normalmap texture if you ditched it to use a 2-channel compressed format, which is pretty much impossible to get wrong as you're just reconstructing a normalized vector), and then puts the two together.

    All issues to do with orthogonalization, vertex tangent/bitangent regeneration, triangle winding orders.... that's 100% a tools issue. If your mesh and texture reaches the renderer with the same data as you baked with, the data in the mesh and the texture should be in sync and it should render correctly.
    Tangent-space syncing is a tools issue, not a rendering issue.

    Where you most commonly run into issues is when this data is changed somewhere in the pipeline between content creation and rendering. A great example is Unreal Engine, where every time you import a mesh into the editor it regenerates the vertex tangent/bitangent using the Mikk (MikkTspace) tangent space.


  • Farfarer
    Options
    Offline / Send Message
    The per-pixel bitangent regeneration happens with Unreal. That's why you need to tick the "Compute Per-Pixel Bitangent" option in the Mikk plugin settings for xNormal.

    When it comes to baking, it's already been regenerated per-vertex because that's what Mikk does for normal maps (via cross product of the normal and tangent, inverted if the dot product of the "true" bitangent and generated bitangent is less than zero). In Unreal, this happens again per-pixel to create the final tangent basis for conversion. If you don't enable that option in xNormal, you don't get synced results. The mesh data can be retained exactly but it won't sync up without the rendering option.

    In versions of Unity prior to 5, that was done at the per-vertex level and the vectors were converted into tangent space per-vertex, then fed into the fragment shader. Which is why even synced normal maps look strange in forward rendering for Unity < 5. All the data is perfectly in sync, but the conversion methods don't line up at the renderer's end. In deferred rendering, this wasn't an issue as the tangent space conversion happened per-pixel.

    Max and Maya re-normalize and reorthogonalize the interpolated vectors per-pixel - I don't think any game engine does this (although Marmoset does if you select the correct tangent basis option) so even if you've baked your normal maps in Max and the vectors get through untouched to the renderer, you're still not going to get synced results unless you do those extra steps.
Sign In or Register to comment.