Since I've started working in the industry I had a chance to test out all practices and methods I've learned here on Polycount and share them with my colleagues and people from other companies.
The response hasn't been universally positive so far. Some people absorb knowledge but certainly they are in minority. For example most people can't get their heads around baking in synched normal tangent basis, it's sound like some occult bullshit to them. They continue on baking in their 3D package without any plugins and dumping these normal maps straight into an engine. Even when they use Xnormal they don't want to use Farfarer's plugin for synched tangent basis with Unity (I'm not even talking about Handplane since baking an additional map is apparently too much of a hassle). Showing difference falls on deaf ears.
Then people can constantly dispute how PBR works even though it's documented in details and even some necessary practices like triangulatiing before baking can be neglected because "we don't have time for this in production/the source model can be lost/whatever reason".
May be I'm too much of an idealist? How was it for you when you started and how do you deal with such unwillingness to accept good practices?
Replies
So there are employees who are interested and eager to learn by themselves, good employees that are open to improvement and then the bad kind that are there to earn money...or maybe lost their drive (or never had it in the first place).
Age difference could also be a problem - accepting a practice told by someone younger (or much younger). These problems are just part of any job I guess.
We used to get comments from clients about our render edges being very jagged. I did a comparison and it definitely happened during composite (straight out of render, our image was sharp and smooth). The compositors kept insisting it was due to Z-depth. It definitely didn't look like a Z-depth issue to me but rather an artifact from overly using sharpen filter... The compositors refused to even listen to my suggestion and it wasn't until the supervisor himself gave the word that it was finally looked into. Which indeed was overly sharpening.
My preferred approach in this situation is to automate the pipeline more.
For example you mention triangulating before baking - why not have some tools that automatically export and bake a mesh? then they won't have the option to not triangulate the mesh.
This will actually make things faster - then they can't say it 'takes too long'
yeah, if I was in charge I would do this and many other things. But I'm not. So for now I just have to deal with this.
Actually I might be one of these people. I bake normalmaps in max without any plugins. I bake in 3dsmax and would change the channelorientation (y+ / y- and so forth) to the target Engine. Then i bake it down as a tangent space normalmap. I don't see where you need any extra tools for that? Always work's fine for me.
For the other points in my experience you can't expect the same interest for knowledge and technique from every artist. I always try to show the things i think optimize workflows and some adapt to it some won't. But it's their right to do it like they did it for 10 years.
So try to improve things in the department or for other artists but don't try to force things and also don't expect people to always follow logic or reason ^^
This.
I think tasks like this should be a technical artist problem,not an artist's one.And even as an artist,I've started learning how to code specifically to deal with this sort of problems.
Not only does it help me understand and help others by making their workflow faster,but it also makes you stand out as an artist
frankly speaking I don't even consider this a bad practice because I've done it myself due to variety of reasons and my normal map looked good most of times except the couple of cases where Handplane actually improved shading. But if it looks good and shades alright then everything is fine. What actually bothers me is an agressive "these new practices are wrong" approach. I can understand that it may look time consuming for some people and that's fine but when they try to persuade me that the whole approach is *wrong* just because they don't want to use it - now that looks strange.
just a simple word, sync
your max normalmap will most likely not be synced to any engine, you can get lucky and work around and get fine enough results. but if you want flawless results you need to work in sync with your target platform.
just because you did something for 10 years, doesn't make it a good habit
that said, for some productions we have to bake with maya, it's god awful and I really need to find a way to get 100% reliable synced results, and sadly neither handplane nor mightybake manages so far. but i need to time to set up a proper NDAfree testcase for this
You know such things as: did you actually know that the renderer in the engine doesn't remotely give a shit what you program baked in? eg as long as all the data is exported correctly through your pipelines to the renderer, you can bake in pretty much anything you want and it should work correctly.
And the "sync" people talk about isn't really syncing "to an engine" which is important to get right, it's the sync between the baked normalmap and the tangent/bitangent vertex data stored in your model by the time the model reaches the renderer.
I think everybody would certainly appreciate it.
Very much, yes!
is anybody actually making sure that triangulation stays consistent from bake to export, across apps, across multiple iterations of the mesh and with several people involved?
actually such situation was the reason given for "we don't do triangulation". A mesh can go through several people working in 4 different packages. But I think that the mesh can stay in quads but every revision should be triangulated before baking and exporting into an engine. It's one click anyway and you shouldn't erase your quad mesh.
That's something I also think but not quite sure. I tend to think i know my stuff when it comes to baking and normalmap understanding. And for me this syncing thing feels like useless if you do it right in the beginning. (correct me if I'm wrong)
As a tangent space normal is always relative to the object normals, tangents and binormals. That's all it needs to function. The difference of the engine is (as far as i know of) only the interpretation of of the channels. Like x+,x- .y+,y- . But that's what you setup before baking (also like setting up smoothinggroups right etc. pp) And I always get good bake results.
But I might be missing something that didn't occur to me so far I'd really like to know because i actually want to understand the topic as good as possible
@neox : I didn't mean that doing somethign for 10 Years is the best way to do things and I am always open to try new and maybe better ways. My point is there are some who do and you can't force knew workflows to them, even if they are better. And as long as they get their work done in the project in time&quality they can use their workflow. If they get to slow or don't achieve quality then it's a different story, but also one a lead needs to tackle.
I used to think this too, but no, a tangent space normal map is relative in the sense that the three values are encoded and decoded from vertex normals, tangents and binormals, BUT there are different implementations - Y vector is not just flipped in Max, it has a *different* value altogether. Now this is most apparent when you use soft edges a lot (big gradients in the normal map) - if you use bevels, hard edges (and above all export tangents and binormals all along the way from program to baker to engine) you're pretty much safe.
No, in max the Y vector is just flipped. The relationship between the vector stored in the normalmap and the tangent-space basis is the same as it would be if you baked in any other program.
What you're talking about, is that 3dsmax has a different algorithm for generating the tangent/binormal for each vertex than many other programs.
Assuming that the process of getting your model from your DCC app into the renderer of your engine doesn't alter the tangent/bitangent vertex data:
A normalmap baked in max and applied to a mesh exported from max should render correctly.
A normalmap baked in maya and applied to a mesh exported from maya should render correctly.
A normalmap baked in max and applied to a mesh exported from maya will not render correctly.
Once the big (or small) production machine starts whirling, it's very hard to introduce new workflow / pipeline to a team that is already in full swing.
Jot down notes, concerns, ideas, suggestions about workflow / pipeline / tool issues and if you have the means and time, try and do some research on your own to find possible solutions.
It may be harder to introduce these solutions during your current project, but as Dustin said, once you start on a new project (hopefully you get some pre-prod / R&D time) take the opportunity to talk with your team about improving workflow and pipelines. Break it down for them and hopefully have some examples to show that can help illustrate the gains.
Also... +1 million to building a stronger automated pipeline, more than ever we have fantastic tools and software that allow this... it just takes some time to learn and to teach your crew.
Also some tangent spaces interpolate between the vertex data in different ways. See Cryengine for a wacky example
This x1000
The corollary to that is "if it looks good enough, it's good enough."
Technically perfect art is really satisfying and usually looks awesome, but unless there's a demonstrable gain from changing gears to do it that way versus what is already established in a 10+ person team that has milestones to hit on a budget... well, the response is usually a shrug and a rueful grin and "next version".
There's also way more to it than the per-vertex vectors matching between baker and renderer.
There's whether the bitangent gets re-generated (if at all) - either per-vertex or per-pixel. There's whether the vectors are re-normalized per-pixel. Finally, there's whether they're re-orthogonalised per-pixel.*
(Technically these last two have per-vertex variants, too, although for normal maps most of the time they are always unit vectors and the tangent and bitangent are at least orthogonal to the normal if not to each other).
* I think that's all the usual variables... although there could be others I've overlooked.
The renderer just reads the vertex data from the mesh, reads the texture data from the normalmap (optionally reconstructs blue channel of the normalmap texture if you ditched it to use a 2-channel compressed format, which is pretty much impossible to get wrong as you're just reconstructing a normalized vector), and then puts the two together.
All issues to do with orthogonalization, vertex tangent/bitangent regeneration, triangle winding orders.... that's 100% a tools issue. If your mesh and texture reaches the renderer with the same data as you baked with, the data in the mesh and the texture should be in sync and it should render correctly.
Tangent-space syncing is a tools issue, not a rendering issue.
Where you most commonly run into issues is when this data is changed somewhere in the pipeline between content creation and rendering. A great example is Unreal Engine, where every time you import a mesh into the editor it regenerates the vertex tangent/bitangent using the Mikk (MikkTspace) tangent space.
When it comes to baking, it's already been regenerated per-vertex because that's what Mikk does for normal maps (via cross product of the normal and tangent, inverted if the dot product of the "true" bitangent and generated bitangent is less than zero). In Unreal, this happens again per-pixel to create the final tangent basis for conversion. If you don't enable that option in xNormal, you don't get synced results. The mesh data can be retained exactly but it won't sync up without the rendering option.
In versions of Unity prior to 5, that was done at the per-vertex level and the vectors were converted into tangent space per-vertex, then fed into the fragment shader. Which is why even synced normal maps look strange in forward rendering for Unity < 5. All the data is perfectly in sync, but the conversion methods don't line up at the renderer's end. In deferred rendering, this wasn't an issue as the tangent space conversion happened per-pixel.
Max and Maya re-normalize and reorthogonalize the interpolated vectors per-pixel - I don't think any game engine does this (although Marmoset does if you select the correct tangent basis option) so even if you've baked your normal maps in Max and the vectors get through untouched to the renderer, you're still not going to get synced results unless you do those extra steps.