Xnormal supports Marmoset's native .mesh format, which you would think would lead to better results baking with. There are still some issues there however, and the results you get are only slightly better/about the same as OBJ. I've had many chats with the 8ml guys but it hasn't really progressed.
I've done some messing around, and baking in maya + "locking normals" before you export gives some pretty good results in marmo, of course you need to have a supported version of maya to do this.
Its interesting, may people have claimed that Max is the proper tool to use in UE, and that if you use max, you wont have any issues. However i don't think anyone has actually show examples of this? Any of you UE3 guys up to the task? I'de really like to see that stress test mesh baked in max and thrown in ue3, i've got a decent hunch it will not display correctly, or anywhere near as close to maya's display.
I feel like any time the issue is pushed, the answer is always "oh we just add a bunch of edges until it looks ok, or use hard edges like everyone else" which is to say the ue3/max combo is on par with most pipelines, but still could be improved.
There are great efforts from artists to make sure that their output is of the highest quality. Examples can be seen in any unreal game that was made using 3dsmax.
the 5k euro you have to spend for the normalmap baking package maya :P
here here
This is why i'm hoping xNormal will come to the rescue of we max users... I think there's more chance of that program hearing our call than autodesk changing based on what is probably a relatively small slice of their market call for.
I'm also actually not too keen on using a post-2010 max version. Feel free to accuse me of being afraid of change but I don't like where they're going with the interface.
I'm thinking the ideal temporary fix would be baking everything in object space and using some program to convert it to a tangentspace with the correct basis. It's still a bother but surely it'd be at least a little bit more efficient than going back and baking it again with a different calculator. The idea that normalmaps will simply not work between different engines really makes me rage, and if, as it's been theorised, this is all more or less a product of people keeping their innovations sealed off and proprietary so other companies can't appropriate them then this is just another ugly mark for me on the case for the industrialisation of our art form
that probably sounds more dramatic than it actually is
At one of the Naughty dog sessions at GDC, he mentioned they used xnormal with custom plugin for their tangent basis. They also rely heavily on maya as their main tool, but I don't know if they use the same basis as in maya...
What can I do to fix this horrible obvious triangulation? (noticeable on the edge of this axe blade)
Rebuild the mesh somehow? Surely it might not be as optimized when it comes to where the triangles are. But I'm not so good with these things.
You can see it much more obvious in marmoset or in any other engine.
This particular map is generated in maya, however I have been making other attempts in XN too. With similar results, so I think the problem is in the mesh or uv's.
Break the UVs along the sharp edge, and add a hard edge along the UV seam.
They are already broken up in the uv's and i think that all the other sharp edges are looking ok.
Will try and harden the edges along the seem. But I think that I've already tried this...
Edit: Ok this helped, Thanks!
Although I got some of those black seems in between..
However, how will this work with the smoothing groups in lets say Unreal for example?
Edit 2:
Worked really well to export the smoothing groups to Unreal aswell. Thanks for the advice!
Aside from the fact that the viewportshaders aren't aligned in 3ds max, I just had to come to the conclusion that the scanline-renderer is not aligned 100% in 3ds max 2009 either!
I hope I made a mistake and will run some more indepth tests during the week but that would be really disappointing after at least the scanline-render working almost perfectly with max's normalmaps in 3ds max 8.
Is this a known problem?
edit: Ok, false alarm. It just expects bump strength set to 100 now.
Thanks a lot for the good read, guys. This has been a very helpful thread. I'd like to share what I have learned from it - and this is mostly aimed at people who are writing FX shaders for the Max viewport:
The standard way (the method that I have used and many others) to create a normal vector from a tangent space normal map in the pixel shader has been like this:
//sample normal map and expand to -1 to 1
float3 NormalMap = tex2D(NormalMapSampler, In.texCoord.xy).xyz * 2 - 1;
//create the tangent basis
float3 Nn = normalize(In.worldNormal);
float3 Tn = normalize(In.worldTangent);
float3 Bn = normalize(In.worldBinormal);
//transform the normal from tangent space to world space
float3 N = normalize(float3((NormalMap.x * Bn) + (NormalMap.y * -Tn) + (NormalMap.z * Nn)));
This is the method that I've been using for years. However, after looking at fozi's code and investigating the output from Max's Hardware Shaded viewport, it would appear that there's another method:
//sample normal map and expand to -1 to 1
float3 NormalMap = tex2D(NormalMapSampler, In.texCoord.xy).xyz * 2 - 1;
//create the tangent basis
float3 Nn = normalize(In.worldNormal);
float3 Bn = normalize(cross(In.worldBinormal, In.worldNormal));
float3 Tn = cross(In.worldNormal, Bn);
if (dot(Bn,In.worldTangent) < 0.0) Bn = -Bn;
float3x3 toWorld = transpose(float3x3(Tn, Bn, Nn));
float3 N = float3(normalize(mul(toWorld, NormalMap)));
When I edit an FX shader to use this new method for creating the tangent basis, I'm able to get perfect results when using normal maps that were baked in Maya and also normal maps that were created using fozi's object->tangent space conversion utility. Win!
However, I don't get good results from a normal map baked with Max's RTT with either of these methods. The first method (the one I've been using all along) yields a slightly better result, but they're still both wrong.
CrazyButcher's link that he posted to the source code for how tangents/binormals are created in Maya is pure gold. That's exactly what anyone needs to get Maya normal maps working perfectly in their game engine. Now we just need Autodesk to post similar source code for how the tangents are generated in the process of baking normal maps in Max.
Fozi's project is cool in that it yields good results in the Max viewport - but the drawback is you have to rebake your normal map. That's no big deal for one test model here and there - but what about the thousands of assets that we've already created? What we really need is to get all of the normal maps that we've already created using Max's RTT to look good in our game engines. And for that, we need Autodesk to release the code that they're using to generate tangent/binormal data for RTT normal map generation and the scanline renderer - just like they did for Maya.
CrazyButcher, does your modifier convert normal/tangent/binormals so that they work correctly with normal maps baked with Max's RTT, and if so - how can I get my hands on that secret sauce?
Sorry to come in after some constructive discussion with confessions of mediocrity again, but I am still at a loss when it comes to baking continuous lighting over UV seams.
edit: nevermind, I just got it... using Blender. Without even having to define hard edges along the seams. Of course, it still looks crap in anything but the immediate Blender viewport. So, no win. I've been googling everything I can about tangent basis, baking, rendering and just feel helpless. Almost like everything to do with tangent space normal maps has become entirely fucked up since their invention and no one really knows what methods to use because there are no standards, etc; it is distressing.
This thread comes up in the top results every time. Feels weird that shit would be so brokenly approximated for so long before this topic received much discussion. Why is tangent space so broken? Or am I overreacting, and confused?
One theory I've heard is that all this code was developed around the same time by different folks. All of them thinking they're right.
to be fair, there is no real "right or wrong" here, there is just different approaches on how to turn a per-triangle "space" into something that works per-vertex and interpolates. Obviously different strategies were taken. So technically if it worked for each of them, they were "right" for their purpose. There was no real need into looking how others do it, I think this problem simply emerged later with the rising popularity and cross-app/engine workflow.
Yep, I like to think of it kinda like power sockets in different countries.
In each country everyone has their own type of power socket and plug, and they all work well internally.
But as soon as you move between countries, you have to get some sort of adapter, because the methods developed wherever you're going don't match wherever you're coming from.
Obviously the ideal solution for this is for everyone to use the same power socket. But who defines that? And anyway, it's going to be a huge amount of work for everyone to conform.
That said, standardising tangent space is probably easier than standardising power sockets...
1. EQ's normal map that he baked in Maya applied in the 3ds Max viewport with the "Show Hardware Map in Viewport" option.
2. EQ's normal map that he baked in Maya applied in the 3ds Max viewport with a custom FX shader that I wrote.
3. A normal map that I baked using 3ds Max's RTT applied in the 3ds Max viewport with the "Show Hardware Map in Viewport" option.
4. A normal map that I baked using 3ds Max's RTT applied in the 3ds Max viewport with a custom FX shader that I wrote.
A couple of things that I learn while doing this:
1. This stuff is super sensitive to the way edges are turned. I had to request a new version of EQ's "Qbert Tower" model that was all triangles because my OBJ importer was turning the edges of the quads incorrectly. You can't make your NM with one model and then turn a bunch of edges and expect it to still look good.
2. This screenie was done in 3ds Max 2010. I tried this same experiment in 3ds Max 2008, but my results weren't near as good. I think Autodesk must have done something to improve the viewport tangent basis between 3ds Max 2008 and 2010.
3. The normal map rendered in Maya looks near perfect in the Max viewport if you use the "Show Hardware Map in Viewport" feature - or if you use an fx shader that creates its normals in an equivalent manner (see my last post for the code.)
4. The NM baked using Max's RTT looks decent, but there are visible seams along UV borders. These go away completely when using this NM with the scanline renderer as others have also noticed.
5. I contacted Autodesk about this (Ken P and Neil Hazzard) and it sounds like they're planning to make some sort of official announcement/statement about it and perhaps provide some insight into how the situation can be improved and/or how you can get the best results in your game engine using Max's RTT baked normal maps.
Nice experiment Ben, thanks for posting the explanations!
And yes, of course, triangulation of "quadded meshes" is going to be an issue. This should never be a problem as long as you're either exporting directly into the engine using an exporter that preserves internally calculated tangents and internally calculated triangulation from your 3D app.
The more levels of indirection you go through to reach your end result, the harder it's going to be to preserve all the original information from your 3D app.
This is one reason why I think that Doom3 (id Tech 4) and related stuff works best, since you're then baking your maps using the final triangulated and processed "in-game model" anyway, so you don't have to worry about things not matching between your baking app and your game, because your game is your baking app!
It kinda makes me wonder why engines/pipelines working that way are so rare. Writing a little command-line based ray casting utility sure isnt that hard? Processing every asset through it might be painful and time consuming but for the few sensitive cases needing top accuracy it's so awesome. My first bake ever was in doom3 renderbump and yeah that thing is just perfect...
I tried the theory about using hard edges around each UV island. Looks a bit better than the soft-edge-only Max bake, but still more artifacts than the Maya bake.
I also changed the camera to orthographic for an easier compare, don't think that affects it though.
It kinda makes me wonder why engines/pipelines working that way are so rare. Writing a little command-line based ray casting utility sure isnt that hard? Processing every asset through it might be painful and time consuming but for the few sensitive cases needing top accuracy it's so awesome. My first bake ever was in doom3 renderbump and yeah that thing is just perfect...
Well in my experience, its because the programmers writing the tools/engine/shaders use best case test meshes, like a well sub-divided sphere, or simply test a normal generated from a bump on a plane/cube/etc and call it done. Anything that doesnt work out well is blame on the artist, or the baking app, or someone else other than the code team.
Heh, in my experience it's the complete opposite, the programmers here are very careful about making sure all test cases are not just simple examples which don't have any bearing on a final mesh... I guess it depends who you work with
Ok, so I went ahead and got all excited about Max 2011 being out (yeah I know, my bad.).
Tangents are still broken.
EarthQuake: you use modo yeah? Everything seems pretty cool in that? Thinking about maybe switching app, as the workaround I have for this involves using Max AND Maya, and going between the 2 is just ridiculous for something that should be trivial to fix.
For the record, 3ds Max had a first solution for normal maps in their renderers before a "standard" emerged. The viewport normal maps were implemented later and follow that "standard". Understand that today, 3ds Max simply use another solution to normal map than what you guys are now used to.
I'm currently gathering information and looking at what we'll be doing next.
I'd like to hear from you if providing a similar solution to Maya, XNormal would be the right thing for you? And most importantly, if such a solution was implemented, would it affect people that managed to use the current implementation?
Jean-Francois, thanks for dropping by.
I think the solution to this is really simple. Just add a "Tangent Basis" dropdown to the NormalMap renderelement that allows the user to select between the current method (which works in the renderer) OR a new, realtime-correct basis such as Maya's or Xnormal's method that works in viewport.
I second Xoliul's suggestion, make it an option. Probably best to leave the current tangent basis as the default, so as not to mess up existing customers, and also I think because a scanline/MR-centric tangent basis would best serve projects focused on Max output (which probably is the top % of the user base, from the product mgmt perspective).
I think it would be very helpful if Autodesk added RTT presets for as many TBs as possible. Similar to how the guruware OBJ exp/imp dialog has presets for a bunch of different apps.
Would also help tremendously to allow users to easily add new tangent bases to RTT, so for example a game that uses their own custom TB could add that to the RTT presets list. MAXScript and SDK access would help a ton here.
Also I wonder if the viewport shaders could have TB presets as well? It might be a good first step to add a dropdown within the Normal Bump map type, so the hardware shaders could be set to use the same TB that a user chose in RTT.
Also I wonder if the viewport shaders could have TB presets as well? It might be a good first step to add a dropdown within the Normal Bump map type, so the hardware shaders could be set to use the same TB that a user chose in RTT.
That would cut viewport shaders such as mine a bit short, since those don't use Normal Bump.
My biggest request would be that whatever is done in terms of tangent basis, please make it transparent either somewhere in the RTT dialogue, or documented somewhere in the help docs.
I don't really care if it's "correct", I just care if I can get a description of how you derived that tangent space so I can pass it on to a rendering programmer to match our engine up to it.
Xoliul, I think your proposal makes a lot of sense : not breaking existing functionality and adding some more. I'm copying Eric comments for review as a possible solution.
GLib, I note your request for better documentation.
thanks for your feedback, don`t hesitate if you have some more comments, I'll keep reading for a bit.
JF
There is another issue with RTT, where sometimes it seems to disregard the orientation of the invisible edges in polygons, choosing its own orientations instead.
I was very excited to get your email and even more excited to see that you have visited the thread!
I agree that a dropdownbox in RTT to select tangent calculation/interpolation etc. would be excellent, especially if extendable in SDK or maxscript to enable studios using proprietary engines to sync up one way or another, presets for whatever you deem appropriate, and a default which matches the current behaviour so as not to upset anybody's applecart.
just want to say its really heartning to see autodesk guys here on the threads taking feedback...
ontop of the tangent thing, i like baking in Xnormal better than max or maya due to its solid production pipeline, once you had linked in the relevant objs and texture paths, its very quick to make the myriad of changes to smoothing groups angles, adding chamfers, uv splits etc, without breaking the render setup etc and to get a perfect bake i will always need these tweaks, i found max and maya both quite flakey regarding this needing to be setup again to prevent issues with the bake...
it would also be really handy to save simple render setups for render to texture, including skylight etc, again speed of use in xnormal i just press go most of the time.
forgive me if this is possible its been a while since i baked in max, but i would go back to it esp if i could use MR easily for bakes
Max's RTT is pretty good at preserving settings in between bakes, as long as I'm invoking it from the same mesh. Switching to a different mesh either resets it all, or loads whatever settings were used last with that particular mesh.
It also has the "Object and Output Settings Presets" which save most of the settings (not bitmap name!), plus it has the Render Presets where you can flip it quickly to MR or Scanline or whatever render presets you've saved off.
jontop of the tangent thing, i like baking in Xnormal better than max or maya due to its solid production pipeline, once you had linked in the relevant objs and texture paths, its very quick to make the myriad of changes to smoothing groups angles, adding chamfers, uv splits etc, without breaking the render setup etc and to get a perfect bake i will always need these tweaks, i found max and maya both quite flakey regarding this needing to be setup again to prevent issues with the bake...
I'm not sure what you're saying here, because for me, any time i make a change to normals/hard edges/etc i've gotta re-setup the cage in xnormal's 3d viewer, which is a huge pain(i posted on this in the XN thread). Sure for stuff that only has 1 smoothing group i think its a great workflow, simply re-export your mesh and you're good to go without changing anything, but mesh normal changes require more painful work than max/maya IMO. I would like to know if you're doing something different here. In XN you've got to go into the cage editor an save your cage, or else you'll get broken edge projection along your hard edges.
I generally like how maya works, i rely very little/virtually not at all on doing custom cages or cage tweaks(as maya's cage/envolope only controls ray distance, not direction anyway) and fix any issues with geometry. Then any changes i make to the mesh at all are instantly updated and all i have to do is click the transfer button again, no need to re-set up my cage like in max or XN.
Oddly enough the code snippet he posted about calculateTangentAndBinormal looks (from my rough memory and code conversion) to be the same as the one that Maya uses to calculate the tangent and binormal too, which makes sense, but... weird.
A first step on normal maps is to document what's going on in3ds Max 2011.
The code provided is the tangent basis used to compute the rendered(baked) normal maps in 3ds Max 2011.
It is different from the viewport's tangent basis used in 3ds Max 2011. Max's viewport basis is AFAIK closer if not the same found in Maya.
In other words, the viewport and the scanline renderer do not use the same tangent basis. This accounts for a lot of confusion around 3ds Max's normal mapping.
stay tuned, it's not over
Jean-Francois Yelle, technical product manager, Autodesk
Replies
Xnormal supports Marmoset's native .mesh format, which you would think would lead to better results baking with. There are still some issues there however, and the results you get are only slightly better/about the same as OBJ. I've had many chats with the 8ml guys but it hasn't really progressed.
I've done some messing around, and baking in maya + "locking normals" before you export gives some pretty good results in marmo, of course you need to have a supported version of maya to do this.
Unreal, Infernal, Vision, and proprietary.
I feel like any time the issue is pushed, the answer is always "oh we just add a bunch of edges until it looks ok, or use hard edges like everyone else" which is to say the ue3/max combo is on par with most pipelines, but still could be improved.
There are great efforts from artists to make sure that their output is of the highest quality. Examples can be seen in any unreal game that was made using 3dsmax.
this and the 5k euro you have to spend for the normalmap baking package maya :P
but yeah i agree it should just do it right from the beginning, hopefully its getting fixed soon
here here
This is why i'm hoping xNormal will come to the rescue of we max users... I think there's more chance of that program hearing our call than autodesk changing based on what is probably a relatively small slice of their market call for.
I'm also actually not too keen on using a post-2010 max version. Feel free to accuse me of being afraid of change but I don't like where they're going with the interface.
I'm thinking the ideal temporary fix would be baking everything in object space and using some program to convert it to a tangentspace with the correct basis. It's still a bother but surely it'd be at least a little bit more efficient than going back and baking it again with a different calculator. The idea that normalmaps will simply not work between different engines really makes me rage, and if, as it's been theorised, this is all more or less a product of people keeping their innovations sealed off and proprietary so other companies can't appropriate them then this is just another ugly mark for me on the case for the industrialisation of our art form
that probably sounds more dramatic than it actually is
Anyone elese caught that one?
http://www.naughtydog.com/site/post/gdc_sessions_saturday_march_13/
Rebuild the mesh somehow? Surely it might not be as optimized when it comes to where the triangles are. But I'm not so good with these things.
You can see it much more obvious in marmoset or in any other engine.
This particular map is generated in maya, however I have been making other attempts in XN too. With similar results, so I think the problem is in the mesh or uv's.
Thanks in advance!
They are already broken up in the uv's and i think that all the other sharp edges are looking ok.
Will try and harden the edges along the seem. But I think that I've already tried this...
Edit: Ok this helped, Thanks!
Although I got some of those black seems in between..
However, how will this work with the smoothing groups in lets say Unreal for example?
Edit 2:
Worked really well to export the smoothing groups to Unreal aswell. Thanks for the advice!
I hope I made a mistake and will run some more indepth tests during the week but that would be really disappointing after at least the scanline-render working almost perfectly with max's normalmaps in 3ds max 8.
Is this a known problem?
edit: Ok, false alarm. It just expects bump strength set to 100 now.
The standard way (the method that I have used and many others) to create a normal vector from a tangent space normal map in the pixel shader has been like this:
//sample normal map and expand to -1 to 1
float3 NormalMap = tex2D(NormalMapSampler, In.texCoord.xy).xyz * 2 - 1;
//create the tangent basis
float3 Nn = normalize(In.worldNormal);
float3 Tn = normalize(In.worldTangent);
float3 Bn = normalize(In.worldBinormal);
//transform the normal from tangent space to world space
float3 N = normalize(float3((NormalMap.x * Bn) + (NormalMap.y * -Tn) + (NormalMap.z * Nn)));
This is the method that I've been using for years. However, after looking at fozi's code and investigating the output from Max's Hardware Shaded viewport, it would appear that there's another method:
//sample normal map and expand to -1 to 1
float3 NormalMap = tex2D(NormalMapSampler, In.texCoord.xy).xyz * 2 - 1;
//create the tangent basis
float3 Nn = normalize(In.worldNormal);
float3 Bn = normalize(cross(In.worldBinormal, In.worldNormal));
float3 Tn = cross(In.worldNormal, Bn);
if (dot(Bn,In.worldTangent) < 0.0) Bn = -Bn;
float3x3 toWorld = transpose(float3x3(Tn, Bn, Nn));
float3 N = float3(normalize(mul(toWorld, NormalMap)));
When I edit an FX shader to use this new method for creating the tangent basis, I'm able to get perfect results when using normal maps that were baked in Maya and also normal maps that were created using fozi's object->tangent space conversion utility. Win!
However, I don't get good results from a normal map baked with Max's RTT with either of these methods. The first method (the one I've been using all along) yields a slightly better result, but they're still both wrong.
CrazyButcher's link that he posted to the source code for how tangents/binormals are created in Maya is pure gold. That's exactly what anyone needs to get Maya normal maps working perfectly in their game engine. Now we just need Autodesk to post similar source code for how the tangents are generated in the process of baking normal maps in Max.
Fozi's project is cool in that it yields good results in the Max viewport - but the drawback is you have to rebake your normal map. That's no big deal for one test model here and there - but what about the thousands of assets that we've already created? What we really need is to get all of the normal maps that we've already created using Max's RTT to look good in our game engines. And for that, we need Autodesk to release the code that they're using to generate tangent/binormal data for RTT normal map generation and the scanline renderer - just like they did for Maya.
CrazyButcher, does your modifier convert normal/tangent/binormals so that they work correctly with normal maps baked with Max's RTT, and if so - how can I get my hands on that secret sauce?
edit: nevermind, I just got it... using Blender. Without even having to define hard edges along the seams. Of course, it still looks crap in anything but the immediate Blender viewport. So, no win. I've been googling everything I can about tangent basis, baking, rendering and just feel helpless. Almost like everything to do with tangent space normal maps has become entirely fucked up since their invention and no one really knows what methods to use because there are no standards, etc; it is distressing.
This thread comes up in the top results every time. Feels weird that shit would be so brokenly approximated for so long before this topic received much discussion. Why is tangent space so broken? Or am I overreacting, and confused?
One theory I've heard is that all this code was developed around the same time by different folks. All of them thinking they're right.
Honestly I wouldn't worry about it too much. Just talk to your tech artists and figure out what works best for your particular project.
to be fair, there is no real "right or wrong" here, there is just different approaches on how to turn a per-triangle "space" into something that works per-vertex and interpolates. Obviously different strategies were taken. So technically if it worked for each of them, they were "right" for their purpose. There was no real need into looking how others do it, I think this problem simply emerged later with the rising popularity and cross-app/engine workflow.
In each country everyone has their own type of power socket and plug, and they all work well internally.
But as soon as you move between countries, you have to get some sort of adapter, because the methods developed wherever you're going don't match wherever you're coming from.
Obviously the ideal solution for this is for everyone to use the same power socket. But who defines that? And anyway, it's going to be a huge amount of work for everyone to conform.
That said, standardising tangent space is probably easier than standardising power sockets...
From Left to Right:
1. EQ's normal map that he baked in Maya applied in the 3ds Max viewport with the "Show Hardware Map in Viewport" option.
2. EQ's normal map that he baked in Maya applied in the 3ds Max viewport with a custom FX shader that I wrote.
3. A normal map that I baked using 3ds Max's RTT applied in the 3ds Max viewport with the "Show Hardware Map in Viewport" option.
4. A normal map that I baked using 3ds Max's RTT applied in the 3ds Max viewport with a custom FX shader that I wrote.
A couple of things that I learn while doing this:
1. This stuff is super sensitive to the way edges are turned. I had to request a new version of EQ's "Qbert Tower" model that was all triangles because my OBJ importer was turning the edges of the quads incorrectly. You can't make your NM with one model and then turn a bunch of edges and expect it to still look good.
2. This screenie was done in 3ds Max 2010. I tried this same experiment in 3ds Max 2008, but my results weren't near as good. I think Autodesk must have done something to improve the viewport tangent basis between 3ds Max 2008 and 2010.
3. The normal map rendered in Maya looks near perfect in the Max viewport if you use the "Show Hardware Map in Viewport" feature - or if you use an fx shader that creates its normals in an equivalent manner (see my last post for the code.)
4. The NM baked using Max's RTT looks decent, but there are visible seams along UV borders. These go away completely when using this NM with the scanline renderer as others have also noticed.
5. I contacted Autodesk about this (Ken P and Neil Hazzard) and it sounds like they're planning to make some sort of official announcement/statement about it and perhaps provide some insight into how the situation can be improved and/or how you can get the best results in your game engine using Max's RTT baked normal maps.
http://www.bencloward.com/nmtest_ben.rar
I've included a .max file in both 2008 and 2010 versions so you can see the difference.
And yes, of course, triangulation of "quadded meshes" is going to be an issue. This should never be a problem as long as you're either exporting directly into the engine using an exporter that preserves internally calculated tangents and internally calculated triangulation from your 3D app.
The more levels of indirection you go through to reach your end result, the harder it's going to be to preserve all the original information from your 3D app.
This is one reason why I think that Doom3 (id Tech 4) and related stuff works best, since you're then baking your maps using the final triangulated and processed "in-game model" anyway, so you don't have to worry about things not matching between your baking app and your game, because your game is your baking app!
I tried the theory about using hard edges around each UV island. Looks a bit better than the soft-edge-only Max bake, but still more artifacts than the Maya bake.
I also changed the camera to orthographic for an easier compare, don't think that affects it though.
Well in my experience, its because the programmers writing the tools/engine/shaders use best case test meshes, like a well sub-divided sphere, or simply test a normal generated from a bump on a plane/cube/etc and call it done. Anything that doesnt work out well is blame on the artist, or the baking app, or someone else other than the code team.
Tangents are still broken.
EarthQuake: you use modo yeah? Everything seems pretty cool in that? Thinking about maybe switching app, as the workaround I have for this involves using Max AND Maya, and going between the 2 is just ridiculous for something that should be trivial to fix.
For the record, 3ds Max had a first solution for normal maps in their renderers before a "standard" emerged. The viewport normal maps were implemented later and follow that "standard". Understand that today, 3ds Max simply use another solution to normal map than what you guys are now used to.
I'm currently gathering information and looking at what we'll be doing next.
I'd like to hear from you if providing a similar solution to Maya, XNormal would be the right thing for you? And most importantly, if such a solution was implemented, would it affect people that managed to use the current implementation?
Well... Let me know, I'm all ears open!
JF Yelle, Autodesk, M&E technical product manager.
I think the solution to this is really simple. Just add a "Tangent Basis" dropdown to the NormalMap renderelement that allows the user to select between the current method (which works in the renderer) OR a new, realtime-correct basis such as Maya's or Xnormal's method that works in viewport.
I think it would be very helpful if Autodesk added RTT presets for as many TBs as possible. Similar to how the guruware OBJ exp/imp dialog has presets for a bunch of different apps.
Would also help tremendously to allow users to easily add new tangent bases to RTT, so for example a game that uses their own custom TB could add that to the RTT presets list. MAXScript and SDK access would help a ton here.
Also I wonder if the viewport shaders could have TB presets as well? It might be a good first step to add a dropdown within the Normal Bump map type, so the hardware shaders could be set to use the same TB that a user chose in RTT.
That would cut viewport shaders such as mine a bit short, since those don't use Normal Bump.
You could certainly write in support for different TBs, via Methods.
I don't really care if it's "correct", I just care if I can get a description of how you derived that tangent space so I can pass it on to a rendering programmer to match our engine up to it.
Nice to see an autodesk guy collecting feedback.
GLib, I note your request for better documentation.
thanks for your feedback, don`t hesitate if you have some more comments, I'll keep reading for a bit.
JF
A thread about the issue.
Skewed Specular Highlight?
Perhaps this happens because RTT flattens the model, which could cause Editable Poly to turn its interior edges?
I was very excited to get your email and even more excited to see that you have visited the thread!
I agree that a dropdownbox in RTT to select tangent calculation/interpolation etc. would be excellent, especially if extendable in SDK or maxscript to enable studios using proprietary engines to sync up one way or another, presets for whatever you deem appropriate, and a default which matches the current behaviour so as not to upset anybody's applecart.
thx
props for awesome use of correct plural for basis. I am a word nerd or something, that made me cheered up!
ontop of the tangent thing, i like baking in Xnormal better than max or maya due to its solid production pipeline, once you had linked in the relevant objs and texture paths, its very quick to make the myriad of changes to smoothing groups angles, adding chamfers, uv splits etc, without breaking the render setup etc and to get a perfect bake i will always need these tweaks, i found max and maya both quite flakey regarding this needing to be setup again to prevent issues with the bake...
it would also be really handy to save simple render setups for render to texture, including skylight etc, again speed of use in xnormal i just press go most of the time.
forgive me if this is possible its been a while since i baked in max, but i would go back to it esp if i could use MR easily for bakes
It also has the "Object and Output Settings Presets" which save most of the settings (not bitmap name!), plus it has the Render Presets where you can flip it quickly to MR or Scanline or whatever render presets you've saved off.
I'm not sure what you're saying here, because for me, any time i make a change to normals/hard edges/etc i've gotta re-setup the cage in xnormal's 3d viewer, which is a huge pain(i posted on this in the XN thread). Sure for stuff that only has 1 smoothing group i think its a great workflow, simply re-export your mesh and you're good to go without changing anything, but mesh normal changes require more painful work than max/maya IMO. I would like to know if you're doing something different here. In XN you've got to go into the cage editor an save your cage, or else you'll get broken edge projection along your hard edges.
I generally like how maya works, i rely very little/virtually not at all on doing custom cages or cage tweaks(as maya's cage/envolope only controls ray distance, not direction anyway) and fix any issues with geometry. Then any changes i make to the mesh at all are instantly updated and all i have to do is click the transfer button again, no need to re-set up my cage like in max or XN.
http://area.autodesk.com/blogs/chris/how_the_3ds_max_scanline_renderer_computes_tangent_and_binormal_vectors_for_normal_mapping
Oddly enough the code snippet he posted about calculateTangentAndBinormal looks (from my rough memory and code conversion) to be the same as the one that Maya uses to calculate the tangent and binormal too, which makes sense, but... weird.
looky :http://area.autodesk.com/blogs/chris/how_the_3ds_max_scanline_renderer_computes_tangent_and_binormal_vectors_for_normal_mapping
A first step on normal maps is to document what's going on in3ds Max 2011.
The code provided is the tangent basis used to compute the rendered(baked) normal maps in 3ds Max 2011.
It is different from the viewport's tangent basis used in 3ds Max 2011. Max's viewport basis is AFAIK closer if not the same found in Maya.
In other words, the viewport and the scanline renderer do not use the same tangent basis. This accounts for a lot of confusion around 3ds Max's normal mapping.
stay tuned, it's not over
Jean-Francois Yelle, technical product manager, Autodesk