Since we know that bakes are not absolute and that it is pointless to just 'look at them' as pretty rainbow images, but rellly are relative to the engine being used to display and or bake the maps, to how an exporter sees triangles after export, and so on and so forth, what if we do the following :
- we/someone/per128 makes a scene with a low and a high, with all the potentially problematic features : split UVs for sharp highpoly shapes support, some contnuous UVs too, 90 degrees angles (or more) in the high shape, mirrored UVs and so on. Also the low mesh could be provided as is (quads and tris) but whoever posts a result is free to triangulate, edit normals, resplit UVs, or anything else that could improve the bake. File format for the low could be .obj but it is known to have problems with storing the tangent data. I think FBX would be safer. .SBM could work too but we need to make sure that the exporter used to genrate it perfectly stores the tangents. Highpoly mesh could be anything so .OBJ is fine for that.
- instead of posting cases where stuff gets broken, why not posting cases and descriptions where stuff renders perfectly the way it should in realtime ? And describing exactly what has been done to get the results (moving UVs away, recalculating the tangents, anything you can think of).
If I understand correctly the following works (but needs to be backed up by practical examples)
> high and lo in Maya ---> TS bake in Maya ---> display with Maya Blinn in realtime highquality mode
> high and lo in Xnormal ---> TS bake in Xn ---> display in Xn realtime viewer (but recomputes stuff)
> high exported to obj, lo exported to SBM from Maya ---> TS bake in Xn ---> display with Maya Blinn in realtime highquality mode
> high exported to .ase or .lw, low exported to .ase ---> bake in Doom3 renderbump ---> display in Doom3 ingame
>high and lo in Ma ---> bake using max RTT ---> scanline render in Max.
This is all from the top of my head but I will doublecheck it all and make examples as soon as I can.
Did this thing long time ago, before the launch doom3 on market. I remenber i used a plugin to generate the normal maps, maybe mankua or something similar. Well, i updated the file and i generated a normal map in Max. This is how looks in standard quality, the edges are very visible, but adding some cuts the issue almost disappear. when rendering, it looks fantastic. Don't know if it could be a good model to test normals, let me know it and i pack a file.
I know it Bitmap, that's just a test. As i said, adding more polygons the problem almost disappears. I've tried using Xnormal, and it didn't generate a better normal map, and this issue with edges is still present. Using a Xnormal normal map in Max viewports gives even worse results. At least is what i have seen.
Here's a link with a rar file with the max scene and objs (there are different meshes, i exploded the different parts for a easy work). It would be nice if someone could generate a nm in Maya, i would like to see the result in Max's viewports.
Blaizer, I'll give that a try in Maya tonight, if someone doesn't get there first
I have a feeling that on a Maya bake you won't be able to see those large triangulation issues that are visible in your screenshot in Max's viewport.
pesky max! have tried this process myself just to see what happens.
looks perfect in scanline render, broken in viewport. how irritating. I might examine maya for baking. anyone used xsi for baking before? what about zbrush and mudbox's build in normal map tools. I've not used either for absolutely ages and I dont have them installed here atm.
I started on a max plugin that will bring the maya (actually any custom)-way to max. I need to do some analysis of the max baking process to check if it can be done the way I hope it would work, but it should.
Awesome Christoph, I was talking to Diogo today about this, he had a similar idea too. Basically the ideal solution is just having a drop-down or radio-button setting somewhere in the Render To Texture dialog where you choose what tangent basis to use when baking.
i been saying this for a long time bet never took the time to demonstrate it nicely.
at least this will open some eyes.
Yes, the only thing is it's a shame that Maya's baker is single-threaded, I'm sitting here on an Intel Core i7 with 6gb of RAM and it's taking well over half an hour to render a 2048x2048 normal-map at "high quality" sampling.
I can do the same thing in other apps in less than 2 minutes, which is... really quite dreadful when you think about all the time this is wasting.
Our in-house baker (which uses Maya normals & tangents, therefore gets identical results) can do normals, ambient occlusion and colour down from a highpoly mesh at 2048x2048 with 4x multisampling in just a few minutes.
This is one of the problems i noticed here at the studio. Tell me if i got this correct?
I actually don't know the term for what it is that Maya is doing differently than the rest. But what i noticed it is doing, is completely eliminating the low polygon normals by baking the inverted normal of the low poly mesh into the normalmap, thus bringing it's value to 0 and then blending it in with the generated (from high poly) normals. Is this correct?
Because that is what i see when i look i see this:
It's completely missing the negated low poly normal blended into the normal maps.
The problem i ran into here was people doing one of two things:
1: adding just a couple edges AFTER baking. Little black spots would start to appear all over your mesh.
2: not using the removal of the blue channel when adding extra details in photoshop.
Have i got this right? Am i just repeating what everyone has already said? Also can someone tell me what it's called when they do add that inverted low normal to bake?
What I was suggesting more specifically was a universal Object Space to Tangent Space converter tool. It would accept the mesh (xyz + uv), a normal map in OS and custom (scriptable, or imported) tangent basis (with a few presets). The output would be a TS normal map in whatever basis your pipeline requires.
Frontends could be made afterwards to integrate the tool with Max, Maya, Blender, etc..
It could also allow importing any OS normal map into Max, converting to Max's native tangent basis in the process.
CB, I'm just laying it out here to avoid having both of us wasting time on the same thing. Let me know what you think.
dur23: Well, the whole point of a normal-map is to represent the high-poly normals using the texture, it's not really anything to do with having "the inverse of the low poly shading" as you seem to be suggesting.
All it represents is the difference between the interpolated vertex normal and the high-poly normal, per pixel. It just so happens that a lot of the time, to make that look correct, you end up with per-pixel normals which are "counteracting" the shading of the lowpoly normals, resulting in the stuff you're identifying in those images.
1: adding just a couple edges AFTER baking. Little black spots would start to appear all over your mesh.
AREAFHGHGGHFHF this kills me. If they're using unique normals baked to an unwrapped lowpoly mesh, this is a terrible plan, I hear people saying they do this all the time but it's really not a good idea, since as soon as you change your geometry after a bake, you're (usually) changing your lowpoly normals, therefore invalidating your bake. The only way this wouldn't affect your bake is if you left the normals intact, which pretty much invalidates adding the geometry in the first place! Tell people not to do this!
Mop, just relax and let it happen, you can't stop the general usage of normalmaps
they're fancier bumpmaps!, and they're generally made from the diffuse!
One trick I _unfortunately_ was taught was that if I copied a normalmap and overlayed it onto itself, it would make the normalmap pop more.
btw, has Jogshy been looking into this thread yet?
fozi, I did that some years ago as commandline-app and afaik xnormal has that feature built-in as well it's just not so exposed/known. If you want to give it another shot, go for making a simple / easy to use one. I go for integrating it with the max baker.
Well, we popped generated normals from Max, XSI, Maya and XNormal into Unreal and Max showed the best results.
Yeah this is what the epic guys, or Jordan atleast, have been saying. Apparently UE is synced up with max's tangent calculations?
I would be curious to see some examples in UE, and see how "correct" it is, if its just better than the others or if they're actually doing it right. I've never used UE so have no clue.
Yeah this is what the epic guys, or Jordan atleast, have been saying. Apparently UE is synced up with max's tangent calculations?
I would be curious to see some examples in UE, and see how "correct" it is, if its just better than the others or if they're actually doing it right. I've never used UE so have no clue.
This seems to be the case with regards to UE and Max. It seems like, though, if you can get something that looks really great in the Max viewport with XNormal that must mean XNormal is baking tangents that are closer to how Max displays them. Shouldn't Epic then have their own renderer that does something closer to how UE displays them?
Because even though the Max bakes look best in Unreal, they were nowhere near as perfect as XNormal bakes showed in up Max, or Maya's bakes in the Maya viewport. Far less warping errors.
I could be missing something here though. It's just that UE supports multiple 3d packages that all seem to generate different normal maps even if slightly, so they should have a standard normal map renderer like Doom3 did.
We were thinking that perhaps this is a gap XNormal could fill even if temporarily by finding out what the exact tangents are for Unreal uses and having a bake option for that.
AnimeAngel : For what purpose?
Screenshots taken within xnormal? (yeah well, safe bet)
Export to a game engine?
If yes, what engine?
And if so, did your tool guys wrote a SBM exporter for proper tangent export?
Dos the engine in question support mirroring?
Do you get artifact using Blaizer's example ?
Please be specific if you found a perfectly artefact free solution (preferably in a game engine. Xnormal viewer doesnt really count as one in a way...)
yeah XN is very good option because there is an easy way to just load your own custom mesh format with custom tangents etc, from what i understand atleast
OBlast, it all depends how nicely the sbm and obj were exported in the first place.
If I understand correctly SBM can potentially store EVERYTHING needed. Normals, Tangents, and maybe other things I am not familiar with. Kindof like an ultimate 3D format.
OBJ I am not sure, it *might* not export tangents at all. I think Xnormal can recalculate them fine for it's own workings hence a bake on an imported obj should look fine in XN realtime viewer. But that does not guarantee that it would look the same back I the app used to export the OBJ, and even moreso in the final game engine the bake is intended for.
Also I think the SBM exporter for Max is not accurate (based on a test I did some time ago). So :
If you plan to use Xnormal's internal viewer, pretty much anything is fine. Good for portfolio pieces.
If you plan to use XN for demanding, hardsurface smooth bakes in a game engine, you might have to ask your tool person to contact Jogshy (XNomal creator) to chat about the SBM export specifics and verify that the bake resulting from such an exported SBM still works correctly in the final engine. Then the tool person can write a very specific SBM exporter for the project.
@Blaizer - did Mankua's Kaldera do a decent job? I know that a Friend of Eric Chadwick used it a lot and his results were really great from what I saw. Didn't know if Kaldera does something different or not, just wondering.
@Blaizer - did Mankua's Kaldera do a decent job? I know that a Friend of Eric Chadwick used it a lot and his results were really great from what I saw. Didn't know if Kaldera does something different or not, just wondering.
For all the tests i did, the renders looked very nice, very good results in the map. When i was using Kaldera (Max could not show then in realtime). Don't know if the plugin still gives support for the newer versions of Max 32/64bits, it was a nice alternative.
If Max normals give the best results in UE, then, all these issues with normals must be related with how the software handles the normals, so i'm thinking it can't be helped. Normal maps generated with xnormal doesn't look better than Max's ones.
Diego over at Mankua is very responsive, I think I might have suggested he create Kaldera in the first place (or not, it's been awhile) but he was very good at incorporating suggestions.
You could ask him to integrate one of your models into the Kaldera demo. His setup for the demo is to limit it to specific meshes. It looks like he's recompiled Kaldera for Max 2010, just not the demo yet.
But I suspect it doesn't overcome the issue, the problem as I see it ultimately lies with the end shader. The smoother the map, the better, but the shader gates the result.
Edit... I bet he incorporated the same Kaldera baking code into his new open source renderer, http://www.crackart.org/overview.php Sounds like it's command-line though.
Alright! Spent too much time on this but at least now I see things a bit more clearly
Conclusions :
- Maya HQ viewport, even before the bake and without a shader, is a very good tool to spot potential bake problems likely to appear at baking time.
- Max bakes might be compatible with Unreal when it comes to organics or very intricate meshes, but they are in no way perfect for clean angular stuff and requires either hard edges + UVs splits or less extreme angle changes. The screens were taken in the asset viewer - the actual game might be different but I doubt it.
- Maya Blinn and Brice's shader are very good by default when fed with the Maya bakes, but the results are still not perfect in the Doom3/Id tech sense. Some hard edges/UV splits are required ; once performed the results ARE perfect.
- The maya SBM importer does not support hard edges.
This seems to be the case with regards to UE and Max. It seems like, though, if you can get something that looks really great in the Max viewport with XNormal that must mean XNormal is baking tangents that are closer to how Max displays them. Shouldn't Epic then have their own renderer that does something closer to how UE displays them?
Because even though the Max bakes look best in Unreal, they were nowhere near as perfect as XNormal bakes showed in up Max, or Maya's bakes in the Maya viewport. Far less warping errors.
I could be missing something here though. It's just that UE supports multiple 3d packages that all seem to generate different normal maps even if slightly, so they should have a standard normal map renderer like Doom3 did.
We were thinking that perhaps this is a gap XNormal could fill even if temporarily by finding out what the exact tangents are for Unreal uses and having a bake option for that.
People here seem to think that there are many different tangent basis types for realtime apps. As far as I know almost any realtime app uses straightforward linear interpolation between vertex normals. That includes Max viewport, Unreal engine, Marmoset, etc...
The problem we have here is that certain offline, software renderers (like Max' Scanline) use a different tangent basis because they're all about making things look as good as possible. We are using these software renderers to generate our baked normalmaps that are used in a hardware, realtime rendering environment.
There might be some really small discrepancies between certain realtime renderers but these are hardly noticeable compared to the problem we have with 3DS Max.
So in short: if it looks correct/incorrect in Max viewport with a shader like mine, this should be 99% similar to any other realtime application.
xoliul, that's not really true. How you generate the tangent space at the end (ie introduce new splits based on tangent vectors...) is really up the final implementation, and I would say differences exist here for sure.
tangent space is a per-triangle feature (same as surface normal). But the same issues that exist when turning per-face normal data to per-vertex apply to TS. Now for normals we can visibly work with "smoothing groups" and alike. But the TS "smoothing/splitting" will be invisible to the user. He can influence it by UVs and such, but ultimately how the TS is turned into a per-vertex attribute is a blackbox (well not if you read up the details for each engine, depending on the public info).
and regarding rendering its also not the same for all, for example one point from those slides
"Storing n and reconstructing u or v does not cope well with shearing"
but exactly that is done in many renderers (including ue3)
basically there is two issues:
- how to create the per-vertex TS from the per-face TS
- how much data do you use at rendering: all 3 vectors, or 2 and reconstruct 3rd, which many do to save vertex space, and then which 2 do you use...
If the basic method of coming up with "raw" per vertex tangent space (before doing splits), ie averaging the neighbouring triangle tangent spaces, is the same between all applications then the differences should at least be minimal I would think ?
I'm not a coder like you, but when I did work with DirectX and had to calculate the normals/tangent space for meshes (a terrain in my case) using the basic average neighbouring triangles method seemed canon ?
I understand that generating splits based on tangent vectors, mirroring, angles can differ a lot, but this shouldn't affect how the normals are being returned per pixel
I would imagine that software renderers perform some more advanced calculations on top of this, which leads to the problem we're seeing with Max bakes.
In short, I'm just trying to say there's not really a reason to worry that much about differences between realtime viewers. Purely visual, I saw the exact same problems (incorrect baking from max) persist between Unreal, Max and Cryengine. There might be other issues, like how Unreal deals with mirrored UV's and so, but that's a seperate issue from the tangent basis.
I hope I was clear, I'm not really used to talking about this kind of stuff anymore
edit: also interesting in the crytek paper, the "L-shape problem" where normals differ depending on triangulation. They say they fix it by "weighting by angle". Seems to me like Max doesn't do this, leading to the skewed highlights; Tycho posted a topic about that a while ago: http://boards.polycount.net/showthread.php?t=66651
Autodesk really need to get off their asses and adress some of this stuff...
i guess it just depends on the tools provided for you by your work then? if youre using the unreal engine, then max...everything else maya or xnormal, amirite?
i just use xnormal anyways
i guess it just depends on the tools provided for you by your work then? if youre using the unreal engine, then max...everything else maya or xnormal, amirite?
i just use xnormal anyways
Well, kinda, but you can theoretically use any baking tools in any engine, you just have to make sure that your engine is set up to derive tangents in the same way as the baking tool.
This is usually most straightforward if your engine is your baking tool (such as id Tech 4), since you can guarantee consistency that way.
I dunno Garrio, after seing my results with the EQ test mesh in UDK ... it makes me feel like not using Unreal at all, be it with max bakes or anything else...
I understand it was an extreme example but still.
Anyone managed to make that mesh look correct in UDK?
Sure, I just had to unwrap it again and change the smoothgroups accordingly:
There are still some small smoothing problems, which is on the areas where the smoothing was stilla tad too extreme, the problem this whole thread is about. It actually looks a little bit better than in Max, save for the compression artifacts due to DXT1 compression.
UV's:
smoothing splits on every open UV edge.
I don't know what you did wrong Pior, but it was pretty straightforward?
So what would happen if this mesh was UV-mapped with much less seams? Then you wouldn't be able to make the edges hard along the UV borders, and it would revert to looking crappy in a Max bake, and better in a Maya bake
Yeah true. The unwrap here isn't really optimal (2 min job), but I would never do it with the pelt/relax style part EQ did, way too much distortion for a straight, hardsurface object.
In this case, If I had to texture that, I wouldn't even mind the amount of seams I have there: I always bake with Max, so I always have such an amount of seams and i've learnt to live with them, they're not that hard to texture for me anymore. Seams are overrated !
(not that i would mind correct bakes in Max, regardless of seams)
Replies
Since we know that bakes are not absolute and that it is pointless to just 'look at them' as pretty rainbow images, but rellly are relative to the engine being used to display and or bake the maps, to how an exporter sees triangles after export, and so on and so forth, what if we do the following :
- we/someone/per128 makes a scene with a low and a high, with all the potentially problematic features : split UVs for sharp highpoly shapes support, some contnuous UVs too, 90 degrees angles (or more) in the high shape, mirrored UVs and so on. Also the low mesh could be provided as is (quads and tris) but whoever posts a result is free to triangulate, edit normals, resplit UVs, or anything else that could improve the bake. File format for the low could be .obj but it is known to have problems with storing the tangent data. I think FBX would be safer. .SBM could work too but we need to make sure that the exporter used to genrate it perfectly stores the tangents. Highpoly mesh could be anything so .OBJ is fine for that.
- instead of posting cases where stuff gets broken, why not posting cases and descriptions where stuff renders perfectly the way it should in realtime ? And describing exactly what has been done to get the results (moving UVs away, recalculating the tangents, anything you can think of).
If I understand correctly the following works (but needs to be backed up by practical examples)
> high and lo in Maya ---> TS bake in Maya ---> display with Maya Blinn in realtime highquality mode
> high and lo in Xnormal ---> TS bake in Xn ---> display in Xn realtime viewer (but recomputes stuff)
> high exported to obj, lo exported to SBM from Maya ---> TS bake in Xn ---> display with Maya Blinn in realtime highquality mode
> high exported to .ase or .lw, low exported to .ase ---> bake in Doom3 renderbump ---> display in Doom3 ingame
>high and lo in Ma ---> bake using max RTT ---> scanline render in Max.
This is all from the top of my head but I will doublecheck it all and make examples as soon as I can.
What do you think? Can't wait to see examples!!
Here's a link with a rar file with the max scene and objs (there are different meshes, i exploded the different parts for a easy work). It would be nice if someone could generate a nm in Maya, i would like to see the result in Max's viewports.
http://dl.dropbox.com/u/571164/NM_TEST.rar
I have a feeling that on a Maya bake you won't be able to see those large triangulation issues that are visible in your screenshot in Max's viewport.
looks perfect in scanline render, broken in viewport. how irritating. I might examine maya for baking. anyone used xsi for baking before? what about zbrush and mudbox's build in normal map tools. I've not used either for absolutely ages and I dont have them installed here atm.
Why can't it all just work nicely?
i been saying this for a long time bet never took the time to demonstrate it nicely.
at least this will open some eyes.
Yes, the only thing is it's a shame that Maya's baker is single-threaded, I'm sitting here on an Intel Core i7 with 6gb of RAM and it's taking well over half an hour to render a 2048x2048 normal-map at "high quality" sampling.
I can do the same thing in other apps in less than 2 minutes, which is... really quite dreadful when you think about all the time this is wasting.
Our in-house baker (which uses Maya normals & tangents, therefore gets identical results) can do normals, ambient occlusion and colour down from a highpoly mesh at 2048x2048 with 4x multisampling in just a few minutes.
I actually don't know the term for what it is that Maya is doing differently than the rest. But what i noticed it is doing, is completely eliminating the low polygon normals by baking the inverted normal of the low poly mesh into the normalmap, thus bringing it's value to 0 and then blending it in with the generated (from high poly) normals. Is this correct?
Because that is what i see when i look i see this:
It's completely missing the negated low poly normal blended into the normal maps.
The problem i ran into here was people doing one of two things:
1: adding just a couple edges AFTER baking. Little black spots would start to appear all over your mesh.
2: not using the removal of the blue channel when adding extra details in photoshop.
Have i got this right? Am i just repeating what everyone has already said? Also can someone tell me what it's called when they do add that inverted low normal to bake?
What I was suggesting more specifically was a universal Object Space to Tangent Space converter tool. It would accept the mesh (xyz + uv), a normal map in OS and custom (scriptable, or imported) tangent basis (with a few presets). The output would be a TS normal map in whatever basis your pipeline requires.
Frontends could be made afterwards to integrate the tool with Max, Maya, Blender, etc..
It could also allow importing any OS normal map into Max, converting to Max's native tangent basis in the process.
CB, I'm just laying it out here to avoid having both of us wasting time on the same thing. Let me know what you think.
All it represents is the difference between the interpolated vertex normal and the high-poly normal, per pixel. It just so happens that a lot of the time, to make that look correct, you end up with per-pixel normals which are "counteracting" the shading of the lowpoly normals, resulting in the stuff you're identifying in those images.
AREAFHGHGGHFHF this kills me. If they're using unique normals baked to an unwrapped lowpoly mesh, this is a terrible plan, I hear people saying they do this all the time but it's really not a good idea, since as soon as you change your geometry after a bake, you're (usually) changing your lowpoly normals, therefore invalidating your bake. The only way this wouldn't affect your bake is if you left the normals intact, which pretty much invalidates adding the geometry in the first place! Tell people not to do this!
edit: Fozi, sounds good
they're fancier bumpmaps!, and they're generally made from the diffuse!
One trick I _unfortunately_ was taught was that if I copied a normalmap and overlayed it onto itself, it would make the normalmap pop more.
btw, has Jogshy been looking into this thread yet?
Yeah this is what the epic guys, or Jordan atleast, have been saying. Apparently UE is synced up with max's tangent calculations?
I would be curious to see some examples in UE, and see how "correct" it is, if its just better than the others or if they're actually doing it right. I've never used UE so have no clue.
This seems to be the case with regards to UE and Max. It seems like, though, if you can get something that looks really great in the Max viewport with XNormal that must mean XNormal is baking tangents that are closer to how Max displays them. Shouldn't Epic then have their own renderer that does something closer to how UE displays them?
Because even though the Max bakes look best in Unreal, they were nowhere near as perfect as XNormal bakes showed in up Max, or Maya's bakes in the Maya viewport. Far less warping errors.
I could be missing something here though. It's just that UE supports multiple 3d packages that all seem to generate different normal maps even if slightly, so they should have a standard normal map renderer like Doom3 did.
We were thinking that perhaps this is a gap XNormal could fill even if temporarily by finding out what the exact tangents are for Unreal uses and having a bake option for that.
Screenshots taken within xnormal? (yeah well, safe bet)
Export to a game engine?
If yes, what engine?
And if so, did your tool guys wrote a SBM exporter for proper tangent export?
Dos the engine in question support mirroring?
Do you get artifact using Blaizer's example ?
Please be specific if you found a perfectly artefact free solution (preferably in a game engine. Xnormal viewer doesnt really count as one in a way...)
If I understand correctly SBM can potentially store EVERYTHING needed. Normals, Tangents, and maybe other things I am not familiar with. Kindof like an ultimate 3D format.
OBJ I am not sure, it *might* not export tangents at all. I think Xnormal can recalculate them fine for it's own workings hence a bake on an imported obj should look fine in XN realtime viewer. But that does not guarantee that it would look the same back I the app used to export the OBJ, and even moreso in the final game engine the bake is intended for.
Also I think the SBM exporter for Max is not accurate (based on a test I did some time ago). So :
If you plan to use Xnormal's internal viewer, pretty much anything is fine. Good for portfolio pieces.
If you plan to use XN for demanding, hardsurface smooth bakes in a game engine, you might have to ask your tool person to contact Jogshy (XNomal creator) to chat about the SBM export specifics and verify that the bake resulting from such an exported SBM still works correctly in the final engine. Then the tool person can write a very specific SBM exporter for the project.
For all the tests i did, the renders looked very nice, very good results in the map. When i was using Kaldera (Max could not show then in realtime). Don't know if the plugin still gives support for the newer versions of Max 32/64bits, it was a nice alternative.
If Max normals give the best results in UE, then, all these issues with normals must be related with how the software handles the normals, so i'm thinking it can't be helped. Normal maps generated with xnormal doesn't look better than Max's ones.
This is crazy! heh
You could ask him to integrate one of your models into the Kaldera demo. His setup for the demo is to limit it to specific meshes. It looks like he's recompiled Kaldera for Max 2010, just not the demo yet.
But I suspect it doesn't overcome the issue, the problem as I see it ultimately lies with the end shader. The smoother the map, the better, but the shader gates the result.
Edit... I bet he incorporated the same Kaldera baking code into his new open source renderer, http://www.crackart.org/overview.php Sounds like it's command-line though.
Conclusions :
- Maya HQ viewport, even before the bake and without a shader, is a very good tool to spot potential bake problems likely to appear at baking time.
- Max bakes might be compatible with Unreal when it comes to organics or very intricate meshes, but they are in no way perfect for clean angular stuff and requires either hard edges + UVs splits or less extreme angle changes. The screens were taken in the asset viewer - the actual game might be different but I doubt it.
- Maya Blinn and Brice's shader are very good by default when fed with the Maya bakes, but the results are still not perfect in the Doom3/Id tech sense. Some hard edges/UV splits are required ; once performed the results ARE perfect.
- The maya SBM importer does not support hard edges.
Bedtime!
People here seem to think that there are many different tangent basis types for realtime apps. As far as I know almost any realtime app uses straightforward linear interpolation between vertex normals. That includes Max viewport, Unreal engine, Marmoset, etc...
The problem we have here is that certain offline, software renderers (like Max' Scanline) use a different tangent basis because they're all about making things look as good as possible. We are using these software renderers to generate our baked normalmaps that are used in a hardware, realtime rendering environment.
There might be some really small discrepancies between certain realtime renderers but these are hardly noticeable compared to the problem we have with 3DS Max.
So in short: if it looks correct/incorrect in Max viewport with a shader like mine, this should be 99% similar to any other realtime application.
tangent space is a per-triangle feature (same as surface normal). But the same issues that exist when turning per-face normal data to per-vertex apply to TS. Now for normals we can visibly work with "smoothing groups" and alike. But the TS "smoothing/splitting" will be invisible to the user. He can influence it by UVs and such, but ultimately how the TS is turned into a per-vertex attribute is a blackbox (well not if you read up the details for each engine, depending on the public info).
crytek has there method published in shaderx4.
http://www.crytek.com/fileadmin/user_upload/inside/presentations/gdc2007/Triangle_mesh_tangent_space_calculation.pdf
and regarding rendering its also not the same for all, for example one point from those slides
"Storing n and reconstructing u or v does not cope well with shearing"
but exactly that is done in many renderers (including ue3)
basically there is two issues:
- how to create the per-vertex TS from the per-face TS
- how much data do you use at rendering: all 3 vectors, or 2 and reconstruct 3rd, which many do to save vertex space, and then which 2 do you use...
I'm not a coder like you, but when I did work with DirectX and had to calculate the normals/tangent space for meshes (a terrain in my case) using the basic average neighbouring triangles method seemed canon ?
I understand that generating splits based on tangent vectors, mirroring, angles can differ a lot, but this shouldn't affect how the normals are being returned per pixel
I would imagine that software renderers perform some more advanced calculations on top of this, which leads to the problem we're seeing with Max bakes.
In short, I'm just trying to say there's not really a reason to worry that much about differences between realtime viewers. Purely visual, I saw the exact same problems (incorrect baking from max) persist between Unreal, Max and Cryengine. There might be other issues, like how Unreal deals with mirrored UV's and so, but that's a seperate issue from the tangent basis.
I hope I was clear, I'm not really used to talking about this kind of stuff anymore
edit: also interesting in the crytek paper, the "L-shape problem" where normals differ depending on triangulation. They say they fix it by "weighting by angle". Seems to me like Max doesn't do this, leading to the skewed highlights; Tycho posted a topic about that a while ago: http://boards.polycount.net/showthread.php?t=66651
Autodesk really need to get off their asses and adress some of this stuff...
This thread is soooo 2005. :poly121:
[/assmode]
i just use xnormal anyways
Well, kinda, but you can theoretically use any baking tools in any engine, you just have to make sure that your engine is set up to derive tangents in the same way as the baking tool.
This is usually most straightforward if your engine is your baking tool (such as id Tech 4), since you can guarantee consistency that way.
I understand it was an extreme example but still.
Anyone managed to make that mesh look correct in UDK?
There are still some small smoothing problems, which is on the areas where the smoothing was stilla tad too extreme, the problem this whole thread is about. It actually looks a little bit better than in Max, save for the compression artifacts due to DXT1 compression.
UV's:
smoothing splits on every open UV edge.
I don't know what you did wrong Pior, but it was pretty straightforward?
In this case, If I had to texture that, I wouldn't even mind the amount of seams I have there: I always bake with Max, so I always have such an amount of seams and i've learnt to live with them, they're not that hard to texture for me anymore. Seams are overrated !
(not that i would mind correct bakes in Max, regardless of seams)