What I need to know is how Blender computes the tangents, internally. In this case, I can find out by just having a look at the source.
The real problem are closed source apps like Max/Maya. AFAIK, in Max, it seems to boil down to "geom.dll" and ComputeTangentAndBinormal(). Is there anyone out here familiar with Max's tangent basis, or maybe connected to someone that may be able help unfold the mystery?
Or we could do it the hard way, like CB suggested, and waste time "guessing" how its done.
Generally whenever this topic is brought up that seams to be the deadend, no access to the hardcoded math in apps like max. So hopefully SOMEONE out there has some info!
EQ, having access to that info could help make things easier but it's not a deal breaker. We have a few options to go around the problem. The worst case scenario might involve a plugin that interfaces with the tool to provide the necessary data. There will be no dead end this time.
I have to say that I'm really impressed where this is going by now. Could be really handy for contractors having to deal with that "special" kind of leads and programmers.
Why the are object_space maps not used more often for static env work? The only bump maps I have seen for env work has had tangent colors, but I thought that object space was cheaper because it doesn't compute tangents. Sorry for getting on a tangent, just wondering.
Why the are object_space maps not used more often for static env work? The only bump maps I have seen for env work has had tangent colors, but I thought that object space was cheaper because it doesn't compute tangents. Sorry for getting on a tangent, just wondering.
While the typical mantra of "you can't animate obj space maps" is false, it's not always the best solution. For one (I don't know the specifics of it) your engine has to do more calcs on the normals IIRC (can someone clarify?).
But the bigger downside is that you can't really edit them in post like you can with tangent space. So overlaying new layers and details won't work. Any details you want in the normals, you need to make sure are modeled out before the bake.
Also object space stuff isn't as modular. You couldn't really re-use parts all over the place because the normals are too specific. With tangent-space you can essentially say "hey this whole texture is flat relative to my lowpoly!" whereas that won't work with object-space stuff.
Great thread, mad I'm late. I seem to recall one of the graphics programmers at NS figuring out the max tangent generation pretty exactly, I'll try to bug him and see if he can share.
I never did understand why so much attention is put into the quality of the high poly, in hiring amazing anatomy and hard surface modelers, critiquing the hell out of fingernail length and shoe tread depth, and then letting it fall apart in the technical conversion into how it displays on screen in the game engine.
Mop it's funny tho - supporting both types of maps on environments certainly is not hard at all, yet I can feel it might seem like a big production problem to some people. Like fearing that artists are too dumb to work with the two kinds instead or just one, or fearing that the pipeline might break because of that somehow. Which (I think) is unfounded, but I can see this being the reason behind 'oh lets go for TS, plus at least they can be deformed better huh right?' kind of arguments.
* Because the normal is relative to the face, Z always faces up.
- That means normals with more precision when data is quantized, usually to 8bit per-axis. (128 vs 256 positive Z values.)
- Z can be omitted and quickly derived on-the-fly without any extra info.
- Storing only two channels helps reduce DXT compression artifacts.
- What MoP said.
In some cases, when doing projection baking, the quantization argument falls apart due to the basis rotating really fast across the triangle surface.
poop, I did a bit of "guessing" last night and was able to figure it out. Now I should have everything I need to get this working without the need to interface with Max.
I've added max's own TS to the custom TS plugin and now the viewport very very closely matches max's scanline renderer. The same artifacts remain however and I even stumbled upon a bug of max's scanline renderer itself...
this is independent of fozi's standalone solution, ie. has access to max's internal functions. It basically uncovers 2 bugs:
1: max feeds the wrong tangent and binormal vectors to the viewport renderer (nothing new really hehe), but this can be fixed with a custom plugin/shader
2: max's scanline renderer is broken with tangent-maps once critical vertices are not part of the view... don't care for this
It looks almost perfect and way better than what we have at the moment.
It seems that max feeds bad and different tangents everywhere.
The scanline bug is very strange.
i think that whoever is using in studio engines that are sinced with max tangents should show this to the programmers incase you also have the same shading errors in engine. Obviously it would affect the pipeline, but make them think about them for other future projects.
and while we are at it de-myth object space maps :P
yeah i dont think there is much you can do with those little things, IMO that is totally forgivable. in the end we'll always have "resolution based" smoothing errors using tangent maps.
I slacked off from work today and did some tests in Cryengine 2. From what I've picked up in this thread, you should be able to take a mesh with lots of 90 degree angles, put it in all one smooth group, and then when baked with a tool that has tangents synced with your engine, it should look fine.
However none of the bakes provided satisfactory results, not even the bakes done with Polybump, Crytek's own tool.
Now, it's likely that I'm doing something wrong, but from what I can see, Cryengine 2 and Max are both synced up, as the bakes are very close to each other, and not only that but the official tools are all for max. I do think that the proper title for this thread should be "why your baking app should be synced with your engine," as to not sound like it's purely hating on max. :poly124:
Wow, interesting test, Racer445. I find it pretty surprising that Crytek's own normalmapping utility bakes in a different tangent space to what the game uses to render.
I'm a bit behind on CryEngine tech, is PolyBump actually part of the latest CryEngine tools? I remember they were one of the first studios to get something out years ago with PolyBump, I wonder if it's just old and they haven't updated it to match their latest shader methods?
CrazyButcher: Nice results. As EQ said, you're never going to be able to get rid of those little "line" errors, they show up even in the Maya bakes, because (as you know I'm sure) the pixels are crossing the boundary between two triangles and therefore can't share the exact same tangent basis?
Am I right to think that in cases where average/smooth vertex normals are used to bake that splitting normals where UV splits occur would help as those verts would have their own basis then? Or does this factor not matter with these better tangent basis generation methods?
I was under the impression to get perfect result there should be splits in normals wherever UV splits occur. Mebbe this approach is essentially a workaround though? Just wondering.
It's a shame we have to go through all this bullshit with cages, splitting edges, etc just to get a clean normal map. Hopefully in the future things will improve.
It's a shame we have to go through all this bullshit with cages, splitting edges, etc just to get a clean normal map. Hopefully in the future things will improve.
Tesselation needs heightmaps , so it will slightly improve for games that actually use tesselation
but there is another problem:
should one render out the heightmap from highpoply sculpted to divided lowpoly or to not divided lowpoly
divided: + offset = scalable, - mor complex tesselation
not divided - offset notscalable , + easyer tesselation algorithm
simple tesselation isnt anything else than meshsmooth or tesselate (0 stretch) in max
models still look crappy, but with a hightmap sou can displace the mesh and you have a model that is somewhat similar to the orginal highpoly model
Get the bench of unigine engine if you really want to see it in action. In the developer page of nvidia there were some executables showing the technique years ago. Subdiv models are another thing :P
typically the reason ive seen people getting problems from max is because they do not know how to properly edit their cages. things have gone from completely horrible to amazing for me with just an extra push or adjustment of the projection cage. it also helps to have good render settings on. sure it may take longer, but as long as it gets u what u want.
typically the reason ive seen people getting problems from max is because they do not know how to properly edit their cages. things have gone from completely horrible to amazing for me with just an extra push or adjustment of the projection cage. it also helps to have good render settings on. sure it may take longer, but as long as it gets u what u want.
for rendering with scanline maybe, but mainly the discussion started about how maya bakes for realtime use the best of all the packages. It doesn't matter if you are a cage editing wizard in max; if your tangent basis generation is screwy, you will never get a good result for realtime.
Yeah if you read the thread, you'll find this is a discussion about smoothing errors, not skewed projection from poorly set up projection, which has no effect on smoothing
as far as rendering settings i was just mentioning about render to texture.. not actual rendered shots. and i had assumed that i may have stated something in the thread, but i wasnt going to go through 10 pages to figure that out.
The new FBX-plugin for Maya 2010.2 supports exporting of tangents and binormals.
Did a test with our rendering and it produced slightly better results than our own calculations.
Still not as good as maya viewport though...
Vassago: It would as long as you convert to your game engine's tangent space!
There'd be no point converting to an arbitrary tangent space (eg. Maya, Max etc) if your game engine doesn't use exactly the same calculations.
Ideally either the engine would be set up to match a certain preset, or you would add a specific preset to the tool based on your engine's tangent space.
Replies
I do i do
O fozi
I can do a Blender bake of the above scene, if thats what you need.
The real problem are closed source apps like Max/Maya. AFAIK, in Max, it seems to boil down to "geom.dll" and ComputeTangentAndBinormal(). Is there anyone out here familiar with Max's tangent basis, or maybe connected to someone that may be able help unfold the mystery?
Or we could do it the hard way, like CB suggested, and waste time "guessing" how its done.
fozi: Those examples look fan-fucking-tastic.
While the typical mantra of "you can't animate obj space maps" is false, it's not always the best solution. For one (I don't know the specifics of it) your engine has to do more calcs on the normals IIRC (can someone clarify?).
But the bigger downside is that you can't really edit them in post like you can with tangent space. So overlaying new layers and details won't work. Any details you want in the normals, you need to make sure are modeled out before the bake.
I never did understand why so much attention is put into the quality of the high poly, in hiring amazing anatomy and hard surface modelers, critiquing the hell out of fingernail length and shoe tread depth, and then letting it fall apart in the technical conversion into how it displays on screen in the game engine.
Mop it's funny tho - supporting both types of maps on environments certainly is not hard at all, yet I can feel it might seem like a big production problem to some people. Like fearing that artists are too dumb to work with the two kinds instead or just one, or fearing that the pipeline might break because of that somehow. Which (I think) is unfounded, but I can see this being the reason behind 'oh lets go for TS, plus at least they can be deformed better huh right?' kind of arguments.
* Because the normal is relative to the face, Z always faces up.
- That means normals with more precision when data is quantized, usually to 8bit per-axis. (128 vs 256 positive Z values.)
- Z can be omitted and quickly derived on-the-fly without any extra info.
- Storing only two channels helps reduce DXT compression artifacts.
- What MoP said.
In some cases, when doing projection baking, the quantization argument falls apart due to the basis rotating really fast across the triangle surface.
More info on the compression benefits:
http://blogs.msdn.com/shawnhar/archive/2008/10/30/dxt-compression-for-normalmaps.aspx
poop, I did a bit of "guessing" last night and was able to figure it out. Now I should have everything I need to get this working without the need to interface with Max.
I've added max's own TS to the custom TS plugin and now the viewport very very closely matches max's scanline renderer. The same artifacts remain however and I even stumbled upon a bug of max's scanline renderer itself...
this is independent of fozi's standalone solution, ie. has access to max's internal functions. It basically uncovers 2 bugs:
1: max feeds the wrong tangent and binormal vectors to the viewport renderer (nothing new really hehe), but this can be fixed with a custom plugin/shader
2: max's scanline renderer is broken with tangent-maps once critical vertices are not part of the view... don't care for this
It seems that max feeds bad and different tangents everywhere.
The scanline bug is very strange.
and while we are at it de-myth object space maps :P
those ugly lines show up when lower resolution is baked...
they also show in scanline, just not as strong.
you can see it clearly here in maya too:
However none of the bakes provided satisfactory results, not even the bakes done with Polybump, Crytek's own tool.
Now, it's likely that I'm doing something wrong, but from what I can see, Cryengine 2 and Max are both synced up, as the bakes are very close to each other, and not only that but the official tools are all for max. I do think that the proper title for this thread should be "why your baking app should be synced with your engine," as to not sound like it's purely hating on max. :poly124:
I'm a bit behind on CryEngine tech, is PolyBump actually part of the latest CryEngine tools? I remember they were one of the first studios to get something out years ago with PolyBump, I wonder if it's just old and they haven't updated it to match their latest shader methods?
CrazyButcher: Nice results. As EQ said, you're never going to be able to get rid of those little "line" errors, they show up even in the Maya bakes, because (as you know I'm sure) the pixels are crossing the boundary between two triangles and therefore can't share the exact same tangent basis?
I was under the impression to get perfect result there should be splits in normals wherever UV splits occur. Mebbe this approach is essentially a workaround though? Just wondering.
Tesselation needs heightmaps , so it will slightly improve for games that actually use tesselation
but there is another problem:
should one render out the heightmap from highpoply sculpted to divided lowpoly or to not divided lowpoly
divided: + offset = scalable, - mor complex tesselation
not divided - offset notscalable , + easyer tesselation algorithm
What do you mean Tesselation needs heightmaps?
models still look crappy, but with a hightmap sou can displace the mesh and you have a model that is somewhat similar to the orginal highpoly model
http://developer.download.nvidia.com/presentations/2008/GDC/Inst_Tess_Compatible.pdf
Get the bench of unigine engine if you really want to see it in action. In the developer page of nvidia there were some executables showing the technique years ago. Subdiv models are another thing :P
http://unigine.com/download/#heaven
EDIT: for the tessellation you need a dx11 card U.U. Messiah and sacrifice use Tessellation and you don't need dx11... damn programmers XDD.
http://developer.download.nvidia.com/SDK/10.5/Samples/InstancedTessellation.zip
http://research.microsoft.com/en-us/um/people/cloop/accTOG.pdf
sounds good, less fiddly things to artists to have to remember the better!
for rendering with scanline maybe, but mainly the discussion started about how maya bakes for realtime use the best of all the packages. It doesn't matter if you are a cage editing wizard in max; if your tangent basis generation is screwy, you will never get a good result for realtime.
If you're just rendering you'll probably be ok!
Did a test with our rendering and it produced slightly better results than our own calculations.
Still not as good as maya viewport though...
http://download.autodesk.com/us/maya/2009help/index.html?url=Polygons_nodes_Tangent_Space.htm,topicNumber=d0e228932
http://download.autodesk.com/us/maya/2009help/index.html?url=Appendix_A_Tangent_and_binormal_vectors.htm,topicNumber=d0e216694
There'd be no point converting to an arbitrary tangent space (eg. Maya, Max etc) if your game engine doesn't use exactly the same calculations.
Ideally either the engine would be set up to match a certain preset, or you would add a specific preset to the tool based on your engine's tangent space.