having problems using external cage files with XN lately.
EDIT: this seems to be a problem with the sbm format, if i export as OBJ or even FBX i don't have this issue.
.
The xN SBM exporter automatically triangulates the model using the Maya's API. Maybe, as triangulation might depend on the normals and edges, the resultant topology is different than the external cage. Try to triangulate the models manually ... and remember to Freeze Transformations before exporting.
Remember also that some old xN versions had a bug with the "use shortest diagonal" option. Try with the latest xN.
when you say it triangulates wit the API, are you talking about just using the API to access the rendering triangles and export those, or is there some other form of triangulation in OpenMaya i dont know about?
this question is related more to a Max-xNormal workflow than xNormal itself, but maybe someone has a clue.
Basically I'm having problems exporting meshes with edited 'face-weighted' normals and simultaneously using a projection modifier (to export via SBM). It seems that the modifier itself chucks out my edits to normals - the only way I've managed to bake edited normals with a cage is via the longer route of exporting the mesh as FBX, and then manually exporting the cage via the projection modifier's menu.
Hi guys !
So i have quite the issue with xnormal since the past few days. I tried to solve it alone and with the help of internet but i didn't find anything helpfull...
Just to be clear I used Xnormal perfectly plenty of times. It's a essential element of my workflow and i know how to use it pretty well .
So my problem is that Xnormal is just giving me blank map for everything from normal map to polypaint.
I checked in the viewer and both mesh are visible.
I tried to change the scale of both mesh
I tried with other meshes
I tried reinstalling it
Nothing seems to work.
I use 3ds 2013 and last version of Xnormal.
If you guys know something about how to fix it please share the tips
Thanks
Can someone tell if Xnormal is baking with Mikk Tspace by default? Trying to work on support for it but we're having issues, first off I need to be sure that Mikkspace is what Xnormal is using. Its in the Tspace plugin list but I don't see any option to enable/disable it.... So yeah, is it just on by default or what?
Can someone tell if Xnormal is baking with Mikk Tspace by default? Trying to work on support for it but we're having issues, first off I need to be sure that Mikkspace is what Xnormal is using. Its in the Tspace plugin list but I don't see any option to enable/disable it.... So yeah, is it just on by default or what?
I believe it's the one by default, however Mikkspace compute the tangents/binormals of your mesh if your mesh file doesn't provide them.
@EQ, Yes xNormal uses mikktspace by default.
What sort of problems are you guys having? PM me with the info if you like and I can pass it over to Morten as he doesn't visit PC very often or I can get him to email you guys directly.
I'm sure he will be willing to help you out with getting the implementation sorted.
Can someone tell if Xnormal is baking with Mikk Tspace by default? Trying to work on support for it but we're having issues, first off I need to be sure that Mikkspace is what Xnormal is using. Its in the Tspace plugin list but I don't see any option to enable/disable it.... So yeah, is it just on by default or what?
Yep, modern xN versions use mkktspace. Older ones nope ( < 3.17.5 ).
You can open the plug-in manager just pressing over the yellow plug-in icon on the bottom-left of the window.
Anyways, if your LP mesh contains tangents, normals and biNormals then they will be used. Formats like SBM, FBX, dotXSI/Collada should be able to export them.
So my problem is that Xnormal is just giving me blank map for everything from normal map to polypaint.
I've received several emails about that. Apparently, it's a ZB4R4+ bug: if you try to export an OBJ while polypaint is assigned to a layer in ZB you'll have problems.
Basically I'm having problems exporting meshes with edited 'face-weighted' normals and simultaneously using a projection modifier (to export via SBM).
If you want to send me your .max meshes compressed I can take a look, pls.
How do I know which version of cuda, optix needs installed for the latest version of Xnormal, version 3.17.16?
None. The xN's installer deploys automatically the required Cuda and Optix DLLs locally into your install dir.
Im stuck figuring out how to bake an AO map from a low poly mesh only, so NO high poly, it keeps telling me to add high def mesh..
Depends on what you need.
1. If you plan to compute AO via retopo and ray-tracing, you'll need a HP mesh and a LP mesh.
2. If you just want to compute fast and inacurate AO for a mesh then use the Simple AO Generator tool.
Alternatively, you can just use the LP mesh as HP mesh too, but you'll need to remove T-junctions and to setup a cage/use MatchUVs.
Andy: Thanks, I'll get in touch if we continue to have problems. I mainly wanted to make sure that what I was baking as test content was actually in mikktspace so we're not going on a wild goose hunt debugging it.
Alright, so more mikktspace stuff. Right now our main issue is just that using xnormal baked maps with mikktspace loaded as the tangent space produces noticeably worse quality than using Maya baked maps loaded with Maya's tangent space(which we've got matching Maya's viewport perfectly).
So I took a look at how Xnormal displays the same mesh when in the 3d viewport, and the results are even worse than what I'm seeing in our app(Sorry, can't show any of that yet).
This is the OBJ with xnormal baked map, presumably using mikktspace, viewed in xnormal's 3d viewer:
Now, our result doesn't look nearly that bad, which makes me think that xn's 3d viewer isn't loading the right tangent space for this mesh or something. Really I just wanted a test case to view the best case that mikktspace is possible of doing, because what we have looks a lot better than this(just not as good as Maya).
So I figured maybe it was an issue loading OBJS. I set up a cage, then resaved as .SBM, and re-baked. Then I get this in xnormal's 3d viewer:
Which is way better, unfortunately, the baked content is totally different than the mikktspace content which mostly works for us, so it seems like when xnormal saves .SBM files its using a different tangent space yet again. When I try to load the baked content from this .SBM file, its much much worse in our app than the content baked from the simple OBJ file.
So, any ideas?
I'm really looking for a way to preview the most accurate implimentation of mikktspace to compare it to what we've got. If mikk isn't as accurate as Maya that's fine, I just want to make sure we're doing everything we can on our end.
I believe it's the one by default, however Mikkspace compute the tangents/binormals of your mesh if your mesh file doesn't provide them.
Since an obj file doesn't provide any tangent/binormals, it's normal you don't get the same result as a SBM which contain the tangent/binormals from maya, while with an OBJ it's mikktspace which create them. There is probably a difference between them.
There's currently a bug in xN, that's why that model is shown differently than Blender. It will be fixed for the next release.
Anyways, if you export the SBM from Maya, and you check in the "export tangent basis" option then it will export the Maya's tangent basis and not the mkktspace.
If you need the SBM to use the mkktspace just don't export the tangent basis so xN will compute it on the fly when the model is loaded ( then, you can enter the 3D viewer and re-save the meshes ).
Btw, we wan't to add another LP + HP example to xN. Any volunteer interestered in donating a model, pls? Must contain the LP mesh and a 2M+ HP mesh ready to compute normal map + AO without MatchUV.
Because we get similar issues when displaying Mikktspace. So I'm really curious to know if you pinned down what was causing the difference in the Xnormal display vs Blender.
Because we get similar issues when displaying Mikktspace. So I'm really curious to know if you pinned down what was causing the difference in the Xnormal display vs Blender.
Yep. It was a problem with my vertex hashing + a desync with mkktspace. It's fixed internally now and patch will be available soon.
Does Xnormal support something similar to material ids like in 3ds max?
No, we don't support MultiSubObj mats.
1 mesh = 1 material.
This might have been asked in past and probably was so i apologize in advance. I am trying to bake a normal map for a very large prop the poly count of the high res obj that i am trying to bake is 14 million. When i try to run the bake i get an error in xnormal saying that the obj is void or doesn't exist. I am able to open the file with Maya so it does exist and it doesn't seem to have any issues. Any ideas how to avoid this issue? Am I trying to bake something past the allowed poly count size? Is there a fix or a work around for this case?
I'm having a big problem baking vertex colour from a lage .ply file.
I've unchecked "Ignore per-vertex-color" on the high poly mesh and I'm baking using 'Bake highpoly's vertex colors. When I load the meshes in th 3d viewer I can see the vertex colour on the high poly mesh, but I can't get anything to bake. No vertex colour, normal, AO or anything. Where should I be looking to troubleshoot this?
The only other information I can offer, is that when the low poly mesh is a .ply file, I get an error that it doesn't have uv coordinates atached (It does).
edit: okay baking ray fails = all red. So that's a pretty big problem I can work from.
edit2: Okay, the program generating the UV's is also scaling the mesh by 1000x for no goddamn reason...
edit3: Okay and the program I'm using to rescale th meshes breaks uvs. Don't ask. I should mention at this point that I do not have access to... "traditional" 3d packages.
edit4: Set hp mesh to 1.0 scale. Set lp mesh to 0.001 scale. It workes! I fucking love you Santi, any program that can work with the BS software I have to put up with is worth it's weight in gold!
I am trying to bake a normal map for a very large prop the poly count of the high res obj that i am trying to bake is 14 million. When i try to run the bake i get an error in xnormal saying that the obj is void or doesn't exist
n-gons present ? Try to triangulate it before exporting and see what happens.
Is your HP mesh tagged as visible in xN? What version of xN are you using, pls?
Did you use an old version of Blender or 3dsmax to export the OBJ?
Hello, I just started using Zbrush and Xnormal today , so far everything went ok when I made a normal map for a previous texture. But for some reason I'm having difficulties getting good normal/AO/cavity/height maps with my current model. Obviously I did something wrong, my question is: What is it ? :P
I've tried changing a few export settings in zbrush, and tried changing the UV's of my low poly, but that didn't helpt. hopefully you guys can help me.
NVM I've fixed it , my normals were locked. such a small mistake actually cost me more time then sculpting the damn thing..
Hi I was wondering is there an exploded normal map baking workflow in xNormal? I would like to give it 3 low, 3 high, and 3 cages and then move a slider for each so they don't penetrate when they bake. The reason I want this is because it is a hassle to have to move the low models/cages in Maya and then also move the high models in Mudbox one at a time. In Zbrush I would image this is even more painful since it doesn't have a channel box style transform tool.
I've increased the contrast of the below image to visualize the difference in quality of the final map normal when saving to a 8 bit format from xnormal vs. converting from a 16 bit format to 8 bit in photoshop. It would be nice if xnormal had a better internal conversion to 8 bit than it does at the moment, as baking to an 8 bit format in xnormal will yield a significantly reduced quality normal map in regards to shading of smooth surfaces.
For now I'm definately sticking to 16 bit outputs from xnormal
Sure that's not PNG compression? Try saving to 8bit tga from xnormal. Although if you're going to be editing the nm in photoshop anyway you'd want 16bit I believe.
Sure that's not PNG compression? Try saving to 8bit tga from xnormal. Although if you're going to be editing the nm in photoshop anyway you'd want 16bit I believe.
Png is lossless, and it's irrelevant whether i want to edit it in the photoshop or not. I'm merely seeing that photoshop seems better at downsampling color depth than xnormal is internally.
Is something wrong with the last two versions of sbm exporters for 3d studio max 13 (versions 3.18 and 3.18.1)?
This is what my bake of a mesh exported via sbm looks like, note the weird colors. In the 3D viewer, most of the mirrored parts of the mesh look inverted and there's some really weird artifacts in some areas.
I export with an Edit Mesh and a Projection modifier stacked, using the "low poly" options in the sbm exporter.
Baking with FBX and a Push'd version as a cage gives me this result, more like what I'd expect. So what's up?
EDIT: Okay, I might now be missing a couple of bugfixes and whatnot, but moving back to version 3.17.16 fixed everything. I'm using Max '13 64bit, by the way.
is there a way to use xNormal the same way as 'Render Surface Map' in 3Ds Max. i want to bake maps just for High poly models. i've tried using the same model in both the high & low definition slots but the maps just render blank.
What maps are you trying to bake ?
For AO I don't know, but for Displacement/Normal map the result is blank because there is no difference between your High-Poly mesh and your Low-poly mesh since there are exactly the same. The baking process is to output in a texture the difference between two meshes.
Im getting the same problem as cptSwing; ever since installing the new 3.18 official release, the Max sbm exporter has been causing major issues with bakes.
Reverting only the Max exporter to the older 3.17 version temporarily fixed the problem (as well as just baking with obj low poly), however I get errors everytime max201364bit starts, related to the exporter and importer not being loaded properly.
Png is lossless, and it's irrelevant whether i want to edit it in the photoshop or not. I'm merely seeing that photoshop seems better at downsampling color depth than xnormal is internally.
PNG is lossless, but 8bit generally isn't enough precision to accurately capture a normal bake.
Mostly it doesn't matter as the normal maps get compressed in the game anyway, but if you're doing object > tangent space conversions, you'll want to store (at least) the object space map in PNG 16bit (or another 16bit format).
Makes sense to store the tangent space bakes as 16bit until the last minute, too.
Trying to convert my tangent space normal map into an object space normal map however xnormal crashes when it gets to the "Converting mesh default normal map" stage.
Any ideas?
obj and tga's used. 3.18.2 xnormal version. maya 2014
I duplicate MY LP (push normals so it overlays my modell)
I Export it (used OBJ, USED SBM)
And i import HP and LW. Browse my cage, click to use cage.
(freeze transformation, delete history and pivot point is centered on all modells).
Same Amount Verts on both modells.
Uv'ed the low poly. Nothing is overlaying.
I've locked and unlocked Normals. What else can i do my fellow PC !!
This is a setting in xNormal. To reach the settings, click the plug icon that is located at the bottom left of the GUI.
When processing the meshes before a bake, xNormal will automatically triangulate them if they contain quads and with the "Shortest diagonal first" option checked, it can sometimes triangulate the LP mesh and the cage differently, even though the topology is the same for both the cage and LP mesh.
is the acceleration only limited to AO baking or other maps like curvature, cavity etc. supported ?
Normal maps and AO maps only. Other kind of maps are not really possible because are so complex that will pass the GPU 32 registers/thread and the performance will suck.
rying to convert my tangent space normal map into an object space normal map however xnormal crashes when it gets to the "Converting mesh default normal map" stage.
Any ideas?
N-gons = evil
Is there anyway to setup a batch render?
You must save your settings as XML file ( for instance, 1.xml, 2.xml, etc..).
Then just pass the filenames through the command like:
xNormal.exe 1.xml 2.xml
Trying to bake an ambient occlusion map with the Simple ambient occlusion generator results in a humanly unreadable error.
After the baking is done and the map has been saved to disk xNormal just throws an error box at me with the name of the file extension - .tga or .tif for example.
No further information are given - just that extension. What the heck? Is this some kind of quiz game?
Replies
The xN SBM exporter automatically triangulates the model using the Maya's API. Maybe, as triangulation might depend on the normals and edges, the resultant topology is different than the external cage. Try to triangulate the models manually ... and remember to Freeze Transformations before exporting.
Remember also that some old xN versions had a bug with the "use shortest diagonal" option. Try with the latest xN.
Im stuck figuring out how to bake an AO map from a low poly mesh only, so NO high poly, it keeps telling me to add high def mesh..
Anyone able to help ?
this question is related more to a Max-xNormal workflow than xNormal itself, but maybe someone has a clue.
Basically I'm having problems exporting meshes with edited 'face-weighted' normals and simultaneously using a projection modifier (to export via SBM). It seems that the modifier itself chucks out my edits to normals - the only way I've managed to bake edited normals with a cage is via the longer route of exporting the mesh as FBX, and then manually exporting the cage via the projection modifier's menu.
Any ideas? Cheers in advance!
So i have quite the issue with xnormal since the past few days. I tried to solve it alone and with the help of internet but i didn't find anything helpfull...
Just to be clear I used Xnormal perfectly plenty of times. It's a essential element of my workflow and i know how to use it pretty well .
So my problem is that Xnormal is just giving me blank map for everything from normal map to polypaint.
I checked in the viewer and both mesh are visible.
I tried to change the scale of both mesh
I tried with other meshes
I tried reinstalling it
Nothing seems to work.
I use 3ds 2013 and last version of Xnormal.
If you guys know something about how to fix it please share the tips
Thanks
What sort of problems are you guys having? PM me with the info if you like and I can pass it over to Morten as he doesn't visit PC very often or I can get him to email you guys directly.
I'm sure he will be willing to help you out with getting the implementation sorted.
You can open the plug-in manager just pressing over the yellow plug-in icon on the bottom-left of the window.
Anyways, if your LP mesh contains tangents, normals and biNormals then they will be used. Formats like SBM, FBX, dotXSI/Collada should be able to export them.
I've received several emails about that. Apparently, it's a ZB4R4+ bug: if you try to export an OBJ while polypaint is assigned to a layer in ZB you'll have problems.
If you want to send me your .max meshes compressed I can take a look, pls.
None. The xN's installer deploys automatically the required Cuda and Optix DLLs locally into your install dir.
Depends on what you need.
1. If you plan to compute AO via retopo and ray-tracing, you'll need a HP mesh and a LP mesh.
2. If you just want to compute fast and inacurate AO for a mesh then use the Simple AO Generator tool.
Alternatively, you can just use the LP mesh as HP mesh too, but you'll need to remove T-junctions and to setup a cage/use MatchUVs.
Santiago: Thanks for confirming.
So I took a look at how Xnormal displays the same mesh when in the 3d viewport, and the results are even worse than what I'm seeing in our app(Sorry, can't show any of that yet).
This is the OBJ with xnormal baked map, presumably using mikktspace, viewed in xnormal's 3d viewer:
Now, our result doesn't look nearly that bad, which makes me think that xn's 3d viewer isn't loading the right tangent space for this mesh or something. Really I just wanted a test case to view the best case that mikktspace is possible of doing, because what we have looks a lot better than this(just not as good as Maya).
So I figured maybe it was an issue loading OBJS. I set up a cage, then resaved as .SBM, and re-baked. Then I get this in xnormal's 3d viewer:
Which is way better, unfortunately, the baked content is totally different than the mikktspace content which mostly works for us, so it seems like when xnormal saves .SBM files its using a different tangent space yet again. When I try to load the baked content from this .SBM file, its much much worse in our app than the content baked from the simple OBJ file.
So, any ideas?
I'm really looking for a way to preview the most accurate implimentation of mikktspace to compare it to what we've got. If mikk isn't as accurate as Maya that's fine, I just want to make sure we're doing everything we can on our end.
Source engine isn't a million miles off Mikk, either. Might be worth a punt?
Since an obj file doesn't provide any tangent/binormals, it's normal you don't get the same result as a SBM which contain the tangent/binormals from maya, while with an OBJ it's mikktspace which create them. There is probably a difference between them.
Source : http://eat3d.com/forum/questions-and-feedback/custom-tangent-space-basis-ignores-binormal
Anyways, if you export the SBM from Maya, and you check in the "export tangent basis" option then it will export the Maya's tangent basis and not the mkktspace.
If you need the SBM to use the mkktspace just don't export the tangent basis so xN will compute it on the fly when the model is loaded ( then, you can enter the 3D viewer and re-save the meshes ).
Btw, we wan't to add another LP + HP example to xN. Any volunteer interestered in donating a model, pls? Must contain the LP mesh and a 2M+ HP mesh ready to compute normal map + AO without MatchUV.
thx
Are you refering to this: http://eat3d.com/forum/questions-and-feedback/bug-xnormal-viewer-vs-blender-glsl ?
Because we get similar issues when displaying Mikktspace. So I'm really curious to know if you pinned down what was causing the difference in the Xnormal display vs Blender.
No, we don't support MultiSubObj mats.
1 mesh = 1 material.
Some help please?
I've unchecked "Ignore per-vertex-color" on the high poly mesh and I'm baking using 'Bake highpoly's vertex colors. When I load the meshes in th 3d viewer I can see the vertex colour on the high poly mesh, but I can't get anything to bake. No vertex colour, normal, AO or anything. Where should I be looking to troubleshoot this?
The only other information I can offer, is that when the low poly mesh is a .ply file, I get an error that it doesn't have uv coordinates atached (It does).
edit: okay baking ray fails = all red. So that's a pretty big problem I can work from.
edit2: Okay, the program generating the UV's is also scaling the mesh by 1000x for no goddamn reason...
edit3: Okay and the program I'm using to rescale th meshes breaks uvs. Don't ask. I should mention at this point that I do not have access to... "traditional" 3d packages.
edit4: Set hp mesh to 1.0 scale. Set lp mesh to 0.001 scale. It workes! I fucking love you Santi, any program that can work with the BS software I have to put up with is worth it's weight in gold!
Is your HP mesh tagged as visible in xN? What version of xN are you using, pls?
Did you use an old version of Blender or 3dsmax to export the OBJ?
I've tried changing a few export settings in zbrush, and tried changing the UV's of my low poly, but that didn't helpt. hopefully you guys can help me.
NVM I've fixed it , my normals were locked. such a small mistake actually cost me more time then sculpting the damn thing..
For now I'm definately sticking to 16 bit outputs from xnormal
Png is lossless, and it's irrelevant whether i want to edit it in the photoshop or not. I'm merely seeing that photoshop seems better at downsampling color depth than xnormal is internally.
This is what my bake of a mesh exported via sbm looks like, note the weird colors. In the 3D viewer, most of the mirrored parts of the mesh look inverted and there's some really weird artifacts in some areas.
I export with an Edit Mesh and a Projection modifier stacked, using the "low poly" options in the sbm exporter.
Baking with FBX and a Push'd version as a cage gives me this result, more like what I'd expect. So what's up?
EDIT: Okay, I might now be missing a couple of bugfixes and whatnot, but moving back to version 3.17.16 fixed everything. I'm using Max '13 64bit, by the way.
thanks
For AO I don't know, but for Displacement/Normal map the result is blank because there is no difference between your High-Poly mesh and your Low-poly mesh since there are exactly the same. The baking process is to output in a texture the difference between two meshes.
Reverting only the Max exporter to the older 3.17 version temporarily fixed the problem (as well as just baking with obj low poly), however I get errors everytime max201364bit starts, related to the exporter and importer not being loaded properly.
Mostly it doesn't matter as the normal maps get compressed in the game anyway, but if you're doing object > tangent space conversions, you'll want to store (at least) the object space map in PNG 16bit (or another 16bit format).
Makes sense to store the tangent space bakes as 16bit until the last minute, too.
is the acceleration only limited to AO baking or other maps like curvature, cavity etc. supported ?
I'm having trouble with normals generated on the beveled edge of my sword:
They have this "stepped" effect rather than being smooth. This shows on the model too:
Any ideas?
obj and tga's used. 3.18.2 xnormal version. maya 2014
Not with default or custom tangent calc.
any ideas on what could be causeing the crash? My mesh perhaps?
Maya 2014 + xnormal (latest build)
I duplicate MY LP (push normals so it overlays my modell)
I Export it (used OBJ, USED SBM)
And i import HP and LW. Browse my cage, click to use cage.
(freeze transformation, delete history and pivot point is centered on all modells).
Same Amount Verts on both modells.
Uv'ed the low poly. Nothing is overlaying.
I've locked and unlocked Normals. What else can i do my fellow PC !!
fixed thanks to this!
Is there anyway to setup a batch render? I tried adding in all the meshes and even checked the batch protection boxes but no luck
N-gons = evil
You must save your settings as XML file ( for instance, 1.xml, 2.xml, etc..).
Then just pass the filenames through the command like:
xNormal.exe 1.xml 2.xml
Because you must erase all but the external cage mesh before exporting :poly136:
After the baking is done and the map has been saved to disk xNormal just throws an error box at me with the name of the file extension - .tga or .tif for example.
No further information are given - just that extension. What the heck? Is this some kind of quiz game?