got a problem with the hxgrid network bake.
i want to bake a highpoly on a lowpoly.
im using the hxGrid for network baking. works fine so far but takes much much longer than render myself.
the problem is, the highpoly is very large. i think xnormal trys to send the complete data to the different users. is it possible to speed this process up somehow ?
if i render one thing completly on my computer it takes around 2 mins.
if i bake it via hxgrid it takes one hour ??
thought it should be much faster.
sendDublicateTasks -> Try with 0, so the task won't be sent again if it's taking a long time to complete
tried this ini setting also. but no luck
what i have seen is, that the agents become a task from my ip. then after some seconds they discard it but still working for my ip with nothing and get a new working for my ip. this one loops also, after 10 minutes the entire line is spammed for all agents with working for ip....
did they time out or whta they are doing ?
if i manual kill the agents and restart them, they sometime start for real.
tried this ini setting also. but no luck
what i have seen is, that the agents become a task from my ip. then after some seconds they discard it but still working for my ip with nothing and get a new working for my ip. this one loops also, after 10 minutes the entire line is spammed for all agents with working for ip....
What's your "userMaxMemoryUsage" set to? Perhaps the task is reset due to out of memory errors? Try to increase it from physical RAM/8 to physical RAM/2 or /3
Also, test the # of packet lost and ping time. Run a command line and perform a ping command to the agent/coordinator IPs. You should get a <1% packet loss and a ping <5ms
Btw, you'll probably need to add the agent.exe, coordinator.exe and xNormal.exe to the Windows's firewall allowed applications list.
@About the backface AO: a question... do you want it enabled by default or disabled?
the highpoly mesh i want to bake is 450mb
the lowpoly only a few kb
usermax mem is set to 3gb
the pings to agents and to coordinator are perfect
1ms without any losses.
firewall is disabled
the normalmap seems to complete but as soon as it starts to create the height map the agents drop their task.
they get it and imediatly drop it without doing anything
the highpoly mesh i want to bake is 450mb
the lowpoly only a few kb
usermax mem is set to 3gb
the pings to agents and to coordinator are perfect
1ms without any losses.
firewall is disabled
the normalmap seems to complete but as soon as it starts to create the height map the agents drop their task.
they get it and imediatly drop it without doing anything
Can you send me the OBJ meshes saved 3 decimals only and compressed with 7zip ultra compression, pls? In that way I could debug the problem in depth.
Exporting cages in the newest version of Xnormal look funny when reimported into max.
Can you post a screenshot showing the problem, pls?
Coolness, I just wanted to mention that the other way around, importing cages to XNormal is perfect!
I only go from XNormal to Max because I get better results with the AO's from Max then from XNormal but that is my own fault, because I haven't looked into the exact settings to generating great AO's from XNormal but I will get around to it I am sure it will be awesome when I do or someone can suggest their settings.
On another note, I would just like to mention for me the UI when playing with the cages is obstructive.
I had an idea which would be to have the option to hide all other options surrounding the viewport (like you have for F1) and have the editing options for the cage off to the side not obstructing your view of the object and cage? Like when in F1/hiding UI, when inside of F1 have the option for F2 to enable cage editing and have that away from both the cage and object so you have full view of the object.
Here's a visual mock up for you:
Hope I can mention this and I am not trying to take away anything from your design but only suggesting a more efficient way (at least for me) to work with the cage options in XNormal.
Thanks for at least reviewing it and taking the time to take it under advisement.
On another note, I would just like to mention for me the UI when playing with the cages is obstructive.
The complete UI is a mess, I agree. The program grew too fast and ran out of control. xn4 will solve all that.
I had an idea which would be to have the option to hide all other options surrounding the viewport (like you have for F1) and have the editing options for the cage off to the side not obstructing your view of the object and cage?
If you know LUA scripting you can customize the UI as you want. Just see inside the ui.lua file and move the windows as you need.
when you set the resulting bake images to smaller then 128 pixel, you can bake within xnormal but these settings wont load with the xml file.
The minimum size of a map is determined by the "bucket size". If you set a bucket size of 128 but you're rendering to a 32x32 image then you'll have problems...
Is there a way I can make a low poly cage inside of maya that I can use in xnormal? Thank you, sory total newb here.
AFAIK, nope. There is nothing similiar to the 3dsmax's Projection modifier or xNormal's cage in Maya.
However, you can clone your object in Maya. Then, keep the topology INTACT(don't add/delete faces, don't add/delete vertices, etc), extrude a bit the mesh, move some vertices until the cloned mesh covers completely the highpoly mesh. Finally, export the mesh into a separate .OBJ and use it as "External cage file" in xNormal.
AFAIK, nope. There is nothing similiar to the 3dsmax's Projection modifier or xNormal's cage in Maya.
However, you can clone your object in Maya. Then, keep the topology INTACT(don't add/delete faces, don't add/delete vertices, etc), extrude a bit the mesh, move some vertices until the cloned mesh covers completely the highpoly mesh. Finally, export the mesh into a separate .OBJ and use it as "External cage file" in xNormal.
Thanks man. Actually, I was thinking about extruding the face but I'm not just really sure so I asked for confirmation.
One more little thing, this may have been brought up before, but in the simple AO tool the "Spread Angle" is an option there, but greyed out. Is there any reason we cant use this option here?
One more little thing, this may have been brought up before, but in the simple AO tool the "Spread Angle" is an option there, but greyed out. Is there any reason we cant use this option here?
The spread angle only works if you set the CPU mode.
Hello.
I'm baking AO map from HP model, which has one smooth group, also as LP version.
Why I have effect like this?:
I'm using 3.16.8.1301 version of xNormal.
In newest one I have this same postering.
1. Try to increase the # of rays per pixel.
2. Use the cosine distribution.
3. Try to use the 3.16.12 version because fixes several AO problems.
4. Sure your highpoly mesh's normals are set correctly ( seems it's using face normals instead of averaged ones ).
5. Limit the spread angle to avoid edge artifacts ( usually 162 deg is ok ).
6. Increase the bias parameter a bit ( for instance, from 0.0001 to 0.01 ).
Just followed the external cage file tutorial for 3ds max 9;
Numerous errors crooped up using the .ase extension as instructed. Not sure if it's my setup or what but the errors were with all the files, not just the cage. To fix, everything was re-exported using .obj format and it worked perfectly.
//btw I had errors with just the ao bake in max (normal was great) so after messing in xnormal with ray distances, using the calculator and otherwise this was the only solution that worked.//
AFAIK, nope. There is nothing similiar to the 3dsmax's Projection modifier or xNormal's cage in Maya.
However, you can clone your object in Maya. Then, keep the topology INTACT(don't add/delete faces, don't add/delete vertices, etc), extrude a bit the mesh, move some vertices until the cloned mesh covers completely the highpoly mesh. Finally, export the mesh into a separate .OBJ and use it as "External cage file" in xNormal.
Easiest way is to slect all the verts on your lowpoly, and change the move tool setting to "Normal". Then drag the handle that has the "N" at the end of it and export with the SBM plugin. Remember to freeze tranforms and delete history before you do though.
This is probably going to sound like contradiction from my last post lol but the same error crops up everytime...
Everything has been checked;
The cage has the same no of verts, edges, poly's and smooth groups as low poly. All the meshes have been x-formed, stack collapsed with all sub objects types deselected. Pivots centered with meshes aligned to each other.
On top of that tried different file formats with exporting selected/individual meshes, finally merged objects to a new max file and re-exported, nothing seems to work
Edit : fixed this without using the cage, results were pretty good though i got better with a cage before so it would be nice to know how to fix this problem for the future.
The cage has the same no of verts, edges, poly's and smooth groups as low poly.
The lowpoly mesh and the external cage must use the same topology. Having an equal number of verts, edges, etc... is good but the face indices must match too.
That error dialog appear when the external cage's indices dont not match with the lowpoly's ones. Do this test, pls: edit the external cage .OBJ with a text editor and see the first "f" element. Do the same for the lowpoly mesh. You'll see one uses something like "f 1/2/3/4" while the other uses something like "f 23/45/29/2".
Possible reasons: some vertices/faces were added/removed or 3dsmax triagulated differently the mesh (it's better to triangulate, then clone, move verts and save).
Btw... if you are using 3dsmax... why you use a external cage? Is it not easier to use the Projection modifier and to save as a .SBM?
The cage was made by cloning the low poly, extruding locally and deleting excess borders - so it was this that caused the vertices/face order to be re-arranged?
Thanks for the info, this totally went over my head as only vert/edge/polycount were mentioned as important factors in the tutorial and therefore the error made little sense.
Please could you explain what you mean by .SBM file? Sometimes I get better results in xnormal and sometimes with max but if there's a more precise way of calculating ray distances using just one application that would be great
hah you know I never even thought about using the push modifier here but have used it for displaying overlayed wireframes in the past. Just didn't consider it a useful modelling tool!
What do you mean about the uv's? The low poly wasn't altered but the clone (cage) was, are you saying the cage requires uv's too?
Hello,
i am realy having hard time trying to bake normals. Can somebody explain me this, when i bake normals in tangent space normal, i get normal map, but not as i expect it to be. It is blue and all, but when i use the same model and settings except i turn on baking in object space normals, normal map is looking way better. I'd expect them to be pretty much the same to see in the viewport but they are not. I ve been fidling with this for at least a week now. I hope my question makes sense, what am doing wrong. If somebody is willing to help me, i can send him my models if needed. help please, my head is just about to explode.
The cage was made by cloning the low poly, extruding locally and deleting excess borders - so it was this that caused the vertices/face order to be re-arranged?
Erasing any vertex/face/edge will invalidate the topology, yep.
So... the initial external cage must be a perfect clone of the lowpoly mesh. The only operation allowed is to move vertices/faces/edges. If you add/erase any face, edge or vertex then the error dialog will appear!
Please could you explain what you mean by .SBM file?
xNormal >= 3.12.0 (if I remember well) includes a mesh exporter for 3dsmax. It saves the meshes using a .SBM file extension.
The SBM files have several advantages over other formats:
1. They use the xNormal's native format... so they are faster to load than any other file format because no conversion/triangulation is needed.
2. They store all the information required by xNormal. That includes cages ( using the 3dsmax's Projection modifier's data )... so you can edit the cage in 3dsmax and to use it directly in xNormal...
They include also the normals/smoothing groups, UVs and per-vertex occlusion!
3. They don't loss precision. If you use the OBJ format you'll notice it's really a text format. The decimals there are rounded to 3,4 or 5 positions usually... and that can make your mesh to loss accuracy. The SBM files save the data as raw floating point numbers ( no precision is lost in the process ).
4. xNormal includes this for 3dsmax and Maya... and the SBM files can be also imported.
5. It's an open format. Using the xNormal SDK you can read/write SBM files easily ( for direct usage in your 3D engine, DCC tool, etc... ).
Soooooooo.... use the xNormal SBM exporter if you can!
ps: Btw... if you even wondered what SBM means is ... Simple Binary Mesh
hah you know I never even thought about using the push modifier here but have used it for displaying overlayed wireframes in the past. Just didn't consider it a useful modelling tool!
What do you mean about the uv's? The low poly wasn't altered but the clone (cage) was, are you saying the cage requires uv's too?
The cage was made by cloning the low poly, extruding locally and deleting excess borders - so it was this that caused the vertices/face order to be re-arranged?
Doing edits like the above(extruding and deleting faces) is going to screw up the vert order, and possibly screw up the uvs as well. Basically any modeling operations (extruded, cut, bevel, etc etc) is doing to reorder the verts. The vert # matching up isnt the only imporant thing, but the vert order as well(as you asked).
And yes AFAIK, your cage needs to have uvs, or else how would it know where to render the map to? =D The cage needs to be an exact duplicate of your low(with verts just pushed/moved around in some manner).
Interesting and definately reassuring to hear that jogshy Previously I had been using obj's with 12 decimal point precision but if like .tga there is no data lost in translation then .sbm format will be used exclusively from now on. Thankyou :thumbup:
ps: just downloaded and installed the latest update to xnormal (v3.16.12) as you say it addresses some ambient occlusion problems, can't wait to try it out with the new file format, hoping for some cracking results
--
Understood, so if there is a cage in the future requring more volume to encompass other meshes the push modifier will be used to avoid that problem, thanks EarthQuake
Trying to get a firm grasp on understanding the mechanics of this process but it seems the well of information is alot deeper than anticipated as I expected a cage to provide vertice quantity and 3d space xyz coordinate data to cast rays through to the high and low poly surfaces without need for uv's.
Have 3 parts of this latest model left to bake so will test this out by messing up the cage uv's and keeping those fingers crossed!
Edit - baking out a final part now with some smallish text, getting good results with obj using an external cage so going to try out sbm format using 3ds max projection cage, will post the results here.
Cheers
Some results; 3ds maxrtt projection cage (mental ray)
xnormalobj format (external cage)
xnormalSBM format (inc max projection cage)
Might be difficult to see but the text is clearest using xN's SBM format, even though the cage inside max was not altered the result is cleaner than 3ds max...
I have a quick question for anyone using Xnormal regularly... does xnormal cause a BSOD when you are baking AO maps? For me when i try to bake an AO map at 2048, it does all of the calculations, but them when it shows the map being baking, my system crashes.
here are my specs:
Core i7 920 2.66Ghz O.C. to 3.0
6GB DDR3 Ram at 1333Mhz
Nvidia Geforce 8800GT with Forceware driver 190.62
OS: Windows Vista Home Premium 64-bit
I have a quick question for anyone using Xnormal regularly... does xnormal cause a BSOD when you are baking AO maps? For me when i try to bake an AO map at 2048, it does all of the calculations, but them when it shows the map being baking, my system crashes.
here are my specs:
Core i7 920 2.66Ghz O.C. to 3.0
I like to OC also! xNormal really fires your CPU when it computes the AO in a similar way that FurMark does for the GPU.
Have you ran a OCCT/LinX almost 2h(or whatever the AO takes) and "Memtest for Windows" up to 600%? What happens if you disable the OC?
Hey Santy, I think the problem has gone away... I recently updated my drivers so things are going smoothly... it may have been the model i was using for reference, (25mil) and it have really taxed my CPU...
btw, I really love Xnormal... since ZB 3.5 removed Zmapper, I will be using this more often. Also, in the preview window, how do I zoom in so I can see my normals in full size?
Cool man...and if i can push my luck, what about the ability to have the window maximised . I would love that so i can see what model i am using in the viewer. And also allowing the viewer to show your normal maps as seamless....
and if you are looking for testers for xn4, count me in. I am upgrading my system next month...
Hi guys, love xnormal but got 1 big problem.
The baked normal map got this nasty UV seam, from distance its not really visable but up close u can see it.
Here's the problem:
Soo as you can see theres a edge following the whole UV seam.
When i apply the normal as a texture you can really see the problem.
You can see on the arrows at picture 4 how the coloring ain't even close to the same on both sides of the seam.
Get the same problem on whatever kind off map i try to render.
Settings like padding or ray distance dosent change the problem in any way.
neigan: Bear in mind that since tangent-space normal maps take into account UV seams as part of the tangent basis, there's no reason why the "colours" (as you show in your last image, normalmap applied as a diffuse) should match up on either side of the seam. In fact, more often than not they must be different in order to make the resultant per-pixel normals display correctly. They'd only be the same if the surface was perfectly flat at the seam, and the UVs were perfectly aligned on the same axis.
You can't just apply the image as a diffuse map and say "look! a seam!", since they're naturally there anyway.
I have overlapping uvs as im trying to conserve texture space and i'm getting a normal map issue and was hoping to get some help on it. Is there a way around this or do I just need to not overlap uvs? thanks in advance.
I tried flipping the green channel and it wasn't the issue. It turns out I had to offset the mirrored uvs then bake the normals. It mostly fixed the problem but I still have a small seam.
neigan: Bear in mind that since tangent-space normal maps take into account UV seams as part of the tangent basis, there's no reason why the "colours" (as you show in your last image, normalmap applied as a diffuse) should match up on either side of the seam. In fact, more often than not they must be different in order to make the resultant per-pixel normals display correctly. They'd only be the same if the surface was perfectly flat at the seam, and the UVs were perfectly aligned on the same axis.
You can't just apply the image as a diffuse map and say "look! a seam!", since they're naturally there anyway.
But there is a pretty obvius seam when its applied as a normal map too as you can see so what you said doesn't rly help.
Just applied it as texture to make it more obvius for you to see.
I have overlapping uvs as im trying to conserve texture space and i'm getting a normal map issue and was hoping to get some help on it. Is there a way around this or do I just need to not overlap uvs? thanks in advance.
But there is a pretty obvius seam when its applied as a normal map too as you can see so what you said doesn't rly help.
Just applied it as texture to make it more obvius for you to see.
I think you missed my point - applying normal maps as diffuse textures will show the "seams" even if they are perfectly hidden when the texture is correctly as a normal-map.
Showing the normal-map as a diffuse texture doesn't prove anything. The seams are clear enough in the other images. I could show a screenshot of a normal-mapped asset with perfectly hidden seams and then display the normal-map as diffuse and you'd see the UV seams clearly.
I appreciate that this doesn't help solve your problem, I was just trying to point out that you seem to have misunderstood how normal-maps work.
I surely don't know how normal maps work 100% more like 43%. I thought an offset bake would have solved my problems and it did handle most of it but thereis still a hint of a seam in marmoset so ive been trying to fix it with this tut.. http://boards.polycount.net/showthread.php?t=51088&page=2
with mixed results. thanks for helping to troubleshoot the problem guys i'm open to everything.
Replies
i want to bake a highpoly on a lowpoly.
im using the hxGrid for network baking. works fine so far but takes much much longer than render myself.
the problem is, the highpoly is very large. i think xnormal trys to send the complete data to the different users. is it possible to speed this process up somehow ?
if i render one thing completly on my computer it takes around 2 mins.
if i bake it via hxgrid it takes one hour ??
thought it should be much faster.
the compression is disabled in the ini file
thanks in advance
Creating edges bigger than 1 or 2 create funny looking artifacts on the normal maps when objects have close geo, <- is that more of a cage problem???
nevermind spoke too soon, uhg.
sendDublicateTasks -> Try with 0, so the task won't be sent again if it's taking a long time to complete
tried this ini setting also. but no luck
what i have seen is, that the agents become a task from my ip. then after some seconds they discard it but still working for my ip with nothing and get a new working for my ip. this one loops also, after 10 minutes the entire line is spammed for all agents with working for ip....
did they time out or whta they are doing ?
if i manual kill the agents and restart them, they sometime start for real.
What's your "userMaxMemoryUsage" set to? Perhaps the task is reset due to out of memory errors? Try to increase it from physical RAM/8 to physical RAM/2 or /3
Also, test the # of packet lost and ping time. Run a command line and perform a ping command to the agent/coordinator IPs. You should get a <1% packet loss and a ping <5ms
Btw, you'll probably need to add the agent.exe, coordinator.exe and xNormal.exe to the Windows's firewall allowed applications list.
@About the backface AO: a question... do you want it enabled by default or disabled?
the lowpoly only a few kb
usermax mem is set to 3gb
the pings to agents and to coordinator are perfect
1ms without any losses.
firewall is disabled
the normalmap seems to complete but as soon as it starts to create the height map the agents drop their task.
they get it and imediatly drop it without doing anything
Can you post a screenshot showing the problem, pls?
this makes me crazy since 3 days
will send you the sbm of the lowpoly and the obj from the highpoly and the xml file for xnormal
thanks in advance
Screenshots
Love XNormal still.
I only go from XNormal to Max because I get better results with the AO's from Max then from XNormal but that is my own fault, because I haven't looked into the exact settings to generating great AO's from XNormal but I will get around to it I am sure it will be awesome when I do or someone can suggest their settings.
On another note, I would just like to mention for me the UI when playing with the cages is obstructive.
I had an idea which would be to have the option to hide all other options surrounding the viewport (like you have for F1) and have the editing options for the cage off to the side not obstructing your view of the object and cage? Like when in F1/hiding UI, when inside of F1 have the option for F2 to enable cage editing and have that away from both the cage and object so you have full view of the object.
Here's a visual mock up for you:
Hope I can mention this and I am not trying to take away anything from your design but only suggesting a more efficient way (at least for me) to work with the cage options in XNormal.
Thanks for at least reviewing it and taking the time to take it under advisement.
when you set the resulting bake images to smaller then 128 pixel, you can bake within xnormal but these settings wont load with the xml file.
bug procedure
open a setting
set the texturesize of the rendered maps to smaller then 128, like 32 x 32
save settings
load these settings
the texture size switches back to 128
another question
is it better to render mip stages then creating them via downscaler by the dds tools from nvidia ?
If you know LUA scripting you can customize the UI as you want. Just see inside the ui.lua file and move the windows as you need.
The minimum size of a map is determined by the "bucket size". If you set a bucket size of 128 but you're rendering to a 32x32 image then you'll have problems...
Is there a way I can make a low poly cage inside of maya that I can use in xnormal? Thank you, sory total newb here.
However, you can clone your object in Maya. Then, keep the topology INTACT(don't add/delete faces, don't add/delete vertices, etc), extrude a bit the mesh, move some vertices until the cloned mesh covers completely the highpoly mesh. Finally, export the mesh into a separate .OBJ and use it as "External cage file" in xNormal.
Thanks man. Actually, I was thinking about extruding the face but I'm not just really sure so I asked for confirmation.
thx
One more little thing, this may have been brought up before, but in the simple AO tool the "Spread Angle" is an option there, but greyed out. Is there any reason we cant use this option here?
I'm baking AO map from HP model, which has one smooth group, also as LP version.
Why I have effect like this?:
I'm using 3.16.8.1301 version of xNormal.
In newest one I have this same postering.
1. Try to increase the # of rays per pixel.
2. Use the cosine distribution.
3. Try to use the 3.16.12 version because fixes several AO problems.
4. Sure your highpoly mesh's normals are set correctly ( seems it's using face normals instead of averaged ones ).
5. Limit the spread angle to avoid edge artifacts ( usually 162 deg is ok ).
6. Increase the bias parameter a bit ( for instance, from 0.0001 to 0.01 ).
Numerous errors crooped up using the .ase extension as instructed. Not sure if it's my setup or what but the errors were with all the files, not just the cage. To fix, everything was re-exported using .obj format and it worked perfectly.
//btw I had errors with just the ao bake in max (normal was great) so after messing in xnormal with ray distances, using the calculator and otherwise this was the only solution that worked.//
Easiest way is to slect all the verts on your lowpoly, and change the move tool setting to "Normal". Then drag the handle that has the "N" at the end of it and export with the SBM plugin. Remember to freeze tranforms and delete history before you do though.
Everything has been checked;
The cage has the same no of verts, edges, poly's and smooth groups as low poly. All the meshes have been x-formed, stack collapsed with all sub objects types deselected. Pivots centered with meshes aligned to each other.
On top of that tried different file formats with exporting selected/individual meshes, finally merged objects to a new max file and re-exported, nothing seems to work
Edit : fixed this without using the cage, results were pretty good though i got better with a cage before so it would be nice to know how to fix this problem for the future.
That error dialog appear when the external cage's indices dont not match with the lowpoly's ones. Do this test, pls: edit the external cage .OBJ with a text editor and see the first "f" element. Do the same for the lowpoly mesh. You'll see one uses something like "f 1/2/3/4" while the other uses something like "f 23/45/29/2".
Possible reasons: some vertices/faces were added/removed or 3dsmax triagulated differently the mesh (it's better to triangulate, then clone, move verts and save).
Btw... if you are using 3dsmax... why you use a external cage? Is it not easier to use the Projection modifier and to save as a .SBM?
new floater ao thing works perfectly. Also AO seems to be "cleaner" as well, great job!
Thanks for the info, this totally went over my head as only vert/edge/polycount were mentioned as important factors in the tutorial and therefore the error made little sense.
Please could you explain what you mean by .SBM file? Sometimes I get better results in xnormal and sometimes with max but if there's a more precise way of calculating ray distances using just one application that would be great
the push modifier is a much better bet.
but as santi says, just use a projection modifier.
What do you mean about the uv's? The low poly wasn't altered but the clone (cage) was, are you saying the cage requires uv's too?
i am realy having hard time trying to bake normals. Can somebody explain me this, when i bake normals in tangent space normal, i get normal map, but not as i expect it to be. It is blue and all, but when i use the same model and settings except i turn on baking in object space normals, normal map is looking way better. I'd expect them to be pretty much the same to see in the viewport but they are not. I ve been fidling with this for at least a week now. I hope my question makes sense, what am doing wrong. If somebody is willing to help me, i can send him my models if needed. help please, my head is just about to explode.
So... the initial external cage must be a perfect clone of the lowpoly mesh. The only operation allowed is to move vertices/faces/edges. If you add/erase any face, edge or vertex then the error dialog will appear!
xNormal >= 3.12.0 (if I remember well) includes a mesh exporter for 3dsmax. It saves the meshes using a .SBM file extension.
The SBM files have several advantages over other formats:
1. They use the xNormal's native format... so they are faster to load than any other file format because no conversion/triangulation is needed.
2. They store all the information required by xNormal. That includes cages ( using the 3dsmax's Projection modifier's data )... so you can edit the cage in 3dsmax and to use it directly in xNormal...
They include also the normals/smoothing groups, UVs and per-vertex occlusion!
3. They don't loss precision. If you use the OBJ format you'll notice it's really a text format. The decimals there are rounded to 3,4 or 5 positions usually... and that can make your mesh to loss accuracy. The SBM files save the data as raw floating point numbers ( no precision is lost in the process ).
4. xNormal includes this for 3dsmax and Maya... and the SBM files can be also imported.
5. It's an open format. Using the xNormal SDK you can read/write SBM files easily ( for direct usage in your 3D engine, DCC tool, etc... ).
Soooooooo.... use the xNormal SBM exporter if you can!
ps: Btw... if you even wondered what SBM means is ... Simple Binary Mesh
Doing edits like the above(extruding and deleting faces) is going to screw up the vert order, and possibly screw up the uvs as well. Basically any modeling operations (extruded, cut, bevel, etc etc) is doing to reorder the verts. The vert # matching up isnt the only imporant thing, but the vert order as well(as you asked).
And yes AFAIK, your cage needs to have uvs, or else how would it know where to render the map to? =D The cage needs to be an exact duplicate of your low(with verts just pushed/moved around in some manner).
ps: just downloaded and installed the latest update to xnormal (v3.16.12) as you say it addresses some ambient occlusion problems, can't wait to try it out with the new file format, hoping for some cracking results
--
Understood, so if there is a cage in the future requring more volume to encompass other meshes the push modifier will be used to avoid that problem, thanks EarthQuake
Trying to get a firm grasp on understanding the mechanics of this process but it seems the well of information is alot deeper than anticipated as I expected a cage to provide vertice quantity and 3d space xyz coordinate data to cast rays through to the high and low poly surfaces without need for uv's.
Have 3 parts of this latest model left to bake so will test this out by messing up the cage uv's and keeping those fingers crossed!
Edit - baking out a final part now with some smallish text, getting good results with obj using an external cage so going to try out sbm format using 3ds max projection cage, will post the results here.
Cheers
3ds max rtt projection cage (mental ray)
xnormal obj format (external cage)
xnormal SBM format (inc max projection cage)
Might be difficult to see but the text is clearest using xN's SBM format, even though the cage inside max was not altered the result is cleaner than 3ds max...
...SBM from now on!
here are my specs:
Core i7 920 2.66Ghz O.C. to 3.0
6GB DDR3 Ram at 1333Mhz
Nvidia Geforce 8800GT with Forceware driver 190.62
OS: Windows Vista Home Premium 64-bit
thank you...
Have you ran a OCCT/LinX almost 2h(or whatever the AO takes) and "Memtest for Windows" up to 600%? What happens if you disable the OC?
btw, I really love Xnormal... since ZB 3.5 removed Zmapper, I will be using this more often. Also, in the preview window, how do I zoom in so I can see my normals in full size?
and if you are looking for testers for xn4, count me in. I am upgrading my system next month...
The baked normal map got this nasty UV seam, from distance its not really visable but up close u can see it.
Here's the problem:
Soo as you can see theres a edge following the whole UV seam.
When i apply the normal as a texture you can really see the problem.
You can see on the arrows at picture 4 how the coloring ain't even close to the same on both sides of the seam.
Get the same problem on whatever kind off map i try to render.
Settings like padding or ray distance dosent change the problem in any way.
Any ideas?
You can't just apply the image as a diffuse map and say "look! a seam!", since they're naturally there anyway.
But there is a pretty obvius seam when its applied as a normal map too as you can see so what you said doesn't rly help.
Just applied it as texture to make it more obvius for you to see.
Here's how you fix that: http://boards.polycount.net/showthread.php?t=60615&highlight=pooh
I think you missed my point - applying normal maps as diffuse textures will show the "seams" even if they are perfectly hidden when the texture is correctly as a normal-map.
Showing the normal-map as a diffuse texture doesn't prove anything. The seams are clear enough in the other images. I could show a screenshot of a normal-mapped asset with perfectly hidden seams and then display the normal-map as diffuse and you'd see the UV seams clearly.
I appreciate that this doesn't help solve your problem, I was just trying to point out that you seem to have misunderstood how normal-maps work.
http://boards.polycount.net/showthread.php?t=51088&page=2
with mixed results. thanks for helping to troubleshoot the problem guys i'm open to everything.