I would like to point out a potential problem with the SBM exporter in maya : I discovered that when the Export Tangent Basis option is on, normal maps baked with Xnormal get weird (see image). If the option is not checked, the normal map is baked perfectly fine. I am baking from a high rez OBJ file, and using a SBM file for the low rez. All other settings are default.
I would like to point out a potential problem with the SBM exporter in maya : I discovered that when the Export Tangent Basis option is on, normal maps baked with Xnormal get weird (see image).
Thanks. I'll fix it.
Btw... is Maya 2010 available?
whats up with the xnormal site, its been down for a few days, anyone know?
Important: Maya 2009 Service Pack 1 has recently been updated
If you installed it before April 16th, 2009 , you must uninstall the existing version, then download and install this new version. The new service pack includes the following important fixes for you:
* mia_material_x_passes shader and shadows
Previously, the mia_material_x_passes shader caused shadows to behave incorrectly. This has been fixed.
* mental ray for Maya and processor limits
Previously, the mental ray for Maya renderer was not using all available processors when hyperthreading was turned on. This has been corrected so that all available processors are used for rendering.
* Color per vertex shading
Due to an optimization in CG 2.0, color per vertex shading was incorrectly rendered in the Render View and in batch rendering. This has been fixed.
* Stereo cameras and ATI graphics cards
Due to an update in CG 2.1, stereo cameras in Maya caused display errors when using an ATI graphics card. This has been fixed.
In addition to the above fixes, Autodesk Maya 2009 Service Pack 1 includes over 100 fixes for Autodesk Maya 2009 across several functional areas, including: Rendering, Assets, Animation and Rigging, Dynamics and Effects, Modeling, Python scripting and API.
Ok , ive tried searching through the forums and the internets and i cant find anything on this.
i have a character model i built a bit modularly as a result the arms and legs clip a bit, i wanted to have everything baked to one uvmap so thats set. i thought in xnormal batch protection wouldnt overwrite the pixels of something thats already rendered but im still getting a lot of nasty artifacts that look exactly like things are being overwritten.
the not so fun way would be to just render each piece out and then composite everything in pshop but isnt there an easier way?
also as far as settings im just working with the defaults right now. theres no cage either. just dont understand why if i render these one at a time its fine but if i try to do it all in one go i go lots of stuff clipping through and other weirdness.
sorry in advance if i am just missing something very basic.
Cause your cage is picking up the other pieces where they intersect or are close. Just make an exploded version of the high and low poly. And a unexploded version for the AO map.
::::::::::
Idea that I would love to see: A screensaver built in that when started, a certain port on their puter opens and the hx agent starts broadcasting from it to a master server. This way, people away from their desk, a user rendering could steal some cpu ticks to help batch render their object. Right now Im not even sure you can at this point do anything but LAN renders? And each box must have the agent started manually. I don't have another box to use, so I cant test all the features of the hxgrid.
In other words:
-User A leaves system
-Screensaver starts, hx agent starts as well and starts broadcasting to set server that its available though a port unfirewalled.
-User B, whom is rendering; the local hx coordinator not only look on LAN, but contacts this server to see whats available.
-To keep processing down on server. This master server only lists whats available. It does not coordinate. The local users coordinator would do this.
-If User A comes back to system, screensaver shuts off and quits the agent while sending a signal to the master server and the coordinator of User B that User A's system no longer available.
-Coordinator then goes out to the master server and coordinates with the next available box on the list.
Another idea with this is the agent also sends how long its been in a screensaver mode. The longer the time, the more the coordinator can send data since its more likely the user is away for awhile. While a screensaver thats just started. There would be smaller processing portions sent in case that user comes back.
This could be a way to make a few bucks as well, to charge extra for access to this master server list.
Its difficult trying to continue work in the background when xnormal is rendering an AO. This way, people who dont have access to large render farms could work off one another.
00Zero: There's no such thing as "too high" padding. It doesn't overwrite anything, higher is always better - the higher it is, the less likely you will get mip-mapping seam issues when viewed at a distance. Obviously it takes a bit longer to process, but apart from that there are only benefits to high padding values.
binopittan: As 00Zero says, it looks like your ray casting distance is set too low, so it's not hitting the highpoly mesh. Or maybe it's too high, and it's hitting that bowl geometry. What's the aim of that, to have more occlusion under and around the sides?
i would like to say thanks to the creator of xnormal i tried to bake a scene using mudbox which is all buggy it crased all the time in 30 min i installed watch a tutorial and baked my object whiout any trouble
So I got the new version 3.16 I believe and followed the linked video tutorial on the xNormal site exactly but still get messed up with bakes. Roommate gets similar buggy results.
Imported HP, then LP, set smooth normals to Average normals. Did the ray distance calculator. Waited 30 seconds. The numbers never changed when doing the calculating thing, is that odd? My geo was just a simple plane FYI.
When I generate maps it seems to diagonally split my maps in half for some reason. Did it with Normal and AO.
Max Cage render works fine when baking. Tried different object with more complex geo same buggy results.
This is what it looks like and my baking options.
Edit: Results look like this in PS too, not just render viewer error.
Are your UVs in a strict [0,1] range? Seems the top-left and top.-right corners could be 1,001. Triangles crossing the [0,1] boundaries could be conflictive ( not the ones lying completely outside ). Could be also a good idea to snap-to-pixel the borders.
Btw, an advice: give the UV borders some margin for the dilation filter.
Are you sure your UVs aren't overlapping? Are your normal map's UVs assigned to the channel 1?
Did you use the nephast old 3dsmax's max2obj exporter? Use better the gw:obj or the xN's SBM exporter... and remember to ResetXForm before exporting ( and an Edit Mesh could help also to load the file faster because, in that way, the triangulation process can be skipped ).
Btw... are you using cages or ray blockers? Or uniform ray distances?
What happens if you render a "Wireframe and ray fails" map?
I'm using xNormal to bake my DWIV entry - a big chunky thing with lots of hard edges - and keep finding that xNormal overwrites the exported vertex normals of my lopoly at rendertime ("Averaging vertex normals..."), which is causing havoc for my maps. I basically want a ray distance render from a hard-edged mesh to render correctly without needing to split up the mesh manually before exporting, which should work considering that the settings for the smoothing in xNormal says "Use exported normals"... Help?
"Use exported normals" For me has always worked. It may be another issue, like having much to harsh angles in your mesh and just general smoothing errors. Post some screenshots.
when you create a normalmap make sure you bake the triangulated version.. maya for example triangulates different than xnormal, thus producing normalmap errors.
all the highpolies i bake are always like if all theyr faces are hard edges , i export from max, maya , mudbox etc always the same thing, i export as i always have been exporting, no issues with that, and even if i set average normals , or exported normals it always renders as if it was all hard edges , thus producing insatisfatory normals and ao , i cant give you a scene example , but just export any obj from max or maya as a high and a low and youll see what i mean when rendering it. thanks !
Are you using the (defective,old,bugged) max2obj exporter that comes with the old versions of 3dsmax? Use better the gw:obj or the SBM.
Remember to apply a resetXForm before exporting in 3dsmax and "Freeze transformations" in Maya.
The AO, by nature, is computed using the face normals. Using per-vertex normals produces errors near the triangles's edges... but, anyways, you can compute a per-vertex-ao and then UNcheck the "ignore per-vertex AO" so the AO will be interpolated using the per-vertex values.
The smiley example (.obj) was exported from Maya but I cannot reproduce that error you mention... I'm afraid I need a more concrete description or example to reproduce the bug.
Btw... are you using cages or constant ray distances? Have you played with the weld/break feature? Perhaps is a bug related to that.
hey jogshy i made this image to show the issues i am getting, it happens even with reset x form etc, with the simplest geo etc , anyway i made diff renders and attached sorry about this , its just that i use xnormal in my workflow daily and if it has the ao like this i cant use it :( anyway man, hope you can help me out as you always do ^^
... but I'm afraid that's because I changed the way the AO is computed.
For the 3.14.6 I was using the vertex normals ( which is not correct and can cause artifacts near the edges... this is problably why you're getting those white artifacts ). Now I'm using the face normals instead ( which is correct but gives a less smooth result if the mesh is not subdivided ).
Why I say to use the vertex normals is not correct? See this image:
Some ways to solve the incorrect self-occlusion problem are:
1. Use the face normals instead ( which I'm doing now ). You can make the AO smoother subdividing a bit the highpoly mesh ( ideally, in a way that each highpoly's face occupy less than 2 texels ).
2. Apply some bias to the ray origin point ( hard to measure it accurately ... and it can skip a lot of detail ). I discarded this because I could not get decent results. I'm aware Dreamworks applies some bias against the normal putting the ray origin towards when dealing with micro-displaced surfaces... the problem is that it can miss a lot of the detail and it's really hard to setup manually.
3. Use the Simple GPU AO calculator to compute per-vertex AO... then UNcheck the "Ignore per-vertex AO" on the corresponding highpoly mesh slot... so the per-texel AO will be computed interpolating the per-vertex AO. This will give you a softer AO but less accurate.
4. Perform some jitter so the shadows will turn noisy. You can make that and then filter the AO map using a median/gauss blur/bilateral filter in Photoshop....
It's a hard problem to solve ( I'm open to suggestions) ... for example, Mental Ray uses the face normals to compute the AO and it adds noise to hide the "hard edges" effect... while Blender uses the interpolated per-vertex + some subdivision... I bet that's why the SH-point/disk-based AO is becoming more and more popular hehe!
For xn4, like I'm integrating the GPU AO renderer into the "standard" process, you could choose between to interpolate the per-vertex AO or to compute the per-pixel AO using the face normal ( and, perhaps, I'm gonna add the option to use the interpolated vertex normal although it's incorrect ). You'll see it... in a few days
I hope this clarifies it.
Btw... what happens if you decrease the spread angle(for example, to 150 degrees or if you apply linear/quadratic attenuation? And use the cosine distribution instead of the uniform one. That might help to smooth the hard edge appearance a bit.
hey jogshy tried what you suggest and none works the mesh keeps getting faceted an unusable , in 14.6 or something this doesnt hapen , except those white weird stuff, is it possible to reuse 14.6 method but try to get rid of the white stuff ? thanks man i really appreciate it
hey jogshy tried what you suggest and none works the mesh keeps getting faceted an unusable ,
Even if you subdivide the mesh one or two times or you use the Simple AO calculator to compute the pvao and then interpolate it per-texel?
is it possible to reuse 14.6 method but try to get rid of the white stuff
The white stuff is due to the incorrect self-occlusion with the vertex normals. I'm afraid there's no easy solution for that... if the vertex normals are used then you get smooth ao but some artifacts can appear. If the face normals are used then the artifacts won't be present but you'll get a faceted aspect.
Probably the best way to get a smoother result is to subdivide the mesh a few or to use the Simple AO calculator so the AO will be interpolated for each texel.
Well... I'm going to see what I can do and to release the 3.16.9.
thx
Haven't found any info on this, is there no way to change the focal length of the camera in the 3D viewer? Would be so handy.
For depth of field effect? No, I have not implemented that yet.
However, if you just want to rotate the camera around a point then you can use the CTRL key+mouse. To control that point you have two options: Check in the Auto-camera orbit or move the camera target distance slider.
Nah, I don't mean DOF (that would be a nice addition though =D), nor moving the camera, but the actual focal length of of the camera lens, in max it's called field of view I think.
I just thought of something that >might< be easy to add?
What about an actual ground plane that's incorporated into the AO renderer. That can be choosen as a option in it. So people wanting their AO to have darker areas in the objects (z?) plane for static objects. So they wont have to remember to export their AO version with a plane, while not with the normal version. Another reason is then your raytracing calculator will try to measure the ground plane if its exported from a 3d program as part of the high rez mesh, making the distance farther than it should. So if the ground plane was already in the engine, the calculator could ignore.
Since the size of the objects imported will vary, this ground plane would have to be able to dynamically enlarge and movie its position to be below the lowest point of the high rez model without hitting any portion of the cage or raytracing distance.
I Have problem while baking...Normal map does not render properly/black shots on map...
Version 3.16.8
I have compare map with mudbox below map is render out in mudbox
This one from xnormal
Here is uv shot
This not only problem with this model i try different model turn out to be same problem...
Am i missing some part of setting here...
In xnormal i have set average normal and set up r.max/r.min by ray distance calculator...
Looks like the models aren't placed correctly in xNormal, try loading the low poly and high poly in the xNormal 3d viewer, and check if they overlap as they should, maybe when exporting your pivot points got moved around.
Looks like the models aren't placed correctly in xNormal, try loading the low poly and high poly in the xNormal 3d viewer, and check if they overlap as they should, maybe when exporting your pivot points got moved around.
It overlapping correctly
By i have reset setting and map are rendering well.....
As you see there edge seams . Is it bcoz low poly is too much low to hold details....
It looks like its because you have hard edges on your lowpoly, if need to use hard edges, you should separate the uv islands where they are hard. If you have hard edgs on areas like this where it is still connected, you'll get bad artifacts.
But for this object, i wouldn't see any reason why you would want hard edges.
I have a question, I tried searching for it here but man...61 pages is a lot. Anyways....
When I export from zbrush to 3ds max, they align perfectly, when I export from max to zbrush it aligns perfectly...but when I export high poly from zbrush to xnormal and low poly from max to xnormal, they don't seem to match up...I'm getting the normals of my characters head in the uv space for the stomach!
I have "rotate model" off in 3ds max when exporting and importing...I guess I don't know the correct way to export from zbrush to xnormal...can someone help, please?
Edit: In the 3d viewer the low-poly from max is ok, it's standing up and looking forward...I guess it really is the way it's exporting from zbrush...is there a "right" setting for the ImportExport option in zbrush?
Second Edit: I figured it out. I had to export from zbrush with switch YZ on export and flip Y on export checked from zbrush.
gaganjain: There is problem with your smoothing groups. Assign only 1 smoothing group to your model it should fix the hard edges.
Applying smooth also does not works.It still show same edge seam
But applying smoothing group will increase poly count...
Have look at this is normal map applied base mesh from mudbox
As you see those seams come where edge flow that is also on border of mesh
I have a question, I tried searching for it here but man...61 pages is a lot. Anyways....
When I export from zbrush to 3ds max, they align perfectly, when I export from max to zbrush it aligns perfectly...but when I export high poly from zbrush to xnormal and low poly from max to xnormal, they don't seem to match up...I'm getting the normals of my characters head in the uv space for the stomach!
I have "rotate model" off in 3ds max when exporting and importing...I guess I don't know the correct way to export from zbrush to xnormal...can someone help, please?
Edit: In the 3d viewer the low-poly from max is ok, it's standing up and looking forward...I guess it really is the way it's exporting from zbrush...is there a "right" setting for the ImportExport option in zbrush?
Second Edit: I figured it out. I had to export from zbrush with switch YZ on export and flip Y on export checked from zbrush.
Max may be transforming it in import, if possible you could to import your high into max, and export again
Remember to ResetXForm in 3dsmax before exporting ( or Freeze transformations + triangulate in Maya )... and don't use the super-bugged max2obj 3dsmax's exporter!
gagan: Show the normals in Maya's viewport to see if they're made hard or not.
Sure you export it with the MayaObj exporter's "Export smoothing groups" option enabled... or, even better, use the xn's maya sbm exporter.
And... what happens if you use a cage? Do you see a continuous cage or a broken one? Are you using the "Use exported normals" option in the corresponding lowpoly mesh?
gagan: Show the normals in Maya's viewport to see if they're made hard or not.
Sure you export it with the MayaObj exporter's "Export smoothing groups" option enabled... or, even better, use the xn's maya sbm exporter.
And... what happens if you use a cage? Do you see a continuous cage or a broken one? Are you using the "Use exported normals" option in the corresponding lowpoly mesh?
Yes i am using export normal while exporting low mesh...
Tried with sbm exporter...
Export with "Export smoothing groups"
Tried use Normals>soften edge
but no use...
Tried different uv set then same uv seams...
jogshy did't get this line...
what happens if you use a cage? Do you see a continuous cage or a broken one?
Is it using cage in low res option while baking.Using cage also...
Is it using cage in low res option while baking.Using cage also...
Show it in the xn 3d viewer pls ( and enable the "show tangent basis" option too in order to see if your UV's are welded )... and render a "Wireframe and ray fails map" to see the UV seams... and post a screenshot with your lowpoly mesh's settings too pls.
Replies
wish it were so, but show up the same in blender and pretty weird in xnormal too
anywayz i've tried extracting normals off a simpler mesh and it worked , no artefacts and such, it must be mesh related...right ?
:poly121:
you may size it down
I would like to point out a potential problem with the SBM exporter in maya : I discovered that when the Export Tangent Basis option is on, normal maps baked with Xnormal get weird (see image). If the option is not checked, the normal map is baked perfectly fine. I am baking from a high rez OBJ file, and using a SBM file for the low rez. All other settings are default.
thanks
Btw... is Maya 2010 available?
xnormal.net works for me
Btw, I've just uploaded the 3.16.8.
The latest version of Maya is 2009 SP1.
I always have hard time downloading it....
http://usa.autodesk.com/adsk/servlet/ps/dl/item?siteID=123112&id=12715497&linkID=9242259
Important: Maya 2009 Service Pack 1 has recently been updated
If you installed it before April 16th, 2009 , you must uninstall the existing version, then download and install this new version. The new service pack includes the following important fixes for you:
* mia_material_x_passes shader and shadows
Previously, the mia_material_x_passes shader caused shadows to behave incorrectly. This has been fixed.
* mental ray for Maya and processor limits
Previously, the mental ray for Maya renderer was not using all available processors when hyperthreading was turned on. This has been corrected so that all available processors are used for rendering.
* Color per vertex shading
Due to an optimization in CG 2.0, color per vertex shading was incorrectly rendered in the Render View and in batch rendering. This has been fixed.
* Stereo cameras and ATI graphics cards
Due to an update in CG 2.1, stereo cameras in Maya caused display errors when using an ATI graphics card. This has been fixed.
In addition to the above fixes, Autodesk Maya 2009 Service Pack 1 includes over 100 fixes for Autodesk Maya 2009 across several functional areas, including: Rendering, Assets, Animation and Rigging, Dynamics and Effects, Modeling, Python scripting and API.
i have a character model i built a bit modularly as a result the arms and legs clip a bit, i wanted to have everything baked to one uvmap so thats set. i thought in xnormal batch protection wouldnt overwrite the pixels of something thats already rendered but im still getting a lot of nasty artifacts that look exactly like things are being overwritten.
the not so fun way would be to just render each piece out and then composite everything in pshop but isnt there an easier way?
also as far as settings im just working with the defaults right now. theres no cage either. just dont understand why if i render these one at a time its fine but if i try to do it all in one go i go lots of stuff clipping through and other weirdness.
sorry in advance if i am just missing something very basic.
::::::::::
Idea that I would love to see: A screensaver built in that when started, a certain port on their puter opens and the hx agent starts broadcasting from it to a master server. This way, people away from their desk, a user rendering could steal some cpu ticks to help batch render their object. Right now Im not even sure you can at this point do anything but LAN renders? And each box must have the agent started manually. I don't have another box to use, so I cant test all the features of the hxgrid.
In other words:
-User A leaves system
-Screensaver starts, hx agent starts as well and starts broadcasting to set server that its available though a port unfirewalled.
-User B, whom is rendering; the local hx coordinator not only look on LAN, but contacts this server to see whats available.
-To keep processing down on server. This master server only lists whats available. It does not coordinate. The local users coordinator would do this.
-If User A comes back to system, screensaver shuts off and quits the agent while sending a signal to the master server and the coordinator of User B that User A's system no longer available.
-Coordinator then goes out to the master server and coordinates with the next available box on the list.
Another idea with this is the agent also sends how long its been in a screensaver mode. The longer the time, the more the coordinator can send data since its more likely the user is away for awhile. While a screensaver thats just started. There would be smaller processing portions sent in case that user comes back.
This could be a way to make a few bucks as well, to charge extra for access to this master server list.
Its difficult trying to continue work in the background when xnormal is rendering an AO. This way, people who dont have access to large render farms could work off one another.
quickie using my dw entry.
also, whats with the dome?.
go to the low definition mesh tab in XN and increase the ray distance. max and min.
binopittan: As 00Zero says, it looks like your ray casting distance is set too low, so it's not hitting the highpoly mesh. Or maybe it's too high, and it's hitting that bowl geometry. What's the aim of that, to have more occlusion under and around the sides?
MoP. yes it,s because i want evenly distributed AO around my mesh. it,s just how i usually bake my AO in max
Xnormal rocks!
So I got the new version 3.16 I believe and followed the linked video tutorial on the xNormal site exactly but still get messed up with bakes. Roommate gets similar buggy results.
Imported HP, then LP, set smooth normals to Average normals. Did the ray distance calculator. Waited 30 seconds. The numbers never changed when doing the calculating thing, is that odd? My geo was just a simple plane FYI.
When I generate maps it seems to diagonally split my maps in half for some reason. Did it with Normal and AO.
Max Cage render works fine when baking. Tried different object with more complex geo same buggy results.
This is what it looks like and my baking options.
Edit: Results look like this in PS too, not just render viewer error.
Any help is much appreciated.
Been using it quite a bit.
Are your UVs in a strict [0,1] range? Seems the top-left and top.-right corners could be 1,001. Triangles crossing the [0,1] boundaries could be conflictive ( not the ones lying completely outside ). Could be also a good idea to snap-to-pixel the borders.
Btw, an advice: give the UV borders some margin for the dilation filter.
Are you sure your UVs aren't overlapping? Are your normal map's UVs assigned to the channel 1?
Did you use the nephast old 3dsmax's max2obj exporter? Use better the gw:obj or the xN's SBM exporter... and remember to ResetXForm before exporting ( and an Edit Mesh could help also to load the file faster because, in that way, the triangulation process can be skipped ).
Btw... are you using cages or ray blockers? Or uniform ray distances?
What happens if you render a "Wireframe and ray fails" map?
I'm using xNormal to bake my DWIV entry - a big chunky thing with lots of hard edges - and keep finding that xNormal overwrites the exported vertex normals of my lopoly at rendertime ("Averaging vertex normals..."), which is causing havoc for my maps. I basically want a ray distance render from a hard-edged mesh to render correctly without needing to split up the mesh manually before exporting, which should work considering that the settings for the smoothing in xNormal says "Use exported normals"... Help?
all the highpolies i bake are always like if all theyr faces are hard edges , i export from max, maya , mudbox etc always the same thing, i export as i always have been exporting, no issues with that, and even if i set average normals , or exported normals it always renders as if it was all hard edges , thus producing insatisfatory normals and ao , i cant give you a scene example , but just export any obj from max or maya as a high and a low and youll see what i mean when rendering it. thanks !
Remember to apply a resetXForm before exporting in 3dsmax and "Freeze transformations" in Maya.
The AO, by nature, is computed using the face normals. Using per-vertex normals produces errors near the triangles's edges... but, anyways, you can compute a per-vertex-ao and then UNcheck the "ignore per-vertex AO" so the AO will be interpolated using the per-vertex values.
The smiley example (.obj) was exported from Maya but I cannot reproduce that error you mention... I'm afraid I need a more concrete description or example to reproduce the bug.
Btw... are you using cages or constant ray distances? Have you played with the weld/break feature? Perhaps is a bug related to that.
Thanks, now I see better the problem.
... but I'm afraid that's because I changed the way the AO is computed.
For the 3.14.6 I was using the vertex normals ( which is not correct and can cause artifacts near the edges... this is problably why you're getting those white artifacts ). Now I'm using the face normals instead ( which is correct but gives a less smooth result if the mesh is not subdivided ).
Why I say to use the vertex normals is not correct? See this image:
Some ways to solve the incorrect self-occlusion problem are:
1. Use the face normals instead ( which I'm doing now ). You can make the AO smoother subdividing a bit the highpoly mesh ( ideally, in a way that each highpoly's face occupy less than 2 texels ).
2. Apply some bias to the ray origin point ( hard to measure it accurately ... and it can skip a lot of detail ). I discarded this because I could not get decent results. I'm aware Dreamworks applies some bias against the normal putting the ray origin towards when dealing with micro-displaced surfaces... the problem is that it can miss a lot of the detail and it's really hard to setup manually.
3. Use the Simple GPU AO calculator to compute per-vertex AO... then UNcheck the "Ignore per-vertex AO" on the corresponding highpoly mesh slot... so the per-texel AO will be computed interpolating the per-vertex AO. This will give you a softer AO but less accurate.
4. Perform some jitter so the shadows will turn noisy. You can make that and then filter the AO map using a median/gauss blur/bilateral filter in Photoshop....
It's a hard problem to solve ( I'm open to suggestions) ... for example, Mental Ray uses the face normals to compute the AO and it adds noise to hide the "hard edges" effect... while Blender uses the interpolated per-vertex + some subdivision... I bet that's why the SH-point/disk-based AO is becoming more and more popular hehe!
For xn4, like I'm integrating the GPU AO renderer into the "standard" process, you could choose between to interpolate the per-vertex AO or to compute the per-pixel AO using the face normal ( and, perhaps, I'm gonna add the option to use the interpolated vertex normal although it's incorrect ). You'll see it... in a few days
I hope this clarifies it.
Btw... what happens if you decrease the spread angle(for example, to 150 degrees or if you apply linear/quadratic attenuation? And use the cosine distribution instead of the uniform one. That might help to smooth the hard edge appearance a bit.
The white stuff is due to the incorrect self-occlusion with the vertex normals. I'm afraid there's no easy solution for that... if the vertex normals are used then you get smooth ao but some artifacts can appear. If the face normals are used then the artifacts won't be present but you'll get a faceted aspect.
Probably the best way to get a smoother result is to subdivide the mesh a few or to use the Simple AO calculator so the AO will be interpolated for each texel.
Well... I'm going to see what I can do and to release the 3.16.9.
thx
However, if you just want to rotate the camera around a point then you can use the CTRL key+mouse. To control that point you have two options: Check in the Auto-camera orbit or move the camera target distance slider.
What about an actual ground plane that's incorporated into the AO renderer. That can be choosen as a option in it. So people wanting their AO to have darker areas in the objects (z?) plane for static objects. So they wont have to remember to export their AO version with a plane, while not with the normal version. Another reason is then your raytracing calculator will try to measure the ground plane if its exported from a 3d program as part of the high rez mesh, making the distance farther than it should. So if the ground plane was already in the engine, the calculator could ignore.
Since the size of the objects imported will vary, this ground plane would have to be able to dynamically enlarge and movie its position to be below the lowest point of the high rez model without hitting any portion of the cage or raytracing distance.
Version 3.16.8
I have compare map with mudbox below map is render out in mudbox
This one from xnormal
Here is uv shot
This not only problem with this model i try different model turn out to be same problem...
Am i missing some part of setting here...
In xnormal i have set average normal and set up r.max/r.min by ray distance calculator...
By i have reset setting and map are rendering well.....
As you see there edge seams . Is it bcoz low poly is too much low to hold details....
But for this object, i wouldn't see any reason why you would want hard edges.
Feel free to test it.
thx
Sever does allows to resume download once stop...errr slow net :poly118:...
EQ@ Low-poly wire frame shot....
spread uvs but then also uv edge seams
Here is same model render normal-map in mudbox2k9 sp2
When I export from zbrush to 3ds max, they align perfectly, when I export from max to zbrush it aligns perfectly...but when I export high poly from zbrush to xnormal and low poly from max to xnormal, they don't seem to match up...I'm getting the normals of my characters head in the uv space for the stomach!
I have "rotate model" off in 3ds max when exporting and importing...I guess I don't know the correct way to export from zbrush to xnormal...can someone help, please?
Edit: In the 3d viewer the low-poly from max is ok, it's standing up and looking forward...I guess it really is the way it's exporting from zbrush...is there a "right" setting for the ImportExport option in zbrush?
Second Edit: I figured it out. I had to export from zbrush with switch YZ on export and flip Y on export checked from zbrush.
But applying smoothing group will increase poly count...
Have look at this is normal map applied base mesh from mudbox
As you see those seams come where edge flow that is also on border of mesh
Below one is from mudbox
I have upload file..Anyone want to try....
http://www.mediafire.com/download.php?zh5znxytjbm
Max may be transforming it in import, if possible you could to import your high into max, and export again
gagan: Show the normals in Maya's viewport to see if they're made hard or not.
Sure you export it with the MayaObj exporter's "Export smoothing groups" option enabled... or, even better, use the xn's maya sbm exporter.
And... what happens if you use a cage? Do you see a continuous cage or a broken one? Are you using the "Use exported normals" option in the corresponding lowpoly mesh?
Tried with sbm exporter...
Export with "Export smoothing groups"
Tried use Normals>soften edge
but no use...
Tried different uv set then same uv seams...
jogshy did't get this line... Is it using cage in low res option while baking.Using cage also...