Sure you use the Beta 3b ( not the Beta 3 ) or you'll have some severe problems... ( /whistle whistle BSOD ) :poly136:
What ForceWare dirvers are you using? Seems Optix runs well in Windows 7 only with the FW 195.62 . The 196.21/192.34b are broken apparently.... I'm not sure about XP or Vista.
Well... that or your mesh does really need more VRAM
Btw, I think NVIDIA gonna release a new version of Optix this Friday so perhaps I could post a new release soon ( including unbiased ray tracing in the 3D viewer also ).
As always, thank you for the speedy reply.
My Forceware drivers are 195.62, as instructed on your website. The installer that I used is the latest from your website named "xnormal_3.17.0_beta_3b_installer.exe". I am in fact running on Windows 7 x64 and I have tried with both the 64bit version and the 32bit version of xNormal, just to rule out that I wasn't using the "wrong" one.
The video card I'm using is an Nvidia GeForce GTX 275 with 896mb of RAM on a system with 8gb of ram. While not the highest it could go, I can't imagine that memory is really the problem. The high poly mesh I'm trying to bake is only 760mb as an obj. I will experiment with the new OpenCTM format you mention on your site just in case.
Again, thank you for your help and assistance to the community.
My Forceware drivers are 195.62, as instructed on your website. The installer that I used is the latest from your website named "xnormal_3.17.0_beta_3b_installer.exe". I am in fact running on Windows 7 x64 and I have tried with both the 64bit version and the 32bit version of xNormal, just to rule out that I wasn't using the "wrong" one.
It's correct then.
The video card I'm using is an Nvidia GeForce GTX 275 with 896mb of RAM on a system with 8gb of ram. While not the highest it could go, I can't imagine that memory is really the problem. The high poly mesh I'm trying to bake is only 760mb as an obj.
Question: can you render the Smiley example's AO ( 2k x 2k, tile size 512 ) using the Optix renderer? If you get that out of memory error then it's a driver's problem ( well, or beta problem ).
If the smiley example renders ok then it's a real out of memory error... so you could try to subdivide a bit less the highpoly mesh and try again. You could also try to reduce the tile size to 256 and see what happens.
If nothing solves it then I'm afraid you need to wait for the next release which should fix some bugs.
I will experiment with the new OpenCTM format you mention on your site just in case.
Nope. The OpenCTM just compresses the disk space,. The mesh will occupy the same in RAM/VRAM once it's uncompressed.
Question: can you render the Smiley example's AO ( 2k x 2k, tile size 512 ) using the Optix renderer? If you get that out of memory error then it's a driver's problem ( well, or beta problem ).
If the smiley example renders ok then it's a real out of memory error... so you could try to subdivide a bit less the highpoly mesh and try again. You could also try to reduce the tile size to 256 and see what happens.
I was in fact able to render the Smiley mesh at your recommended settings so I guess it is a memory problem. I will try lowering the HP mesh subd levels but the Zbrush sculpting I did has very fine details that I am sure will get lost if I lower the level. I'll experiment some with Decimation Master to see if I can come up with something that actually works. For now, the tried and true slow way of baking still works with my high poly mesh so I will keep using that.
The speed of Optix that you have is extremely impressive for sure. Very awesome addition!
Btw, nex, there is an option called "Show stats" in the Optix renderer's options. Enter the plugin-manager and press the "configure" button.
I put a bar with the memory used, so you can see how much VRAM is free for your model.... but you need to render a map successfully to see it.
I'm hoping for some help with xnormal. I have Zbush 3.5 and Xnormal 3.17.0 beta 3b
I have a mesh in zbrush w/ 4 subtools. (The meshes were created from one low poly mesh which was create in maya 09)
All meshes were imported with mesh scaling set to 100
Two of those subtools have produced normals maps w/o any issue in xnormal. However, the other 2 are just producing "blobs" (see attached pictures). It's like xnormal isn't even recognizing the UVing of the mesh which was done in Maya before zbrushing.
1. Sure your UVs do NOT overlap. If you need to work with multiple UV sets then you must separate each mesh by material and export each part independently.
One mesh = one texture.
2. Seems some rays are failing to hit the highpoly model. You can confirm this rendering a "Wireframe and ray fails map". If you see a lot of red pixels then the rays are failing.
Have you setup a cage? Is the cage covering completely the highpoly mesh?
3. Before exporting in Maya, I suggest you to perform a Triangulate + Freeze transformations. Then, export as .SBM
4. Those "blobs" are probly the "Edge padding". xNormal dilates the normal map texels to avoid problems with mipmapping. It's a good thing and completely normal.
I've found on my system 2 million is about the max I can work with in Modo, does xnormal handle high poly meshes better, would I be able to import higher than 2 million there?
1. Sure your UVs do NOT overlap. If you need to work with multiple UV sets then you must separate each mesh by material and export each part independently.
One mesh = one texture.
2. Seems some rays are failing to hit the highpoly model. You can confirm this rendering a "Wireframe and ray fails map". If you see a lot of red pixels then the rays are failing.
Have you setup a cage? Is the cage covering completely the highpoly mesh?
3. Before exporting in Maya, I suggest you to perform a Triangulate + Freeze transformations. Then, export as .SBM
Thanks for the help. I'm confused by a few things
1) rendering a "Wireframe and ray fails map" - I've never heard of this should this be done in xnormal, zbrush, or maya?
2) The Cage - I've heard about using a cage before, but I've never used in in the past (however, I've only made 2 models w/ xnormals help in the past). Are there any tutorials out there about setting up a cage?
3) .SBM - I'm using .obj. Does .sbm work basically the same way? I'm new and have just never heard that that extension before
2) The Cage - I've heard about using a cage before, but I've never used in in the past (however, I've only made 2 models w/ xnormals help in the past). Are there any tutorials out there about setting up a cage?
The cages allow you to specify a per-face ray distance visually. The constant uniform ray distances are not visual and, as their name indicates, it's "constant" so the same ray distance will be applied to the whole mesh.
Usually, complex or organic models require a cage to limit correctly the rays, because specifying the same distance for the whole mesh will lead to incorrect ray inter-penetration due to a excesive distance used or to ray misses if it's too short.
See the http://www.xnormal.net/Tutorials.aspx Ray distance measurement method 2A: xNormal built-in cage editor Ray distance measurement method 2B: External cages Ray distance measurement method 3: 3DSMAX 9 .SBM exporter
3) .SBM - I'm using .obj. Does .sbm work basically the same way? I'm new and have just never heard that that extension before
It's the xNormal's native format ( SBM= simple binary mesh ).
xNormal comes with exporters/importers for 3dsmax and Maya.
The SBM has lots of advantages vs the .OBJ format ( loads faster, uses less RAM, can include vcolors/cage data, has improved vertex normals/smoothing groups accuracy, etc... )
I'm having problems using Dilation filter for PS.. I just can't get it to work.
I wasn't able to find details about where should I put mask and which layer should be selected for filter to work.
Help would be appriciated
Two methods:
1. Work with a transparent background. Perform the dilation.
2. Flatten the image. Go to channels. Make RGB+Alpha visible. Select RGB+Alpha. Perform dilation. Note: If your background is not transparent, you must flatten the image or you won't be able to select the RGB+Alpha.
The alpha channel controls which pixels will be dilated: alpha 0 means dilation, alpha >0 means no dilation.... so you'll need always RGB+Alpha selected and visible.
Hey jogshy, I`m having trouble working out how to even use the .ctm format? Maybe its just me, but can you explain how to even load them? They dont appear in the file formats? I have 3.16.12 installed, and just installed 3.17.0 Beta 3b without any uninstalling the old one? :poly122:
Hey jogshy, I`m having trouble working out how to even use the .ctm format? Maybe its just me, but can you explain how to even load them? They dont appear in the file formats? I have 3.16.12 installed, and just installed 3.17.0 Beta 3b without any uninstalling the old one? :poly122:
OpenCTM is only available in x86 mode. The OpenCTM author does not provide a library for x64 and I have no idea about how to build it using the source codes :poly136:
Thanks again jogshy! very strange, it is working now! I have not done anything since this morning when it was not available in the list, and now it appears! I had a .ctm file which I couldnt see, now I can Many thanks for sticking with me..... heh
So i've been meaning to bring this up for a while. I'de like to open the discussion back up on averaging the cage normals, as a feature got implemented(which was cool) but it is a much much slower way of doing the same thing in max or maya.
So first off, lets go over what it takes to get an "averaged" projection cage in maya.
you need to:
1. load the 3d viewer
2. turn on the cage editing tools
3. select all of the verts
4. weld all verts
5. expand the cage as you normally would
6. save out your mesh as .sbm(or whichever xnormal format it is, i dont rememeber)
7. make sure XN is loading the correct mesh
.....
8. repeat process any time you change your mesh
In max:
1. Set up your RTT, default settings with give you a averaged projection cage. To toggle methods, simply click "use offset" in the RTT options, essentially a one-click workflow
In maya:
1. Set up your transfer maps, by default your settings for "match using" should be set to "geometry normals"(average projection). To toggle, simply set to "surface normals", again another one-click workflow.
So what this all boils down to, is a very complex workflow in xnormal, that A. most users do not understand, B. is excessively complicated and C. Needs to be entirely redone every time even a small mesh change is made.
Most models i do do not require a cage, i could do my meshes a little differently, and rely more on the cage to fix various issues, but i like to rely on the placement of geometry, and smart planing to avoid issues like skewed and intersecting rays. But if i want a nice seamless bake, i'm forced to use this complicated process, when all i really should have to do is click a box, or select from a dropdown in the lowpoly tab to choose weather i want to use the exporter normals, or averaged normals for my projection. And again, the issue of needing to redo the entire process is a HUGE detraction. I think there are a lot of people that do not rely on setting up a cage for every mesh, and requiring them to do so creates a big disadvantage for the XN workflow.
This issue is so important, that i personally have stopped using XN for pretty much anything but generating AO, and that is really sad because XN is a great tool and you know how much i support all of the work that you've done. This whole matter has a lot of people coming to the conclusion that using XN for anything but a mesh that has entirely smooth normals is just not a viable solution.
I'm one of those guys - never use XNormal for any hard-surface work, and that's a shame. Both because of what EQ says and because I've had the "Use Imported" smoothing settings ignored one too many times (a bug that may or may not be fixed by now). If it was as lovely to use when it comes to hard-surface stuff as it is with single smoothing group models, one could just fire off two sets of bakes with that single checkbox on or off, then combine the results in PS for perfect hard-surface goodness. Alas...
1. load the 3d viewer
2. turn on the cage editing tools
3. select all of the verts
4. weld all verts
The latest versions use welding by default.
7. make sure XN is loading the correct mesh
Just ask "yes" to the auto-assign dialog.
8. repeat process any time you change your mesh
Yep, that's a problem. I perhaps should integrate the xN renderer into Maya.
I think there are a lot of people that do not rely on setting up a cage for every mesh, and requiring them to do so creates a big disadvantage for the XN workflow.
To setup a cage can be as easy as: enter the 3D viewer. Extrude the cage with the "global cage extrussion" until it covers the highpoly. Save meshes. For simple objects don't need to move the vertices.
For 3dsmax you should be using already the projection modifier so it won't be a problem.
And well... with luck, everybody will use DX11's tessellation so you can use Match UVs (dual parameterization) to skip the cage setup and ray distances.
Even if it is improved in the latest version, you're still missing the point. Its it still quite a bit easier to switch methods in the other main baking apps, which puts XN at a disadvantage. Setting up a cage is a really slow process in XN no matter how you look at it. Again the HUGELY important aspect that, whenever you make even a slight change to your mesh, you have to redo the entire process. This results in a very slow workflow, when all i really need is a checkbox in the lowpoly to set which method to use, with an offset. When i'm making a change, its automatically updated and i dont have to worry about anything, which would be very fast.
If your highpoly mesh is just the subdivided lowpoly mesh + sculpting, you don't need cages. Just mark the MatchUV and you'll get a perfect normal map/AO without having to setup the cage/ray distances.
Your DX11 engine can use then the lowpoly mesh, subdivide it using the hull/domain shader/vertex displacement with a direction/normal map to reconstruct the highpoly model in a very easy way.
But you'll need DX11. DX10/DX9 can use instanced tessellation but it's not optimal.
So... DX11 = no need for cages/ray distances.
However there's a problem with this method: you can't retopo, just subdivide... so forget about using this for non-organic meshes. If you subdivide a box... you will get a ... nice sphere :poly124:
well all i really need is a checkbox in the lowpoly to set which method to use, with an offset
Sorry, I don't understand. Do you mean a method like Maya's?
Select face normals/smooth normals and a ray distance? You can do that in xNormal too just assigning "Smooth normals"/"Harden normals" and specifying the constant ray distances manually.... although I bet Maya is using some black magic to smooth the hard edges a bit.
Yeah, but that only accounts for a small fraction of character work, you'll virtually never use that for environment/hard surface work, and only a very small number of times will you end up with the exact same cage as your low. This is a situation that is probably about 1% of all models that need baking.
Yep, that's a problem. I perhaps should integrate the xN renderer into Maya.
While this would be very cool, that doesnt actually fix the problem, it simply *moves* the problem. Anyone not using maya still has the same issue, i think relying on app-specific fixes are a bad idea, since XN can be a great universal tool.
Really again, as a user there shouldn't be any need to do anything but simply import your OBJ, and select the projection method. Am i missing something here? I dont know how XN is structured and if that is a really super difficult thing to change or what, so let me know. Having extra tools to go beyond that is all well and good, but the basic functionality is the most important aspect.
Sorry, I don't understand. Do you mean a method like Maya's?
Select face normals/smooth normals and a ray distance? You can do that in xNormal too just assigning "Smooth normals"/"Harden normals" and specifying the constant ray distances manually.... although I bet Maya is using some black magic to smooth the hard edges a bit.
What i want is a way to average JUST the projection, not the entire mesh. Averaging the smoothing on the entire mesh is really bad, as it causes smoothing errors etc. But in max/maya, the projection is averaged so you can have your lowpoly set up with hard edges, but not get the missed detail on said edges because of the spliting rays. Chaning the normals of the lowpoly itself isn't an option, it must be only on the "projection mesh".
Now you can do this by manually setting up a cage, but its a very slow process. What you want here is essentially an "invisible"(to the user) cage that is generated automatically using the offset distances and does not require anything more than turning the feature on(a drop down to select average normals, or use exported normals for *only* the projection). Sure you can go in and set up a custom cage, but the good majority of assets do not need a custom cage, and again the problem with redoing the entire process for even simple changes.
This is basically what maya does, when you add a mesh to your projection, it automatically creates a cage for you based on an input value, in maya's case its a %, for all intents and purposes, this is the exact same thing to the user as an offset distance. You set the projection method(either averaged or using the mesh's normals) and you're done. After the bake finishes, the "invisble mesh" is deleted, and another is created anytime you adjust the offset distance. There is an option to save the invisible "envelope/cage" to tweak as you see fit, but it is not a requirement.
What i want is a way to average JUST the projection, not the entire mesh.
Ok, I understand now. Thanks for the in-depth explanation.
The latest xN version use welding by default. You just need to perform a very similar thing to you're doing in Maya: enter the 3D viewer, extrude the cage with the "global cage extrussion" slider until it covers the highpoly model, hit save meshes and answer "yes" to the auto-assignment dialog.
This can take some time, but consider Maya has already all the meshes loaded and it's displaying a 3D viewport... and xNormal needs to load all from zero.
I think this could make the process more agile:
1. I could export some existing Maya envelopes as cages when saving to a .SBM file. I've seen the transfer maps copies the object and extrudes it generating an evelop called [objectName]+shape1Envelope. Perhaps I could try to find those envelopes automatically to save them to the SBM.
Other option is to create a script that could act like the 3dsmax's Projection modifier.
2. I could weld/break the cage a bit faster: if there are no vertices selected, I can pop a dialog asking to perform the break/weld over the complete cage.... so you won't need to select all the vertices manually.
I'm using the nvidia driver 196.21 on Win7 x64 and xNormal 3.16.13. Will I have to upgrade to the latest xNormal for the cage average-weld mode?
I'm not sure. I think the 3.16.13 uses averaged cages by default.
Do a simple test: export a hard-edged mesh into a SBM file with the "Export cage" UNchecked ( well, or from Maya where that option is not present... or use a .OBJ ). Enter the xNormal 3D viewer and extrude the cage. See if the cage breaks.
Ok, I understand now. Thanks for the in-depth explanation.
I can add an option to the Maya SBM exporter in order to save the transfer-maps's envelope. I think Maya just adds an "shape1Envelope" suffix to the original mesh so I could search for it automatically. You can save those envelopes without problems, can't you? If not, I could create script to compute the envelopes like the 3dsmax's Projection modifier, so the SBM exporter could use that info to export the file.
I can also add a button to weld the cage without having to select all the vertices.
Allright I could totally be wrong here but I do use Xnormal for my bakes and just ran a few tests. When you specify an envelope in the baking dialogue in maya the envelope that gets created does not have averaged normals but draws from the lowpoly normals. Is it possible then that it only gets averaged during the baking process, if you so specified ?
EQ? any light on this ?
I also tried manually averaging the envelope/cage and importing it into Xnormal but I still got results that did not have averaged cage normals.
The averaged cage can only be done at bake time Max/Maya ? Is it not possible to import an "averaged" cage into Xnormal ?
Is it not possible to import an "averaged" cage into Xnormal ?
Yep, it is:
1. For max: use the projection modifier. Save as SBM with the "Export cage option enabled"
2. For maya: clone your lowpoly mesh. Average normals. Extrude some faces/vertices. Save as .OBJ.
In xNormal, assign the external cage file.
3. In xN, using the cage editor: load your mesh, enter the 3D viewer. Show cage->Global cage extrude until it covers the highpoly mesh. Press "save meshes" and answer "yes" to the auto-assign dialog. You can control if you want the cage broken or averaged selecting some vertices.
But I agree with EC... I must find a method to setup the cages in a more agile way.
I think you can save the transfer maps envelopes for later using some tricks ( ctrl+z, clone, etc )
Yeah, that's how I've been preserving the envelopes and importing them into Xnormal, the problem is that despite averaging the normals it still doesnt work properly. Do I have to manually "weld" vertices in the Xnormal editor despite averaging the normals already in Maya?
1. For max: use the projection modifier. Save as SBM with the "Export cage option enabled"
2. For maya: clone your lowpoly mesh. Average normals. Extrude some faces/vertices. Save as .OBJ.
In xNormal, assign the external cage file.
3. In xN, using the cage editor: load your mesh, enter the 3D viewer. Show cage->Global cage extrude until it covers the highpoly mesh. Press "save meshes" and answer "yes" to the auto-assign dialog. You can control if you want the cage broken or averaged selecting some vertices.
Again, isn't this a redudant process ? I already modified my envelope/cage to wrap my highpoly and already averaged the normals. My problem is that despite averaging normals on the cage in Maya, when it gets imported into Xnormal, the averaged normals seem to get ignored.. So again, do I have to manually "weld" it all in maya for it to work ? Normals seems to be preserved in the lowpoly, does it not work when importing the external cage ?
I already modified my envelope/cage to wrap my highpoly and already averaged the normals. My problem is that despite averaging normals on the cage in Maya, when it gets imported into Xnormal, the averaged normals seem to get ignored..
Have you enabled the "use cage" option? Can you see your cage correctly imported if you enter the 3D viewer and you show the cage?
If you use cages the lowpoly mesh's normals aren't used to fire rays. Just the directional vector ( cage.pos-mesh.pos ) is taken into consideration. The external cage's vertex normals won't be used.
Have you enabled the "use cage" option? Can you see your cage correctly imported if you enter the 3D viewer and you show the cage?
If you use cages the lowpoly mesh's normals aren't used to fire rays. Just the directional vector ( cage.pos-mesh.pos ) is taken into consideration. The external cage's vertex normals won't be used.
Btw... I'm not sure about Maya but Max seems to use vertex extrussion from the pivot point and not averaged normals... it's very curious :poly142:
Ok I think I may have been confused a little before..
So this is how I understand it, please correct me if I am wrong.
1. When you use raydistance(Xnormal)/offset(Max)/surfacenormals(Maya) it just shoots the rays from the positions of lowpoly mesh's normals.
2. A cage is used to shoot the rays in the relative direction of the lowpoly to the cage
So what is this "averaged projection" thing that EQ is talking about ? Do you basically want to use a cage in Xnormal ? I just export mine out of Maya. Is this what you are looking for ?
1. When you use raydistance(Xnormal)/offset(Max)/envelope%(Maya) it just shoots the rays from the positions of lowpoly mesh's normals.
Well, I can really tell about xN.
Nope. xN has four methods to fire rays:
1. If you use cages, rays are fired from the cage position following a direction equal to normalize(lowpoly.pos-cage.pos). The vertex normal is not used by rays ( it will be used just to transform to tangent space ).
You can subdivide this into other two: averaged cage or "broken" one. The averaged generates a countinous cage surface(useful for organic models) while the broken is discontinuous(useful for hard-edged meshes).
2. If you use uniform ray distances, rays are fired from the lowpoly.position+(lowpoly.normal*distance) using a direction equal to negate(lowpoly.normal)
3. If you use MatchUVs, you don't need to setup cages and neither to play with ray distances... BUT, your highpoly mesh must be the lowpoly mesh+subdivision+sculpting. DX11 games gonna use this massively with vector/displacement mapping and tessellation.
4. Ray blockers. Like "anti-portals". Rays simply stop when they hit a blocker.
Each method has some pros and cons. Just choose the one that it's better for each case, your pipeline and work method.
Is there a way to separate each high poly mesh to render in conjunction with its lowpoly counter part. Right now I'll export each individual low/high mesh and have to hide/unhide pieces as I go and compile each map in PS.
The batch protect doesn't seem to split the high poly meshs, so any intersecting norms pass on to other parts in the render order.
In xN how do I control whether an imported external cage is averaged or hard-edged ?
If you use 3dsmax you can put a Projection modifier on the top of the stack and export as .SBM with the "Export cage" option enabled. that will use an averaged cage.
If you use external cages or the xN cage editor, enter the 3D viewer. Enabled show cage->edit cage. Select some vertices(or all) and press "weld" to average it or "break" to deatch the cage faces using hard-edges.
Btw, for the next release I'm gonna solve some problems I detected yesterday playing with the averaging algorithm and to make it a bit easier to use.
For xn4 I'll put an option to render a map for each object.
So any chance to get good results with multi objects at the same time without Photoshop and with less time ?. I've a zbrush file with 20 subtools, and 20 retopoly lowpolys...
I test many things - but everytime i get overlapping object seems.
Any solution for xnormal or a trick in Zbrush to switch between two position to get clean nom / ao maps ?
Hello,
Got a problem using Optix renderer xnormal V3.17.0.3
[IMG]file:///C:/Users/3D-DES%7E1/AppData/Local/Temp/moz-screenshot.png[/IMG]Invalid context (Details : Function "rtContextLaunch2D" caught exception: Encountered a CUDA error :
cuGLRegisterBufferObject(buffer->getGLId()) returned (3): Not initialized [458802])
Cadavr: You can't link files from your hard drive, you have to upload it to some sort of hosting service (such as a private FTP if you have one, or a free site like photobucket, dropbox or imageshack).
Hello,
Got a problem using Optix renderer xnormal V3.17.0.3
[IMG]file:///C:/Users/3D-DES%7E1/AppData/Local/Temp/moz-screenshot.png[/IMG]Invalid context (Details : Function "rtContextLaunch2D" caught exception: Encountered a CUDA error :
cuGLRegisterBufferObject(buffer->getGLId()) returned (3): Not initialized [458802])
Quadrofx1800 on board and 191.87 drivers
You need the 195.62 or the new 197.13 beta. Optix does neither work with the 196.XX.
If you use WinXP and you've only one graphics card, you won't be able to use Optix due to the WinXP "watchdog". With Vista/7 you can.
i came into this discussion halfway through and might be way off the mark, but i've noticed that no matter what i do in xnormal there are split normals in the cage at UV seams. the exported cage has averaged normals, i tried using exported normals and forcing "smooth normals" in xN - neither seemed to average the UV seams. I think this is part of the same group of problems.
i've tried exporting as like, obj, fbx, and a couple others... but i dont think i've tried SBM yet. Is this more reliable when it comes to loading the correct normals?
I hear you can open the 3d viewer and force a weld on the cage to solve the issue, but as EQ said... every time?
Replies
As always, thank you for the speedy reply.
My Forceware drivers are 195.62, as instructed on your website. The installer that I used is the latest from your website named "xnormal_3.17.0_beta_3b_installer.exe". I am in fact running on Windows 7 x64 and I have tried with both the 64bit version and the 32bit version of xNormal, just to rule out that I wasn't using the "wrong" one.
The video card I'm using is an Nvidia GeForce GTX 275 with 896mb of RAM on a system with 8gb of ram. While not the highest it could go, I can't imagine that memory is really the problem. The high poly mesh I'm trying to bake is only 760mb as an obj. I will experiment with the new OpenCTM format you mention on your site just in case.
Again, thank you for your help and assistance to the community.
Question: can you render the Smiley example's AO ( 2k x 2k, tile size 512 ) using the Optix renderer? If you get that out of memory error then it's a driver's problem ( well, or beta problem ).
If the smiley example renders ok then it's a real out of memory error... so you could try to subdivide a bit less the highpoly mesh and try again. You could also try to reduce the tile size to 256 and see what happens.
If nothing solves it then I'm afraid you need to wait for the next release which should fix some bugs.
Nope. The OpenCTM just compresses the disk space,. The mesh will occupy the same in RAM/VRAM once it's uncompressed.
I was in fact able to render the Smiley mesh at your recommended settings so I guess it is a memory problem. I will try lowering the HP mesh subd levels but the Zbrush sculpting I did has very fine details that I am sure will get lost if I lower the level. I'll experiment some with Decimation Master to see if I can come up with something that actually works. For now, the tried and true slow way of baking still works with my high poly mesh so I will keep using that.
The speed of Optix that you have is extremely impressive for sure. Very awesome addition!
Thanks again.
I put a bar with the memory used, so you can see how much VRAM is free for your model.... but you need to render a map successfully to see it.
Thanks
1. Use the xNormal built-in cage editor ( enter the viewer 3D, edit the cage and press the Save meshes button ).
or
2. Place a Projection modifier in 3dsmax and save as .SBM with the "Export cage" enabled.
or
3. Use an external cage file ( like a .OBJ ).
I'm hoping for some help with xnormal. I have Zbush 3.5 and Xnormal 3.17.0 beta 3b
I have a mesh in zbrush w/ 4 subtools. (The meshes were created from one low poly mesh which was create in maya 09)
All meshes were imported with mesh scaling set to 100
Two of those subtools have produced normals maps w/o any issue in xnormal. However, the other 2 are just producing "blobs" (see attached pictures). It's like xnormal isn't even recognizing the UVing of the mesh which was done in Maya before zbrushing.
I have examples here: http://sarahdwip.blogspot.com/2010/03/xnormal-hates-me.html
Thanks for any help!
Sarah
About the UV problems:
1. Sure your UVs do NOT overlap. If you need to work with multiple UV sets then you must separate each mesh by material and export each part independently.
One mesh = one texture.
2. Seems some rays are failing to hit the highpoly model. You can confirm this rendering a "Wireframe and ray fails map". If you see a lot of red pixels then the rays are failing.
Have you setup a cage? Is the cage covering completely the highpoly mesh?
3. Before exporting in Maya, I suggest you to perform a Triangulate + Freeze transformations. Then, export as .SBM
4. Those "blobs" are probly the "Edge padding". xNormal dilates the normal map texels to avoid problems with mipmapping. It's a good thing and completely normal.
Thanks for the help. I'm confused by a few things
1) rendering a "Wireframe and ray fails map" - I've never heard of this should this be done in xnormal, zbrush, or maya?
2) The Cage - I've heard about using a cage before, but I've never used in in the past (however, I've only made 2 models w/ xnormals help in the past). Are there any tutorials out there about setting up a cage?
3) .SBM - I'm using .obj. Does .sbm work basically the same way? I'm new and have just never heard that that extension before
Thanks again!
Sarah
The cages allow you to specify a per-face ray distance visually. The constant uniform ray distances are not visual and, as their name indicates, it's "constant" so the same ray distance will be applied to the whole mesh.
Usually, complex or organic models require a cage to limit correctly the rays, because specifying the same distance for the whole mesh will lead to incorrect ray inter-penetration due to a excesive distance used or to ray misses if it's too short.
See the
http://www.xnormal.net/Tutorials.aspx
Ray distance measurement method 2A: xNormal built-in cage editor
Ray distance measurement method 2B: External cages
Ray distance measurement method 3: 3DSMAX 9 .SBM exporter
and
http://www.poopinmymouth.com/tutorial/normal_workflow.htm
tutorials.
To use the cage is very easy really: just extrude it/move vertices until they cover completely the highpoly model.
It's the xNormal's native format ( SBM= simple binary mesh ).
xNormal comes with exporters/importers for 3dsmax and Maya.
The SBM has lots of advantages vs the .OBJ format ( loads faster, uses less RAM, can include vcolors/cage data, has improved vertex normals/smoothing groups accuracy, etc... )
I hope it helps.
http://www.incgamers.com/News/21293/nvidia-19675-kills-video-cards
http://www.brightsideofnews.com/news/2010/3/4/nvidia-retracts-whql-certified-gpu-killing-drivers.aspx
The 196.34/196.21 breaks Optix...
The 195.62 are old but seem to work ok.
I wasn't able to find details about where should I put mask and which layer should be selected for filter to work.
Help would be appriciated
1. Work with a transparent background. Perform the dilation.
2. Flatten the image. Go to channels. Make RGB+Alpha visible. Select RGB+Alpha. Perform dilation. Note: If your background is not transparent, you must flatten the image or you won't be able to select the RGB+Alpha.
The alpha channel controls which pixels will be dilated: alpha 0 means dilation, alpha >0 means no dilation.... so you'll need always RGB+Alpha selected and visible.
So.. I gave it a go in x86 with no luck...
Windows 7
xnormal_3.17.0_beta_3b_installer
I uninstalled all older versions, even reverted my drivers back to the latest good one, 195.62.
Any ideas?
Can you see a file called [xN installation path]\x86\plugins\MeshImporter_CTM.dll ? It should be there.
Enter xNormal and go to the plugin manager->Mesh importers. Search for the OpenCTM Mesh Importer.
If you can't find it, uninstall xNormal and erase completely the install folder. Re-install xNormal 3.17.0 b3b and try again.
So first off, lets go over what it takes to get an "averaged" projection cage in maya.
you need to:
1. load the 3d viewer
2. turn on the cage editing tools
3. select all of the verts
4. weld all verts
5. expand the cage as you normally would
6. save out your mesh as .sbm(or whichever xnormal format it is, i dont rememeber)
7. make sure XN is loading the correct mesh
.....
8. repeat process any time you change your mesh
In max:
1. Set up your RTT, default settings with give you a averaged projection cage. To toggle methods, simply click "use offset" in the RTT options, essentially a one-click workflow
In maya:
1. Set up your transfer maps, by default your settings for "match using" should be set to "geometry normals"(average projection). To toggle, simply set to "surface normals", again another one-click workflow.
So what this all boils down to, is a very complex workflow in xnormal, that A. most users do not understand, B. is excessively complicated and C. Needs to be entirely redone every time even a small mesh change is made.
Most models i do do not require a cage, i could do my meshes a little differently, and rely more on the cage to fix various issues, but i like to rely on the placement of geometry, and smart planing to avoid issues like skewed and intersecting rays. But if i want a nice seamless bake, i'm forced to use this complicated process, when all i really should have to do is click a box, or select from a dropdown in the lowpoly tab to choose weather i want to use the exporter normals, or averaged normals for my projection. And again, the issue of needing to redo the entire process is a HUGE detraction. I think there are a lot of people that do not rely on setting up a cage for every mesh, and requiring them to do so creates a big disadvantage for the XN workflow.
This issue is so important, that i personally have stopped using XN for pretty much anything but generating AO, and that is really sad because XN is a great tool and you know how much i support all of the work that you've done. This whole matter has a lot of people coming to the conclusion that using XN for anything but a mesh that has entirely smooth normals is just not a viable solution.
Just ask "yes" to the auto-assign dialog.
Yep, that's a problem. I perhaps should integrate the xN renderer into Maya.
To setup a cage can be as easy as: enter the 3D viewer. Extrude the cage with the "global cage extrussion" until it covers the highpoly. Save meshes. For simple objects don't need to move the vertices.
For 3dsmax you should be using already the projection modifier so it won't be a problem.
And well... with luck, everybody will use DX11's tessellation so you can use Match UVs (dual parameterization) to skip the cage setup and ray distances.
Whats this about the DX11 stuff?
Your DX11 engine can use then the lowpoly mesh, subdivide it using the hull/domain shader/vertex displacement with a direction/normal map to reconstruct the highpoly model in a very easy way.
But you'll need DX11. DX10/DX9 can use instanced tessellation but it's not optimal.
So... DX11 = no need for cages/ray distances.
However there's a problem with this method: you can't retopo, just subdivide... so forget about using this for non-organic meshes. If you subdivide a box... you will get a ... nice sphere :poly124:
Sorry, I don't understand. Do you mean a method like Maya's?
Select face normals/smooth normals and a ray distance? You can do that in xNormal too just assigning "Smooth normals"/"Harden normals" and specifying the constant ray distances manually.... although I bet Maya is using some black magic to smooth the hard edges a bit.
While this would be very cool, that doesnt actually fix the problem, it simply *moves* the problem. Anyone not using maya still has the same issue, i think relying on app-specific fixes are a bad idea, since XN can be a great universal tool.
Really again, as a user there shouldn't be any need to do anything but simply import your OBJ, and select the projection method. Am i missing something here? I dont know how XN is structured and if that is a really super difficult thing to change or what, so let me know. Having extra tools to go beyond that is all well and good, but the basic functionality is the most important aspect.
What i want is a way to average JUST the projection, not the entire mesh. Averaging the smoothing on the entire mesh is really bad, as it causes smoothing errors etc. But in max/maya, the projection is averaged so you can have your lowpoly set up with hard edges, but not get the missed detail on said edges because of the spliting rays. Chaning the normals of the lowpoly itself isn't an option, it must be only on the "projection mesh".
Now you can do this by manually setting up a cage, but its a very slow process. What you want here is essentially an "invisible"(to the user) cage that is generated automatically using the offset distances and does not require anything more than turning the feature on(a drop down to select average normals, or use exported normals for *only* the projection). Sure you can go in and set up a custom cage, but the good majority of assets do not need a custom cage, and again the problem with redoing the entire process for even simple changes.
This is basically what maya does, when you add a mesh to your projection, it automatically creates a cage for you based on an input value, in maya's case its a %, for all intents and purposes, this is the exact same thing to the user as an offset distance. You set the projection method(either averaged or using the mesh's normals) and you're done. After the bake finishes, the "invisble mesh" is deleted, and another is created anytime you adjust the offset distance. There is an option to save the invisible "envelope/cage" to tweak as you see fit, but it is not a requirement.
The latest xN version use welding by default. You just need to perform a very similar thing to you're doing in Maya: enter the 3D viewer, extrude the cage with the "global cage extrussion" slider until it covers the highpoly model, hit save meshes and answer "yes" to the auto-assignment dialog.
This can take some time, but consider Maya has already all the meshes loaded and it's displaying a 3D viewport... and xNormal needs to load all from zero.
I think this could make the process more agile:
1. I could export some existing Maya envelopes as cages when saving to a .SBM file. I've seen the transfer maps copies the object and extrudes it generating an evelop called [objectName]+shape1Envelope. Perhaps I could try to find those envelopes automatically to save them to the SBM.
Other option is to create a script that could act like the 3dsmax's Projection modifier.
2. I could weld/break the cage a bit faster: if there are no vertices selected, I can pop a dialog asking to perform the break/weld over the complete cage.... so you won't need to select all the vertices manually.
Sorry, I'm afraid I cannot do anything more.
I'm using the nvidia driver 196.21 on Win7 x64 and xNormal 3.16.13. Will I have to upgrade to the latest xNormal for the cage average-weld mode?
Do a simple test: export a hard-edged mesh into a SBM file with the "Export cage" UNchecked ( well, or from Maya where that option is not present... or use a .OBJ ). Enter the xNormal 3D viewer and extrude the cage. See if the cage breaks.
Allright I could totally be wrong here but I do use Xnormal for my bakes and just ran a few tests. When you specify an envelope in the baking dialogue in maya the envelope that gets created does not have averaged normals but draws from the lowpoly normals. Is it possible then that it only gets averaged during the baking process, if you so specified ?
EQ? any light on this ?
I also tried manually averaging the envelope/cage and importing it into Xnormal but I still got results that did not have averaged cage normals.
The averaged cage can only be done at bake time Max/Maya ? Is it not possible to import an "averaged" cage into Xnormal ?
any help is appreciated
I think you can save the transfer maps envelopes for later using some tricks ( ctrl+z, clone, etc )
Yep, it is:
1. For max: use the projection modifier. Save as SBM with the "Export cage option enabled"
2. For maya: clone your lowpoly mesh. Average normals. Extrude some faces/vertices. Save as .OBJ.
In xNormal, assign the external cage file.
3. In xN, using the cage editor: load your mesh, enter the 3D viewer. Show cage->Global cage extrude until it covers the highpoly mesh. Press "save meshes" and answer "yes" to the auto-assign dialog. You can control if you want the cage broken or averaged selecting some vertices.
But I agree with EC... I must find a method to setup the cages in a more agile way.
Yeah, that's how I've been preserving the envelopes and importing them into Xnormal, the problem is that despite averaging the normals it still doesnt work properly. Do I have to manually "weld" vertices in the Xnormal editor despite averaging the normals already in Maya?
Again, isn't this a redudant process ? I already modified my envelope/cage to wrap my highpoly and already averaged the normals. My problem is that despite averaging normals on the cage in Maya, when it gets imported into Xnormal, the averaged normals seem to get ignored.. So again, do I have to manually "weld" it all in maya for it to work ? Normals seems to be preserved in the lowpoly, does it not work when importing the external cage ?
If you use cages the lowpoly mesh's normals aren't used to fire rays. Just the directional vector ( cage.pos-mesh.pos ) is taken into consideration. The external cage's vertex normals won't be used.
Ok I think I may have been confused a little before..
So this is how I understand it, please correct me if I am wrong.
1. When you use raydistance(Xnormal)/offset(Max)/surfacenormals(Maya) it just shoots the rays from the positions of lowpoly mesh's normals.
2. A cage is used to shoot the rays in the relative direction of the lowpoly to the cage
So what is this "averaged projection" thing that EQ is talking about ? Do you basically want to use a cage in Xnormal ? I just export mine out of Maya. Is this what you are looking for ?
Nope. xN has four methods to fire rays:
1. If you use cages, rays are fired from the cage position following a direction equal to normalize(lowpoly.pos-cage.pos). The vertex normal is not used by rays ( it will be used just to transform to tangent space ).
You can subdivide this into other two: averaged cage or "broken" one. The averaged generates a countinous cage surface(useful for organic models) while the broken is discontinuous(useful for hard-edged meshes).
2. If you use uniform ray distances, rays are fired from the lowpoly.position+(lowpoly.normal*distance) using a direction equal to negate(lowpoly.normal)
3. If you use MatchUVs, you don't need to setup cages and neither to play with ray distances... BUT, your highpoly mesh must be the lowpoly mesh+subdivision+sculpting. DX11 games gonna use this massively with vector/displacement mapping and tessellation.
4. Ray blockers. Like "anti-portals". Rays simply stop when they hit a blocker.
Each method has some pros and cons. Just choose the one that it's better for each case, your pipeline and work method.
In xN how do I control whether an imported external cage is averaged or hard-edged ?
The batch protect doesn't seem to split the high poly meshs, so any intersecting norms pass on to other parts in the render order.
If you use external cages or the xN cage editor, enter the 3D viewer. Enabled show cage->edit cage. Select some vertices(or all) and press "weld" to average it or "break" to deatch the cage faces using hard-edges.
Btw, for the next release I'm gonna solve some problems I detected yesterday playing with the averaging algorithm and to make it a bit easier to use.
So any chance to get good results with multi objects at the same time without Photoshop and with less time ?. I've a zbrush file with 20 subtools, and 20 retopoly lowpolys...
I test many things - but everytime i get overlapping object seems.
Any solution for xnormal or a trick in Zbrush to switch between two position to get clean nom / ao maps ?
Got a problem using Optix renderer xnormal V3.17.0.3
[IMG]file:///C:/Users/3D-DES%7E1/AppData/Local/Temp/moz-screenshot.png[/IMG]Invalid context (Details : Function "rtContextLaunch2D" caught exception: Encountered a CUDA error :
cuGLRegisterBufferObject(buffer->getGLId()) returned (3): Not initialized [458802])
Quadrofx1800 on board and 191.87 drivers
If you use WinXP and you've only one graphics card, you won't be able to use Optix due to the WinXP "watchdog". With Vista/7 you can.
i've tried exporting as like, obj, fbx, and a couple others... but i dont think i've tried SBM yet. Is this more reliable when it comes to loading the correct normals?
I hear you can open the 3d viewer and force a weld on the cage to solve the issue, but as EQ said... every time?
Newest drivers for Quadro fx 1800 are 191.87
I have Win7.
im having some problems baking normals,
this image will illustrate my problem, what did i do wrong?