If you add/insert any vertex or face in the external cage, it will show the mismatch message. The topology of the external cage must be exactly equal to the lowpoly mesh's one.
I should not load the external cage if you don't check in the "Use cage" option though... I'll fix it.
The 3.16.13 works for me. How much memory do you have free? Pls, do a CTRL+ALT+DEL and see the free memory available. If it's less than 256Mb it simply won't run.... and sure your graphics drivers are updated and you're running the final 3.16.13 and not a Release Candidate/Beta.
Btw, I write a txt log with the errors in My DocumentsxNormalxNormal_debugLog.txt if you want to take a look.
Sorry I'm really not clued up on pc tech. I pressed ctrl+alt+delete and it says mem usage for xnormal is about 90,000k? Not sure if that's what you mean? Commit Charge 943M/3049M? I am certain I have the official 3.16.13 download, as for graphics driver it's the same one I get from the boot disc so I guess it's dated but I'm having trouble finding a update that works. I'm led to here,
When I bake my normals, two pieces of the mesh have an artifact in the normal map, which looks like low poly geom clipping through it. The hi res mesh does not have any intersecting geom like this, since it's exploded for baking. I've tried unhiding all geom, still nothing. After trying everything I could to find hidden geom, I tried baking it out in 3dsmax, and it's perfectly fine - no intersecting geom. Any ideas what could cause this?
in 3dsmax I'm using the default "10.0" offset for the bake. In Xnormal, I used the default 0.5, but I also tried baking at 1, 5 and 10 - all with the same results (it gets worse the higher you go)
**EDIT** I'm a moron - got it solved. Those parts were inset into the mesh, so I had to use a higher rear ray distance...DURRRRR!
Still crashing for me, does this setup seem normal? Ram bar is only 50% full, maybe it needs more?
What's the value of the text that appears when you do a mouseover over the RAM bar, pls? If you have less than 256Mb free you'll have problems. What's your polycount on the highpoly mesh? Can you execute any of the xNormal's examples well?
It's usually 350 to 550 on the RAM so should be fine? The polycount on the HP is 180k. Yep I loaded the African head example and was able to mess with parameters etc. I tried doing my mesh again but it still wouldn't work, it seems to load a few full bars but then just crashes.
as i've said, i'm a total newbie to this stuff and i don't know where to start looking for bugs. i'm testing those cubes, because i had problems with more complicated objects, so i wanted to see how it works with some simple stuff.
exported both to OBJ using default settings, but the xnormal renders a map that looks like this:
Hmmm... strange.
Are you using the old 3dsmax's max2obj exporter? Or the gw:obj one?
Is the cube strictly mapped into the [0,1] UV range or are you applying tiling/wrapping? It looks like your normals aren't well exported or a problem with the smoothing groups.
hey jogshy,
It's usually 350 to 550 on the RAM so should be fine?
Yep.
Btw... two questions:
1. Are you sure your laptop is completely stable? Won't be the first time I see a laptop causing strange memory problems due to temperature. Have you past an OCCT/Memtest for Windows and a LinX?
2. Please, tell me you aren't using that thing called vLite and neither compressing the Windows\WinSxS folder...
3. Have you tried to reinstall xNormal from zero? That really sounds like a problem with your OS or with .NET.
eh... as usuall, the problem lies between the chair and the screen. it seems that i didn't understand how exactly xnormal works untill i saw one of the tutorials from Yr site.
i forgot that when You export selected, the root coordinate system is still the global one taken from the scene, not the local object one. and for that You MUST align the LO/HI objects as closely as possible.
sorry for flooding the thread.
maybe there is a contest for the most trivial fail, cos certainly i'm a 1st place candidate.
Ive never done proper normal mapping before, but im trying to learn it. Ive been trying xN some, but im doing something wrong and i cant really figure it out.
So I tried using xNormal for the first time today on a pretty high-res mesh, and I'm really pleased with the quality of the resulting map - very good, and very quick.
However, I noticed that when I bring the map into my shader in Max, the normal map is offset slightly so the detail doesn't match the diffuse texture. I haven't altered any settings beyond the defaults, so I guess I'm missing something pretty elementary?
Original UV layout in Max:
And the resulting map as generated by xNormal:
It's not offset by much, but it's enough to throw off the detail alignment. Any suggestions are appreciated!
Don't worry about that, it's perfectly normal. It's the "dilation filter"(aka edge padding) which is used to avoid artifacts near the UV seams and mipmaping.
You can disable it to see how beneficial is: set the edge padding to 0... and you'll notice some black lines to appear when you move the object some distance to the camera...
Try to keep it always to a minimum of
log2 ( max(texture.width,texture.height) )
for example, for a 2048x1024 texture a good value is:
*Tested everything in max with no errors then saved files "save selected" in .sbm format. So they were all individual meshes though they had not been exploded for this reason.
In light of this I'm guessing xnormal groups all the geometry together to produce a final map instead of masking then layering the output from indivdual meshes.
EDIT: got it working, this exploding thing is a pain in the ass though. Still I found out I could select all the low/high poly parts and export the selected as one file instead of all those^^ That saves some time
In light of this I'm guessing xnormal groups all the geometry together to produce a final map instead of masking then layering the output from indivdual meshes.
Correct.
Btw, if you have overlapping UVs and you don't want these pixels being rendered over the previous result you can use the "Batch protection" option.
For xn4 I'll put an option to render a map for each object.
Hi, jogshy
First of all I would just like to thank you for creating such an amazing tool.
I mean being able to load huge models for bake as hazzlefree and with such speed and ease as this software allows for is just a dream.
And now to my question.
Now when zbrush 3.5 exports polypaint vertexcolors in their obj format are there any plans for supporting this in xnormal?
It would be such a great feature if it had an option in the bake base texture slot where you could bake any vertex colors onto your low-poly.
In topogun there is a similar feature but since you`d probably be unable to load your highest detail level it`s sort of just halfway there. http://www.cgbootcamp.com/tutorials/2009/12/4/topogun-zbrush-polypaint-transfer.html
I know you can export your polypaint as a texture and load in xnormal but for obvious reasons this is not the best solution and also fairly tedious.
Anyway, thanks for an amazing tool. Keep up the good work!
Finding out that Zbrush 3.5 does indeed export vertex colors I can only second bilbana's request for such a mighty feature... It would complete the Zbrush workflow in a way that I think a lot of people have dreamed about for a long time, and thereby kick all kinds of ass, dear sir! I can only offer a long-overdue PayPal donation to help ignite your inspiration
Would be great to hear your views on this and any roadmap to possible implementation.
I'm curious jogshy, is it possible to render AO lowpoly casts on itself using xnormal, or is it doing bakes ?
Yep, it is. Just specify as highpoly mesh the lowpoly mesh.
Cheers jogshy,
I gave it a try, and I had two problems :
1. The intersection didn't line up. (I couldn't use 0 ray distance so instead I used the same mesh as the cage with harden normals)
2. The AO didn't seem to work double sided. (the white squares surrounded by shadows should be black), I tried duplicate/flipping for highpoly source but that only created artifacts (such as AO on the corners)
Sorry for my ignorance, but is it possible to do something like this in xNormal? The ray to hit only the matching ID?
At the moment I do my baking inside 3ds max in one mesh, without using the cage. I just set an offset value and some IDs to avoid intersections and I'm done.
A question: have you tried to use cages? You should if you're using the lowpoly as collider.
Ye like I said, I use the same mesh as the cage. (can't use 0 ray distance without cages)
Using a cage with a push though, won't give good results as it will have intersection issues. (for the same reason exploding meshes from each other is popular for normalmap bake).
This is also why I've chosen two cubes intersecting with each other for testing purposes.
Ive been using z-brush 3.5 R3 for sculpting lately. I recently did a experiment of making some high poly rocks.
So I made a basic box with a sub D on it , reseted its X-form and its pivot and placed it at 0,0,0 before export.
I then sculpted out my said rock. I after feeling comfortable Took my rock down from Sub D 6 and went to sub-d 3 and used zbrushes decimation tool to get it to a low poly count
I then took the low Poly rock and unwrapped it in 3D's Max in which my model was still in 0,0,0 pivot.
I then export out the new uv unwrapped low poly object + the highest Sub D of the rock from zbrush
I then import them into xnormal and they wouldnt bake.
Thinking that its a issue with them lineing up I decide to take the high poly rock back into max and see if there offset from one another.
I see that both rocks are perfectly in sink with each other, but there pivots are slightly off between the 2 of them (in which the high poly is SLIGHTLY to the right)
I reset both of there pivots and reset. And it baked Easy peasy.
I thought it to be a weird issue with zbrush with the reseting of the pivot (in which that part I understand)
but in the past ive had xnormal had shapes that had pivots COMPLETELY off but if both the high and the low poly matched up mesh wise. It baked.
I thought it to just be a weird coinicdence with this particular mesh, until I created 5 new rocks and with each one I had to do the above process of each.
So my question to you is 2 things
1- Is xnormal now having to have all meshes it bakes from have the same exact pivots?
2-Is there a way to turn this off it is a new feature im un-aware of?
I am also using your latest release of 3.16.12
also one other question/suggestion
In the future will there be a way to have exported high poly models that have parts of it color coated by materials in max/maya etc for colors of poly groups (To get a basic color scheme down) will there be a ability to export it with materials (not textures) applied to the high poly to bake a base diffuse from for the low? (I know you have a option to have the texture from z-brush exports from high poly models bake to low but I wasnt sure about materials from max/maya etc)
Thank you so much for all your efforts and your amazing program I couldnt imagine creating anything next gen without it!
I remember that there were some interesting elements of your CUDA implementation. Particularly with geometry handling. You had mentioned that 512 mb of graphics memory was enough to load a 10 million poly model and bake. When using the CPU version, 9 million poly's uses 1.9 gb of memory during baking.
Is the GPU's handling of geometry that much more efficient?
Ye like I said, I use the same mesh as the cage. (can't use 0 ray distance without cages)
Perhaps you should remove that T-junction creating shared vertices.
What spread angle are you using? Perhaps you should also increase it to 179.5 or decrease it a bit... but I really don't know why those white lines appear.
Some interesting things have been happening regarding GPU acceleration. Some people over at Luxrender have been experimenting with OpenCL rendering.
I'm experimenting with CUDA and OpenCL also. The main problem is that OpenCL drivers are still very immature and they're very little optimized. Also, the old cards like the ATI 3XXX has just a reduced OpenCL's feature set.
CUDA is very good... but it only works for NVIDIA cards... and I don't like specially the fact to code 18 times the same, one for each hardware vendor... I really prefer to code just one time and run on multiple hardware and platforms.
Other problem is the low quantity of VRAM ( 256 or 512Mb usually... very dense geometry can't be stored there ).
For 2D treatment(hm2nm,etc) is very good though.
You had mentioned that 512 mb of graphics memory was enough to load a 10 million poly model and bake. When using the CPU version, 9 million poly's uses 1.9 gb of memory during baking.
Well, the system memory used includes a lot of extra data(like the enourmous output image buffer). The GPU data is more compact and it also uses some parts of the CPU one shared using the PCI-express... so yes, 512Mb of VRAM should be anough for a 10M mesh if you share some data using the CPU.
Ideally, I would like to have all the meshes loaded into the VRAM... but that's impossible because there are no cheap-mid-range GPUs with the 2/4Gb of memory required to store a big mesh plus the acceleration structures.
Currently I'm experimenting a lot with GPGPU.... but we've a lot of problems to solve yet.
1- Is xnormal now having to have all meshes it bakes from have the same exact pivots?
Nope. The lowpoly and highpoly could have different pivot points.
Btw... are you using the SBM exporter or exporting as a .OBJ? Try with the SBM.
If that does not solve it or you need the meshes as .OBJ, sure you use the same precision when you export ( the OBJ format is text-based... and it exports the numbers with certain number of decimals.... If your meshes have a very small radius then if you supress a decimal could have terrible consequences ).
When exporting from 3dsmax, sure you triangle them(Edit mesh on the top of the stack), then Collapse and apply a ResetXForm. Collapse again into an Edit Mesh and, then, export the highpoly and lowpoly meshes in the same session.
Caution with ZB when exporting... if you do a zoom or a viewport pan the program could alter the mesh! Better open the ZTL/OBJ file and export immediatly without moving it and neither the camera!
If nothing solves it... could you compress the files and send me them, so I could debug them in depth and see what's the cause of the misaligment, pls?
In the future will there be a way to have exported high poly models that have parts of it color coated by materials in max/maya etc for colors of poly groups (To get a basic color scheme down) will there be a ability to export it with materials (not textures) applied to the high poly to bake a base diffuse from for the low?
Yep. xn4 will support MultiSubObj parts and also multi-file-baking.
In xn3 you must assign a texture to the whole mesh. xn4 will respect the mesh's polygroups/face clusters when you import it into xn4... so it will be possible to assign a material to each polygroup independently.
I'm having a problem with a cylinder-like shape. Have any of You guys found a way to deal with the 90 degrees angle and a circular curve?
when i'm baking with "use exported normals" i'm getting the usual ugly
seams
and while using average normals, the seam is gone, other artefacts show up:
the low poly mesh is pretty much basic:
EDIT:
i've checked the cage in the 3D viewer, and for the most of the model its intersecting with the highpoly. now i know that it's no good, but there seems to be no distinction from the places correctly mapped from the ones generating errors. plus, when i altered the cage, so that it covers the whole highpoly, runned the bake again, same errors poped up.
the seams go as shown on the picture above
it's 4sure something that i'm doing wrong but i have no idea where the mistake lies.
i've figured out how to fix that, thanks for the reply though
i've used the "use exported normals" option. the problem was actually the UV mapping. i had the 90' angles polygons not separated. when i've moved them apart, giving there a little space, the dark seams along the long vertical edge dissapeared. (of course after cage tweaking)
here's the result:
i'm still a beginner if it comes to normal mapping. if i wanted to go with the "average normals" option, it required a lot of cage tweaking. and even then i couldn't get rid completely of that ugly stains on the curved area.
as for the beveling, i didn't want to go that way to keep the model as low poly as possible. i think that the results are not that bad.
i have a question regarding that dark lines as shown on the picture above. is there a way to get rid of them? i dont think they are a mistake cage-wise, as they dissapear completely if You shift the lightsource. my guess is that this is the result of a lowpoly mesh there.
plus, is there any way to make that edge a perfet curve without adding more geometry? i mean if normalmapping can do that? (forgive me if that is a stupid question )
what i had id mind was the highilght curve in the middle of that stripe, not the silhouette. there are two highlight curves there, the first one is smooth and nice, but the second one is edgy.
on the other hand, the surface on the first one is at a different angle, and because of that it allows the normalmap to create an illusion of a smooth curve.
i've figured out how to fix that, thanks for the reply though
Just wanna throw this out there... seem to be able to overcome these types of problems by turning edges (3ds max) even with 90 degree angles the shading looks correct providing you have enough geometry to allow surrounding edges to flow together or be seperated, whatever's required...
...BUT... a question specific to this problem. Seeing as the side of this object is flat and this rounded seam appears to be a seperate object with isolated uv's? Does this mean if not connected to the rest of the mesh by vertices or uv's that less tangency/curvature will occour and I'm not talking just XN here, any software.
Perhaps you should remove that T-junction creating shared vertices.
What spread angle are you using? Perhaps you should also increase it to 179.5 or decrease it a bit... but I really don't know why those white lines appear.
Jogshy, I have varified the issue of the white lines are due to the cage projection.
Your program bakes between objects, but not an object by itself which was why I asked this at first.
The cubes are 2 seperate meshes, and I've made them intersect on purpose.
You'll always have meshes intersecting with each other, unlike baking AO from highpoly where it's best to separate them.
For example in 3dsmax if you use the "render to texture" dialogue without highpoly source, it will actually work differently - by capturing the pixel data directly from the mesh. (without projection) ... which is the only way to properly bake AO an object casts on itself.
I was hoping xNormal could do it, and I'll hit two birds in the same stone
Jogshy, I have varified the issue of the white lines are due to the cage projection.
The cubes are 2 seperate meshes, and I've made them intersect on purpose.
I'm afraid to get the desired result you must collapse both cubes into an object and solve the T-junctions. If not, the cage projection will overlap causing a lot of artifacts.
Speaking visually, do this:
Btw, If you don't extrude the cage, it won't work ( I use a small epsilon value to avoid self hits ).
I'm not completely sure, but you probably should use an averaged cage instead of an unwielded one.
Then, assign as highpoly the lowpoly mesh and render. ( sure you enable the "use cages" option). The per-pixel AO should work now without those white lines.
If nothing of this work, you could perhaps try to subdivide(tessellate really) the meshes and use the Simple GPU AO tool to render per-vertex AO.
I'm afraid to get the desired result you must collapse both cubes into an object and solve the T-junctions. If not, the cage projection will overlap causing a lot of artifacts.
Speaking visually, do this:
Btw, If you don't extrude the cage, it won't work ( I use a small epsilon value to avoid self hits ).
I'm not completely sure, but you probably should use an averaged cage instead of an unwielded one.
Then, assign as highpoly the lowpoly mesh and render. ( sure you enable the "use cages" option). The per-pixel AO should work now without those white lines.
If nothing of this work, you could perhaps try to subdivide(tessellate really) the meshes and use the Simple GPU AO tool to render per-vertex AO.
Well merging it won't really solve anything, this is just a test model - imagine I work on a daily basis with 6k+ meshes with hundreds of different intersection areas.
There's only one way to properly do it, and is to bake an object by itself without projection.
Most packages (3dsmax,maya,modo,etc) can do both types of baking, however I find your program superiour in projection baking (which is the more complex baking)
I would have used modo, but your AO engine is lightyears more advanced.
What happens if you UNcheck the "use cages" option and you set the forward/back ray distances to a very small value ( like 0.001 )?
I get the same results regardless of the value I put. (though extreme values obviously give bad results)
From my tests I'd say these kind of maps just can't be properly baked in projection mode.
Fortunately I think I could use modo for self AO bakes - I gotta say thanks for your time trying to sort this out jogshy.
Replies
Sorry I'm really not clued up on pc tech. I pressed ctrl+alt+delete and it says mem usage for xnormal is about 90,000k? Not sure if that's what you mean? Commit Charge 943M/3049M? I am certain I have the official 3.16.13 download, as for graphics driver it's the same one I get from the boot disc so I guess it's dated but I'm having trouble finding a update that works. I'm led to here,
http://support.amd.com/us/gpudownload/windows/Legacy/Pages/radeonaiw_xp.aspx?type=2.4.1&product=2.4.1.3.18&lang=English
but I run that and it crashes during install.
When I bake my normals, two pieces of the mesh have an artifact in the normal map, which looks like low poly geom clipping through it. The hi res mesh does not have any intersecting geom like this, since it's exploded for baking. I've tried unhiding all geom, still nothing. After trying everything I could to find hidden geom, I tried baking it out in 3dsmax, and it's perfectly fine - no intersecting geom. Any ideas what could cause this?
in 3dsmax I'm using the default "10.0" offset for the bake. In Xnormal, I used the default 0.5, but I also tried baking at 1, 5 and 10 - all with the same results (it gets worse the higher you go)
**EDIT**
I'm a moron - got it solved. Those parts were inset into the mesh, so I had to use a higher rear ray distance...DURRRRR!
It's usually 350 to 550 on the RAM so should be fine? The polycount on the HP is 180k. Yep I loaded the African head example and was able to mess with parameters etc. I tried doing my mesh again but it still wouldn't work, it seems to load a few full bars but then just crashes.
I'm new to the normal baking stuff, so forgive me if that is a trivial problem that i'm having troubles with. (I'm working in 3ds Max 2010)
I was trying to bake normals to a simple cube:
http://img266.imageshack.us/img266/8881/screenshot011v.jpg
i've unwrapped the right one like this:
http://img209.imageshack.us/img209/4632/dasdf.jpg
exported both to OBJ using default settings, but the xnormal renders a map that looks like this:
http://img266.imageshack.us/img266/2229/sdfnormalss.jpg
as i've said, i'm a total newbie to this stuff and i don't know where to start looking for bugs. i'm testing those cubes, because i had problems with more complicated objects, so i wanted to see how it works with some simple stuff.
thanks for the help!
regards.
Hmmm... strange.
Are you using the old 3dsmax's max2obj exporter? Or the gw:obj one?
Is the cube strictly mapped into the [0,1] UV range or are you applying tiling/wrapping? It looks like your normals aren't well exported or a problem with the smoothing groups.
Yep.
Btw... two questions:
1. Are you sure your laptop is completely stable? Won't be the first time I see a laptop causing strange memory problems due to temperature. Have you past an OCCT/Memtest for Windows and a LinX?
2. Please, tell me you aren't using that thing called vLite and neither compressing the Windows\WinSxS folder...
3. Have you tried to reinstall xNormal from zero? That really sounds like a problem with your OS or with .NET.
i forgot that when You export selected, the root coordinate system is still the global one taken from the scene, not the local object one. and for that You MUST align the LO/HI objects as closely as possible.
sorry for flooding the thread.
maybe there is a contest for the most trivial fail, cos certainly i'm a 1st place candidate.
this is what i get when baking the normals map with xN:
http://dl.dropbox.com/u/1889765/3d/xn%20problem/xn.jpg
normals map:
http://dl.dropbox.com/u/1889765/3d/xn%20problem/xn_map.jpg
I tried with render to texture in max and it looks much better:
http://dl.dropbox.com/u/1889765/3d/xn%20problem/rendertotexture.jpg
normals map:
http://dl.dropbox.com/u/1889765/3d/xn%20problem/r2t_map.jpg
here is the max scene:
http://dl.dropbox.com/u/1889765/3d/xn%20problem/butter%20knife.max
Theres quite possible im doing something fundamentaly wrong and it has nothing to with xN >.> Been trying to follow advice from http://www.svartberg.com/tutorials/article_normalmaps/normalmaps.html and others, but its not helping
Thanks
So I tried using xNormal for the first time today on a pretty high-res mesh, and I'm really pleased with the quality of the resulting map - very good, and very quick.
However, I noticed that when I bring the map into my shader in Max, the normal map is offset slightly so the detail doesn't match the diffuse texture. I haven't altered any settings beyond the defaults, so I guess I'm missing something pretty elementary?
Original UV layout in Max:
And the resulting map as generated by xNormal:
It's not offset by much, but it's enough to throw off the detail alignment. Any suggestions are appreciated!
Don't worry about that, it's perfectly normal. It's the "dilation filter"(aka edge padding) which is used to avoid artifacts near the UV seams and mipmaping.
You can disable it to see how beneficial is: set the edge padding to 0... and you'll notice some black lines to appear when you move the object some distance to the camera...
Try to keep it always to a minimum of
log2 ( max(texture.width,texture.height) )
for example, for a 2048x1024 texture a good value is:
log2 ( max(2048,1024) ) = 12
Thought it would be easier with a picture...
*Tested everything in max with no errors then saved files "save selected" in .sbm format. So they were all individual meshes though they had not been exploded for this reason.
In light of this I'm guessing xnormal groups all the geometry together to produce a final map instead of masking then layering the output from indivdual meshes.
EDIT: got it working, this exploding thing is a pain in the ass though. Still I found out I could select all the low/high poly parts and export the selected as one file instead of all those^^ That saves some time
Btw, if you have overlapping UVs and you don't want these pixels being rendered over the previous result you can use the "Batch protection" option.
For xn4 I'll put an option to render a map for each object.
Ahh thanks, that's a really handy feature, wasn't aware of that Your app saves so many headaches, it's high time I donated something:thumbup:
also use it to generate object space normals for masks.
First of all I would just like to thank you for creating such an amazing tool.
I mean being able to load huge models for bake as hazzlefree and with such speed and ease as this software allows for is just a dream.
And now to my question.
Now when zbrush 3.5 exports polypaint vertexcolors in their obj format are there any plans for supporting this in xnormal?
It would be such a great feature if it had an option in the bake base texture slot where you could bake any vertex colors onto your low-poly.
In topogun there is a similar feature but since you`d probably be unable to load your highest detail level it`s sort of just halfway there.
http://www.cgbootcamp.com/tutorials/2009/12/4/topogun-zbrush-polypaint-transfer.html
I know you can export your polypaint as a texture and load in xnormal but for obvious reasons this is not the best solution and also fairly tedious.
Anyway, thanks for an amazing tool. Keep up the good work!
Would be great to hear your views on this and any roadmap to possible implementation.
Cheers jogshy,
I gave it a try, and I had two problems :
1. The intersection didn't line up. (I couldn't use 0 ray distance so instead I used the same mesh as the cage with harden normals)
2. The AO didn't seem to work double sided. (the white squares surrounded by shadows should be black), I tried duplicate/flipping for highpoly source but that only created artifacts (such as AO on the corners)
Awesome, thank you.
You are the man!
I ticked off the ignore backface hits, and it added AO that shouldn't be in the corners. (see image below)
At the moment I do my baking inside 3ds max in one mesh, without using the cage. I just set an offset value and some IDs to avoid intersections and I'm done.
Nope. I plan to support that in xn4 though.
Ye like I said, I use the same mesh as the cage. (can't use 0 ray distance without cages)
Using a cage with a push though, won't give good results as it will have intersection issues. (for the same reason exploding meshes from each other is popular for normalmap bake).
This is also why I've chosen two cubes intersecting with each other for testing purposes.
I have a question to ask
Ive been using z-brush 3.5 R3 for sculpting lately. I recently did a experiment of making some high poly rocks.
So I made a basic box with a sub D on it , reseted its X-form and its pivot and placed it at 0,0,0 before export.
I then sculpted out my said rock. I after feeling comfortable Took my rock down from Sub D 6 and went to sub-d 3 and used zbrushes decimation tool to get it to a low poly count
I then took the low Poly rock and unwrapped it in 3D's Max in which my model was still in 0,0,0 pivot.
I then export out the new uv unwrapped low poly object + the highest Sub D of the rock from zbrush
I then import them into xnormal and they wouldnt bake.
Thinking that its a issue with them lineing up I decide to take the high poly rock back into max and see if there offset from one another.
I see that both rocks are perfectly in sink with each other, but there pivots are slightly off between the 2 of them (in which the high poly is SLIGHTLY to the right)
I reset both of there pivots and reset. And it baked Easy peasy.
I thought it to be a weird issue with zbrush with the reseting of the pivot (in which that part I understand)
but in the past ive had xnormal had shapes that had pivots COMPLETELY off but if both the high and the low poly matched up mesh wise. It baked.
I thought it to just be a weird coinicdence with this particular mesh, until I created 5 new rocks and with each one I had to do the above process of each.
So my question to you is 2 things
1- Is xnormal now having to have all meshes it bakes from have the same exact pivots?
2-Is there a way to turn this off it is a new feature im un-aware of?
I am also using your latest release of 3.16.12
also one other question/suggestion
In the future will there be a way to have exported high poly models that have parts of it color coated by materials in max/maya etc for colors of poly groups (To get a basic color scheme down) will there be a ability to export it with materials (not textures) applied to the high poly to bake a base diffuse from for the low? (I know you have a option to have the texture from z-brush exports from high poly models bake to low but I wasnt sure about materials from max/maya etc)
Thank you so much for all your efforts and your amazing program I couldnt imagine creating anything next gen without it!
http://www.luxrender.net/forum/viewtopic.php?f=21&t=2947&p=28569&hilit=opencl#p28569
I remember that there were some interesting elements of your CUDA implementation. Particularly with geometry handling. You had mentioned that 512 mb of graphics memory was enough to load a 10 million poly model and bake. When using the CPU version, 9 million poly's uses 1.9 gb of memory during baking.
Is the GPU's handling of geometry that much more efficient?
What spread angle are you using? Perhaps you should also increase it to 179.5 or decrease it a bit... but I really don't know why those white lines appear.
I'm experimenting with CUDA and OpenCL also. The main problem is that OpenCL drivers are still very immature and they're very little optimized. Also, the old cards like the ATI 3XXX has just a reduced OpenCL's feature set.
CUDA is very good... but it only works for NVIDIA cards... and I don't like specially the fact to code 18 times the same, one for each hardware vendor... I really prefer to code just one time and run on multiple hardware and platforms.
Other problem is the low quantity of VRAM ( 256 or 512Mb usually... very dense geometry can't be stored there ).
For 2D treatment(hm2nm,etc) is very good though.
Well, the system memory used includes a lot of extra data(like the enourmous output image buffer). The GPU data is more compact and it also uses some parts of the CPU one shared using the PCI-express... so yes, 512Mb of VRAM should be anough for a 10M mesh if you share some data using the CPU.
Ideally, I would like to have all the meshes loaded into the VRAM... but that's impossible because there are no cheap-mid-range GPUs with the 2/4Gb of memory required to store a big mesh plus the acceleration structures.
Currently I'm experimenting a lot with GPGPU.... but we've a lot of problems to solve yet.
Nope. The lowpoly and highpoly could have different pivot points.
Btw... are you using the SBM exporter or exporting as a .OBJ? Try with the SBM.
If that does not solve it or you need the meshes as .OBJ, sure you use the same precision when you export ( the OBJ format is text-based... and it exports the numbers with certain number of decimals.... If your meshes have a very small radius then if you supress a decimal could have terrible consequences ).
When exporting from 3dsmax, sure you triangle them(Edit mesh on the top of the stack), then Collapse and apply a ResetXForm. Collapse again into an Edit Mesh and, then, export the highpoly and lowpoly meshes in the same session.
Caution with ZB when exporting... if you do a zoom or a viewport pan the program could alter the mesh! Better open the ZTL/OBJ file and export immediatly without moving it and neither the camera!
If nothing solves it... could you compress the files and send me them, so I could debug them in depth and see what's the cause of the misaligment, pls?
Yep. xn4 will support MultiSubObj parts and also multi-file-baking.
In xn3 you must assign a texture to the whole mesh. xn4 will respect the mesh's polygroups/face clusters when you import it into xn4... so it will be possible to assign a material to each polygroup independently.
when i'm baking with "use exported normals" i'm getting the usual ugly
seams
and while using average normals, the seam is gone, other artefacts show up:
the low poly mesh is pretty much basic:
EDIT:
i've checked the cage in the 3D viewer, and for the most of the model its intersecting with the highpoly. now i know that it's no good, but there seems to be no distinction from the places correctly mapped from the ones generating errors. plus, when i altered the cage, so that it covers the whole highpoly, runned the bake again, same errors poped up.
the seams go as shown on the picture above
it's 4sure something that i'm doing wrong but i have no idea where the mistake lies.
Also, normal mapping could require a bit of beveling there as is shown here:
http://www.poopinmymouth.com/tutorial/normal_workflow.htm
(see the "one smoothing group vs two" figure).
i've used the "use exported normals" option. the problem was actually the UV mapping. i had the 90' angles polygons not separated. when i've moved them apart, giving there a little space, the dark seams along the long vertical edge dissapeared. (of course after cage tweaking)
here's the result:
i'm still a beginner if it comes to normal mapping. if i wanted to go with the "average normals" option, it required a lot of cage tweaking. and even then i couldn't get rid completely of that ugly stains on the curved area.
as for the beveling, i didn't want to go that way to keep the model as low poly as possible. i think that the results are not that bad.
i have a question regarding that dark lines as shown on the picture above. is there a way to get rid of them? i dont think they are a mistake cage-wise, as they dissapear completely if You shift the lightsource. my guess is that this is the result of a lowpoly mesh there.
plus, is there any way to make that edge a perfet curve without adding more geometry? i mean if normalmapping can do that? (forgive me if that is a stupid question )
what i had id mind was the highilght curve in the middle of that stripe, not the silhouette. there are two highlight curves there, the first one is smooth and nice, but the second one is edgy.
on the other hand, the surface on the first one is at a different angle, and because of that it allows the normalmap to create an illusion of a smooth curve.
Just wanna throw this out there... seem to be able to overcome these types of problems by turning edges (3ds max) even with 90 degree angles the shading looks correct providing you have enough geometry to allow surrounding edges to flow together or be seperated, whatever's required...
...BUT... a question specific to this problem. Seeing as the side of this object is flat and this rounded seam appears to be a seperate object with isolated uv's? Does this mean if not connected to the rest of the mesh by vertices or uv's that less tangency/curvature will occour and I'm not talking just XN here, any software.
Jogshy, I have varified the issue of the white lines are due to the cage projection.
Your program bakes between objects, but not an object by itself which was why I asked this at first.
The cubes are 2 seperate meshes, and I've made them intersect on purpose.
You'll always have meshes intersecting with each other, unlike baking AO from highpoly where it's best to separate them.
For example in 3dsmax if you use the "render to texture" dialogue without highpoly source, it will actually work differently - by capturing the pixel data directly from the mesh. (without projection) ... which is the only way to properly bake AO an object casts on itself.
I was hoping xNormal could do it, and I'll hit two birds in the same stone
Speaking visually, do this:
Btw, If you don't extrude the cage, it won't work ( I use a small epsilon value to avoid self hits ).
I'm not completely sure, but you probably should use an averaged cage instead of an unwielded one.
Then, assign as highpoly the lowpoly mesh and render. ( sure you enable the "use cages" option). The per-pixel AO should work now without those white lines.
If nothing of this work, you could perhaps try to subdivide(tessellate really) the meshes and use the Simple GPU AO tool to render per-vertex AO.
Well merging it won't really solve anything, this is just a test model - imagine I work on a daily basis with 6k+ meshes with hundreds of different intersection areas.
There's only one way to properly do it, and is to bake an object by itself without projection.
Most packages (3dsmax,maya,modo,etc) can do both types of baking, however I find your program superiour in projection baking (which is the more complex baking)
I would have used modo, but your AO engine is lightyears more advanced.
But T-junctions in the lowpoly meshes are bad because they cause tons of Z-fighting problems, aren't they?
I get the same results regardless of the value I put. (though extreme values obviously give bad results)
From my tests I'd say these kind of maps just can't be properly baked in projection mode.
Fortunately I think I could use modo for self AO bakes - I gotta say thanks for your time trying to sort this out jogshy.
- vertex colors support ( zb/polypaint, max, maya ) + render vertex colors map
- Support for OpenCTM compressed meshes.
- 10% speed increase courtesy of VS2008+some optimizations.