Hi Everyone! I am having a problem with Xnormals bakes. Whenever I'm using my computer from home, my bakes are not baking correctly. When I go to the school computers and bake it works great.
NapoleonX0X
You have at least 7 uv islands that are inverted and will bake inside out on your uv map.
hit the left or right uv keys on keyboard to flip them.
And your uv coverage is horribad. so much wasted space.
Run optimise uv islands for 1-2 mins in uvlayout so that islands match in size.
And use a cage and setup proper smoothing groups etc.
Hey guys - any trick to using more than one high poly mesh at once in XNormal? I'm trying to use two, but it looks like only one is baking out details, and the other just bakes out a flat gray color. Both are exported from Blender as FBX files, with smoothing on.
Also, the colors look way off to me. I reset the defaults in XNormal thinking I had changed something on accident, but no luck. Any ideas?
There's no trick to using multiple highpoly pieces, it should just work. Have you verified using xNormals 3d view that your meshes have imported correctly?
Thanks for the suggestion. I tried that, and it throws an error:
"The highpoly mesh [...] is too big ( too many vertices, max allowed by your graphics card is 16777215 and contains 32437763 at the moment, so can't be loaded into VRAM). Please, disable this mesh in the xNormal mesh list table or try the OpenGL, DX10 or the realtime raytracing graphics driver to solve it."
I'm not seeing options for any of those, though...
Thanks for the suggestion. I tried that, and it throws an error:
"The highpoly mesh [...] is too big ( too many vertices, max allowed by your graphics card is 16777215 and contains 32437763 at the moment, so can't be loaded into VRAM). Please, disable this mesh in the xNormal mesh list table or try the OpenGL, DX10 or the realtime raytracing graphics driver to solve it."
I'm not seeing options for any of those, though...
I'm using a GTX970 with 4GB RAM, if that matters.
Try decimating your high poly mesh and see if that helps.
PlateCaptain > Just use the latest xN 3.19.2 which uses DX10 by default.
For older versions, pls go to the plug-in manager ( the small yellowish icon on the bottom-left near the close button ) and change the xN's default graphics driver from DX9 to DX10 ( or OpenGL ).
About your problem with the 32M mesh ( !!! :poly122: ), try other format like .OBJ and see if that fixes the problem.
I couldn't get very good results with decimation, so I ended up splitting the high res model into four different pieces, and now they actually load in XNormal. DX10 also seems to have helped.
However, now I'm getting some weirdness in the bake. Here's a close up of part of it:
Those aren't at all the details it should be showing, and the colors also seem off to me. I don't have a very good grip on the 3D Viewer, but it seems to be loading the meshes fine, and it's using a custom cage file.
I wouldn't doubt I'm doing something wrong again, but I'm not sure what it is.
I did another test this morning, where I kept everything the same except for the following:
1) Removed the armature from the low poly mesh before exporting to FBX.
2) Removed the placeholder material from the cage mesh (just a solid color so I could see it more easily) before exporting to FBX.
This time, the colors look more like my first image above, except that all meshes are now baking.
It's kind of hard to see in that image, but there's basically only two colors in the bake - a pink, and a bluish-gray - and there's no transitions between the two colors, just abrupt edges. Anything that might be causing that?
Okay, getting closer: I found I can bake an Object Space normal map just fine. Even if I take that map and use xNormal's converter tool, however, I get the weird normal map.
Any ideas? I feel like I'm messing up something really simple/obvious.
Edit: Just for kicks, I tried converting the Object Space map one more time, and it suddenly worked. The only thing I may have done differently is use an OBJ instead of the original FBX for the low poly mesh - should that matter?
In any case, it finally works, so I'm not asking any more questions. Thanks for the earlier suggestions, guys.
Tried bake to bake this one... is it even possible? my bake just looked totally crap.
Or should i solve it some other way? Some tips thrown my way is gladly accepted
using a custom made cage and cut the uv in all the hard edges
Does anyone have any tips as to the blender 2.7x> xnormal workflow. Alot has changed in blender since the last version I used with xnormal (2.69); I'm not expecting a tutorial, just quick tips on:
1. preferred export format
2. whether marking sharp edges is necessary for an organic model (an octopus, save for some spines which I haven't modeled yet, it's as squishy/round as they get)
3. any tips for optimizing for ogre 1.10 (i know this isn't ogre3d.org, but i might as well ask.)
4. how you handle multiple materials in blender..do you cut your object into sets of faces per material, or do you move the uvs of the faces for the material that you don't want out of the uv space and bake the same geometry for each material, where each low poly model has the uvs for only one material in its 0,1 uv space.)
I have a question about bent normal in xNormal. I am noticing that the bentnormal in tangent space is the same as normal map in object space, and bentnormal with "tangent space" unchecked is exactly the same as the normal map. When baking bent normals in tangent space through substance or maya I am getting a different result - similar to the normal map but with more light areas. Why can I nott achieve this in xNormal?
Hey, So I was looking on the web for an answer to this question... Is it possible to change the file name suffix for each map? I saw a post that mentioned Xn4 would have this option, however this was posted back in september 2013.
Hey, here's an XML that I'm using to bake a test mesh. Unfortunately, for the Normal Map and Bent Normal Map, the specified Background Color doesn't affect anything in the final bake while the AO Background Color works as expected... any ideas why?
>>Perhaps a noob question, but is there a full list of all the available
XML arguments? >> I'm looking to bake a Bent Normal Map >>specifically via
XML. I think we include the XML schema in the xN's SDK. Not sure if it's 100% updated tho. Your XML file seems to be ok. No idea what could be wrong there, except it's not much sense to use a full-in-red normal map color ( that I assume you set just to test, right? ). Btw, are you loading that XML file in user mode ( loading the xNormal app using the UI ) or just passing it as param for the beatch renderer? Anyways, If you want, send me the XML file and the LP&HP meshes. I'll debug it in depth.
>>Tips for optimizing for ogre 1.10 Was the 1.10 released? In theory, xN supports the latest stable 1.9.0 atm. Not sure if the 1.10 is 100% compatible.
>>Is Xnormal 4 still in development? Of course! We just delayed it to early 2016 to be sure all works properly and to wait for more stable drivers ( we're using some exciting new tech APIs and the whole think is not very mature yet hehe ). We'll release a public open beta for Linux, OSX and Windows.
>>Is it possible to change the file name suffix for each map? Not currently, but I'll add it to the TODO list. Thanks for the feedback!
>>Does anyone know where the ignore per vertex on the high mesh thing has gone?
>>The latest versino of xnormals doesnt seem to have it Use the bar on the bottom to scroll-right the HP list hehe
Btw, I currently can't answer in Eat3D forums because they're solving some spam problems, so better contact me here or email me directly if you have questions.
hello guys, im working on the project, im modelling a wooden house, i can bake outside of the house and the result was great, but when i see inside of the house, the normal map was terrible, it was overlapping with the texture across the wall my question is, how can i bake inside of the house without exploding the mesh??, i has use cage but the result are same
I would like to see a layers system in xNormal, where each layer is rendered in isolation, and automatically composited. Each layer could contain a list of low poly models, and a list of high poly models, which are only visible when that layer is processed.
This would fix the destructive workflow of exploding meshes, making it easier to make changes after baking.
Exactly! I ran into this just a couple days ago.
One thing I will say about xNormal being as this is the first post of mine in this thread, one of my favourite pieces of free software! I use Blender so having xNormal is a must, blender is nowhere near as good a lot of the time
Probably not the first person that is asking this question. But I've got some problems regarding baking normal map in xNormal. I'm working in Cinema 4D and want to render with OTOY Octane (for C4D).
I've modelled an object, UV-mapped it (http://puu.sh/lCVM8/923243850d.png), made an high poly object from it [copy and paste the low-poly, put it in a subdivider object and make an editable poly from it]. Then I exported my scene as an .OBJ file [first I deleted the high poly, saved the low poly, and then did the same for the high-poly].
In xNormal I did the standard, import high poly mesh, import low poly mesh, ticked on normal map, X Y Z +, tangent space ticked on. Other settings here: http://puu.sh/lCVXg/2eabc84e45.png Clicked Generate maps and I get a problem. The edges for holes that I made while UV unwrapping are getting some strange shapes. You see that a lot of corners are not that smooth (at all). See here: http://puu.sh/lCVZJ/ef6e7d0171.jpg
At this point I'm getting really frustrated because no one can help me at this point. I hope there is someone here that has a solution, a tip or a tutorial that can help me solve this. I want to enjoy making nice textures for my 3D models just as anybody else does using Quixel *tears are flowing*
Lately XNormal has been spending a large amount of time on "Dilating Map Pass". Is anybody familiar with this? Is there a way to fix it? Only thing I can think of is a reinstall and just set up from blank slate (usually i load a prev settings and edit). It could be that i'm rendering to a mid-poly low? It's 60k triangles. Could that be it?
It's taking a very long time to dilate every single map.
Anybody heard about any updates? Santiago's blog was deleted, it seems like. Last I heard he was talking about working on xNormal 4, but it's been a little while!
Anybody heard about any updates? Santiago's blog was deleted, it seems like. Last I heard he was talking about working on xNormal 4, but it's been a little while!
I've been wondering about that too, he wasn't posting on eat3d due to some spam issues the forum was having, and the last post over here was in like November.
Im thinking about building a new rig and I'm not sure whether to go with a Geforce or Radeon. Ive been using CUDA to bake maps with xnormal on my Geforce. However, I'm looking at a Geforce 970 (3.5gig GRAM) or a Radeon R9 390 with 8 gigs. Double the amount of RAM for baking on the graphics card is cool, but CUDA doesn't work on the Radeon. What is your experience with baking maps on Radeons (and graphics cards instead of CPU in general)? Does is work just as fine, or would you go with a Geforce? How important would you judge the extra 4 gigs of RAM on the Radeon to be?
Im thinking about building a new rig and I'm not sure whether to go with a Geforce or Radeon. Ive been using CUDA to bake maps with xnormal on my Geforce. However, I'm looking at a Geforce 970 (3.5gig GRAM) or a Radeon R9 390 with 8 gigs. Double the amount of RAM for baking on the graphics card is cool, but CUDA doesn't work on the Radeon. What is your experience with baking maps on Radeons (and graphics cards instead of CPU in general)? Does is work just as fine, or would you go with a Geforce? How important would you judge the extra 4 gigs of RAM on the Radeon to be?
Beyond a few little tests I haven't really used GPU baking. When it was first implemented in xNormal I tried it, but with the lack of anti aliasing and me being too lazy to use the workaround (render at a higher resolution and resize) I never used it that much. However using the GPU in xNormal or any of the other bakers is a hell of a lot quicker.
My preference would be for a GeForce card, they seem to have better drivers and are quicker to fix things but there are folks that say the AMD cards work fine for them. If you do go the GeForce route, I would recommend getting a 980 instead of the 970. While the 970 is a good card and good for gaming, it apparently is pretty crippled in substance painter when working with larger textures when you hit that that 3.5GB limit and then it starts to access that 500MB of super slow ram. Radeons in general don't fare well with Allegorithmic products across the board, so while I might hope the new Crimson drivers signal better driver development it hasn't turned out to be the case so far and you could save yourself the headache and get a GeForce.
If you want to build the rig really soon (next month or 2), try for the 980, a non TI version is fine. If you can wait until June, that is when the new GeForce cards are rumored to be released and you could get a 1070 or whatever they end up calling them. Keep an eye out for announcements at GDC this month and more importantly at GTC (Nvidia's GPU Technology Conference) where they would confirm an estimated timeline for release. If they don't give an actual month of release it will likely be later than June and you could just build your rig then instead of waiting, even an announcement of new cards on the horizon could mean lower prices on existing cards too.
Thank you M4dcow, for your detailed reply! You even answered some questions that I had but didn't ask in my post! (about release dates for the upcoming geforce cards.) It seems that the 970's and radeon's drawbacks are even bigger than I already thought (comatibility with allegorithmic software). The 980 is too expensive for me, unfortunately. I guess I'll see if there is anything more specific announced at the conferences and then decide on what I am going to do. I'm open for any additional advice, though!
Sorry for being so silent lately, I was a bit occupied
Some friends are taking care of the xn3 releases, we're migrating the infrastructure ( so probably some emails / blog / links are break atm. Don't worry, we'll fix that soon ).
xn 3.20.0 will be released soon with support for Maya/Max 2017 and other fixes/updates.
xn4 is delayed due to new APIs & exciting tech it's using(Vulkan,etc). The drivers (today) are still a big LOL. We must wait a bit ( and I can't be more precise I'm afraid ).
SergeyXDD -> xn3 uses transparent background (using alpha) by default, so you'll be able to composite the maps in Photoshop. You could open the map in PS/Gimp/PaintNET and perform a flat image operation with your desired back color. I can't remember, but I think there's an option too in the xn's BMP exporter to write only the RGB components
I would love to request one feature in xnormal. After probably 10 years of baking with xnormal I recently switched to substance, because of the name matching feature. Name matching is so awesome, that I just can't imagine baking without it.
Another small tweak with big changes would be limiting ao raylength correctly. The feature is there and speeds up ao + in some cases improves the results. But it would need it's own value just like the curvature ray length. Right now it's limited by the cage distance (doesn't get any further) and the general tracing distance, which makes it nearly unusable.
I am watching a tutorial on gumroad and the guy makes a stunning little lightmap by combining an AO pass he baked in 3ds max (which I can replicate just fine) with an "occlusion" and "Diffuse" pass from crazybump... These are the two I'm having trouble replicated in xNormal... Here is what he's generating.
CrazyBump's "Diffuse" output:
CrazyBump's "Occlusion" output:
What are the equivalent passes in xNormal? I can't seem to replicate his results.
@JoshuaG When you say it doesn't work, what do you mean? Does it tell you to triangulate your high-poly, or does it just never load it entirely, or something else?
I'm also having a problem with xNormal, this seems new but if I leave it bake for any longer than about 10 mins it'll bluescreen my PC, if I reduce quality settings it extends the time until it bluescreens but I don't understand why as I have a strong GPU and plenty of RAM. Never done this before. On Win 7 64 bit, 16GB RAM, GTX 980
Can't bake anything above 2k res at the moment and with an AO ray setting of 128.
I'm also having a problem with xNormal, this seems new but if I leave it bake for any longer than about 10 mins it'll bluescreen my PC, if I reduce quality settings it extends the time until it bluescreens but I don't understand why as I have a strong GPU and plenty of RAM. Never done this before. On Win 7 64 bit, 16GB RAM, GTX 980
Can't bake anything above 2k res at the moment and with an AO ray setting of 128.
What baker are you using, and are you sure your system is fully stable. If you are using the normal baker that uses the CPU, download RealTemp and monitor your temps during the bake. Also try downloading Prime 95 and doing the CPU stress test, if you get a blue screen with that it points to system instability. if you use the Optix baker that is gpu based maybe roll back a driver and see if that helps.
What baker are you using, and are you sure your system is fully stable. If you are using the normal baker that uses the CPU, download RealTemp and monitor your temps during the bake. Also try downloading Prime 95 and doing the CPU stress test, if you get a blue screen with that it points to system instability. if you use the Optix baker that is gpu based maybe roll back a driver and see if that helps.
I use the default renderer, having read what you said though I gave the Optix/CUDA renderer a go and it instantly crashed my PC, I just tried out OpenRL and it worked fine, no real noticeable reduction in quality, would I be better off using this one in the future? I'll admit I haven't dived in to each renderer to see what the plus/cons are of each.
>>anyone know when the sbm exporter for max 2017 comes out? Finishing the 3.20.0 atm, so I would say soon ... August or early September
>> I don't understand why as I have a strong GPU and plenty of RAM. >>Never done this before. On Win 7 64 bit, 16GB RAM, GTX 980 For GPU rendering, I would recommend u to boot with the IGP enabled and your monitor attached to it. Use your GPU as secondary card for rendering. In that way your PC will be much more fast & responsive and you'll avoid the Windows's driver "watchdog". There's a property in the registry called "TdrLevel" which controls the GPU's driver reset (VDR) when a kernel exceeds 5 secs. By default, xN disables it ( == zero value ) in the installer. Try to remove it, that may help you ( or not ). HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\GraphicsDrivers\TdrLevel
On the other hand, I should recompile xN with the lastest Optix libs ... I'm working on that atm
You should test your GPU's card stability with Furmark or OCCT. Maybe it's getting too hot or exceeding your PSU's rails amperage.... and avoid the infamous 364.XX drivers
@1813 -> You might need to adjust the ray distances or setup a cage.
I use zbrush to make the models, paint in zbrush and bake everything in xnormals, even vertex colors, but the polypaint colors are different from the highpoly mesh in the vcols texture.
I usually use gimp and change the HUE to 70 or 90 depending on the colors, but I would know how to get the same colors in the bake.
heres a pic showing the different colors (left = highpoly in zbrush, right the vcols texture):
In this case, if I change the HUE, I can return the skin color to the same as highpoly, but then, the red inside the eyes changes to yellow, too much work to fix it completely and I would like to know how to solve this forever in xnormal.
Replies
Examples:
Home V V V V
http://imgur.com/Aoj92i8&yuEg6Kw#0
School V V V V
http://imgur.com/Aoj92i8&yuEg6Kw#1
I have a good computer with a good gpu, cpu, ram ect. I am wondering if its a new graphic cards problem?
if anyone can help that would be deeply appreciated
You have at least 7 uv islands that are inverted and will bake inside out on your uv map.
hit the left or right uv keys on keyboard to flip them.
And your uv coverage is horribad. so much wasted space.
Run optimise uv islands for 1-2 mins in uvlayout so that islands match in size.
And use a cage and setup proper smoothing groups etc.
Also, the colors look way off to me. I reset the defaults in XNormal thinking I had changed something on accident, but no luck. Any ideas?
"The highpoly mesh [...] is too big ( too many vertices, max allowed by your graphics card is 16777215 and contains 32437763 at the moment, so can't be loaded into VRAM). Please, disable this mesh in the xNormal mesh list table or try the OpenGL, DX10 or the realtime raytracing graphics driver to solve it."
I'm not seeing options for any of those, though...
I'm using a GTX970 with 4GB RAM, if that matters.
Try decimating your high poly mesh and see if that helps.
For older versions, pls go to the plug-in manager ( the small yellowish icon on the bottom-left near the close button ) and change the xN's default graphics driver from DX9 to DX10 ( or OpenGL ).
About your problem with the 32M mesh ( !!! :poly122: ), try other format like .OBJ and see if that fixes the problem.
However, now I'm getting some weirdness in the bake. Here's a close up of part of it:
Those aren't at all the details it should be showing, and the colors also seem off to me. I don't have a very good grip on the 3D Viewer, but it seems to be loading the meshes fine, and it's using a custom cage file.
I wouldn't doubt I'm doing something wrong again, but I'm not sure what it is.
1) Removed the armature from the low poly mesh before exporting to FBX.
2) Removed the placeholder material from the cage mesh (just a solid color so I could see it more easily) before exporting to FBX.
This time, the colors look more like my first image above, except that all meshes are now baking.
It's kind of hard to see in that image, but there's basically only two colors in the bake - a pink, and a bluish-gray - and there's no transitions between the two colors, just abrupt edges. Anything that might be causing that?
Any ideas? I feel like I'm messing up something really simple/obvious.
Edit: Just for kicks, I tried converting the Object Space map one more time, and it suddenly worked. The only thing I may have done differently is use an OBJ instead of the original FBX for the low poly mesh - should that matter?
In any case, it finally works, so I'm not asking any more questions. Thanks for the earlier suggestions, guys.
Or should i solve it some other way? Some tips thrown my way is gladly accepted
using a custom made cage and cut the uv in all the hard edges
im really at a loss what to do here
Does anyone have any tips as to the blender 2.7x> xnormal workflow. Alot has changed in blender since the last version I used with xnormal (2.69); I'm not expecting a tutorial, just quick tips on:
1. preferred export format
2. whether marking sharp edges is necessary for an organic model (an octopus, save for some spines which I haven't modeled yet, it's as squishy/round as they get)
3. any tips for optimizing for ogre 1.10 (i know this isn't ogre3d.org, but i might as well ask.)
4. how you handle multiple materials in blender..do you cut your object into sets of faces per material, or do you move the uvs of the faces for the material that you don't want out of the uv space and bake the same geometry for each material, where each low poly model has the uvs for only one material in its 0,1 uv space.)
thx!
The latest versino of xnormals doesnt seem to have it
So I was looking on the web for an answer to this question...
Is it possible to change the file name suffix for each map?
I saw a post that mentioned Xn4 would have this option, however this was posted back in september 2013.
http://eat3d.com/forum/questions-and-feedback/xnormal-4-eta
SO, is there any way to change the suffix in the latest Xnormal release?
and
Is Xnormal 4 still in development?
cheers
>> I'm looking to bake a Bent Normal Map >>specifically via XML.
I think we include the XML schema in the xN's SDK. Not sure if it's 100% updated tho.
Your XML file seems to be ok. No idea what could be wrong there, except it's not much sense to use a full-in-red normal map color ( that I assume you set just to test, right? ).
Btw, are you loading that XML file in user mode ( loading the xNormal app using the UI ) or just passing it as param for the beatch renderer?
Anyways, If you want, send me the XML file and the LP&HP meshes. I'll debug it in depth.
>>Tips for optimizing for ogre 1.10
Was the 1.10 released? In theory, xN supports the latest stable 1.9.0 atm. Not sure if the 1.10 is 100% compatible.
>>Is Xnormal 4 still in development?
Of course! We just delayed it to early 2016 to be sure all works properly and to wait for more stable drivers ( we're using some exciting new tech APIs and the whole think is not very mature yet hehe ). We'll release a public open beta for Linux, OSX and Windows.
>>Is it possible to change the file name suffix for each map?
Not currently, but I'll add it to the TODO list. Thanks for the feedback!
>>Does anyone know where the ignore per vertex on the high mesh thing has gone?
>>The latest versino of xnormals doesnt seem to have it
Use the bar on the bottom to scroll-right the HP list hehe
Btw, I currently can't answer in Eat3D forums because they're solving some spam problems, so better contact me here or email me directly if you have questions.
my question is, how can i bake inside of the house without exploding the mesh??, i has use cage but the result are same
One thing I will say about xNormal being as this is the first post of mine in this thread, one of my favourite pieces of free software! I use Blender so having xNormal is a must, blender is nowhere near as good a lot of the time
I've modelled an object, UV-mapped it (http://puu.sh/lCVM8/923243850d.png), made an high poly object from it [copy and paste the low-poly, put it in a subdivider object and make an editable poly from it]. Then I exported my scene as an .OBJ file [first I deleted the high poly, saved the low poly, and then did the same for the high-poly].
In xNormal I did the standard, import high poly mesh, import low poly mesh, ticked on normal map, X Y Z +, tangent space ticked on. Other settings here: http://puu.sh/lCVXg/2eabc84e45.png
Clicked Generate maps and I get a problem. The edges for holes that I made while UV unwrapping are getting some strange shapes. You see that a lot of corners are not that smooth (at all). See here: http://puu.sh/lCVZJ/ef6e7d0171.jpg
At this point I'm getting really frustrated because no one can help me at this point. I hope there is someone here that has a solution, a tip or a tutorial that can help me solve this. I want to enjoy making nice textures for my 3D models just as anybody else does using Quixel *tears are flowing*
Greetings,
Cazure
After some tweaking in the UV layout (I got some more space between the UV's) I get better results. But still get these weird errors.
http://puu.sh/lFNgJ/9147bc9fff.jpg
http://puu.sh/lFNlo/ca677ce3f3.jpg
How do we get our scene lights into xnormal? What settings are worth using etc?! Any information really not been finding much around on the webz.
It could be that i'm rendering to a mid-poly low? It's 60k triangles. Could that be it?
It's taking a very long time to dilate every single map.
Thanks
sorry for my english
What is your experience with baking maps on Radeons (and graphics cards instead of CPU in general)? Does is work just as fine, or would you go with a Geforce? How important would you judge the extra 4 gigs of RAM on the Radeon to be?
Beyond a few little tests I haven't really used GPU baking. When it was first implemented in xNormal I tried it, but with the lack of anti aliasing and me being too lazy to use the workaround (render at a higher resolution and resize) I never used it that much. However using the GPU in xNormal or any of the other bakers is a hell of a lot quicker.
My preference would be for a GeForce card, they seem to have better drivers and are quicker to fix things but there are folks that say the AMD cards work fine for them. If you do go the GeForce route, I would recommend getting a 980 instead of the 970. While the 970 is a good card and good for gaming, it apparently is pretty crippled in substance painter when working with larger textures when you hit that that 3.5GB limit and then it starts to access that 500MB of super slow ram. Radeons in general don't fare well with Allegorithmic products across the board, so while I might hope the new Crimson drivers signal better driver development it hasn't turned out to be the case so far and you could save yourself the headache and get a GeForce.
If you want to build the rig really soon (next month or 2), try for the 980, a non TI version is fine. If you can wait until June, that is when the new GeForce cards are rumored to be released and you could get a 1070 or whatever they end up calling them. Keep an eye out for announcements at GDC this month and more importantly at GTC (Nvidia's GPU Technology Conference) where they would confirm an estimated timeline for release. If they don't give an actual month of release it will likely be later than June and you could just build your rig then instead of waiting, even an announcement of new cards on the horizon could mean lower prices on existing cards too.
It seems that the 970's and radeon's drawbacks are even bigger than I already thought (comatibility with allegorithmic software). The 980 is too expensive for me, unfortunately. I guess I'll see if there is anything more specific announced at the conferences and then decide on what I am going to do.
I'm open for any additional advice, though!
Some friends are taking care of the xn3 releases, we're migrating the infrastructure ( so probably some emails / blog / links are break atm. Don't worry, we'll fix that soon ).
xn 3.20.0 will be released soon with support for Maya/Max 2017 and other fixes/updates.
xn4 is delayed due to new APIs & exciting tech it's using(Vulkan,etc). The drivers (today) are still a big LOL. We must wait a bit ( and I can't be more precise I'm afraid ).
SergeyXDD -> xn3 uses transparent background (using alpha) by default, so you'll be able to composite the maps in Photoshop. You could open the map in PS/Gimp/PaintNET and perform a flat image operation with your desired back color.
I can't remember, but I think there's an option too in the xn's BMP exporter to write only the RGB components
I would love to request one feature in xnormal.
After probably 10 years of baking with xnormal I recently switched to substance, because of the name matching feature.
Name matching is so awesome, that I just can't imagine baking without it.
Another small tweak with big changes would be limiting ao raylength correctly.
The feature is there and speeds up ao + in some cases improves the results. But it would need it's own value just like the curvature ray length. Right now it's limited by the cage distance (doesn't get any further) and the general tracing distance, which makes it nearly unusable.
I am watching a tutorial on gumroad and the guy makes a stunning little lightmap by combining an AO pass he baked in 3ds max (which I can replicate just fine) with an "occlusion" and "Diffuse" pass from crazybump... These are the two I'm having trouble replicated in xNormal... Here is what he's generating.
CrazyBump's "Diffuse" output:
CrazyBump's "Occlusion" output:
What are the equivalent passes in xNormal? I can't seem to replicate his results.
Thanks
I've never had it tell me to triangulate my high, and I've never had an issue.
When you say it doesn't work, what do you mean? Does it tell you to triangulate your high-poly, or does it just never load it entirely, or something else?
Can't bake anything above 2k res at the moment and with an AO ray setting of 128.
Norm
Low
High
Finishing the 3.20.0 atm, so I would say soon ... August or early September
>> I don't understand why as I have a strong GPU and plenty of RAM.
>>Never done this before. On Win 7 64 bit, 16GB RAM, GTX 980
For GPU rendering, I would recommend u to boot with the IGP enabled and your monitor attached to it. Use your GPU as secondary card for rendering.
In that way your PC will be much more fast & responsive and you'll avoid the Windows's driver "watchdog".
There's a property in the registry called "TdrLevel" which controls the GPU's driver reset (VDR) when a kernel exceeds 5 secs.
By default, xN disables it ( == zero value ) in the installer. Try to remove it, that may help you ( or not ).
HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\GraphicsDrivers\TdrLevel
On the other hand, I should recompile xN with the lastest Optix libs ... I'm working on that atm
You should test your GPU's card stability with Furmark or OCCT. Maybe it's getting too hot or exceeding your PSU's rails amperage.... and avoid the infamous 364.XX drivers
@1813 -> You might need to adjust the ray distances or setup a cage.
This just happens with this model.
This thread should help. Basically, it's behaving exactly the way it's expected to. There are ways around it, though:
http://polycount.com/discussion/81154/understanding-averaged-normals-and-ray-projection-who-put-waviness-in-my-normal-map/p1
I usually use gimp and change the HUE to 70 or 90 depending on the colors, but I would know how to get the same colors in the bake.
heres a pic showing the different colors (left = highpoly in zbrush, right the vcols texture):
In this case, if I change the HUE, I can return the skin color to the same as highpoly, but then, the red inside the eyes changes to yellow, too much work to fix it completely and I would like to know how to solve this forever in xnormal.