Has any one had any luck using hxgrid and xnormal? I have both installed and the agent and coordinator see each other, but I cannot figure out how to assign a "task" to the coordinator. Is there something I have to change in xnormal to get this to work?
I have xnormal v 3.18.8.36 and hxgrid v1. Thanks.
xN 3.18.X is 64-bits only and, therefore, it no longer supports hxGrid ( which is 32bits only ). I asked hxGrid's developer and he said he was not going to port it 64 bits ... so, if you want to use hxGrid, the only solution is to use xN 3.17.16 which supports 32 bits.
I've noticed that 3dsMax has an option when baking maps to only bake matching material ID's. Does Xnormal have anything like this? or any solution to baking multiple matching mesh parts in one go?... or is it a case of having to explode the model still?
So I'm new to normal mapping, and while I haven't had problems in the past with organic models (I've mostly done human character modeling) I've been having some trouble creating a normal map for a futuristic shotgun I'm working on. I started out trying to generate my normals in ZBrush, couldn't get that to work and downloaded XNormal as a result.
If anyone has a chance to take a look I'd really appreciate it! I'm new to normal mapping (so far I've only taken 2 courses on 3D modeling, and I wasn't required to create normal maps, or even UV maps, for them) and I know a lot of people on polycount have years of experience, so any feedback would be much appreciated!
how can i bake normal map of an object which has multiple uv tiles
as far as I'm aware, Xnormal only supports one UV set. I guess the workaround would be to clone multiple dummy meshes from the 'master' mesh then copy the correct UV channel into the 1st channel for each one. then use these dummy meshes for the baking. Then apply the maps to the 'master' mesh when complete.
Hi There! For some reason that i am not aware of my xNormal is getting errors when i attempt to bake something. I have tried reinstalling, restoring the default setting and used different objects for baking.
This happens when i try to bake a height map
and this happens when i try to bake a normal map
Any Ideas?
EDIT: Nevermind i changed the renderer and its fixed it? Will this make a difference to actual how the actual bake comes out?
Will this make a difference to actual how the actual bake comes out?
If you set it to default it should be slower (way slower), but more accurate with less artifacts (none...ideally). Also you will be able to render height maps (and maybe some other I don't know of too).
I've noticed that 3dsMax has an option when baking maps to only bake matching material ID's. Does Xnormal have anything like this? or any solution to baking multiple matching mesh parts in one go?... or is it a case of having to explode the model still?
cheers
I use the Offset values you can find at the end of the columns in the low and high poly object lists. But would be gorgeous to use that MatID
hello everybody.
I've got some problem, baking normals in Xnormal many many times.
Look at the screenshot...
There are some waves on the normal map that couses that strange wave errors on the model, especially when u loock closer to a surface + put any glossy material. I've already tried bake in maya too, the result the same.
Preferences are in default.
Xnormal save normal map as 4096x4096 .tiff (16bits)
Did anybody encounter that difficulties? How u fixed it?
Switching to 16 bit will double the number of bands, but decrease the contrast between bands. If the gradient takes place over a larger number of pixels than bands, there will still be visible banding, even in 16 bit. Have you tried converting the normal maps to 8 bit after exporting the 16 bit image? This will introduce dithering which should reduce the appearance of banding further.
hi, is it possible to install/manage different versions of xNormal on the same machine? i have a case here where a custom tangent space plugin only wants to cooperate with a rather old version but also needto use a different one for work on another project.
it seems the different xNormal installations fight over the same settings files/registry entries causing them to make a right mess of things on my setup.
You don't stop it
Normal maps will automatically be baked with the correct directional information along the UV borders no matter how the UV islands are oriented and you will always get that seam if you view the normal map as a diffuse.
It will display correctly once it is applied to your mesh as a normal map.
hello guys how are you i`m does any one know how can the uv showing up in my x normal i have a problem on this , evry time i export my 3ds to obj or the x normal format , to the x normal i do see random uv not the one that i created and my backing just messed up with very nasty uv ... i have to mention that i use 3dmax 2009
Make sure you delete any extra/garbage uvs sets. OBJ only supports 1 uv channel so if you have more than one you don't know which is getting exported.
The workflow I use has a lot of overlapping UVs. In Max, I can adjust the W coordinate of my model's texture coordinates to force Mental Ray to render overlapped UVs in front of the UVs they're sitting on. I'm trying to get xNormal to recognize W coordinates, but it doesn't seem to work with them. Is there a method to get xNormal to recognize overlapped UVs and render accordingly?
Hey people, I've got some questions about xNormal normal bakes vs. Max (3.18.9 vs. 2013 PU6).
This may be an extreme example and some of these problems are likely to be invisible in game but I'd like to hear your thoughts regardless.
Note:
All maps were baked uncompressed with no supersampling or antialiasing at 2048x2048 and downsized to 1024x1024 in Photoshop with Bicubic resampling, and normalised with xNormal's filter.
All bakes used the same high and low poly mesh and the cage was exported from Max to be used in xNormal too (xNormal's own cage wasn't looking so good).
xNormal bakes used the default bucket renderer as the CUDA ones looked bad.
Viewing the TIFF as a PSD in TB2 looked great, no real banding.
1. High poly (ignore the front face weirdness)
2. TGA baked in Max (default scanline)
3. TGA baked in xNormal
4. 16 bit TIFF in xNormal - converted to 8 bit, downsized, normalised, saved as TGA
5. 16 bit TIFF in xNormal - downsized, converted to 8 bit, normalised, saved as TGA
6. 16 bit TIFF in xNormal - downsized, normalised, converted to 8 bit, saved as TGA
My thoughts:
1. High poly looks smooth, apart from the front face but this does not show in Max and appears to have no effect on the bakes.
2. I hear Max bakes have a bit of noise in them to reduce banding and this is clear here. The bake has uniform noise on the curved faces and no banding. To me, the best looking 8 bits/channel bake of the bunch.
3. Has clear banding on the cylindrical parts, would like to avoid this result. Baking 16 bit TIFF improves this.
4 & 5. Look very similar. Converting to 8 bit introduces noise on curved areas but in the same bands that appear in 3. Second best results after Max.
6. More uniform noise but stronger than Max. Also considerable noise on flat faces, although banding has been reduced. Third best result, due to to the strong noise.
Admittedly I have just started using xNormal but I feel like my best bake came out of Max anyway, and without having to bake and convert different formats.
So I guess my questions are:
What do you make of all this? Do you have any fixes?
When do you convert your 16 bit maps to 8 bit, before or after any editing?
xNormal vs. Max for normal map bakes? How do game engines handle them?
Hello all. I was wondering if anyone had any experience in dealing with this problem with Xnormal. The issue arises when I try to bake my ambient occlusion map of this simple wall with a bent pipe emitting from it. The shading information on the wall appears correct except for a white patch behind the underside of the pipe. Is there some setting I'm missing, or am I doing something wrong? I've included pics to alleviate any confusion. If I render the AO in something like blender it bakes out just fine, but I like the quality and speed of of xnormal, ergo the reason why I'm asking.
Thanks ahead of time.
On a side note I've tried searching google and these forums for things like "xnormal ambient occlusion glitches, xnormal ambient occlusion shadow error" and variations in there of. Is there some kind of specific term to describe what is happening here? I've also tried fiddling with the settings.
Too large cage/ray distance. Captures light from the tube/cylinder instead of itself.
Ahh thank you so much! I cranked down the "maximum frontal ray distance" and "maximum rear ray distance" and got better results. Just a note I wasnt using a bake cage for this model as it was relatively simple, but should I use bake cages even for simple models in the future?
Hello everyone I'm looking for a tutorial or something on how to bake my asset with xNormal with one smoothing group to go in to UE4.
It's completely new to me at the moment so I need to know what options to choose, buttons to press, etc.
That's the way my friend and tutor suggests. I used to faff around with the smoothing groups so much, optimising them for the unwrap and then never getting a perfect result. So then I'm told, "The way most people do it now for UE4 is use one smoothing group, xNormal and mikk tangent space."
So, even if it isn't the only or best way to do it for everyone, I've seen it working perfectly so I'd like to give it a go myself. But I'm afraid I can't find much information on the workflow online.
One smoothing group for an entire model doesn't make sense for everything, it needs to be treated as a case-by-case basis.
There's no way you'll get a decent looking hardsurface piece with one smoothing group, for example. UE4 isn't vastly different from any other deferred PBR engine, it still is going to follow many common workflow techniques that we have learned as artists.
That's the way my friend and tutor suggests. I used to faff around with the smoothing groups so much, optimising them for the unwrap and then never getting a perfect result. So then I'm told, "The way most people do it now for UE4 is use one smoothing group, xNormal and mikk tangent space."
So, even if it isn't the only or best way to do it for everyone, I've seen it working perfectly so I'd like to give it a go myself. But I'm afraid I can't find much information on the workflow online.
No this is not how game artists do it.
Set your smoothing groups by your uv islands to start. There are scripts that do this.
In general at very hard edges with angles >75° you want to set a hard edge/smoothing group. Where ever you have a hardedge/smoothing group you need to detach that as a uv island.
Force triangulate before you bake.
These few steps will get you close to what you need, with some caveats that you will learn as you go.
Using 1 smoothing group is very situational and far from the standard.
Hmm it's very confusing. There is a lot of mixed opinions around the internet about it.
I've done a test with boxes and it didn't seem to matter the angle the edges were as long as I broke the UV accordingly.
And I've seen so many people get perfect results from using one smoothing group with xNormal.
I have also been confused by the whole smoothing group thing and have been told by several people 1 smoothing group would work,
all the assets I have baked so far using 1 group using the correct tangent space in marmoset, most worked but im not sure if this is the correct way from what you guys are saying
here is an older asset i baked using 1 group
is there anything wrong with the way I did that asset?
Using 1 smoothing can work yes when using a synched workflow.
However, using this method will generate a normal map that has a metric ton of gradients in it. Which in and of itself is fine as long as it looks good in the viewport.
In a game engine though, these gradients will now be compressed for realtime rendering, usually dxt5(old,bad) or bc5 (new,better). This compression can wreck havoc on heavily gradated normal maps. So if your end goal is to put the object uncompressed into a viewport, then go crazy with 1 smoothing group. But if you want to create the smoothest normal map WHILE being compressed, you have to give the normal map a break and using smoothing groups/hard edges.
At the end of the day, do what looks good to you. But do it because you understand, rather than because it's confusing.
At the very least, outside of very specific reasons not to, set your smoothing groups by UV islands via a script, as it is 'free' in engine resource cost, and can help reduce unnecessary gradation.
Thanks for the help Quack it just gets a little confusing with many places suggesting different things, I just assumed the 1 smoothing group way was correct as it looked fine in view ports.
but I didn't take compression into account.
Is there a good summary about that on the polycount wiki ? I know there is a normal map page that already talk about a lot of things...
I don't think there is 1 resource that has all of this info concisely put together. It is spread out here, on the wiki, in EQ's thread, and random threads like this.
Okay that's very helpful, thank you Mr. Duck.
Can you recommend any scripts or tutorials on this? For now, I will be learning it without scripts though as I don't like not knowing first how to do it myself.
Epic does recommend using the 1 turbo smooth method so I will use that too as it's so easy and comes out fine in UE4.
Problem is if i do this and try to render an AO map using these options i get pure white AO, unless i use the CUDA renderer, now if i put the mesh settings to default (scale at 1.0 and ray distances at 0.5) it renders fine.
No other tutorial that i can find ever mentions adjusting these settings, so my question is, am i doing something wrong or is crytek's tutorial wrong.
It seems as if the cage isnt getting scaled along with the object, but it works fine when baking normals and it scaled up in the 3d viewer
Problem is if i do this and try to render an AO map using these options i get pure white AO, unless i use the CUDA renderer, now if i put the mesh settings to default (scale at 1.0 and ray distances at 0.5) it renders fine.
No other tutorial that i can find ever mentions adjusting these settings, so my question is, am i doing something wrong or is crytek's tutorial wrong.
It seems as if the cage isnt getting scaled along with the object, but it works fine when baking normals and it scaled up in the 3d viewer
Make sure you scale the lowpoly and highpoly, but if the defaults work for you, might as well use them.
Sometimes scaling up is recommended if you have really small meshes which can result in precision errors and xNormal will warn you if that is the case, but even in those situations I have rarely had any issues.
Okay that's very helpful, thank you Mr. Duck.
Can you recommend any scripts or tutorials on this? For now, I will be learning it without scripts though as I don't like not knowing first how to do it myself.
Epic does recommend using the 1 turbo smooth method so I will use that too as it's so easy and comes out fine in UE4.
Can you post a link to Epic's recommendation of using 1 smoothing group method? Also remember, TurboSmooth is a 3ds Max mesh modifier, not smoothing groups.
I can recall when I began baking for the first while I made the same mistake of using 1 smoothing group and the results were very bad.
Using 1 smoothing group can work in some cases, for example if you have a bunched up cloth or wavy curtains. But in my experience - for most meshes it won't make sense, should be avoided, and is a a shortcut that will eventually come back to bite you when you discover for whatever reason, on whatever rendering platform, your mesh looks like garbage.
Also I'd be interested to see the in-engine results you got from using one smoothing group on your mesh.
There is a plethora of support/tutorials here and elsewhere. One of the most important points is to make sure to have a UV split at every hard edge on your mesh.
i am having an issue where trying to use the ray trace calculator, it get stuck on analyzing. Also when trying to bake, same problem. i read posting here and on eat3d neither have I found a fix. My job doesn't provide zbrush for me to clean the mesh. I have triangulated and cleaned up the mesh via Maya. Nothing work:poly118:s!
You really shouldn't use the ray distance calculator unless your lowpoly mesh has no hard edges; otherwise, you'll get discontinuities in your normal map where those hard edges are since ray distance doesn't create a proper averaged projection mesh. Also if I were you I would export .fbx for your lowpoly mesh if your pipeline supports it, since that format has much better support for per-vertex normals and synced workflows (again, whether you should export bitangents and binormals depends on your pipeline.) Instead of using ray distance, I would go into the 3D viewer and slide out the global cage extrusion slider until your highpoly mesh is entirely enclosed by the cage. If your meshes are too dense to load in the 3D viewer, I would grab Meshlab (free program) and decimate the high-poly just for cage editing, and use the ordinary high-poly for your bake.
The 3D viewer can also be helpful with diagnosing problems with Xnormal, as you'll be able to see exactly where Xnormal thinks your meshes are, or if it can load your meshes at all.
I'm getting seams along my UV shells, it looks to me like a problem with the smoothing groups during baking.
My workflow is basically this:
- Unwrap the lowpoly
- Use TexTools UVs->SmoothingGroups tool to break my smoothing groups along the UV shells
- Duplicate the mesh, add a push modifier and collapse it to get the cage
- Send both off to XNormal for baking
I've done this tons of times, I'm not sure what's gone wrong here. Anyone have any advice?
Edit: Solved this by unchecking smoothing groups in my FBX cage export
Apparently unifying the cage normals or setting them all to a single smoothing group was not enough for this model, the cage had to be exported without SG. Leaving the post up incase anyone else has similar problems
Hi, I've been having trouble with normal map baking lately so I decided to go back to the basics, and i'm still habving problems:
I am exporting everything from Maya using the .SBM format, my UVs are split, and the edges are hard.
It looks like the rays aren't being shot backward, only forward (I've noticed this problem on other meshes as well). Any idea what's causing that and how I could fix it?
Replies
xN 3.18.X is 64-bits only and, therefore, it no longer supports hxGrid ( which is 32bits only ). I asked hxGrid's developer and he said he was not going to port it 64 bits ... so, if you want to use hxGrid, the only solution is to use xN 3.17.16 which supports 32 bits.
I've noticed that 3dsMax has an option when baking maps to only bake matching material ID's. Does Xnormal have anything like this? or any solution to baking multiple matching mesh parts in one go?... or is it a case of having to explode the model still?
cheers
My thread here: http://www.polycount.com/forum/showthread.php?p=2084712
If anyone has a chance to take a look I'd really appreciate it! I'm new to normal mapping (so far I've only taken 2 courses on 3D modeling, and I wasn't required to create normal maps, or even UV maps, for them) and I know a lot of people on polycount have years of experience, so any feedback would be much appreciated!
as far as I'm aware, Xnormal only supports one UV set. I guess the workaround would be to clone multiple dummy meshes from the 'master' mesh then copy the correct UV channel into the 1st channel for each one. then use these dummy meshes for the baking. Then apply the maps to the 'master' mesh when complete.
cheers
This happens when i try to bake a height map
and this happens when i try to bake a normal map
Any Ideas?
EDIT: Nevermind i changed the renderer and its fixed it? Will this make a difference to actual how the actual bake comes out?
Change the renderer. Height maps don't take too much time (like AO or cavity) anyway.
If you set it to default it should be slower (way slower), but more accurate with less artifacts (none...ideally). Also you will be able to render height maps (and maybe some other I don't know of too).
I use the Offset values you can find at the end of the columns in the low and high poly object lists. But would be gorgeous to use that MatID
I've got some problem, baking normals in Xnormal many many times.
Look at the screenshot...
There are some waves on the normal map that couses that strange wave errors on the model, especially when u loock closer to a surface + put any glossy material. I've already tried bake in maya too, the result the same.
Preferences are in default.
Xnormal save normal map as 4096x4096 .tiff (16bits)
Did anybody encounter that difficulties? How u fixed it?
thank u in advance!
Have you tried if it happens in different engines too?
Also: It's look, not loock.
u can see the waves in maya, marmoset toolbag2 and so on.
The matter is in the normal map. But what's wrong? even when all options by default.
it seems the different xNormal installations fight over the same settings files/registry entries causing them to make a right mess of things on my setup.
You don't stop it
Normal maps will automatically be baked with the correct directional information along the UV borders no matter how the UV islands are oriented and you will always get that seam if you view the normal map as a diffuse.
It will display correctly once it is applied to your mesh as a normal map.
Make sure you delete any extra/garbage uvs sets. OBJ only supports 1 uv channel so if you have more than one you don't know which is getting exported.
Is it possible to bake an AO to your HP vertex color so you can transfer it to your LP? If so do you have to UV your HP?
Reed
This may be an extreme example and some of these problems are likely to be invisible in game but I'd like to hear your thoughts regardless.
Note:
1. High poly (ignore the front face weirdness)
2. TGA baked in Max (default scanline)
3. TGA baked in xNormal
4. 16 bit TIFF in xNormal - converted to 8 bit, downsized, normalised, saved as TGA
5. 16 bit TIFF in xNormal - downsized, converted to 8 bit, normalised, saved as TGA
6. 16 bit TIFF in xNormal - downsized, normalised, converted to 8 bit, saved as TGA
My thoughts:
1. High poly looks smooth, apart from the front face but this does not show in Max and appears to have no effect on the bakes.
2. I hear Max bakes have a bit of noise in them to reduce banding and this is clear here. The bake has uniform noise on the curved faces and no banding. To me, the best looking 8 bits/channel bake of the bunch.
3. Has clear banding on the cylindrical parts, would like to avoid this result. Baking 16 bit TIFF improves this.
4 & 5. Look very similar. Converting to 8 bit introduces noise on curved areas but in the same bands that appear in 3. Second best results after Max.
6. More uniform noise but stronger than Max. Also considerable noise on flat faces, although banding has been reduced. Third best result, due to to the strong noise.
Admittedly I have just started using xNormal but I feel like my best bake came out of Max anyway, and without having to bake and convert different formats.
So I guess my questions are:
If you want to have a closer look you can get the scene and files here: https://dl.dropboxusercontent.com/u/45171669/Polycount/XLR/XLR.zip
Thanks
Thanks ahead of time.
On a side note I've tried searching google and these forums for things like "xnormal ambient occlusion glitches, xnormal ambient occlusion shadow error" and variations in there of. Is there some kind of specific term to describe what is happening here? I've also tried fiddling with the settings.
Too large cage/ray distance. Captures light from the tube/cylinder instead of itself.
Ahh thank you so much! I cranked down the "maximum frontal ray distance" and "maximum rear ray distance" and got better results. Just a note I wasnt using a bake cage for this model as it was relatively simple, but should I use bake cages even for simple models in the future?
Anyone experience this before?
It's completely new to me at the moment so I need to know what options to choose, buttons to press, etc.
Please respond urgently! Thank you!!
That's the way my friend and tutor suggests. I used to faff around with the smoothing groups so much, optimising them for the unwrap and then never getting a perfect result. So then I'm told, "The way most people do it now for UE4 is use one smoothing group, xNormal and mikk tangent space."
So, even if it isn't the only or best way to do it for everyone, I've seen it working perfectly so I'd like to give it a go myself. But I'm afraid I can't find much information on the workflow online.
There's no way you'll get a decent looking hardsurface piece with one smoothing group, for example. UE4 isn't vastly different from any other deferred PBR engine, it still is going to follow many common workflow techniques that we have learned as artists.
No this is not how game artists do it.
Set your smoothing groups by your uv islands to start. There are scripts that do this.
In general at very hard edges with angles >75° you want to set a hard edge/smoothing group. Where ever you have a hardedge/smoothing group you need to detach that as a uv island.
Force triangulate before you bake.
These few steps will get you close to what you need, with some caveats that you will learn as you go.
Using 1 smoothing group is very situational and far from the standard.
I've done a test with boxes and it didn't seem to matter the angle the edges were as long as I broke the UV accordingly.
And I've seen so many people get perfect results from using one smoothing group with xNormal.
all the assets I have baked so far using 1 group using the correct tangent space in marmoset, most worked but im not sure if this is the correct way from what you guys are saying
here is an older asset i baked using 1 group
is there anything wrong with the way I did that asset?
I also came across some stuff on the UE4 tutorials that mention using one smoothing group.
https://docs.unrealengine.com/latest/INT/Engine/Content/Types/Textures/NormalMaps/Creation/index.html
However, using this method will generate a normal map that has a metric ton of gradients in it. Which in and of itself is fine as long as it looks good in the viewport.
In a game engine though, these gradients will now be compressed for realtime rendering, usually dxt5(old,bad) or bc5 (new,better). This compression can wreck havoc on heavily gradated normal maps. So if your end goal is to put the object uncompressed into a viewport, then go crazy with 1 smoothing group. But if you want to create the smoothest normal map WHILE being compressed, you have to give the normal map a break and using smoothing groups/hard edges.
At the end of the day, do what looks good to you. But do it because you understand, rather than because it's confusing.
At the very least, outside of very specific reasons not to, set your smoothing groups by UV islands via a script, as it is 'free' in engine resource cost, and can help reduce unnecessary gradation.
but I didn't take compression into account.
thanks for clearing that up for us
I don't think there is 1 resource that has all of this info concisely put together. It is spread out here, on the wiki, in EQ's thread, and random threads like this.
Can you recommend any scripts or tutorials on this? For now, I will be learning it without scripts though as I don't like not knowing first how to do it myself.
Epic does recommend using the 1 turbo smooth method so I will use that too as it's so easy and comes out fine in UE4.
They say that you should put the scale to 16, and both the ray distances to 50 and go from there.
http://docs.cryengine.com/display/SDKDOC3/Ambient+Occlusion+and+Normal+map+bake+using+Xnormal
Problem is if i do this and try to render an AO map using these options i get pure white AO, unless i use the CUDA renderer, now if i put the mesh settings to default (scale at 1.0 and ray distances at 0.5) it renders fine.
No other tutorial that i can find ever mentions adjusting these settings, so my question is, am i doing something wrong or is crytek's tutorial wrong.
It seems as if the cage isnt getting scaled along with the object, but it works fine when baking normals and it scaled up in the 3d viewer
Make sure you scale the lowpoly and highpoly, but if the defaults work for you, might as well use them.
Sometimes scaling up is recommended if you have really small meshes which can result in precision errors and xNormal will warn you if that is the case, but even in those situations I have rarely had any issues.
Can you post a link to Epic's recommendation of using 1 smoothing group method? Also remember, TurboSmooth is a 3ds Max mesh modifier, not smoothing groups.
I can recall when I began baking for the first while I made the same mistake of using 1 smoothing group and the results were very bad.
Using 1 smoothing group can work in some cases, for example if you have a bunched up cloth or wavy curtains. But in my experience - for most meshes it won't make sense, should be avoided, and is a a shortcut that will eventually come back to bite you when you discover for whatever reason, on whatever rendering platform, your mesh looks like garbage.
Also I'd be interested to see the in-engine results you got from using one smoothing group on your mesh.
There is a plethora of support/tutorials here and elsewhere. One of the most important points is to make sure to have a UV split at every hard edge on your mesh.
The 3D viewer can also be helpful with diagnosing problems with Xnormal, as you'll be able to see exactly where Xnormal thinks your meshes are, or if it can load your meshes at all.
My workflow is basically this:
- Unwrap the lowpoly
- Use TexTools UVs->SmoothingGroups tool to break my smoothing groups along the UV shells
- Duplicate the mesh, add a push modifier and collapse it to get the cage
- Send both off to XNormal for baking
I've done this tons of times, I'm not sure what's gone wrong here. Anyone have any advice?
Edit: Solved this by unchecking smoothing groups in my FBX cage export
Apparently unifying the cage normals or setting them all to a single smoothing group was not enough for this model, the cage had to be exported without SG. Leaving the post up incase anyone else has similar problems
I am exporting everything from Maya using the .SBM format, my UVs are split, and the edges are hard.
It looks like the rays aren't being shot backward, only forward (I've noticed this problem on other meshes as well). Any idea what's causing that and how I could fix it?
If so, that's down to how the cage works. With this particular low poly mesh, there's no real easy around it.
You could bevel the edges and then in the cage only push the main faces straight outwards.