Using Maya for modeling and X-Normal for baking, I always advise using one smoothing group. With one smoothing group, your low-poly will inherit all the detail from your high-poly whether it be soft edges or hard edges. I always run into issues when I try to bake with multiple smoothing groups.
However, if I am not going to be baking and just using height maps, that is when I will use multiple smoothing groups.
Using Maya for modeling and X-Normal for baking, I always advise using one smoothing group. With one smoothing group, your low-poly will inherit all the detail from your high-poly whether it be soft edges or hard edges. I always run into issues when I try to bake with multiple smoothing groups.
However, if I am not going to be baking and just using height maps, that is when I will use multiple smoothing groups.
You need to set up a cage in xnormal's 3d viewer, or export a cage(envelope) from maya, using the basic ray distance setting in xnormal will give you gaps along your hard edges. This was explained a bit in this thread, and you can find some more info here too(read the whole thread): http://www.polycount.com/forum/showthread.php?t=81154
So I still don't quite understand this. When they say Synced Normals, what are they Synced to? xNormal? Did they basically make it so the normals in unreal match the normals generated by xNormal? What about the ones generated in Max? What's the difference anyway?
So I still don't quite understand this. When they say Synced Normals, what are they Synced to? xNormal? Did they basically make it so the normals in unreal match the normals generated by xNormal? What about the ones generated in Max? What's the difference anyway?
AFIAK: Epic isn't calling this "synced" they're calling it "improved" The OP called it synced. From everything I know about it there are some deep down technical issues that prevent "proper" syncing, but even then a nice improvement over the old workflow is very welcome.
Earthquake, I understand your skeptical but you should totally give this a try and see what kind of results you get.
Well considering that mbullister's comparison uses my "worst case" mesh from one of the older threads I'm not sure what more I could do to try it out. From what I've seen here on unrigged meshes its a massive improvement, but I don't really know anything about rigging etc so someone else will have to test that stuff out anyway.
Though maybe I can find some time to try out a more realistic asset, or at least provide an asset for other people to play around with thats a bit more representational of what would end up in a game.
For me, the cases where this sort of thing would really show a big impact would be rigged/animated weapon models, so if the results there haven't really improved that will be disappointing. It would be good to know if there is anything extra people should be doing with animated meshes and this workflow.
I know you probably think I'm being a bit of a wet blanket here, but I'm just really curious to see the full extent of how this works as the example in the official doc wasn't all that in depth.
I just want to touch on this quickly. In general terms this is bad advice. Even with a perfectly synced workflow, you still want to split your edges/SGs at your UV borders. Because:
1. Its free, there is no extra vertex cost.
2. Your normal maps will have less artifacts, even with 100% synced workflow
3. Your normal maps will generally compress better(less crazy gradients etc)
4. Reuse may be easier in some instances.
5. A script can do this for you in 1 second in Max/Maya.
And there are literally no cons. The only possible con is if you've got a weird baking workflow where your cage is not averaged(ie: offset method in max, not using a proper cage in Xnormal) than your projection mesh gives you a gap at your hard edges, but nobody should be baking like this.
TL;DR: "Using one smoothing group" isn't the point of a synced workflow, it is easier to do with a synced workflow, but it isn't the end goal.
I tested smoothing groups along UV texture border edges vs 1 smoothing group quite extensively a few months ago on statics within the UDK (from xnormal and maya bakes with custom cages). I was less worried about the compression artifacts and more concerned about the seams once imported ingame. I honestly thought I would notice less seams with the additional smoothing groups but the results were quite inconsistent even across similar assets (similar topology, UVs, etc). I should also note my border pixel bleeding was plenty adequate.
I isolated the assets with just normal maps and standard lighting (no post too) so that I wasn't being confused by lightmass seams and additional elements that could have skewed the results. Maybe this was me losing my mind or something else happening but I also noticed some of the seams reversing after updating drivers and switching to a newer UDK (I think Dec to Feb). By reversing I mean, I had to switch from 1 smoothing group bake to the UV border edge method and vice versa after upgrading to re-fix the seams. I wish I had more time to test this and even had some examples to show you but I'm a bit swamped. Hopefully when I have time I can try out this July UDK and compare it to my prior results and bakes.
By the way, your explanation makes sense to me and that's why I was puzzled by my results. It's certainly possible I screwed something up during the process but I definitely dedicated a lot of hours testing multiple setups and configs through xnormal and maya into UDK via FBX (even multiple versions of fbx). During those tests I would import the assets from scratch and overwrite completely in case there were any bugs related to re-imports (we had this a few years ago at my last studio). I should also note that I encountered some funky bugs with normalmaps in DX11 mode that nobody else had reported afaik (I have fairly rare vid card amongst game artists --> quadro 5000)
5. A script can do this for you in 1 second in Max/Maya.
Earthquake any chance you could link me to this MEL script? I'm currently doing lots of mechs in an in-house engine (which dosn't have this synced normals :.( ) and manually spliting the uvs is killing me. Tried to look for the script online but can't find anything referencing it.
5. A script can do this for you in 1 second in Max/Maya.
Earthquake any chance you could link me to this MEL script? I'm currently doing lots of mechs in an in-house engine (which dosn't have this synced normals :.( ) and manually spliting the uvs is killing me. Tried to look for the script online but can't find anything referencing it.
Maybe earthquake has a better one but I use this for Maya 2010 and it works pretty well. Just add to your shelf or a hotkey as MEL (not python):
hey im not trying to sprinkle hate on anything. But I dont necessarily agree with using Xnormal as a baker.
Alot of the pipeline stuff I was taught in the old outsourcing studios showed me ways to COMPLETELY speed up workflow via certain bake settings and exports etc..
I wont go into the details but im more concerned that the result of these synced normals only coming out from Xnormal as it said.
Do any similar results come out from baking in max? Or in maya? Or is it strictly Xnormal that gives the "Best" results as they showed?
5. A script can do this for you in 1 second in Max/Maya.
Earthquake any chance you could link me to this MEL script? I'm currently doing lots of mechs in an in-house engine (which dosn't have this synced normals :.( ) and manually spliting the uvs is killing me. Tried to look for the script online but can't find anything referencing it.
hey im not trying to sprinkle hate on anything. But I dont necessarily agree with using Xnormal as a baker.
Alot of the pipeline stuff I was taught in the old outsourcing studios showed me ways to COMPLETELY speed up workflow via certain bake settings and exports etc..
I wont go into the details but im more concerned that the result of these synced normals only coming out from Xnormal as it said.
Do any similar results come out from baking in max? Or in maya? Or is it strictly Xnormal that gives the "Best" results as they showed?
It does appear to be that way, however if you use the SBM exporter from Max for instance, you can use all the max tools for setting up the cage etc etc that speeds things up a bit over XN's cage editor. Just make sure to triangulate before exporting or the cage gets wanked.
It does appear to be that way, however if you use the SBM exporter from Max for instance, you can use all the max tools for setting up the cage etc etc that speeds things up a bit over XN's cage editor. Just make sure to triangulate before exporting or the cage gets wanked.
ah crap, last time I messed with Xnormal it still couldnt do unexploded bakes based on material ID's Like Max could, that was kinda crucial to a workflow for a few maps I would bake (and overall not killing myself for similar results of exploding my mesh high and low)
If that was fixed somehow I would be willing to give Xnormal a chance again, but honestly I still feel that having to work outside of a package to achieve results that isnt common with your 3D modeling application is just bad practice....
in a better example its like someone telling me go model in modo, export to maya to get it to calcualte triangles right, export that as FBX to max and then bake in max and then export to unreal/unity/crytech...sure it will work, but thats just needless steps in my opinion (ok maybe im exaggerating but you get the idea of what I mean, its jut a pain to work with more tools to achieve the same crap )
What exactly are you using Multi ID's for? If it's masking, why not write a script that will take RGB from your Material and instead lock them into your Vertex Color and bake out that as your mask, since last I checked, you still need to clean up stuff in UDK for Multi ID's.
If you could give an example on why you need them on the other hand, and how it will directly affect the workflow...of course, why not bake out your NM's in XN and ID's in something else?
Also, you need to realize that so far, XNormal is the cleanest software in terms of Normals Maps not to mention speed, in other apps you need to sacrifice certain settings, and in my case in Max, I get these weird layered lines like a mountain range unless I really crank up the settings, with Maya being in at a close second, and others following mess (PS: I mean XNormal and Maya are the two only decent bakers so far, with Max and the rest really falling behind in terms of 'logic' math).
Lastly, I don't see how using an 'outside' package to get stuff done is bad practice, especially for something which relates to only baking out maps? It's kinda like telling me "You should only learn how to render in Mental-Ray instead of creating your own Shaders" or that using scripts vs. doing everything by standard tools is bad. You're not rigging models or anything complex, where it might break from one package to another.
No no, you're missing the point. In Max you can tell it to have the rays only hit matching material IDs. Usually you'd have to explode your mesh. But with this technique you just assign the same material ID to the highpoly as a piece of the lowpoly, and you don't get weird ray intersections. It's the best way I know of to get really clean bakes.
Edit:
Since we're on the topic of xNormal's bakes, how do you guys handle its cage? That's the reason I still prefer to use Max to bake normals. You can just reset the cage in the Projection modifier, and then Push it outwards and you get 99% of the way there. Then you can just tweak the points. Is there a way to do this in xNormal? Just having it use the closest ray hit is kinda bad.
In Maya, I just duplicate my mesh and move verts along their normals, then export that as the cage. Even in xNormal when you create a cage, you still have to save it out and load it into the low poly cage slot to be able to use it.
yeah bigjohn called it, as I said I dont want to turn this into a normal map witch hunt that some other threads turn into. All im getting at is I rather not have to leave my 3D modeling package to do something thats already built in and change current workflows to integrate using that. Thats where the bad practice comes in, changing your workflow and using something outside of it for a result while as it sounds is fine. But having to make that part of a major pipeline (lets say like...oh I dunno EA all switched to using Xnormal instead of max or maya for baking or lets go even further and say they have a internal baker like ID does/did (I dunno if they still do I just remember from some work I did with them a long time ago) now your changing your workflow to fit the needs of the program, instead of the program fitting the needs for what you need. In my case I say if epic decided Xnormal is the base for all of our normals. Xnormal should become a less of a cheap/for fee app and get full support by a company (autodesk) and be integrated into max and maya as a new renderer. That way I dont have to sacrifice any of my workflow because someone decided otherwise for there game engines. Make tools that are universally used is all im getting at. IF I was a maya user I would be harping the same thing because I probably would have had a very key specific workflow for game production im doing at some game company that is all maya. Where as right now mine is tailored for max from previous jobs/current ones (well current one no real 3D but more engine work but you get the idea)
Anyway In the short answer AWESOME EPIC!!!
In the long answer EPIC GET AUTODESK TO INTEGRATE XNORMALS MATH INTO THERE PRODUCTS !!!
In the long answer EPIC GET AUTODESK TO INTEGRATE XNORMALS MATH INTO THERE PRODUCTS !!!
It would be cool if there were a way to be able to switch between different types tangent calculation in Autodesk apps, not just use what xNormal uses.
Also I don't see a huge problem in using xNormal within a pipeline. Alot of the time your DCC app is shit at doing something be it UV mapping, rendering etc... And plugins are often used to supplement.
I could see a problem if xNormal cost something, but it's free even in commercial situations. It flies through AO maps, and it also has the ability to bake a whole lot of other maps that require a bunch of setup in most DCC apps.
Don't get me wrong, I am pretty averse to introducing new tools to a pipeline, but Epic just introduced the use of xNormal normals in the July Beta, so if you're in production right now, you don't need to worry about it anyway. In any case this is a hell of alot better than it was before where the recommendation was to waste time adding a whole bunch of supporting geometry to limit artifacts, which one can continue to do if they so desire.
An Alpha of xNormal 4 is planned to drop before the end of the year, and maybe it will be more suited to be integrated with the DCC apps with some sort of bridge, like GOZ does with zbrush.
Seforin I understand your general complaint but for studios with a mostly Maya focused pipeline, baking with xNormal not only gets you pretty nice results (especially AO), the multi-threaded support makes quite a big difference speed-wise compared to Maya's single thread baking. On top of that, it's free of course. Maya integration would be nice though (maybe a "send lowPoly + cage to xNormal" function). It does seem to get a little overwhelming these days regarding how many specialized apps us game artists get tempted into adding to our workflow.
I don't really find the quality of xNormal's AO bake all that great. Another reason why I don't use it. I think at best it's comparable to Mental Ray or Max's Skylight. But it's not as good as a vRay irradiance bake.
And as far as I know all of those are multithreaded (maybe not the Max skylight as it's using the scanline)
But really the cage thing is what kills it for me.
BigJohn - xNormal has cage editing, you just have to use the weird first person 3D preview interface to do it. It's as clunkily charming as the rest of xNormal.
Big John I haven't tried Vray's irradiance bake so maybe I don't know what I'm missing. Yeah it's pretty sad that maya still doesn't have multi-threaded map bakes (unless 2013 added this and I didn't realize it).
I don't really find the quality of xNormal's AO bake all that great. Another reason why I don't use it. I think at best it's comparable to Mental Ray or Max's Skylight. But it's not as good as a vRay irradiance bake.
xNormals AO quality is great. If you want broad wash of AO like 3dsmax skylight or vray's irradiance you'll just need to config it and set up proper groundplane and/or sphere environment.
Btw, last time I checked irradiance map was an interpolated solution, why would you want to use interpolation when baking maps? Also, not everyone can keep dedicated and expensive rendering engine around just for baking ambient occlusion maps.
Edit:
Since we're on the topic of xNormal's bakes, how do you guys handle its cage? That's the reason I still prefer to use Max to bake normals. You can just reset the cage in the Projection modifier, and then Push it outwards and you get 99% of the way there. Then you can just tweak the points. Is there a way to do this in xNormal? Just having it use the closest ray hit is kinda bad.
You can use the cage editor in XN, but its pretty painful. Just use the SMB exporter and set your cage up in max, really easy.
You can use the cage editor in XN, but its pretty painful. Just use the SMB exporter and set your cage up in max, really easy.
Hi EarthQuake I gotta ask ,what do you guys at 3 point commonly use for baking out your stuff? Vahl told me it was a toss up between maya and max depending on your client. But what is the more common one you guys handle? Because from the way you and other guys at 3 point talk about baking I rarely hear any of you guys using xnormal. (if im talking shit or wrong please let me know) And to follow up on what you said...I remmeber before I would do that and it would never freakin work right. Exporting the SMB it will never read my cage setup in max. (where going back a year or 2 so if its improved I hope its much less painful because cage editing is the big deal breaker for me since I deal with TONS of hard surface models vs organics/characters )
Hi EarthQuake I gotta ask ,what do you guys at 3 point commonly use for baking out your stuff? Vahl told me it was a toss up between maya and max depending on your client. But what is the more common one you guys handle? Because from the way you and other guys at 3 point talk about baking I rarely hear any of you guys using xnormal. (if im talking shit or wrong please let me know) And to follow up on what you said...I remmeber before I would do that and it would never freakin work right. Exporting the SMB it will never read my cage setup in max. (where going back a year or 2 so if its improved I hope its much less painful because cage editing is the big deal breaker for me since I deal with TONS of hard surface models vs organics/characters )
We use whatever our clients request, we don't really have an "in-house" method, though Max, Maya and Xnormal are really the only methods requested. Usually our clients have very specific requirements for baking etc, and we try to stick to those work flows as much as possible. In an ideal world, we use whatever the client's engine is synced with, but this isn't always possible. On Brink we used Maya because SD's pipeline was synced to that, but I can't really talk about anything else due to NDAs(and I only mention Brink because its been brought up before by Mop etc).
With the SMB exporter the easiest thing to mess up is forgetting to triangulate your mesh. Throw an editmesh modifier(or whatever one triangulates) before exporting. If you don't the cage won't be exported correctly. Other than that.... it just works, make sure you have the correct options checked in the exporter, etc. Post in the XN thread if you're still having issues.
It's a matter of preference really. I'm not trying to say xNormal's bakes suck. Just that I don't like it. I can't seem to get the highpoly detail to show as clearly as I can with Max. Also when I wanna bake generic AO from just the lowpoly without a highpoly there, using Vray's irradiance is the only way I know of to get the renderer to take the normal map into account. I don't believe xNormal can do that.
Xnormal should become a less of a cheap/for fee app and get full support by a company (autodesk) and be integrated into max and maya as a new renderer.
Hey man Im not trying to be a jerk. But the Cheap in me says "Great we have a simple solution"
But the business professional in me says "This is a free app that some guy happens to do in his spare time for free that is now being used by a MAJOR developer in the 3D game/movie/engine world. So that means if his product has ISSUES at one point or another and people need to have support on it, he can just give up on it and say fuck off to the world and then we are all screwed. Where as if autodesk owns said product and integrated it, and I was working for a company that has inside support with them, then we have access to a team/development to upgrade the tool set for...or flat out just support thats part of a product.
Anyway I feel Im derailing this thread. Overall im VERY happy that UDK has synced tangents. Its a shame its in a 3RD party app but whatever if it works and its available we cant really complain right? I just need to get use to the idea of exploding my bakes and exporting things pretriangulated and all sort of instancing meshes etc I would need to combine off hand so it wont try and bake for over a hundred pieces or something...its nothing bad....just need to get use to a different workflow....off topic but I would love to see some people who handle REALLY high detailed Hard surface models and see there workflow for Xnormal....every example ive seen people do have been for organics or characters...and every non test is usually a cube (nothing complex)
@John: The settings for XNormal right off the bat are pretty lousy, for example, set your AO bake to something like Cosine within the 120-180 range. The default ones aren't that grand, it also doesn't help that most tutorials don't cover indepth these stuff and you need to read the PDF, which gets installed in the XN directory and isn't available as a quick separate download.
Same can be said of it's Normals, you gotta tinker with them settings.
@Seforin: We all can see how well it went with Mudbox, half of the features are lagging behind schedule, those that do come out are usually in SP-Pack formats, and out of the blue, just because Maya and Mudbox shared the same tangents, Autodesk decided to say "Oh you know what, Maya tangents are the new standard in the industry" when so far, Maya is the 'cleanest' in it's own app and maybe in a couple of engines because people worked around it, and not because Autodesk knew what they were doing.
Oh also, somehow they started a war with Mari with Mudbox's giga-texture stuff which no one in the majority asked apart from a few crazy peeps, and yet somehow using Dynamesh in ZB and Vertex Painting from there feels easier...I mean hell, half of the current texturing tools in Mudbox are only truly bug free if you texture and keep on texturing in Mudbox, good luck importing symmetry models with a premade texture in Canvas or PS, cause Mudbox will claim it can't read the coordinates correctly unless you start from within it.
Hell, Mortin even offered up the code and all to public fully open and even showed it to Autodesk, yet AD refuses to even acknowledge that it exists, I mean they literally said "Oh cool, so anyhow, we use Maya all the time for our tangents". Same with shaders for the 2.0 Viewports, they said "Oh cool, you peeps wanna work with us to get them issues solved, come on in" only to go silent on the matter later on and not progress on the issues.
The problem is that the code and information for the tangents of the normal maps of XN are there, out in the open, any person with half a brain can access them right now, it's just that AD doesn't want to use it, or acknowledge it for some odd reason, which is what is odd.
And not every company out there will have access to both apps, especially indies, so XN is the safest bet, if AD only got their heads out of their arse for a nano-second, and saw the benefits on having a streamlined pipe-line, especially for a visual perspective, then we wouldn't be in this mess that we are right now. I mean 3,000 bucks ain't cheap, nor is a 1,000 upgrade price tag in some cases, that thing can bleed you dry real quick if you're not prepared, and frankly, when you have Blender performing better then 3,000 buck program, you know your studio did something very, VERY wrong.
BTW, hope I don't come off as someone who is shouting at you, I'm just very much vexed on how a free software is better then bought software, thanks to the laziness of the industry.
Well, it's one of the reasons that I disliked that XSI got bought by Autodesk, sure they got more money to work with, but progress seemed to slow down, and less ideas and more "make it look the same".
I mean 3,000 bucks ain't cheap, nor is a 1,000 upgrade price tag in some cases, that thing can bleed you dry real quick if you're not prepared, and frankly, when you have Blender performing better then 3,000 buck program, you know your studio did something very, VERY wrong.
I think it's more of a comfort thing - like Microsoft Office. They don't have to try anymore, so they don't. That's one of the reasons I'm really championing Modo right.
Luxology are hungry and want your business. Every release comes with a ton of great features and new stuff.
Autodesk has been phoning it in for years.
EDIT : Oh, and Modo is a 1/3 of the price of Max. It's a no brainer to me.
Yeah if you compare Autodesk to Luxology it shows how lazy AD is. Modo has to make some effort to fight for the customer. AD just releases new features without even checking if it works... When they annouced new UV Unwrap features everyone was excited... they released it and it was unusable. Ohhh AD why so ... AD.
I hope Modo will become more firendly for game art though. I usually need to go through Modo > Max > xNormal to get low poly with good maps and smoothing. Whole CG and Game industry - sort your shit out!!!
The only thing Modo really needs is proper support for smoothing groups. I know they announced them in the last version but, no, that's an unusable train wreck. But I have faith they'll sort it out ... until then, I have to cut/paste the pieces of my model where I want hard edges. Not the end of the world but a lot less friendly than the Max Edit Poly way.
until then, I have to cut/paste the pieces of my model where I want hard edges. Not the end of the world but a lot less friendly than the Max Edit Poly way.
If you're using that mesh to bake you really should not be doing that, you'll get bad seams from manually detaching geometry because of gaps in the projection mesh. Same thing as using a basic ray distance vs a proper cage.
@John: The settings for XNormal right off the bat are pretty lousy, for example, set your AO bake to something like Cosine within the 120-180 range. The default ones aren't that grand, it also doesn't help that most tutorials don't cover indepth these stuff and you need to read the PDF, which gets installed in the XN directory and isn't available as a quick separate download.
Same can be said of it's Normals, you gotta tinker with them settings.
Thanks for the advice man. I'll try it again. Admittedly it's been quite a while since I used xNormal. I couldn't get a decent result when I first used it, so I never got back to it. So I'm probably all full of shit on this one.
What do you recommend as the settings for the Normal-Map bake? The settings dialog for it seems very simple. Are there other settings elsewhere that affect it?
Replies
However, if I am not going to be baking and just using height maps, that is when I will use multiple smoothing groups.
You need to set up a cage in xnormal's 3d viewer, or export a cage(envelope) from maya, using the basic ray distance setting in xnormal will give you gaps along your hard edges. This was explained a bit in this thread, and you can find some more info here too(read the whole thread): http://www.polycount.com/forum/showthread.php?t=81154
I tried it with a simple beveled cube to a hard cube, and it seems to be OK, but was hoping for other peeps opinion on it.
I'm guessing so because of:
"This workflow currently relies on using XNormal to bake normal maps but produces much higher quality shading than before"
...but haven't tried it yet
This. SBM is really confy to export cage with lowpoly and it never caused me any problems.
AFIAK: Epic isn't calling this "synced" they're calling it "improved" The OP called it synced. From everything I know about it there are some deep down technical issues that prevent "proper" syncing, but even then a nice improvement over the old workflow is very welcome.
Well considering that mbullister's comparison uses my "worst case" mesh from one of the older threads I'm not sure what more I could do to try it out. From what I've seen here on unrigged meshes its a massive improvement, but I don't really know anything about rigging etc so someone else will have to test that stuff out anyway.
Though maybe I can find some time to try out a more realistic asset, or at least provide an asset for other people to play around with thats a bit more representational of what would end up in a game.
For me, the cases where this sort of thing would really show a big impact would be rigged/animated weapon models, so if the results there haven't really improved that will be disappointing. It would be good to know if there is anything extra people should be doing with animated meshes and this workflow.
I know you probably think I'm being a bit of a wet blanket here, but I'm just really curious to see the full extent of how this works as the example in the official doc wasn't all that in depth.
I tested smoothing groups along UV texture border edges vs 1 smoothing group quite extensively a few months ago on statics within the UDK (from xnormal and maya bakes with custom cages). I was less worried about the compression artifacts and more concerned about the seams once imported ingame. I honestly thought I would notice less seams with the additional smoothing groups but the results were quite inconsistent even across similar assets (similar topology, UVs, etc). I should also note my border pixel bleeding was plenty adequate.
I isolated the assets with just normal maps and standard lighting (no post too) so that I wasn't being confused by lightmass seams and additional elements that could have skewed the results. Maybe this was me losing my mind or something else happening but I also noticed some of the seams reversing after updating drivers and switching to a newer UDK (I think Dec to Feb). By reversing I mean, I had to switch from 1 smoothing group bake to the UV border edge method and vice versa after upgrading to re-fix the seams. I wish I had more time to test this and even had some examples to show you but I'm a bit swamped. Hopefully when I have time I can try out this July UDK and compare it to my prior results and bakes.
By the way, your explanation makes sense to me and that's why I was puzzled by my results. It's certainly possible I screwed something up during the process but I definitely dedicated a lot of hours testing multiple setups and configs through xnormal and maya into UDK via FBX (even multiple versions of fbx). During those tests I would import the assets from scratch and overwrite completely in case there were any bugs related to re-imports (we had this a few years ago at my last studio). I should also note that I encountered some funky bugs with normalmaps in DX11 mode that nobody else had reported afaik (I have fairly rare vid card amongst game artists --> quadro 5000)
Earthquake any chance you could link me to this MEL script? I'm currently doing lots of mechs in an in-house engine (which dosn't have this synced normals :.( ) and manually spliting the uvs is killing me. Tried to look for the script online but can't find anything referencing it.
Maybe earthquake has a better one but I use this for Maya 2010 and it works pretty well. Just add to your shelf or a hotkey as MEL (not python):
string $objList[] = `ls -sl -o`;
string $uvBorder[];
string $edgeUVs[];
string $finalBorder[];
for ($subObj in $objList) {
select -r $subObj;
polyNormalPerVertex -ufn true;
polySoftEdge -a 180 -ch 1 $subObj;
select -r $subObj.map["*"];
polySelectBorderShell 1;
$uvBorder = `polyListComponentConversion -te -in`;
$uvBorder = `ls -fl $uvBorder`;
clear( $finalBorder );
for( $curEdge in $uvBorder ) {
$edgeUVs = `polyListComponentConversion -tuv $curEdge`;
$edgeUVs = `ls -fl $edgeUVs`;
if( size( $edgeUVs ) > 2 ) {
$finalBorder[ size( $finalBorder ) ] = $curEdge;
}
}
polySoftEdge -a 0 -ch 1 $finalBorder;
}
select -r $objList;
Alot of the pipeline stuff I was taught in the old outsourcing studios showed me ways to COMPLETELY speed up workflow via certain bake settings and exports etc..
I wont go into the details but im more concerned that the result of these synced normals only coming out from Xnormal as it said.
Do any similar results come out from baking in max? Or in maya? Or is it strictly Xnormal that gives the "Best" results as they showed?
Sure: https://dl.dropbox.com/u/499159/UVShellHardEdge.mel
I believe Mop wrote this BTW, and I haven't tested it in anything newer than 2008.
It does appear to be that way, however if you use the SBM exporter from Max for instance, you can use all the max tools for setting up the cage etc etc that speeds things up a bit over XN's cage editor. Just make sure to triangulate before exporting or the cage gets wanked.
ah crap, last time I messed with Xnormal it still couldnt do unexploded bakes based on material ID's Like Max could, that was kinda crucial to a workflow for a few maps I would bake (and overall not killing myself for similar results of exploding my mesh high and low)
If that was fixed somehow I would be willing to give Xnormal a chance again, but honestly I still feel that having to work outside of a package to achieve results that isnt common with your 3D modeling application is just bad practice....
in a better example its like someone telling me go model in modo, export to maya to get it to calcualte triangles right, export that as FBX to max and then bake in max and then export to unreal/unity/crytech...sure it will work, but thats just needless steps in my opinion (ok maybe im exaggerating but you get the idea of what I mean, its jut a pain to work with more tools to achieve the same crap )
/end of comment/rant/
If you could give an example on why you need them on the other hand, and how it will directly affect the workflow...of course, why not bake out your NM's in XN and ID's in something else?
Also, you need to realize that so far, XNormal is the cleanest software in terms of Normals Maps not to mention speed, in other apps you need to sacrifice certain settings, and in my case in Max, I get these weird layered lines like a mountain range unless I really crank up the settings, with Maya being in at a close second, and others following mess (PS: I mean XNormal and Maya are the two only decent bakers so far, with Max and the rest really falling behind in terms of 'logic' math).
Lastly, I don't see how using an 'outside' package to get stuff done is bad practice, especially for something which relates to only baking out maps? It's kinda like telling me "You should only learn how to render in Mental-Ray instead of creating your own Shaders" or that using scripts vs. doing everything by standard tools is bad. You're not rigging models or anything complex, where it might break from one package to another.
Edit:
Since we're on the topic of xNormal's bakes, how do you guys handle its cage? That's the reason I still prefer to use Max to bake normals. You can just reset the cage in the Projection modifier, and then Push it outwards and you get 99% of the way there. Then you can just tweak the points. Is there a way to do this in xNormal? Just having it use the closest ray hit is kinda bad.
Anyway In the short answer AWESOME EPIC!!!
In the long answer EPIC GET AUTODESK TO INTEGRATE XNORMALS MATH INTO THERE PRODUCTS !!!
It would be cool if there were a way to be able to switch between different types tangent calculation in Autodesk apps, not just use what xNormal uses.
Also I don't see a huge problem in using xNormal within a pipeline. Alot of the time your DCC app is shit at doing something be it UV mapping, rendering etc... And plugins are often used to supplement.
I could see a problem if xNormal cost something, but it's free even in commercial situations. It flies through AO maps, and it also has the ability to bake a whole lot of other maps that require a bunch of setup in most DCC apps.
Don't get me wrong, I am pretty averse to introducing new tools to a pipeline, but Epic just introduced the use of xNormal normals in the July Beta, so if you're in production right now, you don't need to worry about it anyway. In any case this is a hell of alot better than it was before where the recommendation was to waste time adding a whole bunch of supporting geometry to limit artifacts, which one can continue to do if they so desire.
An Alpha of xNormal 4 is planned to drop before the end of the year, and maybe it will be more suited to be integrated with the DCC apps with some sort of bridge, like GOZ does with zbrush.
And as far as I know all of those are multithreaded (maybe not the Max skylight as it's using the scanline)
But really the cage thing is what kills it for me.
Btw, last time I checked irradiance map was an interpolated solution, why would you want to use interpolation when baking maps? Also, not everyone can keep dedicated and expensive rendering engine around just for baking ambient occlusion maps.
You can use the cage editor in XN, but its pretty painful. Just use the SMB exporter and set your cage up in max, really easy.
Hi EarthQuake I gotta ask ,what do you guys at 3 point commonly use for baking out your stuff? Vahl told me it was a toss up between maya and max depending on your client. But what is the more common one you guys handle? Because from the way you and other guys at 3 point talk about baking I rarely hear any of you guys using xnormal. (if im talking shit or wrong please let me know) And to follow up on what you said...I remmeber before I would do that and it would never freakin work right. Exporting the SMB it will never read my cage setup in max. (where going back a year or 2 so if its improved I hope its much less painful because cage editing is the big deal breaker for me since I deal with TONS of hard surface models vs organics/characters )
We use whatever our clients request, we don't really have an "in-house" method, though Max, Maya and Xnormal are really the only methods requested. Usually our clients have very specific requirements for baking etc, and we try to stick to those work flows as much as possible. In an ideal world, we use whatever the client's engine is synced with, but this isn't always possible. On Brink we used Maya because SD's pipeline was synced to that, but I can't really talk about anything else due to NDAs(and I only mention Brink because its been brought up before by Mop etc).
With the SMB exporter the easiest thing to mess up is forgetting to triangulate your mesh. Throw an editmesh modifier(or whatever one triangulates) before exporting. If you don't the cage won't be exported correctly. Other than that.... it just works, make sure you have the correct options checked in the exporter, etc. Post in the XN thread if you're still having issues.
It's a matter of preference really. I'm not trying to say xNormal's bakes suck. Just that I don't like it. I can't seem to get the highpoly detail to show as clearly as I can with Max. Also when I wanna bake generic AO from just the lowpoly without a highpoly there, using Vray's irradiance is the only way I know of to get the renderer to take the normal map into account. I don't believe xNormal can do that.
Pretty sure both use MaxStar 2.5.
Nope xNormal uses a tangent basis calculation created by Morten S. Mikkelsen, and I think blender started using that too.
You might want to do a little more research on the topic at hand.
please don't give them any ideas
Hey man Im not trying to be a jerk. But the Cheap in me says "Great we have a simple solution"
But the business professional in me says "This is a free app that some guy happens to do in his spare time for free that is now being used by a MAJOR developer in the 3D game/movie/engine world. So that means if his product has ISSUES at one point or another and people need to have support on it, he can just give up on it and say fuck off to the world and then we are all screwed. Where as if autodesk owns said product and integrated it, and I was working for a company that has inside support with them, then we have access to a team/development to upgrade the tool set for...or flat out just support thats part of a product.
Anyway I feel Im derailing this thread. Overall im VERY happy that UDK has synced tangents. Its a shame its in a 3RD party app but whatever if it works and its available we cant really complain right? I just need to get use to the idea of exploding my bakes and exporting things pretriangulated and all sort of instancing meshes etc I would need to combine off hand so it wont try and bake for over a hundred pieces or something...its nothing bad....just need to get use to a different workflow....off topic but I would love to see some people who handle REALLY high detailed Hard surface models and see there workflow for Xnormal....every example ive seen people do have been for organics or characters...and every non test is usually a cube (nothing complex)
Same can be said of it's Normals, you gotta tinker with them settings.
@Seforin: We all can see how well it went with Mudbox, half of the features are lagging behind schedule, those that do come out are usually in SP-Pack formats, and out of the blue, just because Maya and Mudbox shared the same tangents, Autodesk decided to say "Oh you know what, Maya tangents are the new standard in the industry" when so far, Maya is the 'cleanest' in it's own app and maybe in a couple of engines because people worked around it, and not because Autodesk knew what they were doing.
Oh also, somehow they started a war with Mari with Mudbox's giga-texture stuff which no one in the majority asked apart from a few crazy peeps, and yet somehow using Dynamesh in ZB and Vertex Painting from there feels easier...I mean hell, half of the current texturing tools in Mudbox are only truly bug free if you texture and keep on texturing in Mudbox, good luck importing symmetry models with a premade texture in Canvas or PS, cause Mudbox will claim it can't read the coordinates correctly unless you start from within it.
Hell, Mortin even offered up the code and all to public fully open and even showed it to Autodesk, yet AD refuses to even acknowledge that it exists, I mean they literally said "Oh cool, so anyhow, we use Maya all the time for our tangents". Same with shaders for the 2.0 Viewports, they said "Oh cool, you peeps wanna work with us to get them issues solved, come on in" only to go silent on the matter later on and not progress on the issues.
The problem is that the code and information for the tangents of the normal maps of XN are there, out in the open, any person with half a brain can access them right now, it's just that AD doesn't want to use it, or acknowledge it for some odd reason, which is what is odd.
And not every company out there will have access to both apps, especially indies, so XN is the safest bet, if AD only got their heads out of their arse for a nano-second, and saw the benefits on having a streamlined pipe-line, especially for a visual perspective, then we wouldn't be in this mess that we are right now. I mean 3,000 bucks ain't cheap, nor is a 1,000 upgrade price tag in some cases, that thing can bleed you dry real quick if you're not prepared, and frankly, when you have Blender performing better then 3,000 buck program, you know your studio did something very, VERY wrong.
BTW, hope I don't come off as someone who is shouting at you, I'm just very much vexed on how a free software is better then bought software, thanks to the laziness of the industry.
Luxology are hungry and want your business. Every release comes with a ton of great features and new stuff.
Autodesk has been phoning it in for years.
EDIT : Oh, and Modo is a 1/3 of the price of Max. It's a no brainer to me.
I hope Modo will become more firendly for game art though. I usually need to go through Modo > Max > xNormal to get low poly with good maps and smoothing. Whole CG and Game industry - sort your shit out!!!
If you're using that mesh to bake you really should not be doing that, you'll get bad seams from manually detaching geometry because of gaps in the projection mesh. Same thing as using a basic ray distance vs a proper cage.
Is there another way via Modo? That smoothing group menu is just a flat out NO so I don't see another way to break smoothing other than cut/paste.
Thanks for the advice man. I'll try it again. Admittedly it's been quite a while since I used xNormal. I couldn't get a decent result when I first used it, so I never got back to it. So I'm probably all full of shit on this one.
What do you recommend as the settings for the Normal-Map bake? The settings dialog for it seems very simple. Are there other settings elsewhere that affect it?