Home Technical Talk

Blender 2.8 Baking OBJECT Space Normals

polycounter lvl 7
Offline / Send Message
Gigante polycounter lvl 7
Hi
I am trying to use this workflow: https://youtu.be/Nz7oY1q470w 

TLDR:
Basically you create your high poly and low poly as usual,but when is time to lay out your UVs, you make them more aggressively (Ignore hard edges etc) which results in less UV seams which leads to less vertex counts in game. And it all works because instead of baking a Tangent Space Normal Map you bake an OBJECT space normal map, then you take that object space normal map into xNormal and convert it into a Tangent Space normal map. 

My problem is that I am not getting a proper object space normal map in blender. 

This is what blender ""bakes"" as an Object Normal Map 



And this is what xNormal creates after converting that map into a Tangent Normal Map



As you can see it is all wrong.

I leave you with the blend file with the objects I am having trouble.

https://drive.google.com/file/d/1Fl8gG-sHRcrPhoNKJgqqCzQ_ercjS-Mb/view?usp=sharing

Thanks for your time.

Replies

  • pior
    Online / Send Message
    pior grand marshal polycounter
    "Basically you create your high poly and low poly as usual,but when is time to lay out your UVs, you make them more aggressively (Ignore hard edges etc) which results in less UV seams which leads to less vertex counts in game. And it all works because instead of baking a Tangent Space Normal Map you bake an OBJECT space normal map, then you take that object space normal map into xNormal and convert it into a Tangent Space normal map."

    This is basically all false :D  Or more precisely : you are imagining exotic workflows because you are thinking about all this backwards.

    It doesn't "all works" because of the OS map ; it works because whoever is using this workflow doesn't have a way to bake directly in the target TS space needed for their project. Just like when doing Dota assets, requiring a very specific TS space that most baking tools don't know the math, or the rare cases of some engines tied tightly to the depreciated ways Max or Maya output their TS maps - hence the need to bake to OS and then convert to TS in Handplane - or in XN, I suppose using some special plugin converting to something else than Mikkt.

    Outside of these edge cases you don't need to do that at all (for instance, you absolutely do not need to do that for a Toolbag portfolio piece, or for anything made for Unity or Unreal, which is probably 99% of the cases).

    Blender bakes to Mikkt TS natively. Just do that.

    Also : even though it looks good with the TS map applied, that door example in the video is awful. Plenty of shading artifacts will show up once the texture gets reduced. The "gains" absolutely do not outweight the cons of the asset behaving badly in a practical ingame use. Unless I missed it, the asset has also not been triangulated before the conversion to TS, which is a recipe for disaster :D

    I'd make the wild guess that the person using this workflow here is doing so probably because Modo is not offering an accurate way to bake TS maps in the first place. If it did, there would have never been a need to do such conversion at all (regardless of the asset having a lot of hard edges or not) as TS bakes would just work.

    Meanwhile, the Blender devs are always making a strong effort to follow standards. Don't embrace overly complicated workflows when you don't need them at all :)
  • Gigante
    Offline / Send Message
    Gigante polycounter lvl 7
    Thanks for not answering the question and instead acting all snarky very useful dude.  :smile:

    "This is basically all false D  Or more precisely : you are imagining exotic workflows because you are thinking about all this backwards."

    Gee I guess that I imagined the entire video? Is it actually in my imagination then? Am I the only one that can see it? Pls confirm.

    "
    It doesn't "all works" because of the OS map ; it works because whoever is using this workflow doesn't have a way to bake directly in the target TS space needed for their project"

    I am pretty sure modo can bake TS normal maps just fine, as you can see in the video (again don't know if I took my pills today or is it actually real) he compares the TS map baked in modo against the converted object space map in xNormal. Timestamp is 33:12. https://youtu.be/Nz7oY1q470w?t=1992   Also interestingly enough he is using UE4 to demonstrate in game how it would look and it looks great not some engine that requires any special map baking. 

    "
    Unless I missed it, the asset has also not been triangulated before the conversion to TS, which is a recipe for disaster D"

    You missed it. I guess you can see the video form the start? I'm not crazy yay!

    Again this is for static objects with no movement that saves on vertex count which is more important this days tan polycount. Please try to answer the question of why my baked Object Space normal look wrong in Blender, witch was the actual question. I don't care if its "not standard practice" that's on me.

  • pior
    Online / Send Message
    pior grand marshal polycounter
    Hi there man (dude).

    Well, there is a difference between being snarky ... and being blunt by going straight to the point in the hope of saving you hours of work hours later down the line. Let's unpack this.

    I'll reiterate : there is literally zero benefit to this workflow *if* the user has access to a 
    A to Z pipeline in which TS normals are written and read in synced unisson. And again, the statement below : 

    "And it all works because instead of baking a Tangent Space Normal Map you bake an OBJECT space normal map" 

    ... is, indeed 100% false. It doesn't "all" work because of the conversion. It only "works" in the case of a pipeline that doesn't allow the user to write a TS map that is accurately written and/or read at the end of the pipe. I am not a Modo user hence I am not familiar with its limitations, but the fact that it can write TS maps "just fine" doesn't guarantee you that it does so in a way that is natively accepted by the end of a pipeline (game engine, renderer, and so on). TS calculations are complex and take more into account than one may think (I'll let more technically oriented people elaborate on that), and the proper shading from them also depends on the options that the end context allows you to tick on import. UE4, Unity and TB3 offer a lot of options in that regard but there is no guarantee that everything is covered - which is the very reason why a tool like Handplane exists in the first place.

    So my bet here is that in the case of Modo, some important information gets dropped or gets written in a non-standard way - hence the need to resort to such a workflow using OS maps as a clean proxy. Again : with a TS workflow properly synchronized from A to Z, that door here would have baked directly to a TS map perfectly readable by UE4 and SP with perfect shading even with aggressive UVs.

    Also : in a production environment, an asset like this would probably be flagged as in need to be redone as it relies too strongly on the nmap info hence making MIPs and LODs very unlikely to work well, if at all.





    Now all that said there are indeed some uses for the OS<>TS conversions : as a way to compensate for the lack of options in a baking tool, but also as a way to retrieve a good approximation of a lost highpoly model. Done that many times and it works fantastically well.

    I do understand that it is frustrating to not be given the answer that you expect ; but if all you want to know is how to bake OS maps in Blender properly (which is a valid question in and of itself), then I would advise to post a question about that and only that. Because here, the extra context given actually hints at you potentially wasting a lot of time and effort, and developing convoluted practices. And since everyone is genuinely trying to help here, you are indeed going to receive advice on these peripheral topics.

    Oh also : Blender 2.8 is beta software, hence any issue encountered with it may simply be a bug. You want to try that in 2.79 instead, and then try it on 2.8 once it's out.
  • AMajesticSeaFlapFlap
    Hi, this problem is most likely due to mismatching transforms. For baking use +X for R, +Z for G and -Y for B and make sure the object you use doesnt have any transforms.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    @Gigante This is an absolutely shitty attitude to bring on here. @pior happens to be an industry vet who knows his business and took time out to try and help you with your issue. Is it really necessary to behave lile an entitled child?
  • gnoop
    Offline / Send Message
    gnoop polycounter
    While many of what pior said is true  there might be  reasons  IMO  to do object space baking. 
    I don't use Blender 2.8,  it's still too crashy and unreliable. Besides I don't like the new interface while few new tools are cool indeed.    So as of Blender 2.79  it ignores edited (face weighted) normals  while baking with Cycles renderer.    Not sure about 2.8.  
    Also sometimes it's just easier to bake object space normals without hi-res  object at all  and imo less pain in the a.. with hard edge support loops etc.   You could keep you model non triangulated, edit it later . In a word I wouldn't say it's totally unreasonable workflow.

    All that said I haven't baked in Blender for pretty a while. My record in my Evernote base  I did years ago regarding the exact same issue  says you should set  Swizzle  to +X; +Z; -Y .  In that exact order ( right beneath space selector)   for default  obj export.    For fbx  it might be totally different order.  Anyway it's just a simple quest of finding right axis order with a test cube.
      You also could do the conversion in Mari Indie if you happen to have it (full Mari too) and Substance Designer.   it's better to render 32 bit exr as object space texture, I got better seams appearance. But Xnormal doesn't like  linear exr  so give it 16 bit png or something.

  • Gigante
    Offline / Send Message
    Gigante polycounter lvl 7
    musashidan I only responded in the same tone he responded to me, I find sad that this community finds that "veterans"  can do no wrong and talk down on people like that. As for taking time to help me with my issue, he didn't, in no part of his answer he helped me with my issue, he just told me how dumb I was being for trying to bake a texture and how clearly  I am thinking of it all backwards. I also find baffling how no one can be bothered to watch the video in its entirety to understand the workflow. I know this is not standard and not for everything, but I think is a good tool to have in your toolbox, and wanted to share it and bring it to blender.

    pior I tried to bake it on 2.79 had the same issue as 2.80. I took offense of how you answered in a condescending manner the first time but, be honest, did you watch the video in its entirety? Or did you just thought I am trying to bake an object normal and use that instead of a TS normal in UE4 and went with that without watching the video in its entirety? I added the video to add context on what I am trying to achieve so people would understand why part of the process is to create a object normal map that is not working for me in blender. Again the proposed workflow is to use less UV islands and save on vertex counts in UE4 or any engine really.

    I know I am not a veteran or come close to anything you guys have done or will probable come close to your skill level. All I ask is you guys to give fair treatment to everyone and read/watch the questions in its entirety to understand what I'm trying to do. Because you guys keep explaining why baking a OS normal map is not standard and I shouldn't use it, and yes you are right, this is not about that thou. Baking the OS normal is only part of the process. Please watch the full video and listen carefully.

    I think everyone no matter their skill level deserve respect and I was thought to treat people like they treat me. I just wanted to add a new tool to my toolbag and had trouble in that step. 

    AMajesticSeaFlapFlap  Thanks I will try that
  • Gigante
    Offline / Send Message
    Gigante polycounter lvl 7
    gnoop said:
    While many of what pior said is true  there might be  reasons  IMO  to do object space baking. 
    I don't use Blender 2.8,  it's still too crashy and unreliable. Besides I don't like the new interface while few new tools are cool indeed.    So as of Blender 2.79  it ignores edited (face weighted) normals  while baking with Cycles renderer.    Not sure about 2.8.  
    Also sometimes it's just easier to bake object space normals without hi-res  object at all  and imo less pain in the a.. with hard edge support loops etc.   You could keep you model non triangulated, edit it later . In a word I wouldn't say it's totally unreasonable workflow.

    All that said I haven't baked in Blender for pretty a while. My record in my Evernote base  I did years ago regarding the exact same issue  says you should set  Swizzle  to +X; +Z; -Y .  In that exact order ( right beneath space selector)   for default  obj export.    For fbx  it might be totally different order.  Anyway it's just a simple quest of finding right axis order with a test cube.
      You also could do the conversion in Mari Indie if you happen to have it (full Mari too) and Substance Designer.   it's better to render 32 bit exr as object space texture, I got better seams appearance. But Xnormal doesn't like  linear exr  so give it 16 bit png or something.

    Thank you this solved my problem!
  • Mink
    Offline / Send Message
    Mink polycounter lvl 5
    So there is absolutely no reason to be using an object space normal map with the two assets you've provided OP. Just bake a high to low tangent space model. Thats' it.
  • Gigante
    Offline / Send Message
    Gigante polycounter lvl 7
    Mink said:
    So there is absolutely no reason to be using an object space normal map with the two assets you've provided OP. Just bake a high to low tangent space model. Thats' it.
    Pleas watch the added video for context.
  • gnoop
    Offline / Send Message
    gnoop polycounter
    BTW , does Blender 2,8  able to bake  properly above edited face weighted normals now ?    Does somebody know?      It has a cool face weighting modifier last time I tried but still couldn't bake a proper tangent space normal map  doing  gradients everywhere.  

    I believe face weighted normals is a proper answer to  LOD mismatch issue
  • pior
    Online / Send Message
    pior grand marshal polycounter
    Yes, I did watch the video - and I will reiterate : beyond a mere curiosity about the math/conversion process (which is interesting in and of itself), or the edge case of having no access to a proper synced workflow in the first place (which is not your case), there are zero advantages to either the process nor the output. The gain is negligeable, and even *if* it was a huge gain ... it would be attainable without the conversion anyways.

    There is nothing snarky or condescending here. For the sake of clarity :

    - No, the output (mesh and nmap) in the linked video doesn't have any advantages over a version with "less agressive UVs" - neither performance-wise nor visually. And the cons (very bad behavior of such an asset when mipping down/LODing) far outweight the 10% gain in geometry - a point actually brought up in one of the comments of the video.

    - Also, since you do have access to a toolset that *does* follow the norms of all the major engines out there, you do not need the conversion at all to begin with. Try it for yourself : Bake a TS in Blender (even with  "very aggressive UVs"), then do the whole OS>TS conversion ; the results will be exactly the same down to the individual pixels of the map, and the two versions will look exactly the same when both displayed in an engine that follows proper norms in that regard.

    Now all that said I do understand that some of this may sound convoluted - because it really is, no doubt about it. If anything my advice would be for you to keep working on many diffeent assets while keeping an open mind - and without a doubt things will start to click, and you will naturally steer towards using the most appropriate workflows giving you both great results and great speeds. Just bookmark this thread for later. It'll be fine :)

    At the end of the day, I am of course glad that you found an answer to the issue you had. But I remain convinced that it is indeed very much appropriate to point out that the convoluted workflow discussed here is not something "better" or beneficial in any way, and might even make you produce assets that wouldn't pass review in a demanding environment. Let's just say this is me anticipating you potentially wasting your time on something you don't need and might actually hinder your progress. Whether or not this is perceived as condescending doesn't really matter :D

    @gnoop : Not sure, but it doesn't seem to even have the option to faceweight normals by default, at least not in the version of 2.8 I am currently running. I personally used to rely on an add-on for face weight normals 2.78 and did all the baking externally in TB3 anyways. And this seems like a safer option than baking directly in Blender  - the maps it writes are perfectly fine, but the UX cannot compete with the flexibility of TB3s as a baker.

    As for wether or not the baker now bypasss/ignores FWNs : I personally don't know, as I've personally never attempted that. Definitely valuable stuff to investigate and report.
  • gnoop
    Offline / Send Message
    gnoop polycounter

    @gnoop : Not sure, but it doesn't seem to even have the option to faceweight normals by default, at least not in the version of 2.8 I am currently running.
    I have just checked it in the build I downloaded a week ago.  It does do face weighted normals with its new super cool "weighted normals" modifier. I wish I'd have such one in 3dMax.        But weird thing is even if I apply this modifier to geometry, even if I export  to fbx and bring it back, still  Cycles bakes the normal map as it would be default vertex normals.

    So speaking about Blender 2.8  , if someone uses edited vertex normals and I believe it's kind of typical workflow now for many cases since even in U4 ( from what I have seen at least)  a tightly  compressed 8bit normal map  couldn't compensate vertex based shading gradient  with a 100% precision , synched or not.   Not in our custom engine for sure.  And also since 2.8 doesn't have legacy Blender renderer which does it right If I am not wrong.  So in that case baking to object space and further TS convertion is an only option for  Blender 2.8 user.
  • pior
    Online / Send Message
    pior grand marshal polycounter
    Well, it would be the only option ... if there wasn't any specialized baking tools out there :D

    Anyone involved with this stuff owed to themselves to have a license of TB3 - and from there all these issues become irrelevant. Did a fully FWN project about 2 years ago, modelled in Blender, baked in TB3 and with UE4 as its end target, and the question of baking in Blender didn't even come up since TB3 is such a superior tool for that anyways (in terms of user-friendliness, that is)
  • gnoop
    Offline / Send Message
    gnoop polycounter
    Agree, I don't bake in Blender either  but still use this object space to TS conversion pretty often.   Just because every 3d program out there  had always baked  world or object space normal maps without any extra  troubles long before mikT synched  workflow have surfaced  and to be honest it takes just a few clicks longer.     In Mari the conversion is real time. Just an adjustment layer.

     I tried TB once, shrugged and moved on . It couldn't do what I wanted.  Couldn't bake UDIM, couldn't bake super hi-res  300mil poly objects, couldn't bake special half AO, half direct  lighting.  Maybe I just didn't let it enough time to find all the cool stuff
  • Gigante
    Offline / Send Message
    Gigante polycounter lvl 7
    @pior "- No, the output (mesh and nmap) in the linked video doesn't have any advantages over a version with "less agressive UVs" - neither performance-wise nor visually. And the cons (very bad behavior of such an asset when mipping down/LODing) far outweight the 10% gain in geometry - a point actually brought up in one of the comments of the video."

    Yep you were right the resulting geometry is terrible and the LODs break horribly making this whole endeavor a big waste of time. Thanks for trying to save me time and effort.
  • pior
    Online / Send Message
    pior grand marshal polycounter
    Hi there man -

    Well, if anything it would be cool if you could post your results ! It would be a great way to wrap it all up, by illustrating the sometimes wide chasm that can exist between something that sounds like it would work great in theory, VS what happens in the practical context of game optimisation, portability to different platforms (hence various texture resolutions), and so on.

    And the silver lining is that ... knowing how to do this conversion back and forth is a useful skill to have anyways, as it may save one's back in production (very unlikely in order to create assets proper, but rather, as a way to salvage data if a source highpoly gets lost for instance).

    And you will likely need OS bakes to fake top-down data in some project too (useful for mobile art, texture weathering, and Substance graphs), so that's good as well.
  • WarrenM
    Did you ever figure this out?  I'm getting the same thing you're getting ... the object space normal map that comes out of Blender won't convert in xNormal or Substance Designer.  It's perplexing to say the least...
  • gnoop
    Offline / Send Message
    gnoop polycounter
    WarrenM said:
    the object space normal map that comes out of Blender won't convert in xNormal or Substance Designer.  It's perplexing to say the least...
     For converting in Xnormal  use  those
    RGB  +X +Z -Y

    It's for obj export from Blender . I would apply transforms too , just in case.

    No idea if  it works with fbx same way, probably needs different axis order.

    and if I am not wrong  X normal doesn't do it right with exr ,  some gamma troubles.  I did it with 16 bit png
  • rollin
    Offline / Send Message
    rollin polycounter
    Even if this thread is older there is one point not coming out clearly enough beside all already said above. imo. 

    You don't save on vertex count!
    All the support edges you add to your low poly to improve the shading are still necessary.
    If you convert the object space map back to tangent space bc you - again - need these edges / verts. 

    So if you do this workflow you end up with:
    - a low poly without any supporting edges and properly set hard edges and uv cuts
    - and a tangent space map

    Which is leading us to the point where you just as easily could have baked your model straight out to tangent space in the first place with the same bad results.

    So if you found this thread online, and scrolled down till here:  -> saving nothing!
  • gnoop
    Offline / Send Message
    gnoop polycounter
    It's just a convenient and universal way to do normal maps that works in literally  any 3d soft  without  necessity  of doing cages,  un-skewing details, etc . 

    Still you could perfectly cope without it in modern bakers .   Well, until your hi-ress is too huge  to be loaded in a baker .  A whole landscape for example
Sign In or Register to comment.