Home Technical Talk
The BRAWL² Tournament Challenge has been announced!

It starts May 12, and ends Oct 17. Let's see what you got!

https://polycount.com/discussion/237047/the-brawl²-tournament

Texture baking features?

polycounter lvl 10
Offline / Send Message
MediumSolid polycounter lvl 10
From your personal perspective and experience what baking features would you like to see in your software of choice, irrespective if it's Blender, Maya, Max, Substance Painter, Marmoset Toolbag etc. Would per object baking settings be something you'd like too see? For instance, say you have a hero character model with clothes on, and you'd like to bake a cavity map for the whole model, where the strength (for the lack of a better term) of cavity map would vary between apparel pieces, would that be something you'd need or like? Or is the traditional way good enough for you, where you bake color ID map for those individual pieces and use it as a selection mask in Substance Painter for trying to adjust cavity through levels modifier/filter?

Also, what are some of the drawbacks you most dislike when it comes to your current baking process?

Replies

  • Eric Chadwick
    One thing that comes to mind right away is the lack of support for baking passes with surfaces that have transparency. Especially troubling if you need to bake AO for vegetation or hair or whatever that's opacity-mapped. 

    Also the trouble I get with overlapped UVs. Often game assets have symmetry or reused parts, that can reuse the same UV space. To bake those, you typically have to move all the overlaps outside the (0,1) UV space, and you have to make sure whatever remains inside is not actually facing backwards (and thus invisible to most bakers). Example: http://wiki.polycount.com/wiki/Texture_Baking#UV_Coordinates It would be nice to not have to do that extra work, the baker should be smart enough to figure it out (only render one forward-facing bit of the overlapped UVs).

  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    One thing that comes to mind right away is the lack of support for baking passes with surfaces that have transparency. Especially troubling if you need to bake AO for vegetation or hair or whatever that's opacity-mapped. 

    Could you please elaborate on that, or provide me with an example, please? I have trouble understanding that. Are you talking about baking from actual high poly surfaces (presumably sculpted) such as a plant onto a low poly object that for instance containes planes/quads that are supposed to catch rays from the high poly leaves and the remainder should remain transparent and not black. Or are you talking about something else, like for instance where we have textures of leaves on planes/quads of our 3D plant model and we want to self-bake an ambient occlusion map where the transparent areas are excluded from baking?

    Also the trouble I get with overlapped UVs. Often game assets have symmetry or reused parts, that can reuse the same UV space. To bake those, you typically have to move all the overlaps outside the (0,1) UV space, and you have to make sure whatever remains inside is not actually facing backwards (and thus invisible to most bakers). Example: http://wiki.polycount.com/wiki/Texture_Baking#UV_Coordinates It would be nice to not have to do that extra work, the baker should be smart enough to figure it out (only render one forward-facing bit of the overlapped UVs).

    I get you here completely, it must be frustrating having to do that all the time when baking, and then placing the mirrored UVs back into (0, 1)-UV space in order to examine the results only to find out that not everything whet as expected and now you're confined to repeating the same process again and again until all baking mistakes have been properly addressed. It's a rinse and repeat process. However, I have to admit that I did not expect this, since I hold a long time belief that this method is a relic of the past, given how much hardware has improved over the past decade or so and that SP became integral part of every 3D artists pipeline way back in 2015.
  • Eric Chadwick
    If the highpoly has transparency, like glass windows, or foliage textured planes, then baking from that is problematic in most bakers… whether directly back onto the highpoly, or onto a lowpoly.

    Substance Painter doesn’t alter the need for UV offsets. This is an issue about UV space savings, and how intentional overlapping causes problems with baking.
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    Okay, I think I understand better now. For instance, say we have a car with windshields just like any other car, and we decide to bake the car's AO as is without consideration to remove them from the baking process. Naturally, they will contribute to AO baking process and depending on the AO's baking settings (i.e. here) as well as the size of the car in the world, it could potentially, if not most likely, blacken out the whole interior, which is the last thing we want. If that's the case, can't we just keyframe the glass elements in their "original" position, move them to a location where they won't contribute to AO, keyframe that on a different timeline step, bake, and return to previous keyframe thus resetting back to correct position. Sorry if these workaround steps don't make much sense, but this is how I'd do it in Blender.

    I realize the purpose and benefit of stacking mirrored UVs one on top of the other, it's just that I falsely I assumed that by doing so it creates problems in SP or even any texture painting software that uses materials, as most materials by default project their texture from a cube onto the 3D model with blending which creates "broken" textures unless you've imported mesh with mirrored parts removed. And I know that this is method of projection is changeable to UV one, but that mode of operation introduces unwanted seams which kinda defeats the purpose of texturing effectively in the 3D space in the first place.

    I woldn't be surprised if what I've just explained is plain out wrong, so I'm anticipating a harsh critic, especially after seeing your portfolio full of industry experience.  :s

    By the way, if you don't mind me asking, which software do you find yourself using for baking?
  • Eric Chadwick
    Currently using this one
    https://rapidpipeline.com/

    Of course, I work there, ha ha. But it is a pretty cool baker. When baking AO (and also if you want to remove hidden meshes) we simply ignore surfaces with transparent materials. 
  • gnoop
    Offline / Send Message
    gnoop sublime tool
    I still love ancient  style  baking cameras  in regular  3d renderers.   When any mesh could became a ray casting camera along its normals .   Octane  has a nice one  for example.  Arnold had one last time I checked .  Bakes alphas  , crypto-matte  or whatever custom AOV you  might want, perfectly looking GI ,  and usually full control  over rays  and what rays see and what don't .    Like  a fur  on a mesh , small scattered geometry or a whole landscape if you need  it.     Set up takes some time although.          Missing it in typical baking tools  a lot .    
    ps.  Clarisse FX had a perfect one  with ability to paint ray distance on a low poly mesh  instead of tweaking  cages.  
  • sacboi
    Offline / Send Message
    sacboi godlike master sticky
    For Blender 'match-by-name' high to low poly bakes, is basically a feature I'm wishing for albeit hoping the Foundation might think about eventually adding as a branch commit in terms of the latest build series 5.0 although plugins such as Ezy Baker or maybe Simple Bake as well, at present do make the experience somewhat less of a grind but still imo with limited flexibility, when compared too proprietary DCC solutions currently available. 
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    For Blender 'match-by-name' high to low poly bakes
    Is it so that you could isolate certain objects from one another during baking contributions? In other words, you'd like to avoid bake-bleading if that makes sense. Something like this?
  • sacboi
    Offline / Send Message
    sacboi godlike master sticky
    Correct, I had meant via standard naming conventions as opposed to utilising an 'exploded' baked option and as referred too earlier have tested couple of scripts/addons for instance EZ Baker at this point have to say quite a functional little app in which it's UX does offer the choice between native Cycles or bridging bakers which in this case either Marmoset or  Handplane I'm more familiar with. 
  • iam717
    Offline / Send Message
    iam717 sublime tool
    Also, what are some of the drawbacks you most dislike when it comes to your current baking process?

    i am not hip to the new stuff and use what works, however if i had to try to say anything i'd like its:
    dislike not being able to see live previews for any and every map so some sort of live update window to see the results or get to play with results to "test" looks and experiment with things, perhaps even (if it is not grey scaled/black & white img) be able to also go "deeper", into the R,G,B, channels and get to play with those with live updates.

    I'd say that would be a "new" thing and if it isn't someone put me onto what does this cause yeah, i'd like this.
    Oh yeah have to add this now-a-days that is NOT online, sadly if there is an online one i will not be using it.


  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    @sacboi
    I've never used any of the addon bakers that Blender's community provides, so I have to ask if the problem with exploded baked option is that it's not automatic and has to be done manually or if it's destructive in sense that separates highpoly objects that should be baked together. I don't have any visual examples at my hand right now but I'll think I'll be able to explain in words nicely. Say we have a high poly keyboard with rectractable legs and obviously numerous keys. We plan on baking the base keyboard along with the keys onto a single low poly mesh objects and high poly legs onto their own low poly representation, naturally this requires us to explode the keyboard in order to avoid bake-bleeding. However, the problem now arrises with the fact that our keys have also been exploded as they don't have the same origin as the large low poly base keyboard mesh and thus they won't be captured by it during the baking process. Or is the problem in something else?

    And what's your opinion on the skew-offset normals, would inclusion of that in Blender be a big benefit?

    And how would you feel about a Blender addon that non-destructively automates most of the aformentioned work with UI friendly fine-tune controls with a downside of being a bit slower in baking in comparison to marmoset toolbag. To be clear I'm not offering anything at the moment, I'm just curious.
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    @iam717
    So if I got this correctly, you essentially want a form of post processing for baked textures? If that's the case, I'm curious to know for what bake types and exact benefit? Would channel packing and compression(i.e. Normal xyz to Normal zy) also be something of a value to you?

    Not to stray too much off topic but I've personaly written a semi-practical add-on for blender that amongst other bake types bakes an AO map that automatically puts it in a postprocessing stage where you can control the smoothness between huge height discrepencies. Hope I'm not breaking any rules or TOS' here but here's a link to it on blenderartists, it's free of charge.
  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    If the highpoly has transparency, like glass windows, or foliage textured planes, then baking from that is problematic in most bakers… whether directly back onto the highpoly, or onto a lowpoly.

    Substance Painter doesn’t alter the need for UV offsets. This is an issue about UV space savings, and how intentional overlapping causes problems with baking.

    to add.. 
    it's not just about alpha-blended surfaces (those are hard anyway)
    Proper support for alpha-clip / alpha-test on the source mesh is a really useful thing that very few bakers have. 
    ie. The ability to render through clipped pixels in the materials on the source mesh and capture underlying opaque pixels. 
    Use cases include baking foliage cards from arrangements of alpha-cilpped leaf meshes  and opaque branch meshes - or baking hair cards down to a surface. 

    What I'd really like is to be able to push arbitrary data  from the source object's materials onto the target object (essentially, let me write a shader and you bake the result to the target) but that's probably pushing it :D 


  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    @poopipe
    Would it be correct to assume that what you're trying to describe is essentialy backing LOD's from folliage where the lowest LOD has a bilboard for each leave (also has trunk and branch meshes) and each successive LOD's incorporates an even large bilboard that encompases more leaves, all the way until we're left with only two large bilboards that essentially capture the whole tree data from Y and X axis?

    poopipe said:
    (essentially, let me write a shader and you bake the result to the target) but that's probably pushing it :D 
    Well... I believe that Blender already kinda allows for that. But I wouldn't be surprised if you're using another software, since the most common way of creating shaders in Blender is by constructing them via shader nodes and not by writting (even though it's possible via OSL shaders) and you specifically wrote "write". Could you kindly provide an example how and where this would be beneficial?
  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    yes - although the lod chain part doesn't bother me personally since both simplygon and speedtree cope well enough with lodding/billboard generation.  im more interested in baking the high-lod leaf atlases

    and - to the shader question. 
    the ability to gather any data I might happen to need from a mesh, composite and transfer that as a texture from a source mesh to a target mesh means that I can reformat data into something useful for my target engine
    eg. certain tools pack data into UV sets and vertex color and I might want to gather that, mess with it and bake it to a texture 
    Where I work we write a fair amount of custom tooling that runs as part of our build process to manipulate data in this way which works great but it's hard to debug for artists as the output is not readily available to for viewing. 

    if we're talking about blender and you already plan to support 'any' material then it's likely that it'll cover most of my potential use cases - OSL is ok and blenders node based tools seem to be fairly well featured

  • sacboi
    Offline / Send Message
    sacboi godlike master sticky
    I've never used any of the addon bakers that Blender's community provides, so I have to ask if the problem with exploded baked option is that it's not automatic and has to be done manually or if it's destructive in sense that separates highpoly objects that should be baked together...
    As an aside I mostly use TexTools for Blender anyway and since it's introduction I don't bake without a custom cage.

    "Or is the problem in something else?"
    Primary issue as I mentioned in my initial response. Rather than rely on plugins authored by an enthused userbase I'm nonetheless very thankful of for streamlining an otherwise convoluted process, is to enable artists to effectively work natively due too Blender's version update frequency ones addon library can break, when a dev ceases maintenance. 

    "And what's your opinion on the skew-offset normals, would inclusion of that in Blender be a big benefit?"
    Not sure what you mean?

    "And how would you feel about a Blender addon that non-destructively automates most of the aformentioned work with UI friendly fine-tune controls"
    I work from a laptop and textools for the time being at least to extent resolves my personal workflow as a hard surface modeler by using Blender 3.6.1 (I've found is fairly stable) but tbh if push comes to shove, 2016 3ds Max would be DCC app of choice.
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    sacboi said:
    "And what's your opinion on the skew-offset normals, would inclusion of that in Blender be a big benefit?"
    Not sure what you mean?
    I'm mainly talking about skew-normal correction that only, at least to my knoweldge, Marmoset Toolbag has. It eliminates the need for creating support loops in your low poly mesh that not only reduces polycount but also saves on time. I know that the conses is that modern GPU's can handle vast amount of triangles, but that's not exactly the full story, there's a thing called overdraw, and limiting the amount of triangles in a scene is very beneficial.
  • sacboi
    Offline / Send Message
    sacboi godlike master sticky
    Ah got it, skewing errors albeit also important to note how 'waviness' may arise without understanding averaged normals and ray projection.

    I might also add this Blender workaround as well and afaik Cycles engine, a path tracer is CPU reliant.
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    @sacboi
    I've seen the workaround video before, and it's quite tedious, requires additional (unnecessary) geometry and thus incredibly time consuming. This could be "easily" circumvented by procedurally generating additional geometry and carefully capturing/storing the custom split normals from the original mesh and standard normals around its edges. This way you'd be able to bake any map without skewing errors that can be displayed on your original low poly mesh, yes... the one before adding additional geometry. I know that what I wrote doesn't make much sense, and hopefully in couple of months I'll finish the algorithm that does exactly that and I'll demonstrate.

    As for "waviness", I'm not too sure about solution for it but I believe that it could also be addressed in a similar manner with a bit more extra steps introduced in the aforementioned "algorithm".
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    poopipe said:
    yes - although the lod chain part doesn't bother me personally since both simplygon and speedtree cope well enough with lodding/billboard generation.  im more interested in baking the high-lod leaf atlases

    So I assume the source object(leaf) is an actual high poly mesh and during AO bake where the rays (projected from the quad/bilboard) happen to miss we're left a black color instead of transparency, and we can't write a shader that uses black parts of the leaf texture as a transparency/opacity factor since some occluded parts of the leaf are black, is that the issue? Because I always thought that's why we bake a simple emission mask, so that we could discard unnecessary pixels, but I'm guessing the idea is to avoid baking additional textures.
    and - to the shader question. 
    the ability to gather any data I might happen to need from a mesh, composite and transfer that as a texture from a source mesh to a target mesh means that I can reformat data into something useful for my target engine
    eg. certain tools pack data into UV sets and vertex color and I might want to gather that, mess with it and bake it to a texture 
    Where I work we write a fair amount of custom tooling that runs as part of our build process to manipulate data in this way which works great but it's hard to debug for artists as the output is not readily available to for viewing. 
    That's really interesting, but I'm struggling how that would be useful, since with my limited knoweldge and experience I always assumed that it's better to  simply export data stored in vertices as is and do whatever manipulation you want to do in your shader during run time, instead of baking them to a texture and "waste" precious resources.
    if we're talking about blender and you already plan to support 'any' material then it's likely that it'll cover most of my potential use cases - OSL is ok and blenders node based tools seem to be fairly well featured

    Well, I'm no expert when it comes to blender's shader nodes but I've noticed that they lack some fundamental stuff such as partial derivatives and that they're object bound shader nodes which means you can't perform any operations on the image output, for viewport display reasons, ie. outline effect. Thus I find it fairly limited, great for prorotyping, though.

  • iam717
    Offline / Send Message
    iam717 sublime tool
    MediumSolid said:  @iam717
    "a form of post processing for baked textures?"  YES

    "for what bake types?"  Is it possible for Combined & Individual, bake types?

    "exact benefit?"  Experimental Testing, purposes.  With hopeful usefulness in production, "new styles", kind of things.

    "Would channel packing and compression(i.e. Normal xyz to Normal zy) also be something of a value to you?"
    Me & others might find an interest, yes.

    "  I've personally written a semi-practical add-on for blender that amongst other bake types bakes an AO map that automatically puts it in a post-processing stage where you can control the smoothness between huge height discrepancies. Hope I'm not breaking any rules or TOS' here but here's a link to it on blenderartists, it's free of charge.  "

    1st post quoting added:
    "  Would per object baking settings be something you'd like too see? 

    you'd like to bake a cavity map for the whole model, where the strength (for the lack of a better term) of cavity map would vary between apparel pieces, would that be something you'd need or like?  "

    Sounds good to me.
    ^inside of the quoting i address the questions with answers in color.^

    Grabbed the AO doc to review, currently haven't looked at it & still working my way through blender.
    Alternatively for the record the apps i play with when baking :
    Max, SubsD, SubsP, Xnormal,
    More subsD than max, xnormal last resort kind of thing, thought usually shows me "issues", so its a neat tool for me to figure out whats wrong with the model, I've run into one issue where it helped.

    Just rant no real need to address : 
    // Tried to do it with Marmoset but it never works so i just bail on that one.  
    It is probably a me thing but then i also think (for me) it isn't "easy", just put your cage, low & high together & press go.
    I get black renderings, again its probably a me thing but i can't be bothered, so just go to what works.  
    I'll figure it out one day.




  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    So I assume the source object(leaf) is an actual high poly mesh and during AO bake where the rays (projected from the quad/bilboard) happen to miss we're left a black color instead of transparency, and we can't write a shader that uses black parts of the leaf texture as a transparency/opacity factor since some occluded parts of the leaf are black, is that the issue? Because I always thought that's why we bake a simple emission mask, so that we could discard unnecessary pixels, but I'm guessing the idea is to avoid baking additional textures.
    the source object would be an arrangement of opaque branches with individual leaves represented as 'planes' with transparent texture applied - these will overlap with each other / the branch and in most cases when you bake the result to a plane the transparency is not respected . 
    S = source 
    A = what you get 
    B = what you want


    That's really interesting, but I'm struggling how that would be useful, since with my limited knoweldge and experience I always assumed that it's better to  simply export data stored in vertices as is and do whatever manipulation you want to do in your shader during run time, instead of baking them to a texture and "waste" precious resources.
    'waste' is dependent on use case and available resources.  its not uncommon to run out of channels to store data in vertex color and its generally true that texture data can be more efficiently streamed than mesh data. 
    admittedly this is a pretty niche area and use cases are highly context dependent :) 
  • Celosia
    Online / Send Message
    Celosia keyframe
    Not to stray too much off topic but I've personaly written a semi-practical add-on for blender that amongst other bake types bakes an AO map that automatically puts it in a postprocessing stage where you can control the smoothness between huge height discrepencies. Hope I'm not breaking any rules or TOS' here but here's a link to it on blenderartists, it's free of charge.
    At the risk of sounding like an ass, this is almost an add-on but it can't be used like one because it's not structured like one. I turned the general baking panel into one to quickly test it, feel free to @ at me if you want a copy. I can't guarantee everything is working because I didn't try out all features yet.

    I'm not sure how the post processing from general_baking.py is supposed to work, if the user would have to rebake every time for it to apply or what, I couldn't get it to do anything, but the basic height baking seems solid so far. On Blender you usually have to resort to pointness or AO or GN to get something similar, and the first method has banding while the second can be awfully slow. Yours baked quickly on a small test object. I'm pretty curious to see how it fares on heavy organic meshes and will try with one later. It'd be very useful to have a way to bake quality cavity/convexity maps directly in Blender!

    I also really like the high and low poly lists. I didn't get around baking multiple objects yet, but it's pretty handy and intuitive. Nice work!

    Edit: Oh, I see why the pp didn't work! It's using a method for OpenGL and I'm using Vulkan. It's also a really cool feature, made even better by updating immediately so you get to see it affect the material in real time. That's great when texturing!
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    poopipe said:

    the source object would be an arrangement of opaque branches with individual leaves represented as 'planes' with transparent texture applied - these will overlap with each other / the branch and in most cases when you bake the result to a plane the transparency is not respected . 
    S = source 
    A = what you get 
    B = what you want


    That's really interesting, but I'm struggling how that would be useful, since with my limited knoweldge and experience I always assumed that it's better to  simply export data stored in vertices as is and do whatever manipulation you want to do in your shader during run time, instead of baking them to a texture and "waste" precious resources.
    'waste' is dependent on use case and available resources.  its not uncommon to run out of channels to store data in vertex color and its generally true that texture data can be more efficiently streamed than mesh data. 
    admittedly this is a pretty niche area and use cases are highly context dependent :) 
    Thank you for the precise explanation, I really appreciate it.
    I just can't believe I forgot about the limitation of number of attributes per vertex, now it's clear to me how "baking" those attributes could be useful. However, would it be correct to assume that you're talking about data extraction from low poly object itself and thus it would be most efficient to "bake" those values to a 1D texture where we can later access it via vertex ID's in the vertex shader and thus save on texture space but also keep the interpolation of those values between vertices?
    Of couse it goes without saying that this is only valid as long as our shader language of choice allows for texture reading in the vertex shader in the first place, as thankfully most do nowadays. I'm asking all of this because if that's the case (and just be clear again), then technically we wouldn't need to actually bake anything all, but just iterate through vertices in ascending order and write that specific attribute data into 1D texture via scripting.
  • MediumSolid
    Offline / Send Message
    MediumSolid polycounter lvl 10
    @Celosia
    Celosia said:
    At the risk of sounding like an ass, this is almost an add-on but it can't be used like one because it's not structured like one. I turned the general baking panel into one to quickly test it, feel free to @ at me if you want a copy. I can't guarantee everything is working because I didn't try out all features yet.

    No worries, you don't sound pretentious at all. You're quite right and I agree, it's not an add-on one bit, it's my fault that I called it an add-on by mistake in this thread. And if you refer back to the blender artists page you'll see that it's not posted in blender's released scripts and theme but in blender's testing thread section and never once did I originally call it an addon but rather a script since I'm aware that it's not that, again.. you and I agree on this one 100%.

    I'm not sure how the post processing from general_baking.py is supposed to work, if the user would have to rebake every time for it to apply or what, I couldn't get it to do anything, but the basic height baking seems solid so far. On Blender you usually have to resort to pointness or AO or GN to get something similar, and the first method has banding while the second can be awfully slow. Yours baked quickly on a small test object. I'm pretty curious to see how it fares on heavy organic meshes and will try with one later. It'd be very useful to have a way to bake quality cavity/convexity maps directly in Blender!

    I know you've already figured it out, but I'll just explain it for the sake of explaining, so no need to read this.
    The post processing does not require rebaking every time, you just have to place a grayscale texture in the post-processing image slot and the changes performed on it are real time. Sadly it's a bit slow when it comes to textures larger than 2048x2048, then again that depends on your device.
    The simplistic explanation to how height map construction works is as follows, one world position map for both high and low poly is baked, and a normal map for low poly, from these three textures a height map is extrapolated by using most basic arithmetic operations and then offset and clamp operations are applied so that the range is mapped between 0.0 and 1.0 thus not leaving any unused bit space. So while it's not the slowest, it's definitely not the fastes. I could theoretically gain on height map speed construction by instead of using python's numpy methods (CPU bound) switch to GLSL (GPU bound).
    I'll see what I can do about cavity maps in blender, but as far as I know cavity is more or less AO with a really small sample radius so while possible to create in Blender it's going to run really slow as it will be CPU bound, but that's just my guess. I'm going to have to explore that deeper. Convexity has already been developed and with a fine-tune control preview per each high poly object!
  • Celosia
    Online / Send Message
    Celosia keyframe
    Thanks for the explanation! I've done something similar using geometry nodes in the past and I assure that one is the slowest way to do it! Your baker was pretty fast considering the other options to get height maps in Blender.
    I'll see what I can do about cavity maps in blender, but as far as I know cavity is more or less AO with a really small sample radius so while possible to create in Blender it's going to run really slow as it will be CPU bound, but that's just my guess. I'm going to have to explore that deeper. Convexity has already been developed and with a fine-tune control preview per each high poly object!
    It could be that I just don't know how to get a good AO-derived cavity map in Blender, but even after a lot of trial and error it's the worst way to get maps of tiny details like pores and wrinkles. It's the slowest, noisiest method with abysmal results that either accidentally picks up larger surface features or ignores smaller ones when not both at the same time:



    I've been using pointiness which also isn't great because it's still a bit slow for dense geometry and you're forced to use workarounds to avoid a couple of serious float precision limit glitches. Although it's a shader it's derived from the geometry, so geo resolution has direct impact on the final quality.



    A streamlined equivalent that doesn't use 10's of GB of RAM to bake would be very welcome. Unfortunately I think the memory-eating sluggishness is just an inherent limitation of Blender's baking. It's so inefficient.
  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    MediumSolid said:

    Thank you for the precise explanation, I really appreciate it.
    I just can't believe I forgot about the limitation of number of attributes per vertex, now it's clear to me how "baking" those attributes could be useful. However, would it be correct to assume that you're talking about data extraction from low poly object itself and thus it would be most efficient to "bake" those values to a 1D texture where we can later access it via vertex ID's in the vertex shader and thus save on texture space but also keep the interpolation of those values between vertices?

    not just that no -
    I've not done a great job of explaining myself :) 

    there are a number of use cases 

    As a high-res to low-res transfer example - and sticking with trees:
    in your source model (the picture i drew before)
    You can use speedtree to assign vertex color values by index to leaves/branches, you can also apply a gradient running from base to tip of branches etc.  all of which is very useful information for seasonality, plant degradation etc. 
    This can be baked to your target mesh (a plane) and used to control masking in runtime shaders that handle seasonality, degradation etc.
    it isn't be possible to encode this data as vertex color into the low-res because there aren't enough vertices. 

    The benefit of being able to run a shader on the source mesh is that I would be able to manipulate
    the values at bake time - eg. index into a noise function to randomise the indices, composite channels together etc. etc. 
    I generally do the sort of manipulation I describe here as a post process step by scripting substance designer but that adds another link in the chain and links are where problems occur. 

    For low-res to low-res transfer there are lots of use cases but the most common would probably be taking a complex shader and baking it down to something simpler. eg. if you're doing a switch port of a console title or you want to aggregate a bunch of meshes to generate a lod or background asset.
    There are very few tools that allow you to do this sort of thing with arbitrary shader code and even fewer that aren't a colossal ballache to operate ( looking at you simplygon ...)



  • gnoop
    Offline / Send Message
    gnoop sublime tool
    I recall  around 2017  3dmax to Arnold  plugin had baking camera  that could bake whatever you want , including custom AOVs , a whole set of Arnold utilities and through clip textures for alpha leaves.   I used  it to render small scattered geometry  to surface.   Or maybe  It was in Arnold standalone.   Can't  find a trace of it now.  

    ps.  Or perhaps  it was Maya. 
    ps2. Should be possible in Gaffer using Arnold render probably but i never tried it there.


     
     
Sign In or Register to comment.