Home Technical Talk

[Blender 2.8] Bevel shader normal map baking artifacts on flat surfaces

polycounter lvl 10
Offline / Send Message
sqrRedwend polycounter lvl 10
Hello, everyone.

I decided to try out Blender 2.8 bevel shader for Low-poly normal baking. While the results seem to be quite impressive, I get some artifacts, mostly on flat surfaces, which are seemingly caused by High-poly topology.

Here is how High-poly looks without bevel shader applied:



Here is the Low-poly:



Here is the Low-poly with baked normal map (Baked from HP with bevel shader) applied:


As you can see, there is these random lines showing up for some reason.


Is this a known bug with the bevel shader? Or is there something else that I'm missing?
Also, baking normal map using same models but without applying bevel shader does not result in aforementioned artifacts.

Here is the Low-poly with normal map baked from High-poly without bevel shader applied:



Any help is greatly appreciated. 

Replies

  • V!nc3r
    Offline / Send Message
    V!nc3r polycounter lvl 8
    You didn't show us the highpoly with bevel modifier, but I suspect it's just its topology which is wrong (artefacts looks like ngon-not-really-flat). Also your normal map resolution seems low.
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    V!nc3r said:
    You didn't show us the highpoly with bevel modifier, but I suspect it's just its topology which is wrong (artefacts looks like ngon-not-really-flat). Also your normal map resolution seems low.
    Please keep in mind that I'm using bevel shader (https://docs.blender.org/manual/en/latest/render/shader_nodes/input/bevel.html), not a bevel modifier, i.e. it does not affect High-poly topology at all. The topology is actually a mess, but I believe it should not matter as long as the end result looks good, since it is just for baking purposes. 

    In fact, Cycles render does not produce these artifacts when rendering just the High-poly. It seems that only baking is affected. 

    Here is the Cycles render of the High-poly with bevel shader applied:


    P.S. Initial screenshots are just a test bake, normal map resolution was set to 1024x1024px, but just for reference I tried baking 4k 32bit float map and the issue was still there.
  • V!nc3r
    Offline / Send Message
    V!nc3r polycounter lvl 8
    OK, I had misread. Probably this usecase is not suitable for bevel shader, I suppose it's camera dependant, and texture baking doesn't understand how to render it?
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    Just in case anyone wants to give it a try, here is the meshes in obj. 
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    V!nc3r said:
    OK, I had misread. Probably this usecase is not suitable for bevel shader, I suppose it's camera dependant, and texture baking doesn't understand how to render it?
    I'm sure this is not the case since the edges themselves turn out to be pretty good, it is just that these strange artifacts that ruins it for me. It is definitely not camera dependent and intended to work with baking as evidenced by developer discussions on bevel shader feature here: https://developer.blender.org/D2803

    Seems like a bug to me honestly, or maybe some precision issue?

    Still, I'd want to know if anyone found a way to avoid these issues without fiddling with High-poly topology.
  • Justo
    Offline / Send Message
    Justo polycounter
    @musashidan

    It is said weird people from Ireland might know of this 
  • pior
    Offline / Send Message
    pior grand marshal polycounter
    Bevel shader or not, this highpoly is really asking for trouble : 



    And the low is no different :



    I understand the sentiment of "if it looks good in a render, then it should bake okay" but here this is so extreme that it makes the test basically pointless imho.

    The artefacts you are getting have basically nothing to do with the round edge shader, and everything to do with the models themselves.
  • Justo
    Offline / Send Message
    Justo polycounter
    Yeah, that might be a good starting point ha...
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    pior said:
    Bevel shader or not, this highpoly is really asking for trouble : 



    And the low is no different :



    I understand the sentiment of "if it looks good in a render, then it should bake okay" but here this is so extreme that it makes the test basically pointless imho.

    The artefacts you are getting have basically nothing to do with the round edge shader, and everything to do with the models themselves.
    While I know that High-poly topology here is really messy, I don't really understand why it should affect the bake in this case. It makes sense that this might "really ask for trouble", yet there seems to be none when baking without bevel shader. Here is a quote from initial post: 
    Here is the Low-poly with normal map baked from High-poly without bevel shader applied:



    Any help is greatly appreciated. 
    So, viewport renderer as well as Cycles are both able to render individual models without an issue but fail when trying to bake? I don't know, but to me it doesn't sound right...


    Another thing, I forgot to mention that I needed the normals to be constructed from smoothing groups, so what I did is right after importing I clicked "Clear Custom Split Normals Data" (under Object Data -> Geometry Data) both for HP and LP models. I'm unsure whether it is the correct way to do so in Blender, but without this it seemed like the whole model was smoothed which obviously resulted in horrible shading. Anything wrong with these steps maybe?

  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    Tried baking this in Max (same models but without bevel shader obviously). Came out just fine:
    model
    :/
  • V!nc3r
    Offline / Send Message
    V!nc3r polycounter lvl 8
    Just fix your topology.


    The bevel modifier is a faster option when it works, but sometimes fails on complex or messy geometry.


  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    V!nc3r said:
    Just fix your topology.


    The bevel modifier is a faster option when it works, but sometimes fails on complex or messy geometry.


    Once again, you are confusing bevel modifier with the bevel shader. I am not using bevel modifier here. Your quote from docs is pulled out of context, and originally implies that bevel shader is actually might be a good alternative when bevel modifier fails with the downside of slower renders. 

    Here is what it says:

    Note that this is a very expensive shader, and may slow down renders by 20% even if there is a lot of other complexity in the scene. For that reason, we suggest to mainly use this for baking or still frame renders where render time is not as much of an issue. The bevel modifier is a faster option when it works, but sometimes fails on complex or messy geometry.

  • V!nc3r
    Offline / Send Message
    V!nc3r polycounter lvl 8
    You're right, 'haven't notice this paragraph was talking about modifier. But I still think that the simpler solution is to get a cleaned topology. A lot of very sharp edges angles, even on a flat surface, could probably get rid of bevel algorythm.
  • pior
    Offline / Send Message
    pior grand marshal polycounter
    I understand that it seems confusing that you get a clean surface on that side panel when baking from your source without the shader, yet getting some artefacts when baking with the added bevel shader. But I still maintain that this test is hardly relevant anyways, because it isn't representative of the kind of lowpoly you'll likely want to be working with ultimately.

    And even if you *do* plan to bake from such highs to such lows for a pipeline reason (CAD/Nurbs to game engine), going bruteforce into it at this time won't give you the full picture - hence this thread.

    It would make much more sense to first try to bake from this messy high down to a clean, manually crafted low. It may not give you the whole be-all-end-all solution to your inquiry but at the very least by doing so you'd be manipulating only one variable at a time (messy high + clean low), as opposed to two variables like you are doing now (messy high + messy low). One variable at a time is the only way to debug an issue.

    Without doing this test (clean low) for now you'll have no way to know if the artefacts are caused by either the low or the high since you can't isolate either. Which is precisely why you are not getting anywhere near to an answer at this time :)

    On top of it all, clean lows will end up being a requirement for performance anyways if your goal is an interactive realtime presentation. So by *not* doing a clean low test, you might be setting yourself up for later problems too.
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    pior said:
    And even if you *do* plan to bake from such highs to such lows for a pipeline reason (CAD/Nurbs to game engine), going bruteforce into it at this time won't give you the full picture - hence this thread.
    Well, I'd say that it is completely opposite here. This is a test, so of course I want to "bruteforce"  it to see what this technique is capable of rather that relying on some simplified specific case that works fine. I would argue that this is very far from worst case (High-poly geo wise).

    To make myself a bit more clear, yes, these artifacts seem to be related to High-poly geo, I tried cleaning it up a bit and while it helps in some areas it introduces same artifacts in others. Since the root cause of this is unclear to me, If anything, cleaning up Hight-poly seems unreliable. I mean, what changes exactly should I make? This is not to mention that extensive cleanup is somewhat defying the intention of this workflow.

    I really appreciate your input, but "Your geo is bad, hence the problems" kind of response doesn't help much. If you are sure that this particular geo is the problem, could you please elaborate on how exactly does it affect the baking process in a way that causes these artifacts?

    To sum it up one more time, Low-poly renders without artifacts on a completely flat surfaces, same as High-poly with bevel shader applied. If my understanding of general raytraced baking is correct, the expected result here is a flat colored normal map in these areas. So, to me it seems like either I'm doing something wrong when setting up a bake in blender or there is something wrong with the blender's baking process itself.

    Also, I updated Sketchfab scene with a comparison between different bakes, as well as cleaning up low-poly (it makes no difference tho as expected): 


  • FrankPolygon
    Offline / Send Message
    FrankPolygon grand marshal polycounter
    After testing the OBJ files in this thread:
    Clean low poly mesh the baked normals still had artifacts.
    Clean high poly mesh and the baked normals looked fine.

    The long thin triangles on the high poly were causing artifacts when baking but not when rendering in cycles. Increasing the samples improved the quality of the curved surfaces but didn't remove artifacts on the flat surfaces.

    To clean up the high poly I tried a couple of things but the quickest solution that provided an acceptable result was the remesh modifier on the high poly. Here's the bake result using the unmodified low poly and a remeshed high poly.



    A few options are:
    Planar decimate modifier delimited to Sharp edges.
    Collapse decimate modifier with an edge split modifier above it in the stack.
    Remesh modifier or remesh the object in Zbrush.

    The bevel shader seems to struggle with long thin triangles. Using the decimation modifier, the mesh density needs to be fairly even. The sides cleaned up OK but the oval divot was much denser than the surrounding geometry and it left a lot of long thin triangles that artifacted.

    Remeshing natively or in Zbrush provided the best results and is in line with what other artists are doing in their boolean or CAD workflows. The downside is remeshing adds another step and Blender can slow down or become unstable if you have a lot of dense geometry with remesh modifiers.

    Overall, I can understand the expectation that things should "just work" but I have to agree with pior and V!nc3r: sub-optimal geometry is already causing problems and is going to be a bigger pain down the line. It's my opinion that the bevel shader works better with less geometry and native boolean operations.

    The optimal workflow here is probably somewhere between what Ben Bolton does with booleans in Max / remeshing in Zbrush and what Tor Frick does with bevel shading in MODO. You'll have to run some tests to see what works best for what you're trying to do.




  • pior
    Offline / Send Message
    pior grand marshal polycounter
    At the end of the day it's pretty simple really.

    For your high : you need to find a way to trick your CAD program so that it doesn't output such extreme triangle fans. The fact that the model renders fine without the bevel shader is *irrelevant* since you know already you want to use it. So you need to either tweak your exporter settings, or maybe even simply slice the object a few times along its length to provide more edges. Or who knows - maybe the round edge shader will get an update addressing this extreme case. But betting on the future is always risky.

    For your low : regardless of the results you manage to get out of this specific test, I can guarantee you that relying on this kind of extreme fan geometry will cause issues at some point way beyond the scope of this test. This is the cornerstone of realtime art :  clean, carefully crafted geometry is always the winner. I understand the sentiment of wanting to cover all your bases with this test and figuring out everything out from it, questioning everything. But at some point it's also important to rely on the good practices developed by veteran artists over years (decades) or working with this kind of stuff. While it wouldn't really matter for still renders, it matters a whole bunch for realtime content.

    Also, while admittedly being a good worst case scenario in the case of this simplistic shape, this test is far from covering every possible shape you'll likely end up building and processing - so who knows what other surprises you'll run into with that CAD exporter. Better playing safe imho by at the very least spending the few minutes needed to craft the low carefully.
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    @FrankPolygon
    Thanks for trying this out!

    Well, full disclosure, I'm actually using Ben's workflow with ZBrush step replaced with Blender bevel shader approach here ;)  I'm familiar with Remesh modifier, and yes, I've managed to achieve great results with it. But as you said, It can have some problems when using on particular types of objects however. Bevel shader seems like really promising alternative. 

    Overall, I can understand the expectation that things should "just work" but I have to agree with pior and V!nc3r: sub-optimal geometry is already causing problems and is going to be a bigger pain down the line. It's my opinion that the bevel shader works better with less geometry and native boolean operations.

    Well, I don't expect it to "just work". As you have noticed yourself, it handles this geometry without any troubles when rendering just High-poly with bevel shader applied. I would not have any questions otherwise.

    P.S. Now when you mentioned it, I'm curious to see MODO results with this same geo...
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    @pior
    Let's get this out of the way :)  The single reason why I did not spend the time to clean low-poly in the first place is because I believe there is nothing "bad" with it that could potentially affect the bake. I could be wrong tho, but the test show the opposite. Why would I care about the issues that I might get "at some point way beyond the scope of this test"?  I'm not intending to use such geo in production obviously, nor do I disregard good practices developed by veteran artists over years, they are just not relevant here.  There was no plan to cover all the bases, as it was just a quick try. But if there are problems already, what is the point of trying to test further with more complex geo?

    As for the High-poly I think you are missing the point here, it renders perfectly with bevel shader applied outside of baking scenario. Also I already mentioned that I tried cleaning it up and while it helps, it does not eliminate the problem completely, at least in my case.


    pior said:
    At the end of the day it's pretty simple really.
    It sure is :) 
    My question can be brought down to "Is this behavior of baking in Blender expected? And if not, what can possibly be done to avoid the issues without modification of High-poly geo". I hope this makes it more clear.

  • FrankPolygon
    Offline / Send Message
    FrankPolygon grand marshal polycounter
    There's really two fundamental issues here: cycles normal baking isn't as accurate (reliable? developed?) as the render side of things and long thin triangles are difficult to calculate accurately. The source will render correctly but it won't always bake correctly. Is it a bug? Maybe but it could also be that baking with the bevel node in cycles isn't all that accurate. So it's a case of a little of column A and a little of column B.

    Adding a planar decimation modifier to the source model turns the triangle fans into Ngons. This helps but the density miss match between curved and flat surfaces means you're still going to have residual issues when it re-triangulates. The bevel shader is viable but it needs cleaner, less dense source geometry to work reliably.

    Below is about as good as it gets running a decimation modifier on the source. Of course this breaks the no modification requirement (not as badly as using a remesh modifier) and it still has some minor artifacts.



    If you have the boolean mesh objects and can export all the pieces I'd be interested in seeing if native boolean functions in Blender work better with the bevel shader.

    *Edit*
    After some more experimentation: less mesh density seems to work better but enabling the bevel shader still causes the baker to miss rays on the edges of long triangles. This may be a bug or an accuracy issue.


  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    cycles normal baking isn't as accurate (reliable? developed?) as the render side of things and long thin triangles are difficult to calculate accurately. The source will render correctly but it won't always bake correctly. 
    I'd say this is a pretty bold statement, do you have any references to that by any chance? It would be really sad if this is actually the case, since the effect on its own looks great.

    Adding a planar decimation modifier to the source model turns the triangle fans into Ngons. This helps but the density miss match between curved and flat surfaces means you're still going to have residual issues when it re-triangulates. The bevel shader is viable but it needs cleaner, less dense source geometry to work reliably.

    Below is about as good as it gets running a decimation modifier on the source. Of course this breaks the no modification requirement (not as badly as using a remesh modifier) and it still has some minor artifacts.
    Well, I'd consider automatic optimizations like that to be acceptable here since that does not involve extensive manual cleanup, but yeah, the results still have issues. And I believe the reason is that these triangles initially come from triangulation that is performed before exporting the mesh, therefore the only optimization here is coming from collapsing vertices. Even at angle setting of 0.1 I can spot difference in shading making this solution far from ideal.

    When using Max ProBoolean there is an option to "Make quadrilaterals" which significantly reduces these long thin triangles, but that unfortunately still does not eliminate the artifacts completely.

    If you have the boolean mesh objects and can export all the pieces I'd be interested in seeing if native boolean functions in Blender work better with the bevel shader.
    Unfortunately I don't have those anymore :(




    I think I'll try filling in bug report at this point and see where that will go.
  • FrankPolygon
    Offline / Send Message
    FrankPolygon grand marshal polycounter
    cycles normal baking isn't as accurate (reliable? developed?) as the render side of things and long thin triangles are difficult to calculate accurately. The source will render correctly but it won't always bake correctly. 
    I'd say this is a pretty bold statement, do you have any references to that by any chance? It would be really sad if this is actually the case, since the effect on its own looks great.
    Sure, I've put the sources in the spoilers. But I think the overall point was that the tests demonstrate that there's a difference in accuracy between cycles renders and normal textures baked from cycles renders.

    The Blender foundation has a limited resource pool and they have to be somewhat strategic about how they allocate developer time. Cycles was added to Blender in 2012 and didn't receive any baking capability until well into 2014. Looking at cycles roadmap (both current and past) baking updates account for a very small portion of it. The original mission statement and the current design goals for cycles are geared more for rendering animations and scenes rather than game development. That's not necessarily a bad thing either. I think the Blender devs have done a great job working within the constraints they face.


    A lot of the resource allocation depends on demand, donations and outside development. As Blender becomes more widely used in game development, I think we'll continue to see new add-ons and features being pushed up the pipe from the studios. Earlier this year both Epic Games and Ubisoft pledged support to the Blender Foundation. Ubisoft Animation Studio (part of Ubisoft Film and Television) picked up Blender and assigned some of their own developers to contribute to Blender's code projects.


    So the comment was less of a damning condemnation and more of an observation from someone who's been using Blender to make game content since 2.28c. With Blender it sometimes takes a while for features to make their way into the core code. Over the next few years we'll probably see a little bit of a priority shift to better support game content creation.

    Back to the artifacting:

    After running more tests with geometry created natively in Blender, I'd have to agree that these artifacts are either a bug or a limitation of the bevel node. The last image in my previous post is a similar model I created and baked entirely in Blender. The geometry is also slightly cleaner with shorter thin triangles and it still struggles with similar artifacts. +1 for opening a bug report with some sample files.
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    Thanks for the info @FrankPolygon!

    Fun fact, found this statement in docs regarding baking (https://docs.blender.org/manual/en/latest/render/cycles/baking.html):
    Cycles uses the render settings (samples, bounces, …) for baking. This way the quality of the baked textures should match the result you get from the rendered scene.
    So at least it seems like the intentions are close to what other software strives to achieve :) 


    Regarding the issue, after trying to alter render settings it seems like lowering global render samples to 1 removes the artifacts completely while also making the result very noisy. Noise can somewhat be reduced by cranking Bevel node samples to 16 and rendering at high resolutions then downsizing manually. The bevel effect itself still looks a bit more noisier up close than what is achieved with default settings so this solution is still not ideal. Here is what it looks like with normal maps applied (Samples here are Bevel node "samples" property):


    Now, considering this, I think cranking Bevel shader samples higher than 16 could potentially be a solution here? However it is currently capped at 16. I tried using a console to raise it beyond this value but without a success.

  • Justo
    Offline / Send Message
    Justo polycounter
    @sqrRedwend  This is a great research effort you're doing :) I also think it's very useful to test how much we can push this to achieve good results, since in some cases this could be the fastest thing to do. How lond did that 8k cycles render take? Do you think it's a viable solution for more complex models?
  • FrankPolygon
    Offline / Send Message
    FrankPolygon grand marshal polycounter
    You're correct. Settings from the render panel are used when baking. Sample quality values will be the same but that doesn't necessarily guarantee identical results.

    I'm not a developer so I don't know what's going on under the hood or if there's any limitations on how baking is handled. What I do know is that there's a difference between how it's rendering and how it's baking and other factors like geometry layout, UV unwrapping and texture size also play a role in the accuracy of a bake.

    Going off the assumption that if it renders correctly it should bake correctly: the expectation is that a normal map baked directly from the bevel shader to an unwrapped high poly should be virtually indistinguishable from the rendered version.

    To test this I unwrapped the high poly model and baked the bevel node normals. The result: it exhibits the same artifacting as baking from a high poly to a low poly. (Bevel node shader above and baked normal map below.)


    Let's talk about long thin triangles. The example below is a representation of a single tri in the UV editor. On the left is a regular triangle and on the right is a long thin triangle. Each square is a pixel. White is where the rays were calculated as a miss and blue is where the rays were calculated as a hit.

    With long thin triangles there's a greater chance the rays will be calculated incorrectly since the edges straddle pixels. A fan of long thin triangles means lots of opportunity for incorrect calculations. It can also introduce a wavy effect into the bake.

    This is a gross simplification of what's happening and how it's calculated but it illustrates the point.



    Larger textures provide more space for information. Rendering a massive texture and down sampling it will smooth out some issues but there's diminishing returns and this strategy doesn't address the fundamental problems caused by the geometry. There's also problems relating to how smaller elements are calculated. With lots of long thin triangles there's more chances the system will make a mistake. Having this type of geometry on both meshes means artifacts are almost inevitable.

    There was an excellent article detailing the technical aspects behind triangle shapes and texture baking but I can't find it at the moment. Maybe someone else knows where it's at? If your interested in reading some related articles about rendering triangles these are worth a quick skim:

    My testing and the results of the small sample large textures experiment indicates that this is more of a geometry problem. I've found that reducing the scale of the bevel shader also reduced the intensity of the artifacts. Only thing I can come up with is maybe the shader is miscalculating along the edges of steep triangles? As much as I'd like to blame it on a bug, I think it's worth noting this comment from the developer during testing: "I'll investigate the baking issues separately, doesn't seem to be about bevel specifically." https://developer.blender.org/D2803#68556

    At this point I think we've narrowed it down to: poor source geometry or some kind of bug / limitation with baking the bevel node in cycles. If it's a bug I hope they can fix it. If it's a geometry problem then subdivide the starting shape to keep the geometry fairly even when running booleans.

  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    So, I decided to check out the code, to see if I could find the cause, and while searching I found that there is this osl present in repository:
    https://developer.blender.org/diffusion/B/change/master/intern/cycles/kernel/shaders/node_bevel.osl;26f39e6359d1db85509a0ee1077b6d0af122a456

    /*
     * Copyright 2011-2013 Blender Foundation
     *
     * Licensed under the Apache License, Version 2.0 (the "License");
     * you may not use this file except in compliance with the License.
     * You may obtain a copy of the License at
     *
     * http://www.apache.org/licenses/LICENSE-2.0
     *
     * Unless required by applicable law or agreed to in writing, software
     * distributed under the License is distributed on an "AS IS" BASIS,
     * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
     * See the License for the specific language governing permissions and
     * limitations under the License.
     */ #include "stdosl.h" shader node_bevel(
     int samples = 4,
     float Radius = 0.05,
     normal NormalIn = N,
     output normal NormalOut = N)
    {
     /* Abuse texture call with special @bevel token. */
     NormalOut = (normal)(color)texture("@bevel", samples, Radius); /* Preserve input normal. */
     NormalOut = normalize(N + (NormalOut - NormalIn));
    }
    This seem to leverage actual bevel shader via some weird hack. Now, there is no checking of the input values like in the actual Bevel node therefore it is possible to raise sample count higher than 16. 

    So, I used this code via script node in place of Bevel node and it looks like my assumption about Bevel samples was correct, resulting normal map is perfect without any artifacts so far at 1024 Bevel samples. Here is an example (Baked at 1024px, 1 Render sample, 1024 Bevel samples):

    This also bakes roughly 2 times faster at 1024px on my machine (vs 128 Render samples, 16 Bevel samples) :p 
    Another observation is that there is seemingly no loss of quality when lowering bevel samples while baking at higher resolutions (Baking 2048px map with 512 bevel samples produces perfect result) so it should potentially scale well to such use cases. So far this approach looks superior over the default one in every way.

    Attaching .blend if anyone want to try this out.

    @Justo
    I hope this post covers your questions ;)
  • pior
    Offline / Send Message
    pior grand marshal polycounter
    Now that's some deep investigation ! Very cool that you've narrowed things down that way.

    Now ... at the risk of sounding like a broken record, I would still encourage you to test things further on something that looks more like an *actual* prop for your project. The fact that you manged to get rid of these artefacts on this test shape is fantastic, but we are still talking about 1024+ texture used to capture the surface of an extremely simplistic shape which really doesn't warrant it ressource wise.

    Note that I am very interested in this kind of stuff, as I am strong believer in approaches not relying on manual handling of rounded edges and the likes. But, performance as well as practicality could very well dicate you to use a different approach altogether (weigted normals without normalmaps *at all*, for instance).

    I guess what I am trying to get at is : while it is great that you worked around the issue revealed by these CAD models, you might want to still keep your guards up (and an open mind) in relation to what comes next for your project. I hope this makes sense !

    (IIRC this OSL version of the rounded edge shader has been around for a while, at least .... 2 years or so ? I remember experimenting with it quite a bit, fun stuff.)
  • FrankPolygon
    Offline / Send Message
    FrankPolygon grand marshal polycounter
    Seems like a viable workaround for normal map baking.

    I can understand additional bevel samples increasing the accuracy of the bevel shader but why would increasing render samples above 1 reintroduce the artifacting? Does this point to the issue being somewhere outside of the bevel node?
  • sqrRedwend
    Offline / Send Message
    sqrRedwend polycounter lvl 10
    pior said:
    Now ... at the risk of sounding like a broken record, I would still encourage you to test things further on something that looks more like an *actual* prop for your project. The fact that you manged to get rid of these artefacts on this test shape is fantastic, but we are still talking about 1024+ texture used to capture the surface of an extremely simplistic shape which really doesn't warrant it ressource wise.

    I guess what I am trying to get at is : while it is great that you worked around the issue revealed by these CAD models, you might want to still keep your guards up (and an open mind) in relation to what comes next for your project. I hope this makes sense !
    Sure, I still have some other curios cases in mind that might be worth checking out. Btw, I've already stumbled upon another unrelated unfortunate limitation with seemingly no workaround :(
    I can understand additional bevel samples increasing the accuracy of the bevel shader but why would increasing render samples above 1 reintroduce the artifacting? Does this point to the issue being somewhere outside of the bevel node?
    Well, I can't say for sure, but from quick look at a code it might have something to do with Render samples parameter being used for different purposes when baking compared to usual renders (It is mostly referred to as "AA" in a baking context). In any case, I reported this as a bug, so maybe devs will see what is really going on there. 

  • fexcg
    OSL version of the rounded edge shader is CPU only right?.

    Thanks
  • Lillya
    Offline / Send Message
    Lillya polycounter lvl 3
    Hi, this thread caught my attention back then, and I'm bumping it because the new 2.83.6 LTS release increased the bevel shader sample limit.

    The soft limit is still 16 samples, but if you manually type the value you can use up to 128 samples. No need to use OSL shaders so it is possible to render on the GPU.

    If you don't use the LTS versions of Blender this change might be included in the next 2.90.1, but if not then only when 2.91 releases.
  • Justo
    Offline / Send Message
    Justo polycounter
    @Lillya Thank you for reporting this! I am getting the new Blender release for this alone :D 
  • Zeist
    Offline / Send Message
    Zeist polycounter lvl 8
    Still seems too low to be useful though. For an 8k texture, 512 samples seems to be just enough to hide most noise. And having to bake a higher res texture on GPU only to downscale to reduce the noise seems to be slower.

    I did some simple tests - maybe my GPU bake settings could be better but my results were these.

    GPU outperformed CPU bake at 4k texture, 128 samples by 55 seconds

    But if I go higher samples for CPU bake, to get comparable results between them regarding noise I need to bake GPU at higher res then down scaling the image after wards.

    So GPU 8k, 128 samples versus CPU at 4k, 256 samples. GPU lost by 28 seconds, but after reducing the tile size for the GPU bake from 4k x 4k, to 64x64, it tied.

    Baking CPU 4k texture at 512 samples, took a minute and 46 seconds. GPU at 16k, 128 samples, 128x128 tile size took 3+ minutes, it didn't finish processing the file before I closed.

    Seems to me that the OSL bevel shader is still better to bake with, maybe the GPU baker handles higher poly mesh/more objects faster, but I didn't try that.

    Render sampling was set to 1 for both, if I increased it for the GPU to reduce noise rather then increase texture size, then it'll bring back the  random artifact's we're getting above.
Sign In or Register to comment.