Home Technical Talk

Distorting Procedural Textures in Blender

polycounter lvl 2
Offline / Send Message
PinkTortoise polycounter lvl 2
Hello!
Recently I've been learning about geometry nodes in order to create procedural landscapes, and have come across something I can't find information about elsewhere.

When distorting a procedural image texture using procedural noise, the distortion isn't uniform across the surface. There is a spherical gradient of strength, surrounding the object's origin point - towards the centre the distortion is reduced, getting stronger towards the outside, giving a bald spot effect (I've tested changing the location of the object's origin point which confirmed this).

I suspect this hasn't been an issue for most people, because the size of the object I'm working on is quite large, and the problem is less obvious on smaller objects.

For reference, I am using Blender version 3.6


I had a look at the same type of pattern made with a material instead, and it has the same issue, though less noticeable.


Also tested setting all the texture coordinates to UV Map, this changed the location of the bald spot to the point 0,0 in the 0-1 space. However the object is mapped, the pattern becomes less distorted as it nears the 0,0 coordinate.


So I'm wondering:

What causes this effect?
Is it possible to have procedural pattern based distortion without this effect?
If so, how can it be achieved?

Thanks! :)

Replies

  • okidoki
    Options
    Offline / Send Message
    okidoki polycounter lvl 2
    Blender use UV's as default.. so add a Texture Coordinate node and use Generated or Object as input ( Vector )on the Noise and Voronoi texture.. maybe using a Mapping node inbetween to control it..
  • PinkTortoise
    Options
    Offline / Send Message
    PinkTortoise polycounter lvl 2
    okidoki said:
    Blender use UV's as default.. so add a Texture Coordinate node and use Generated or Object as input ( Vector )on the Noise and Voronoi texture.. maybe using a Mapping node inbetween to control it..

    Thanks :)
    I previously thought Blender used generated as default, unless a non-procedural image texture was applied.

    Although even with these extra nodes, the bald spot remains. Applying an offset using the mapping node does not change the location or strength of the bald spot.

  • Michael Knubben
    Options
    Offline / Send Message
    Hello!
    Recently I've been learning about geometry nodes in order to create procedural landscapes, and have come across something I can't find information about elsewhere.

    When distorting a procedural image texture using procedural noise, the distortion isn't uniform across the surface. There is a spherical gradient of strength, surrounding the object's origin point - towards the centre the distortion is reduced, getting stronger towards the outside, giving a bald spot effect (I've tested changing the location of the object's origin point which confirmed this).

    I suspect this hasn't been an issue for most people, because the size of the object I'm working on is quite large, and the problem is less obvious on smaller objects.

    For reference, I am using Blender version 3.6


    I had a look at the same type of pattern made with a material instead, and it has the same issue, though less noticeable.


    Also tested setting all the texture coordinates to UV Map, this changed the location of the bald spot to the point 0,0 in the 0-1 space. However the object is mapped, the pattern becomes less distorted as it nears the 0,0 coordinate.


    So I'm wondering:

    What causes this effect?
    Is it possible to have procedural pattern based distortion without this effect?
    If so, how can it be achieved?

    Thanks! :)

    A few things I'm noticing: You're going to want to use a Texture Coordinates node to be able to set your noises/textures mapping separately.
    I see you're inputting a vector in a Color Mix, but the output is going into the 'Scale'. You can tell by the yellow-to-grey cable that there's conversion going on, and in this case you're losing information as it only takes a float (numbers or grayscale image).
    You could distort your vectors rather than your scale, these take three-channel (so potentially RGB) inputs. That's what I usually do (in shaders at least).
    With Node Wrangler you can also isolate nodes to see them in the viewport, isolated. This helps with determining where in the chain your problems come from.

    If you still have the issue after that, share your file and I'll take a look!


  • okidoki
    Options
    Offline / Send Message
    okidoki polycounter lvl 2
    Ohh i didn't looked too deep into the nodes.. after seeing the missing coordinates.. and additionally to what  @Michael Knubben said.. using a value and/or a vector and then mix using any color modification node (linear ligth, color ramp) might be a not so clever idea.. better use any Math node or  Converter -> Float Curve node .. also the colormanagement is going to change this drastically..
  • Michael Knubben
    Options
    Offline / Send Message
    I don't think the color mix will influence things too greatly. Not sure it does _any_ colour management at all, does it? I've certainly used it a lot in cases where the math equivalent is more steps.
  • PinkTortoise
    Options
    Offline / Send Message
    PinkTortoise polycounter lvl 2
    Hello!
    Recently I've been learning about geometry nodes in order to create procedural landscapes, and have come across something I can't find information about elsewhere.

    When distorting a procedural image texture using procedural noise, the distortion isn't uniform across the surface. There is a spherical gradient of strength, surrounding the object's origin point - towards the centre the distortion is reduced, getting stronger towards the outside, giving a bald spot effect (I've tested changing the location of the object's origin point which confirmed this).

    I suspect this hasn't been an issue for most people, because the size of the object I'm working on is quite large, and the problem is less obvious on smaller objects.

    For reference, I am using Blender version 3.6


    I had a look at the same type of pattern made with a material instead, and it has the same issue, though less noticeable.


    Also tested setting all the texture coordinates to UV Map, this changed the location of the bald spot to the point 0,0 in the 0-1 space. However the object is mapped, the pattern becomes less distorted as it nears the 0,0 coordinate.


    So I'm wondering:

    What causes this effect?
    Is it possible to have procedural pattern based distortion without this effect?
    If so, how can it be achieved?

    Thanks! :)

    A few things I'm noticing: You're going to want to use a Texture Coordinates node to be able to set your noises/textures mapping separately.
    I see you're inputting a vector in a Color Mix, but the output is going into the 'Scale'. You can tell by the yellow-to-grey cable that there's conversion going on, and in this case you're losing information as it only takes a float (numbers or grayscale image).
    You could distort your vectors rather than your scale, these take three-channel (so potentially RGB) inputs. That's what I usually do (in shaders at least).
    With Node Wrangler you can also isolate nodes to see them in the viewport, isolated. This helps with determining where in the chain your problems come from.

    If you still have the issue after that, share your file and I'll take a look!



    Sorry for the late reply and thankyou very much for the advise, I'll give that a try!
Sign In or Register to comment.