Home Adobe Substance

Animating a single variable of an exposed float3 input in Substance Designer?

node
Offline / Send Message
Diethyl node

I'm trying to figure out how to animate a texture by changing the value of the Y Offset on a Voronoi texture by using a value processor node connected to the exposed Offset input parameter. I tried the following tutorial (https://www.youtube.com/watch?v=GApNZ9adh-E&list=PL0HuX8PmDSdbEBwU1KCHsKsNErK2KsSTx&index=10) and it works for parameters with a single float value. The problem is the offset parameter is a float3 containing Offset[X],Offset[Y],Offset[Z] and I'm trying to animate just the Y value. I don't fully understand how values are passed between the Input and value processor. I need to assign the $time variable in the value processor to a float and cast that into a float3 that can be output to the expose Offset input but, the documentation isn't very clear and substance just does things a little differently. I think I have to somehow use the sequence node somehow to cast 3 individual values into a single float3 that the input can use, but I'm not sure, like I said the documentation can be kind of confusing. I'm really banging my head against the wall any help would be appreciated, thanks.

Replies

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    you just need to break the float apart and put it back together again - you'd need to adjust your swizzles/merging depending on which channel you're messing with



    the sequence node is useful for storing variables for use elsewhere - same as declaring them at the top of the function and referencing them later, the top input is executed, then the bottom input is executed and becomes the output from the sequence node

    you would use it like this to get the same result (untested but you get the idea)

    The important factor to consider is that the sequence node does not seem to be entirely reliable. it manifests mostly in larger graphs but it's not uncommon for the upper input to just not execute and leave you with empty variables - which is a bit of a dick to debug.

    I find it's much safer to store data in the parent graph - or just avoid storing anything where possible

  • Diethyl
    Offline / Send Message
    Diethyl node

    Thanks for taking the time to help. Your graph works for animating only one value of the float3, which is what I want, but I want Y=Time and I'm having a hard time understanding how to get there, I guess I don't really understand what the Swizzle function is meant to do. It feels like I'm being made to play musical chairs with these variables but, then being told that I can only use the first 2 chairs. To me this whole swizzle function seems like a work around to a problem in the program that couldn't be fixed by simple refactoring, maybe there is something I'm not seeing. Let me try to re-read the documentation over again and see if I can figure this out so I don't waste anymore of my time or yours(been trying to figure this out for almost 2 days now.) Learning Designer has been a major disappointment for me, using it for anything but creating completely static textures, it's really starting to break my game dev education stride, and eating up so much of my time. Learning Z-Brush and relearning Blender was so much easier and fun. I'll make a post once I figure it out and share my work.

  • iam717
    Offline / Send Message
    iam717 interpolator

    @Diethyl Just for the record while learning similar practices (not at Substance designer and idk if i need it) but from others who have used this or are, they too seem disappointed with subD. feeling similar feelings to your last post (videos i've seen threads i've crossed) and have (if i remember correctly) just went to unity/blender to do things they couldn't with subd. Just stating you are not alone in your thoughts, i haven't played with this stuff but when i did attempt this i couldn't get my noodle around any of this process in the pipeline, i get its texture manipulation and you can do a bit with it but, from my experience so far seems pointless, unless making nothing but 2D titles. Hopefully when or if i even get to this point i'll understand more, best of luck with your goal. Just appreciating the thread, as a "warning" of what i might expect.

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    swizzle extracts the indicated components out of a vector - this is perfectly normal and is equivalent to writing input.x / input.xy / input.yzx etc. in code

    the vector float<n> nodes append components to a vector - in the example it appends a vec2 to a vec1 resulting in a vec3

    it is equivalent to writing something like vec3 x = vec3(1.0f, vec2(1.0f, 1.0f))

    there's a merge_float3 node you can use to plug them back together instead of the vector nodes - its more explicit what is getting plugged into which channel


    to do what you want..

    swizzle X and Z out of your input using swizzle float1 nodes, plug them into the X&Z of merge_float3 node and plug $time into Y

  • Diethyl
    Offline / Send Message
    Diethyl node

    @iam717 and @poopipe , thanks for the replies. I don't know I was kind of having a bad week. I guess I'm having a hard time learning Substance Designer because, I've never really done much texturing in the past and I guess I'm just going through the growing pains, which is to be expected if I want to get any better. The main thing I'm struggling with is fitting Substance into a VFX pipeline, I should have clarified that. I've worked as a graphic designer in the past so I can understand and appreciate the utility of procedurally generated graphics as procedural graphics are essentially the vector of raster graphics, if that makes sense. The ability to just seed a new variation of a texture or generate a higher resolution version is extremely useful, but as far as using Substance for VFX goes I don't think it's the way to go at this time. I may just end up rendering the individual outputs and compositing and animating in Unity or Unreal, because I'm pretty sure there are a lot of nodes that don't translate to either of those engines or would be too computationally expensive to be practical. I'm opting for doing things 50/50 now, I just need to figure out what nodes I should stay away from when designing textures intended for porting over into a game engine, which I think could be a post of it's own.

    Anyway, I ended up figuring things out later the night that I posted that like @poopipe described in the second post. It ended up looking something this...

    The values in the float2 (X and Z offsets) are just values that make it look good, the 0.1* is just to slow the animation down. I just created a Gumroad account and published it for free: https://lanceuppercutlu.gumroad.com/l/Slime . I published the .sar and .sbsar files, so you can look under the hood, tweak it, and use it however you like. Just credit me if you use it in a commercial production. If you just make something cool with it, I'd love to check it out. Doubt anyone would want use my armature hour texture, but hey, whatever.

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    Shit. I've never seen a swizzle used like that and I'm actually sad i hadn't thought of it.


    Designer is an image processor - nothing more.

    It's not a VFX package, it's not a magic bullet - it's a platform for programmatically fucking with pixels.


    Can you generate animations for use in VFX scenarios? Yes.

    Can you create very clever effects in 3d space? Yes

    Is it doing anything you can't theoretically achieve in a post-process shader if you're not bothered about performance? No

  • Diethyl
    Offline / Send Message
    Diethyl node

    I've come to appreciate it a bit more as time progresses and have found some tutorials since that show how it can be integrated into a VFX workflow. You're right it's not a magic bullet but, it is a very powerful tool. I don't know where I got the idea that it was a VFX tool, I guess I just thought it should be. Maybe in the future Designer will incorporate VFX creation tools/workflow I think it would be a nice and somewhat logical progression. At anyrate, I'm learning that VFX in general is something that is deceptively difficult and there is no one tool but, more like 3-? tools in VFX workflows.

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    I'd love it to support animation natively but I doubt we'll see it - Adobe already have tools for that and they want you to pay for them

    it would be nice to see sbsar support as filters in AFX and photoshop - i dont think that's entirely unlikely

Sign In or Register to comment.