Does anyone have an example of working while loops in pixel processor? Whatever I do it does exactly nothing. What I want to do is sort of a slope blur.
replicating the video example in a pixel processor works fine - i get a final value of 10 so we know that while loops work in them.
looking at your graph... im not sure you've quite got your head around how the sequence node works - to quote the docs:
The Sequence node gives you control over the execution flow of Substance function graphs, by making sure the first branch is fully executed before the second branch.
The output of the second branch is then passed to the node's output. The obvious problem is that you get i and j before they have values assigned. Flow of execution basically runs along the second branch and when it hits another sequence node it attempts to execute what's in its first branch - your graph executes roughly in the following order (ish - i couldn't be bothered to actually debug it). You can probably declare and set them as input parameters on the parent graph to initialise them for the first run
I'm not convinced that nesting sequence nodes is supported ( where it goes from 5/6 to 7 to 8 ) - it hasn't worked reliably for me in the past. if the above doesn't work, I think you want to restructure your logic a bit to avoid that.
and - speculation time.... I wouldn't be particularly surprised if designer prevents you from nesting while loops
Thank you poopipe. I did a few mistakes when I tried to recreate this node flow from youtube video. Haven't been watchful enough obviously . Sorry.
I attach working example.
I still don't understand a thing of how it really works Wasted a day already. The same question of how I and J works before they even having their values assigned still stands. And why there are so many "set" there .
My goal is to recreate sort of slope blur with unlimited shifts . Something I could easily do in Filter forge decade ago with their "loop" node.
So far I use my custom one I did from gazillion vector warps that allows to "grow" something along the flow map . Hoped with loop node it could be faster but now looking at this blur example which is slow as hell I have my doubts.
its not slow really when you think about it that 128 pixel square kernel means you're sampling 16,384 times for each pixel. on a 2k map that's a total of 68,719,476,736 sampled points - which is quite a big number.
I'm seeing about 160 ms with unreal 5 running in the background on my machine (3080), I'd say that's acceptable given what it's doing.
If you reduce the kernel size it gets faster very quickly, you can divide $size to compensate for the reduction in total distance (or just use a smaller image - same thing)
the 2 ms version here uses kernel size of 16 and i've divided &size by 8 This obviously isn't as accurate but for your simple shape the only real difference is that the cheaper one isn't as smooth.
tbh. you're probably best off chaining a few of these with smaller distances&kernels together - much like the built in nodes do
You don't really have much choice - it's a kernel based filter whichever way you look at it and since it's parallelised already you have limited options in terms of making major speed improvements .
The only way it really gets faster is to do less work. If you chain 8 of these with kernel size 16 in a row it will be a lot faster than using a single at 128. That should get you the same (ish) sized blur - it might not be quite as smooth but you save 60 billion operations
You don't really have much choice - it's a kernel based filter whichever way you look at it and since it's parallelised already you have limited options in terms of making major speed improvements .
The only way it really gets faster is to do less work. If you chain 8 of these with kernel size 16 in a row it will be a lot faster than using a single at 128. That should get you the same (ish) sized blur - it might not be quite as smooth but you save 60 billion operations
Yeah, I got it. I see almost same speed now after I did what you
suggested. I rather try to make a simple repeating UV shift aka
slope blur instead of kernel based blur filter. By replacing scale input into sort of a vector shift but for some uncertain reason I got something weird .
I sort of did what I initially wanted by modifying their blur example but still don't understand a thing really . Why I have to set the "result" and "u" parameter in two different places and why this loop node output only flat number and I have to use another sequence to see actual result in the end ?
Besides looks like the "result" it supposed to repeat with a new shift each time doesn't really sample the directional vector each new time like it does when I constructed the "loop" manually in previous SD versions by connecting a number of warp nodes one after another . Tried to do other brunches set as 'before" and "after" but never managed to make it works.
Replies
looking at your graph...
im not sure you've quite got your head around how the sequence node works -
to quote the docs:
The Sequence node gives you control over the execution flow of Substance function graphs, by making sure the first branch is fully executed before the second branch.
The output of the second branch is then passed to the node's output.The obvious problem is that you get i and j before they have values assigned. Flow of execution basically runs along the second branch and when it hits another sequence node it attempts to execute what's in its first branch - your graph executes roughly in the following order (ish - i couldn't be bothered to actually debug it).
You can probably declare and set them as input parameters on the parent graph to initialise them for the first run
I'm not convinced that nesting sequence nodes is supported ( where it goes from 5/6 to 7 to 8 ) - it hasn't worked reliably for me in the past. if the above doesn't work, I think you want to restructure your logic a bit to avoid that.
and - speculation time....
I wouldn't be particularly surprised if designer prevents you from nesting while loops
that 128 pixel square kernel means you're sampling 16,384 times for each pixel.
on a 2k map that's a total of 68,719,476,736 sampled points - which is quite a big number.
I'm seeing about 160 ms with unreal 5 running in the background on my machine (3080), I'd say that's acceptable given what it's doing.
If you reduce the kernel size it gets faster very quickly, you can divide $size to compensate for the reduction in total distance (or just use a smaller image - same thing)
the 2 ms version here uses kernel size of 16 and i've divided &size by 8
This obviously isn't as accurate but for your simple shape the only real difference is that the cheaper one isn't as smooth.
tbh. you're probably best off chaining a few of these with smaller distances&kernels together - much like the built in nodes do
The only way it really gets faster is to do less work.
If you chain 8 of these with kernel size 16 in a row it will be a lot faster than using a single at 128. That should get you the same (ish) sized blur - it might not be quite as smooth but you save 60 billion operations
Yeah, I got it. I see almost same speed now after I did what you suggested. I rather try to make a simple repeating UV shift aka slope blur instead of kernel based blur filter. By replacing scale input into sort of a vector shift but for some uncertain reason I got something weird .