I'm creating a basematerial for Substance Painter in Substance Designer.
Everything works as expected in Substance Painter, except for the Edge Detect node I used in Substance Designer.
As you can see in the Substance Designer screenshot, the Edge Detect node created the desired black&white mask.
Whereas the Substance Painter Screenshot shows a mess of thin lines / edges, randomly painted in different shades of grey.
I see two solutions for this problem: a) get the Designer Edge Detect node to work in Substance Painter somehow.
b) Achieve the same / similar effect with a different node setup in Designer, which then also works in Substance Painter.
Me and my team would be extremely grateful for a solution to this problem!
I was able to isolate the problem to just the Edge Detect node, because the previous node in the graph still gives an identical result in Designer and Painter.
I've already tried solving the problem with Edge Detect node by setting it to an absolute resolution of 2k, but this had made no difference to the "Relative to Parent" setting I had before.
Replies
Nothing appears to fix the issue in Substance Painter.
Did you mean setting graph Output Format as Absolute and choose 16 bit? I tried that.
Or is there another setting to force this, that I am missing?
How to reproduce the issue:
1. Create a Black & White shape in SD, for example with Arc Pavement node.
2. Connect an Edge Select node afterwards and hook it up to an output, like base color, or blending mask.
3. In the Edge Select node, select a reasonably large Width to be able to see the issue clearly.
(Selecting a very thin edge makes it hard to see the issue.)
4. Export the .SBSAR and import it into Substance Painter as a basematerial. You can see the issue there.
Are others able to reproduce this issue?
So what would be the best way / setting to resolve this issue?
I feel like I've tried pretty much everything by now, and nothing works, which is super frustrating.
Everything works fine in Substance Player.
However, using Player, instead of Painter for the texture exporting would cost us at least a few thousand dollars across the whole production time, so I don't think that's acceptable.
Generally speaking we try to use smart materials/anchor points to layer up custom filters etc rather than building them into materials - in most use cases this performs better as the material isn't having to re-evaluate a load of clever shit every time something happens underneath it.
The stuff you can't do like that tends to be for generating very specific maps that feed shaders
Obviously something is borked and I'd strongly recommend logging the issue with allegorithmic though