Does anybody know a way to get rid of this? I recall same thing had been happening with Photoshop smart objects a decade ago . Then they fixed it.
In SD it always been that way when you scale down not /2 but rather some arbitrary number . I mean since the very first version so I avoided to scale imported bitmaps as much as possible but maybe there is a simple workaround, a secret trick or something ?
Here is Photoshop vs Substance Designer with around same pixel/detail ratio for example:
Replies
Does it happen with the software renderer?
the exact repro case is a little unclear from the posts in here - can you explain it step by step and I'll try it out at my end
why ?
bilinear filtering is bad at minification (downsampling) .
Photoshop uses more appropriate filtering methods for downsampling (you can choose which one when you apply transforms iirc) these are slower and possibly not easily GPU accelerated (trilinear and anisotropic filtering are better but I believe rely on distance/angle information) I imagine affinity uses bilinear filtering for preview and then applies a better one once you commit the change
so. back to Designer
On the left of the attached image I've set the mipmap mode on my transform2d to Automatic, on the right it's set to manual with the mip level set to 0. I've scaled the image down by non uniform values around 30%
This is roughly equivalent to setting mip bias on a texture sampler.
if you scale down more you probably want to choose a larger mip level to avoid sparkles
to fix smart auto tile you'll have to break open the graph and expose the mip-mapping parameters from any transform2d nodes inside it so that'll be fun (i recommend making a copy and putting it somewhere safe)