Home Adobe Substance

What can improve SDesigner performance?

gnoop
sublime tool
Offline / Send Message
gnoop sublime tool

I am straggling with complex materials. It works more or less fine with something simple but whenever I need a mix of several materials my whole PC gets super slow because of Substance Designer.

I see memory 99% in red and everything freezes forever with blue progress bar freezing.

I wonder what memory in needs the most, ram or Vram ? or both? Would it solve the issue if I go 64 GB of RAM ? or it's my videocard 6 gb of vram is the main problem.

Is it a cache issue and I need a rather good SSD for cache?

Would having materials as sub-graphs vs all in top level graph be working better ?

Replies

  • Finnn
  • gnoop
    Offline / Send Message
    gnoop sublime tool

    Thanks Finnn. I usually follow this page instructions but it makes not that much of a help. It also doesn't specify what hardware is critical for SD performance. I recall when I upgraded my videocard last time I noticed not a thing improving in SD so wonder what does it really need?

    My 12 core CPU is taking a nap while SD is working . GPU graph also shows no spikes , neither in Compute_0 or any point from drop down list of graphs. Yet SD is slow as hell typically , especially if lots of 4k bitmap inputs are involved . So I assume it's memory issue . Would 12 gb VRAM videocard make any difference of it's system RAM mostly and my 32 GB is not enough?

  • Finnn
    Offline / Send Message
    Finnn greentooth

    Generally huge graphs just take alot of calculating power. Although SD calculates everything and then holds that informaiton in cache. When a change in the graph occurs only those nodes affected by the change are recalculated. So check your system stats again right after you changed something.

    Its just the nature of stacking up lots and lots of calcuations, at some point every system will take a noticable time to calculate it. Working in 4k obviously requires alot more calculations than working in a lower resolution. Imagine you have 16,777,216 pixels in your 4k bitmap. And each operation (they are mostly pixel based) in your graph will need to do its thing for every single one of those pixels. When you work with 512x512, you only have 262,144 pixels. You see where I am going with this.

    So you have to adapt accordingly. For instance when working with a complex graph, you might want to lower the resolution you work in and only go up in resolution to check if everything looks alright or to render your output maps. When you have a material that is blending several materials together, you might want to work on them separately.

    Also, you can optimize complex graphs by reusing noise, avoid expensive nodes etc. like its described in the performance guide.

  • Finnn
    Offline / Send Message
    Finnn greentooth

    About your system specifications, SSD is pretty much mandatory imo. But I am pretty sure that SD holds their calculations in the RAM. I personally have 32gb and never really had problems with it. I am not entirely sure, but I think most of the operations are calculated by your GPU, because it consists of many cores that are optimized for doing mathematical operations. So having a fast GPU will help with performance alot.

    Also decluttering your PC might help as well, it might not be SD that is flooding your memory but another program could also cause it. One thing I came across alot when researching performance issues of any kind is the suggestion to use "realtime performance" mode on your process. (when you are on Windows). Please dont ever do that, it can cause alot of issues in your system. Generally operating systems know how to prioritize processing power the right way.

  • gnoop
  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    You need a GPU with a wide bus and a lot of memory to deal with big graphs.

    Your cpu and ram make little difference provided they're relatively modern

    To work at 4k smoothly you want something recent with 80 or 90 on the end of its name and even that requires that your graphs are small.

    Generally you want to make your input bitmaps inherit size from the graph and drop resolution while working.

    You can improve responsiveness to parameter changes by modularizing your graphs and arranging them so that you're editing parameters as close to the output as possible.

  • gnoop
    Offline / Send Message
    gnoop sublime tool

    So do you think adding more system RAM , upto 64 gb wouldn't change much? It's just a cheapest option I consider. Perhaps it would keep the whole caching in system ram only . I just haven't noticed any difference when I upgraded my videocard from 4 to 6 gb last time really .

    As of editing parameters toward the end it's not possible really since the most tweaked parameter on my end is a scale usually . In base fx maps typically .

    By the end of graph is not that much to tweak already .

    Also most terribly slow things in SD is open and switching graphs . When you want to copy something from one graph to another . It takes forever sometimes just to open a graph if it takes lots of input bitmaps. And I usually have to drop them at a twice of target resolution at least because SD constantly tries to crap over every bitmap image so I have to use same approach I do in Painter . By working in twice of target resolution at least.

    It sometimes really quicker to use content aware move tool + recording to make a new "seed" in Photoshop than doing same in SD.


    ps. I am using my own nodes and fx maps I composed decade ago in SD . Not sure maybe it's the reason and it's a time to re-do them from scratch again ? Many I already don't remember how I did them. Although I though they should be updating automatically with a new atomic nodes .

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    You need more GPU. More ram won't hurt but it also won't help.


    A 1070 is workable at 1-2k a 1080 is fine at 2k. To work at 4k all the time you need something like a 2080ti or a 3080 and it still won't be buttery smooth.

    You also need the rest of your pc to be able to feed the GPU - we tested a 2080ti against a 1080 on one of our old 16core 2.5ghz xeons and there was bugger all difference. Same test on 9700k and there was a marked difference.


    If it's slow accessing a disk(loading bitmaps) get a faster disk to put them on - fast nvme drives aren't prohibitively expensive unless you're looking at large capacities (4-8tb) - make sure you get a fast one though rather than the cheap shit ones.



    But ultimately - work at lower resolution and turn it up only when you need to - otherwise you're pissing time away

  • gnoop
    Offline / Send Message
    gnoop sublime tool

    Thanks for info poopipe.

    I tried that idea to work in target resolution without downscaling at final export but always got kind of low res result , a rough looking details .

    When you have pebbles of just a few pixels or 2 pixel wide scratches /dents in target resolution they always look better when downscaled at export . Less artificial or something . In fact Photoshop downscales still way better than SD imo with a kind of special magic in-between pixels.


    Also I have just figure out why my sbs takes 10 minutes to load sometimes. It's because cryptomatte masks I feed into sbs as psd files. SD takes forever to load layers from psd files . Wish they would do something with that. Bet it totally irrelevant to GPU calculating power.

Sign In or Register to comment.