Home Technical Talk

What Blender Tasks Are Still Wasting Your Time? Looking for Real Automation Targets

I build Blender tools with a focus on automation and production speed, primarily for game art workflows.

Before deciding what to build next, I want to hear directly from people who use Blender seriously and run into the same problems over and over—the kind where you think “this should not be manual in 2026.”

I’m not looking for abstract feature ideas or wishlist items. I’m looking for concrete, repeatable pain points that:

  • You hit frequently (daily or weekly)

  • Are tedious, mechanical, or error-prone

  • Feel like they should be solvable with a tool or add-on

  • Aren’t already cleanly solved by an existing addon

If you respond, please be specific. Context matters more than volume.

Helpful examples:

  • “When doing X, I always have to manually do A → B → C because Blender doesn’t expose Y.”

  • “This part of my pipeline breaks unless I double-check Z every time.”

  • “I waste time fixing the same class of mistake over and over.”

Less helpful:

  • “Blender’s UI is bad”

  • “I want Maya feature X”

  • “Everything should be procedural”

If it helps, you can include:

  • Your role (environment artist, character artist, tech artist, etc.)

  • The kind of assets you work on

  • Where the friction shows up (modeling, UVs, baking, export, scene management, cleanup, validation, etc.)

I can’t promise I’ll build everything that’s mentioned, but I will prioritize problems that show up across multiple workflows and can be solved cleanly.

If you’ve ever thought “I should script this, but I don’t have time,” that’s exactly the kind of problem I want to hear about.

PS: Anyone whose suggestion I build will get a free copy with long-term support.

Replies

  • gnoop
    Offline / Send Message
    gnoop sublime tool
    Simple automation tools are easily done by chat GPT.   Blender is where it shines actually. 

      What I'd love to see is a kind of deeper Blender customization.   A 2d vector editor mode   with  editable spline   brushes driven by  grease pensil.  
         The whole "draw" mode is quite inconvenient currently  .  

        I want something like "skeletal" brushes  from old Creative house expression 3    https://en.wikipedia.org/wiki/Creature_House_Expression or what Microsoft later  did from  it .  
      A Grease pensil  stroke with up to 10 alternating pieces of RGBA mage , not necessarily a square dab,  rolling out along the pencil stroke one after another randomly .   Same as roll out brushes in Zbrush  but doing different random order each stroke + start end caps  .   
      Not like ID animated brushes in Substance Painter  Which use square dabs only switching by ID and only rotating to follow spline tangent . Rather true smooth curve deform by the stroke spline instead   .     Plus an ability to randomly shift alternating things around , both dab picture planes or  instanced geometry .      In a word  a hole new draw  by grease pencil way to do both textures and geometry scattering  with a seed option.

    Also an ability to have flat poly line shapes having texture/martial  fill  doing  it the way  their  scale and move gizmo would snap to supposed final  output pixel grid to avoid sub-pixel blurring. 
    Plus an option to  feather their edges  by  noise  like in  old Expression.     All for a purpose or easy material  compositing .The thing that is a pain in your ass in both SDesigner and Painter.     Would be nice  to employ  Blender compositing for this  but I think  just flat shapes in regular pane   and Eveee  is ok.   
       A view lock for  top ortho cmera vs 3d .    A quick switch of all brush strokes and shape fills  to normal  map/roughness  or whatever.   I doubt Blender would have enough rea-ltime  power to do it all of those  at the same time.    Just an export script that would replace all the source bitmap used to corresponding versions keeping everythin in place . Having only RGBA  channels input  is  hard codded in Blender I think. 


     Most of this is perfectly possible with Geometry nodes + shader nodes .     Evee is ok for real time preview .   Yet the hole system that  is necessary to maintain  is a puzzle and a headache to keep synced .       But I think all this is scrip-table.        
    A furter step could be  grease pensil brushes doing  fluid simulation  but I don't think it would be  real time and easy.   So far I need those Expression  tools I miss so much for decades already. 

  • anvilinteractive
    gnoop said:
    Simple automation tools are easily done by chat GPT.   Blender is where it shines actually. 

      What I'd love to see is a kind of deeper Blender customization.   A 2d vector editor mode   with  editable spline   brushes driven by  grease pensil.  
         The whole "draw" mode is quite inconvenient currently  .  

        I want something like "skeletal" brushes  from old Creative house expression 3    https://en.wikipedia.org/wiki/Creature_House_Expression or what Microsoft later  did from  it .  
      A Grease pensil  stroke with up to 10 alternating pieces of RGBA mage , not necessarily a square dab,  rolling out along the pencil stroke one after another randomly .   Same as roll out brushes in Zbrush  but doing different random order each stroke + start end caps  .   
      Not like ID animated brushes in Substance Painter  Which use square dabs only switching by ID and only rotating to follow spline tangent . Rather true smooth curve deform by the stroke spline instead   .     Plus an ability to randomly shift alternating things around , both dab picture planes or  instanced geometry .      In a word  a hole new draw  by grease pencil way to do both textures and geometry scattering  with a seed option.

    Also an ability to have flat poly line shapes having texture/martial  fill  doing  it the way  their  scale and move gizmo would snap to supposed final  output pixel grid to avoid sub-pixel blurring. 
    Plus an option to  feather their edges  by  noise  like in  old Expression.     All for a purpose or easy material  compositing .The thing that is a pain in your ass in both SDesigner and Painter.     Would be nice  to employ  Blender compositing for this  but I think  just flat shapes in regular pane   and Eveee  is ok.   
       A view lock for  top ortho cmera vs 3d .    A quick switch of all brush strokes and shape fills  to normal  map/roughness  or whatever.   I doubt Blender would have enough rea-ltime  power to do it all of those  at the same time.    Just an export script that would replace all the source bitmap used to corresponding versions keeping everythin in place . Having only RGBA  channels input  is  hard codded in Blender I think. 


     Most of this is perfectly possible with Geometry nodes + shader nodes .     Evee is ok for real time preview .   Yet the hole system that  is necessary to maintain  is a puzzle and a headache to keep synced .       But I think all this is scrip-table.        
    A furter step could be  grease pensil brushes doing  fluid simulation  but I don't think it would be  real time and easy.   So far I need those Expression  tools I miss so much for decades already. 

    This is beautiful! Thank you for the suggestion. I will start working on this and offer it to you for free for life once it is finished.
  • gnoop
  • pixelb
    Offline / Send Message
    pixelb greentooth

    Yess! My man!

    I’m an environment artist with a background in organics. There are not many problems left that I would not myself try to script around, but the biggest choke point for me has been retopologizing and adding UV seams to lowpoly organic models. I'll use these rocks as an example:

    Overwatch 2 - Wuxing University

    When I do a highpoly to lowpoly workflow for a large rock, I typically sculpt the model in ZBrush, dynamesh the pieces of the highpoly together, bring the result into Blender, and decimate it with the decimate modifier.

    The resulting lowpoly is 500 - 5k tris. I usually then have to spend 10 - 15 minutes cleaning up the lowpoly, spinning edges to get rid of skinny triangles and welding verts to kill short edges. 

    Then I have to define the UV seams. Autoseams provide a starting point but I invariably end up with too few or too many seams. Because my lowpoly doesn’t have a clean edge flow it takes a while, again 10 minutes or so, of selecting and marking edges as seams to split my rock up into three to five UV islands.

    Then of course I have to export my models to Toolbag and bake normal and cavity maps, update the PSD or Substance Painter file, and export my final textures. That part’s not a big deal.

    If I’m testing a rock in-engine it might take 3-4 iterations of this whole loop to get a rock shape that works well and intersects nicely with other rocks.

    I have been looking for years, with no success, for a way to automate cleaning up and UVing the lowpoly in a way that doesn’t require 30 to 60 minutes of manual labor. I’ve tried doing it in Houdini but wasn’t impressed with the results and I prefer to stay in Blender as much as possible.

  • anvilinteractive
    pixelb said:

    Yess! My man!

    I’m an environment artist with a background in organics. There are not many problems left that I would not myself try to script around, but the biggest choke point for me has been retopologizing and adding UV seams to lowpoly organic models. I'll use these rocks as an example:

    Overwatch 2 - Wuxing University

    When I do a highpoly to lowpoly workflow for a large rock, I typically sculpt the model in ZBrush, dynamesh the pieces of the highpoly together, bring the result into Blender, and decimate it with the decimate modifier.

    The resulting lowpoly is 500 - 5k tris. I usually then have to spend 10 - 15 minutes cleaning up the lowpoly, spinning edges to get rid of skinny triangles and welding verts to kill short edges. 

    Then I have to define the UV seams. Autoseams provide a starting point but I invariably end up with too few or too many seams. Because my lowpoly doesn’t have a clean edge flow it takes a while, again 10 minutes or so, of selecting and marking edges as seams to split my rock up into three to five UV islands.

    Then of course I have to export my models to Toolbag and bake normal and cavity maps, update the PSD or Substance Painter file, and export my final textures. That part’s not a big deal.

    If I’m testing a rock in-engine it might take 3-4 iterations of this whole loop to get a rock shape that works well and intersects nicely with other rocks.

    I have been looking for years, with no success, for a way to automate cleaning up and UVing the lowpoly in a way that doesn’t require 30 to 60 minutes of manual labor. I’ve tried doing it in Houdini but wasn’t impressed with the results and I prefer to stay in Blender as much as possible.

    I read this carefully, and you’re not crazy; this is a real gap in most organic pipelines.

    Your problem isn’t “retopo” in the traditional sense. It’s that decimation destroys semantic edge information, and every downstream step (cleanup, seam placement, iteration) is paying that tax again and again. Autoseams fail because they’re blind to intent, not because you’re using them wrong.

    We’ve been working on automated quad-based retopology (Quadify) that already handles messy organic inputs well, including partial targeting. What’s interesting here is pairing retopo and UV generation as a single decision, instead of two disconnected steps. If the edge flow is intentional, seam placement becomes almost trivial, especially for rocks that only need 3-5 islands and don’t care about deformation.

    If you’re open to it, I think this can be pushed into a mostly one-click Blender-side step that replaces the 30-60 minutes you described, not just speeds it up.

    What engine are you validating these assets in, and are these rocks tiling, hero, or pure set-dressing?

  • pixelb
    Offline / Send Message
    pixelb greentooth

    You’re right, decimation is a better word for what I’m doing than retopo. Ideally yes, the decimation and unwrapping would all be handled in one step. Best case would be a modifier that I put on the model once and forget about.

    I know there are quad-based topology solutions out there that would probably give me better auto UV seams, but triangulated meshes are better for adaptive density (putting more detail where I want it) and give me the absolute control I need over the rock's silhouette.

    I’m brainstorming here, but if I could define where I want my UV islands on my highpoly, say via the vertex color, then do a decimate that preserves those island borders, followed by a UV unfold and pack of the results, that would get me close to where I want to go.

    My example is from a proprietary engine but I’m mostly targeting Unreal these days. The assets I want to generate are hero or generic playspace rocks. I have a different workflow I use for tiling rocks.

  • okidoki
    Offline / Send Message
    okidoki interpolator
    The "problem" simply is that there is "no make it so"-button on any computer.. expect one does choose to use some AI.. trained on the data of knowledge which dozen and dozen of artists did accomplished over years.. but then  finally one does not have to hire an artist do do so.

    Humankind evolved a lot when people started to not doing everthing for themself but specialized into some crafts.. of course one had to trade something what one could do "better" or faster then others with something the did. This again got easier when one didn't have to trade directly but use money for some "general trading exhange"..

    <irony sarcasm-level="high">
    Now when "everybody" can do "anything" with just the push of one or to buttons (keys).. then nobody has to speciaze anymore.. but then also noboduz has to buy something from someoen else.. so we do not need any money.. and wil be free !! 
    </irony>

    But then.. this may be a bit off topic.. :tongue:
  • pior
    Offline / Send Message
    pior grand marshal polycounter
    "I build Blender tools with a focus on automation and production speed, primarily for game art workflows."

    What are these tools ?
  • anvilinteractive
Sign In or Register to comment.