I saw that Pie Menu Editor now got an update, seems like that only fixes the most glaring issue with 3.6 though with future fixes planned so I'm still holding out. But it's promising, to say the least.
Another one I can't seem to live without and might not be the only one who had been a little anxious about its development status is Modifier List - and that got finally updated to support newer Blender features today:
https://blenderartists.org/t/modifier-list-1-7-5/1147752
I've stopped using Modifier List because it often lags behind current releases, and breaks in significant ways. I loved it, but just can't rely on it :/
When color picking in 3D texture paint mode, is there a way to have the color picker source color from the actual raw pixel data from the current texture, even though the full result of the shading is being displayed (shader + viewport color management) ?
Note that I am not just talking about avoiding the influence of viewport color management here, or even just avoiding the influence of the 3d shading of the model - as I believe these two aspects have been adressed at some point in recent releases, at least partially. What I mean is, for instance, being able to color pick and paint the white and black of a a texture mask while displaying the result of the whole shader ; or, color picking from and painting on a RGB texture even though its display may be affected by a hue shift node somewhere in the shader.
I hope this makes sense somehow :D FWIW I am using 3.1.0.
Yeah I guess if you try to stay current then you're in far worse a spot than I am since I'm only updating my production setup like once or twice a year. Still looking at replacing Pie Menu Editor witha custom solution longer term here though since that one to me is just far too crucial to be able to use the software at all. I'm so lost when I do a quick test install without it. 😂
This addon adds the usual skinning tools you can get elsewhere like value set/copy/paste and mirror but also offers what I think is the first implementation in Blender of a skinning workflow that I missed dearly from 3ds max that allows you to skin a low poly mesh and transfer/interpolate the weights onto a higher res one. In Max that's done with the Skinwrap modifier, here it's called 'Wrap Weight' - 0:58 into the video:
In Max I used to stack like three models deep for skinning - very basic proxy like shown in this video, driving a low poly but articulated mesh with simple fingers and proper joint topology - and the actual model driven by that. This way I was mostly just editing the weights on the skinning proxies instead of having to deal with skinning higher res geometry.
Of course skinning in Max was also a total disgrace by comparison so it was more essential to work this way than in Blender to stay sane but still. Looking forward to giving this a spin with my new work.
Unless I am mistaken, isn't this just a button calling a weight transfer operation (with the appropriate setting for source changed from the default "active layer" to "name") ? Transferring weights that way from and to models of varied densities has been possible for a long while now. Would you mind describing a practical use case of such transfer that would not be achievable using the default toolset ?
On a different topic : after years of writing off the Blender implementation of Metaballs as unusable because of the seemingly non-sensical way it handles negative shapes (as opposed to how other software does it), I've finally made sense of it. Nothing in the UI suggests it : Blender metaballs have a concept of "stacked value", using hidden positive and negative numbers and resulting in a behavior quite differently from that of Booleans.
- If two positive Metaballs objects overlap, the common volume hidden inside the resulting volume (which would be called the intersection in Boolean terms) is now a "double positive". - When a Metaball gets set as "negative" in the Metaball properies panel, it is set as "single negative". You see where this is going ... - ... this means that bringing a subtractive Metaball over two positively blended Metaballs will *not* remove the volume where the two original volumes intersect. - To get rid of it, the subtractive Metaball needs to be duplicated, for the resulting negative volume to reach a value of -2.
Like so :
I for sure have never ever seen this explained anywhere, and I don't think this is self evident to anyone - and this unusual behavior (compared to other Metaball systems relying on an outliner/relationship tree) is what led me to write off this tool as unpredictable originally. I hope this helps !
Yeah, they're quite powerful, just a pain to use! I'm looking forward to more SDF features being implemented, as some have recently landed in master. Not exactly the same, but the same kind of application and results (when turned into polygons).
Following up on this topic (metaballs, remeshing, SDFs) : what are the options currently available beyond the default metaballs and the remesh modifier ?
The remesh modifier could be great, but the fact that it cannot be applied to multiple objects at once makes it hardly usable ... Sure enough there has to be some clever workaround making the continuous remesh of multiple objects possible ? Like running a script manually to fetch a bunch of objects from a list, and making a merged duplicate of them as one single mesh ready for remesh. This would be a rather viable semi-realtime solution ...
The remesh modifier could be great, but the fact that it cannot be applied to multiple objects at once makes it hardly usable ... Sure enough there has to be some clever workaround making the continuous remesh of multiple objects possible ?
oh there is a very easy solution to this, you can just put your parts into a collection, then instance this collection into a geometry nodes graph (maybe needs realize instances) and then you can add this graph to a placeholder object, add the remesh modifier after the geometry nodes modifier and voila.
Well @kio , I must say I wasn't expecting it to work *that* well ! Excellent stuff, and so simple. Much thanks !
Is there some clever way to mute the "Realize Instances" node directly from the 3D viewport (to act as a final preview toggle, allowing to work on the assembly at full performance without the remesh slowdown) ?
pior said: Is there some clever way to mute the "Realize Instances" node directly from the 3D viewport (to act as a final preview toggle, allowing to work on the assembly at full performance without the remesh slowdown) ?
Well dont think so - just the usual turn off the modifier for the time being. But it would be trivial to write a little operator which toggles all the modifers on the selected object or something like that.
Introducing Baked Blender Pro Suite: Elevate Your Digital Art Workflow!
Hey fellow artists and Blender enthusiasts!
I'm thrilled to announce the launch of Baked Blender Pro Suite, a Blender plug-in designed to make your digital art creation process a little bit more fun. Whether you're a seasoned artist or just getting started, this plug-in is packed with features to take your Blender experience to the next level.
With Baked Blender Pro Suite, you can:
Access a vast library of high-quality assets, including models, materials, and geometry node setups, to enhance your projects.
Drag and drop usability
Regularly updated with new assets, providing fresh inspiration and expanding your creative possibilities.
Engage with our vibrant community, share your work, and exchange ideas, tips, and tricks.
We believe in the power of collaboration and continuous improvement, so we invite you to try out Baked Blender Pro Suite and provide us with your valuable feedback. We're committed to refining and expanding our offerings based on user input.
Visit our gumroad to download the plug-in and explore the free assets.
Join us on this exciting journey and let's shape the future of digital art together!
Hey! Thought I would share my addon with this crowd. It’s primarily for “hardening” your vertex colors, giving you that “polygonal illustration” look, but I also threw in a few utility functions I like having in my day to day as an environment artist. For instance, blender doesn’t have a built-in way to invert a single color channel (or alpha!) but I often need to do just that when using vert colors to blend textures, so I added a button for it. It’s also nice being able to set vert colors directly from edit mode rather than dropping into vertex paint mode all the time.
@okidoki I miss being able to use shift+1,+2 etc to access the second set of layers, that looks a bit overkill
I think that's only a case of adjusting your keymap!
Do a search for 'Collection' in the keymap, scroll down to Object Mode and study how the first 10 are set up. Currently shift+numbers toggle collections on and off (which I like), but you can disable those (I recommend this over changing or deleting), and set up your own alternatives. As another option, you could use alt+numbers to reach the 10 higher collections?
I miss this too. The simplicity of this. All the new "replacements" still don't prevent Blender from a mess of double hide, hidden groups and time wasting to solve Gordian knot of a scene management.
@Michael Knubben I'm aware of these hotkeys but changing them won't do it because in some projects I would have one set of assets im working on and other two or more. I don't think there is a workaround for this, even if you could change the ''index'' of the collection you would have to do it per project
It's very anoying to do it all in the outline with hotkeys as they currently work, for instance working on a project with 2 sets of assets that all will have hp/lp/nanite versions becomes a pain in the ass. It would be so much faster and easier if you had the old system. 1 hp, shift+1 nanite, alt+1 lp etc...
Does anyone feel that the snaps have been messed up in recent blender releases? I've been using Blender professionally every day since 2.93 and i think it smokes Autodesk software any day of the week but since i updated to 3.6.0 from 3.4.1, there is some major annoying behaviour and i've been pulling my hair out.
Snaps don't seem to work properly anymore, if i try to snap a vertex to another vertex (in edit mode, of the same mesh), or use any axis constraints, it feels that the snaps have backface culling active even if it isn't. I use this all the time when modeling and in 3.6.0 i'm forced to orbit the view or hide faces to be able to snap to vertices that are on the other side of the mesh, which means that i have to waste massive amounts of time over a day as these small adjustments and extra operations compound over a day a lot since now it takes me twice or three times as long to do the same operation that worked flawlessly. Settings, addons and snap settings are identical between 3.4.1 and 3.6.0, here are two screenshots from 3.4.1 and 3.6.0 for comparison:
3.4.1 behaving as expected
and here is 3.6.0
The model here is in x-ray mode. I suspect that there is something wonky going with the depth detection or culling because it snaps to other vertices that are closer to the camera. It's also not a camera issue because as i said the settings are identical across the two different versions as I've set them manually one on one everywhere across the two versions. if it helps, i use blender always as portable version, there is no blender installed locally on my system because i want to keep different versions from interfering with each other.
Any suggestion is greatly appreciated if anyone has an idea of what is going on here
I have always used the bevel modifier with few subdivision/segments maybe 4 max, but for one project I need to use more and im seeing some weird artifacts, not sure if it was always there or its a bug. These corners are all wonky when using 12+ semgments, dosen't happen for all meshes though
@Udjani iirc it's not a new thing. For 270 degree angles and steeper it is mostly not an issue. It starts below 12 segments already but gets more obvious the more segments you have and it depends on the "bevel flow" as well. The rare cases I used it, setting the profile to ~0.6-0.66 helped, depending on the model.
I'm preparing a couple of models for my demo reel, and am finding it to be more difficult than expected to render curved wireframes on a model in Blender. This is what I'm talking about:
Model is subdivided, but the wireframe has the resolution of the original low poly geometry, and smoothly follows the curvature of the subdivided mesh. Subdivision modifier with 'optimal display' activated so to speak, but renderable in EEVEE or cycles. Is there an easy way to do this? I have found some complicated setup online using geometry nodes for this, but to be honest I feel little motivation to reconstruct the overly complex setup for such a seemingly obvious and simple effect. How would you do it? Doesn't matter if its a material or geometry.
I think the easiest way to render the wireframe is using the 3D viewport with *Viewport Overlays* set to wireframe and *Viewport Shading* set to rendered.. and then directly using *View -> Viewport Render Image*.. and for a more subdivided wireframe.. just apply as many levels as you want..
Shading breaking on small objects, second image the shading is not looking good, if I increase the size of the object and apply transform It will fix it, but I need the object to be small with transforms applied
Is there a way to fix this without having to apply the weighted normals modifier?
Shading breaking on small objects, second image the shading is not looking good, if I increase the size of the object and apply transform It will fix it, but I need the object to be small with transforms applied
Is there a way to fix this without having to apply the weighted normals modifier?
Shading breaking on small objects, second image the shading is not looking good, if I increase the size of the object and apply transform It will fix it, but I need the object to be small with transforms applied
Is there a way to fix this without having to apply the weighted normals modifier?
Is that Solid shading mode? Curious if these artifacts also appear in Material Preview or Rendered modes.
This might be a math precision probem.. because i tried to recreate it and after a while i recognized.. i didn't applied scale.. so..
..there is a slighly difference in shading just because the scale is normalized to 1.0.. So i assume also the WeightedNormal modifier computes some slighly different result.. an now i see that i get i wrong.. you do original do not want the modifer in the first place... ..okay left small scale right normalized scale; top without weightednoraml modifier, bottom with.. (and with bad choosen angle..)
..i assume you want this edgy part slighly beveled and the round part.. well (perfectly) round.. IDK if something like normal transfer or normal bake from a more high poly object (with explicit choosen bevels on the edgy part) would suit you.. or "just" adding some more detail to the round part..
Textures made in marmoset 4, rendered in Blender 3.6 shows seams. Ive hunted down all the settings I could find in the overlays and texture nodes but no luck yet.
Quick render.
Seams.
UV shells in marmoset (above).
Marmoset shell padding seems fine. This is something recent as the character doesn't show seams from earlier Blender versions.
Im pretty sure it is a check box I cant find. Any help appreciated.
Hmm.. i remember someone ask a similar question (somewhere) and indeed the UV where (somehow) accidently "shrunken".. they didn't fit anymore.. Another (weird) idea would be.. usually in blender (default theme) the overlay color is white and some orange if selected.. additionally marking the faces in a faded orange.. like so:
You do not have a texture with the UV borders included in yellow (as an helping layer from marmorset or generated before) ?? Since it seems the are yellow in the render and the texture ?? (Maybe you are even using some mutli layer texture.. with an UV-border layer.. ???) ..because i'm not aware of any blender checkbox to show UV-island borders on the rendered scene (or any addon to do so).. only UV-Editor -> UV -> Export UV Layout . (Just seeing now: the texture image seems to be from marmorset ?)
Thanx okidoki :} Going through your ideas I opened the albedo map in gimp and found the problem:
Toolbag isn't baking the padding. The padding is slim but shouldn't be exporting like this. The model was exported with smoothed normals from Blender. For now I'll fix it in post. If anyone has any ideas I'd appreciate them. Once again thanx for your time.
I'm preparing a couple of models for my demo reel, and am finding it to be more difficult than expected to render curved wireframes on a model in Blender.
Turn on the Freestyle line renderer in Blender, this is on 4.0.2:
Select your mesh with the Subdivision Surface modifier, enter mesh Edit mode, select all edges, press Ctrl + E for the Edge tools popup and choose "Mark Freestyle Edge". Only the original object edges will be marked, not the generated ones.
Enable Freestyle in the render settings (it's in the Render tab, usually at the bottom).
Then go to View Layer tab, scroll down to the Freestyle render settings, enable only "Mark" and "Contour" edge types, so it only produces lines on your marked edges as well as the overall object silhouette.
How I can pass point attribute stored for certain selected vertexes in geometry nodes as a vertex group for further modifiers . For data transfer for example ?
Did it before in earlier versions but totally forgot how ?
Is there some clever way to see UV orientation (and possibly density as well) as a viewport mode/overlay ?
I am aware that Textools has a bunch of checker generators (all sneakily hidden behind a *single* button ! Bad UX, BAD !) ; and I also use my own checker texture (here indicating UV direction with arrows pointing diagonally up-right), that I simply add to any current material without connecting it, to display it in single texture viewport mode.
But is there something better out there, not involving modifying the current material ?
@pior No, but Geometry Nodes can override all materials on a model, and I'm playing around with something where you can choose (and if I can figure it out in PME: hotkey) between UV Color Checker, orientation, stretching, individually coloured uv islands etc. Don't really have time to finish it right now, but even just having UV checker on a modifier has been a real timesaver!
Here's what the simplest form of that setup looks like. It overrides every material slot, so it's a great way to get a material of your choice (that you'll have to have in the scene) onto your entire model.edit: To show you what that looks like when you display 'Attributes', this is 'UV Stretching/Area', another mode showing a different colour per mesh island, and another showing a different colour per UV island.
And the nice thing about it being an Attribute is you can easily switch between Attribute and Texture.
I used to also have a UV checked disconnected in the material like you, but it grew to be a pain. Even tried scripting a way to add it to materials, and selecting it, but I think this way is better.
I have a glass BSDF shader in Blender that I need to be fairly rough, because there are things behind it that need to be visible blurry when looking through the glass. I do not want the glass itself to be very rough though, meaning there needs to be a high amount of clearcoat to it. The refractions need to be blurry, not the glass itself. There is no clearcoat setting in the glass BSDF shader though, and I have no idea how to add it. Do I need to add this in compositing, or is there some way to add clearcoat to the glass BSDF shader? Using the mix shader node and... some other shader maybe. No idea which one though, didn't work at all in my tests.
Amazing stuff! @Michael Knubben I tried my luck with just Quick Favorites and added a continuous animation for the UV. Selecting one Modifier and copying it over to other meshes works. Would love to see a PME version.
Any suggestions on how to unwrap a model like this efficiently and without much stretching? Currently doing in blender but it takes a lot of time, and zbrush auto unwrap leaves too much stretch
Any suggestions on how to unwrap a model like this efficiently and without much stretching? Currently doing in blender but it takes a lot of time, and zbrush auto unwrap leaves too much stretch
Replies
Hey Michael, that sounds promising indeed. Fingers crossed he'll be back for real.
I saw that Pie Menu Editor now got an update, seems like that only fixes the most glaring issue with 3.6 though with future fixes planned so I'm still holding out. But it's promising, to say the least.
Another one I can't seem to live without and might not be the only one who had been a little anxious about its development status is Modifier List - and that got finally updated to support newer Blender features today: https://blenderartists.org/t/modifier-list-1-7-5/1147752
Good times!
I've stopped using Modifier List because it often lags behind current releases, and breaks in significant ways. I loved it, but just can't rely on it :/
Hello all,
When color picking in 3D texture paint mode, is there a way to have the color picker source color from the actual raw pixel data from the current texture, even though the full result of the shading is being displayed (shader + viewport color management) ?
Note that I am not just talking about avoiding the influence of viewport color management here, or even just avoiding the influence of the 3d shading of the model - as I believe these two aspects have been adressed at some point in recent releases, at least partially. What I mean is, for instance, being able to color pick and paint the white and black of a a texture mask while displaying the result of the whole shader ; or, color picking from and painting on a RGB texture even though its display may be affected by a hue shift node somewhere in the shader.
I hope this makes sense somehow :D FWIW I am using 3.1.0.
Yeah I guess if you try to stay current then you're in far worse a spot than I am since I'm only updating my production setup like once or twice a year. Still looking at replacing Pie Menu Editor witha custom solution longer term here though since that one to me is just far too crucial to be able to use the software at all. I'm so lost when I do a quick test install without it. 😂
Here's something neat: https://blenderartists.org/t/weighttoolbox-addon/1460981
This addon adds the usual skinning tools you can get elsewhere like value set/copy/paste and mirror but also offers what I think is the first implementation in Blender of a skinning workflow that I missed dearly from 3ds max that allows you to skin a low poly mesh and transfer/interpolate the weights onto a higher res one. In Max that's done with the Skinwrap modifier, here it's called 'Wrap Weight' - 0:58 into the video:
In Max I used to stack like three models deep for skinning - very basic proxy like shown in this video, driving a low poly but articulated mesh with simple fingers and proper joint topology - and the actual model driven by that. This way I was mostly just editing the weights on the skinning proxies instead of having to deal with skinning higher res geometry.
Of course skinning in Max was also a total disgrace by comparison so it was more essential to work this way than in Blender to stay sane but still. Looking forward to giving this a spin with my new work.
- If two positive Metaballs objects overlap, the common volume hidden inside the resulting volume (which would be called the intersection in Boolean terms) is now a "double positive".
- When a Metaball gets set as "negative" in the Metaball properies panel, it is set as "single negative". You see where this is going ...
- ... this means that bringing a subtractive Metaball over two positively blended Metaballs will *not* remove the volume where the two original volumes intersect.
- To get rid of it, the subtractive Metaball needs to be duplicated, for the resulting negative volume to reach a value of -2.
Like so :
I for sure have never ever seen this explained anywhere, and I don't think this is self evident to anyone - and this unusual behavior (compared to other Metaball systems relying on an outliner/relationship tree) is what led me to write off this tool as unpredictable originally.
I hope this helps !
The remesh modifier could be great, but the fact that it cannot be applied to multiple objects at once makes it hardly usable ... Sure enough there has to be some clever workaround making the continuous remesh of multiple objects possible ? Like running a script manually to fetch a bunch of objects from a list, and making a merged duplicate of them as one single mesh ready for remesh. This would be a rather viable semi-realtime solution ...
Is there some clever way to mute the "Realize Instances" node directly from the 3D viewport (to act as a final preview toggle, allowing to work on the assembly at full performance without the remesh slowdown) ?
Well dont think so - just the usual turn off the modifier for the time being. But it would be trivial to write a little operator which toggles all the modifers on the selected object or something like that.
Introducing Baked Blender Pro Suite: Elevate Your Digital Art Workflow!
Hey fellow artists and Blender enthusiasts!
I'm thrilled to announce the launch of Baked Blender Pro Suite, a Blender plug-in designed to make your digital art creation process a little bit more fun. Whether you're a seasoned artist or just getting started, this plug-in is packed with features to take your Blender experience to the next level.
With Baked Blender Pro Suite, you can:
Access a vast library of high-quality assets, including models, materials, and geometry node setups, to enhance your projects.
Drag and drop usability
Regularly updated with new assets, providing fresh inspiration and expanding your creative possibilities.
Engage with our vibrant community, share your work, and exchange ideas, tips, and tricks.
We believe in the power of collaboration and continuous improvement, so we invite you to try out Baked Blender Pro Suite and provide us with your valuable feedback. We're committed to refining and expanding our offerings based on user input.
Visit our gumroad
to download the plug-in and explore the free assets.
Join us on this exciting journey and let's shape the future of digital art together!
Download link: https://bakeduniverse.gumroad.com/l/bbps
YouTube: https://youtube.com/@bakeduniverse
Discord: https://discord.gg/bakeduniverse
https://github.com/pixelbutterfly/vertex_color_stylizer
It's very anoying to do it all in the outline with hotkeys as they currently work, for instance working on a project with 2 sets of assets that all will have hp/lp/nanite versions becomes a pain in the ass. It would be so much faster and easier if you had the old system. 1 hp, shift+1 nanite, alt+1 lp etc...
I've been using Blender professionally every day since 2.93 and i think it smokes Autodesk software any day of the week but since i updated to 3.6.0 from 3.4.1, there is some major annoying behaviour and i've been pulling my hair out.
Snaps don't seem to work properly anymore, if i try to snap a vertex to another vertex (in edit mode, of the same mesh), or use any axis constraints, it feels that the snaps have backface culling active even if it isn't.
I use this all the time when modeling and in 3.6.0 i'm forced to orbit the view or hide faces to be able to snap to vertices that are on the other side of the mesh, which means that i have to waste massive amounts of time over a day as these small adjustments and extra operations compound over a day a lot since now it takes me twice or three times as long to do the same operation that worked flawlessly.
Settings, addons and snap settings are identical between 3.4.1 and 3.6.0, here are two screenshots from 3.4.1 and 3.6.0 for comparison:
3.4.1 behaving as expected
and here is 3.6.0
The model here is in x-ray mode.
I suspect that there is something wonky going with the depth detection or culling because it snaps to other vertices that are closer to the camera.
It's also not a camera issue because as i said the settings are identical across the two different versions as I've set them manually one on one everywhere across the two versions. if it helps, i use blender always as portable version, there is no blender installed locally on my system because i want to keep different versions from interfering with each other.
Any suggestion is greatly appreciated if anyone has an idea of what is going on here
iirc it's not a new thing. For 270 degree angles and steeper it is mostly not an issue. It starts below 12 segments already but gets more obvious the more segments you have and it depends on the "bevel flow" as well.
The rare cases I used it, setting the profile to ~0.6-0.66 helped, depending on the model.
Model is subdivided, but the wireframe has the resolution of the original low poly geometry, and smoothly follows the curvature of the subdivided mesh. Subdivision modifier with 'optimal display' activated so to speak, but renderable in EEVEE or cycles.
Is there an easy way to do this? I have found some complicated setup online using geometry nodes for this, but to be honest I feel little motivation to reconstruct the overly complex setup for such a seemingly obvious and simple effect. How would you do it? Doesn't matter if its a material or geometry.
Is there a way to fix this without having to apply the weighted normals modifier?
Is that Solid shading mode? Curious if these artifacts also appear in Material Preview or Rendered modes.
..there is a slighly difference in shading just because the scale is normalized to 1.0..
So i assume also the WeightedNormal modifier computes some slighly different result.. an now i see that i get i wrong.. you do original do not want the modifer in the first place...
..okay left small scale right normalized scale; top without weightednoraml modifier, bottom with.. (and with bad choosen angle..)
..i assume you want this edgy part slighly beveled and the round part.. well (perfectly) round..
IDK if something like normal transfer or normal bake from a more high poly object (with explicit choosen bevels on the edgy part) would suit you.. or "just" adding some more detail to the round part..
You do not have a texture with the UV borders included in yellow (as an helping layer from marmorset or generated before) ?? Since it seems the are yellow in the render and the texture ?? (Maybe you are even using some mutli layer texture.. with an UV-border layer.. ???) ..because i'm not aware of any blender checkbox to show UV-island borders on the rendered scene (or any addon to do so).. only UV-Editor -> UV -> Export UV Layout .
(Just seeing now: the texture image seems to be from marmorset ?)
The other option is to use a 'Store Named Attribute' node
Is there some clever way to see UV orientation (and possibly density as well) as a viewport mode/overlay ?
I am aware that Textools has a bunch of checker generators (all sneakily hidden behind a *single* button ! Bad UX, BAD !) ; and I also use my own checker texture (here indicating UV direction with arrows pointing diagonally up-right), that I simply add to any current material without connecting it, to display it in single texture viewport mode.
But is there something better out there, not involving modifying the current material ?
Any suggestion is welcome.
I tried my luck with just Quick Favorites and added a continuous animation for the UV.
Selecting one Modifier and copying it over to other meshes works. Would love to see a PME version.
Any suggestions on how to unwrap a model like this efficiently and without much stretching? Currently doing in blender but it takes a lot of time, and zbrush auto unwrap leaves too much stretch
I made a Greentooth generator and triangle counter in Geometry Nodes. Free to download!
@Michael Knubben Going to try it, looks great!