Anyone technically smarter than me, it'd be useful to chime in, and spark some inspiration for the dev of this addon - where us - the character artists with intentions of rigging and skinning their characters (properly), could tremendously ease our workflow for weight painting without worrying the usual horrors and micromanaging of weights in general.
Hi, I came across a small problem, which I believe it has a simple solution, but I am new into Blender and into sculping and still learning the basics. I tried joining two meshes together with activating the Bool tool add on in preferences, selecting the objects and pressing shift+ctrl+Num+(direct union). But the neck now looks like it`s sewed on the torso, there is this line that I cant get rid off no matter how much clay strips I add, when I smooth it still appears. Is there any way how I can fix this?
Assuming that the merging itsself went well, then it might just be some custom vertex normals data that needs to be cleared (near the bottom of the Object Data Properties panel / "green triangle").
Assuming that the merging itsself went well, then it might just be some custom vertex normals data that needs to be cleared (near the bottom of the Object Data Properties panel / "green triangle").
I tried your advice, but it doesn't seem to be the issue. I will leave it how it is for now, it was only for practise anyway.
Does anyone know if the sculpting performance improvements described here last year have been implemented yet? I wasn't able to find release notes that would suggest so.
Does anyone know if the sculpting performance improvements described here last year have been implemented yet? I wasn't able to find release notes that would suggest so.
I'm on 3.0 Alpha, Apparently it's not implemented yet, did the same test here and it's still lagging behind at 4.2M verts
Pablo took a break from Blender dev, let's pray for him to get back some day
Any idea why blender crashes when following this tutorial? it happens when i switch from 3D to 4D Noise texture. MSI Afterburner showed that my gpu goes up to 1800Mhz and full Memory. Got a RTX 3080 Ti - doesn't matter if i use stock profile oder undervolted.
is there a blender equivalent of polyStein /PlugIt / MeshBlend ?
The closest thing I ever saw in blender is the ''plug'' option from mesh machine, but it's not designed for subdiv. MESHmachine - Plug preview 3 - YouTube
thanks guys. I checked out both of them just now. MESHmachine seems to just project the mesh on top. and Kitops deals with booleans. Both addons are amazing, dont get me wrong. but its not what im looking for. Im surprised that blender doesn't really have a direct equivalent.
Hi. Can somebody explain to me please why I start seeing sharp edges when I scale the top face of the cylinder? Shading set to Smooth and auto smooth with 30° is enabled.
It is perfectly smooth when it's just a cylinder. Then I scale down the top face and here is what I get:
I can fix it by adding many edge loops on the sides that suppose to be smooth or by adding Subdivision Surface modifier. But I just want to know why it happens 🙂
Hi. Can somebody explain to me please why I start seeing sharp edges when I scale the top face of the cylinder? Shading set to Smooth and auto smooth with 30° is enabled. ...
That is normal for long/thin triangles on bend surfaces, happens not only with Blender. You found the solution already, you need to add loops.
Hi! In Max you have the "Channel Info" tab to display how much weighs you model, your UVs, your vertex color/alpha, and other infos. Is there a built in option for this in Blender? Else, an addon you'd recommend? Thanks!
Hello all - A little while back, wasn't there an addon which allowed to display the currently worked on UVs as an overlay in the corner of the 3d viewport ? This strikes me as being potentially very useful if implemented well, even if only to display the current UV channel. I might be completely mistaken though. Does that ring a bell ?
@another caveman : I would say that for this kind of things it might be best to ask about the one specific thing you want to retrieve, as it is rather unlikely that everything will be in the same place.
@pior I definitely remember seeing it but I don't remember what it is, I was thinking hardops ,meshmachine or one of these but I can't find it in the wiki and I don't own them so...
Out of the box, I don't know Addon answer here are two that offer it: UV-Packer UVPackMaster2
@pior Visualizing the weight of UVs for instance I found interesting. So you know if your UVs are snapped on a grid or not as they would be more heavy with random values
UPD: looks like everything I wrote below is wrong, so please ignore it🙂
Hi.
After you do UV → Unwrap in Adjust Last Operation popup (press F9 to show this popup) you will see the Margin option. This is basically what you need. It will control the margin (or "padding" if you like).
The value of 0.01 means padding will be 1% of texture size. That is if you use 512px textures 1% padding will be 5.12 px. If you need to calculate what padding to specify you need to divide padding size by texture size. For example if you use 1024px textures and want to have 8px padding you need 8 / 1024 = 0.0078125. So you need to set 0.0078125 in the Margin field. In that way you will get 8px paddings.
256 = 2px
512 = 4px
1024 = 8px
2048 = 16px
For all of these cases margin should be 0.0078125. but you can round it to 0.8.
@littleclaude you don't need to use plugins to achieve what I wrote above. It's done out of the box in Blender. Tell me if you need further assistance.
@another caveman : interesting ! I have no idea if this specific bit of information is accessible by default, but it sure would be handy to have when working with some ultra low-end tilesets and atlases. If that is of any help there is something similar available for vertex group weights :
@littleclaude : your question is way too unclear - what do you *actually* mean, in simple straightforward terms ?
It could mean that you want the unwrap tool to space out islands by a certain distance, or it could mean that you want baked textures to a have certain amount of bleed, or it could mean that you want a packing tool that takes such distances into account, and so on.
@Oaken : the plot thickens ! I am getting a bit of a Mandela effect vibe with this thing now
Ha, good find ! HardOps "UV display" indeed. It certainly has potential, it's just a bit of a shame that it isn't quite realtime, only works on a selection of components, and also fades away after a few seconds ... I'll definitely be playing around with it to see if it is helpful, still. Thank you for the fruitful investigation !
When a game model uses a single texture sheet (Texture atlas) the image will have areas that are used for the textures and blank areas between them. The used areas are often called UV shells, and the blank areas are often called gutters.
When a game engine renders a scene it uses Texture filtering to smoothly render the texture, in a process called downsampling. If the gutters have colors that are significantly different from the colors inside the shells, then those colors can "bleed" creating seams on the model. The same thing happens when neighboring shells have different colors; as the texture is downsampled eventually those colors start to mix.
To avoid this, edge padding should be added in the gutters between each UV shell. Edge padding duplicates the pixels along the inside of the UV edge and spreads those colors outward, forming a skirt of similar colors.
When the UV layout is created, the spacing between the shells should be done with edge padding in mind. If the gutters between the UV shells aren't wide enough, there won't be enough edge padding to prevent bleeding.
@littleclaude Hehe yes, I know what edge padding means, thanks What I mean is, what do *you* actually want to know ? How to space out the islands of a given value ? How to generate the pixel bleed ? Something else ?
Overall it really isn't rocket science - you start from a loosely estimated distance on a test UV layout, (whichever way works best for you, either in % of UV space or in pixels), use some strongly contrasting color for the background on a test texture, see how it behaves when mipping aggressively in your target environment, and adjust accordingly. That's all there is to it.
Hi! I've developed a new free animation addon for Blender! It's a global killer, a tool to transfer animation from a root to its children and clear the animation on said root bone. Or you can just transfer the animation onto an empty or any bone(s) you select Often used in animation and game studios I thought it would be helpful!
@littleclaude Hehe yes, I know what edge padding means, thanks What I mean is, what do *you* actually want to know ? How to space out the islands of a given value ? How to generate the pixel bleed ? Something else ?
Overall it really isn't rocket science - you start from a loosely estimated distance on a test UV layout, (whichever way works best for you, either in % of UV space or in pixels), use some strongly contrasting color for the background on a test texture, see how it behaves when mipping aggressively in your target environment, and adjust accordingly. That's all there is to it.
What is a "native 3d package" ?
Hi
What I asking for, in Blender how do I select 8pixel distance or 16, 32 and so on depending on the size of the texture.
1024 = 8px
2048 = 16px
4096 = 32px
And what I mean by native 3D package, is what you normally use for 3d creation, Max, Rhino, Modo, Maya, Houdini and so on.
If you where to transition to Houdini for example and you got stuck on an issue the temptation would be to fix it in Blender quickly rather the research or ask people on forums.
There is a distance setting in the Blender UV tools that I assume you've found already (obviously), as it is right there in the options for UV>Unwrap and UV>Pack Islands. It goes from 0 to 1, with the value being (I think) linked to the averaged size of all current islands. There is probably a clever way to convert this value to an absolute distance but there is no way to input that by default.
But anyways, it would still be best for you to *actually* explain what you are trying to do in practical terms. Are you trying to unwrap a whole model at once from seams ? Are you trying to pack already unfolded islands ? And so on.
And regardless I do stand by the remark that even if there was a way to input a desired absolute distance in UV space for the spacing of islands during UV>Unwrap for instance, it would still not be a guarantee that your unwrap will mip elegantly in your target environment, hence it would *still* be necessary to test out the mipping by hand using a dark background. The values you are mentioning are loose guidelines at best, but not something to take as a definite formula.
Now of course I am not saying that being able to input a minimum distance as an absolute UV space value or as pixel value in relation to the currently displayed image wouldn't be useful - it sure would be. All I am saying is that if you want to get closer to a solution to your actual problem, you might start by explaining what you are actually trying to do as opposed to mentioning broad concepts ...
Regarding "native" packages : this is a rather odd take imho, as Blender has already been making some of these apps obsolete in quite a few aspects.
Hi, apologies for another Blender nooby question, have you got any tips on the best settings to get rid of edges in the weighted normal modifier, I am finding it hard to get rid of some normals and I am trying to stick to Blender and not export back to Maya/3DMax to fix little issues.
Any tips would be appreciated, you can see the issue in the last image compared to Maya in the first image. I will check to see if there is any difference in UE4 as maybe I am over thinking this.
In Blender I seem to be having a problem straightening
the vertex normals, this is what is causing my issues as you can see in the second image. Any ideas?
Hi, apologies for another Blender nooby question, have you got any tips on the best settings to get rid of edges in the weighted normal modifier, I am finding it hard to get rid of some normals and I am trying to stick to Blender and not export back to Maya/3DMax to fix little issues.
Hi. What is the reason you cranked up Threshold parameter in Weighted Normals modifier? Looks like it's causing the problem. When I put it from 10 to 0.01 (default value) shading issues disappear.
Hi. What is the reason you cranked up Threshold parameter in Weighted Normals modifier? Looks like it's causing the problem. When I put it from 10 to 0.01 (default value) shading issues disappear.
Hi, I have tried everything, it's more to do with lining up the vertex normals. Blender does not seem to allow me to straighten them, as you can see here.
Normally this is not an issue as you tent to split and harden edges on UV shells but for things with repeating textures like vehicles and spaceships like Star Citizen you need to have a weighted normal approach. As you can see below, there is clearly a normal issue due to it being averaged, I need to be able to control there direction.
I have asked over on Blender Artists, if there is a fix for “Weighted Normals” & “Pixel Edge Padding” I will be sure to post it back here if I get a solution.
Can someone explain to this old grandpa what's the hype or reason to be excited for the amount of development and updates surrounding geometry nodes? Is it something that'll benefit VFX artists a lot? From a videogame art perspective, I'm kinda scratching my head here on what its uses would be. Can things be done parametrically in Blender now and exported to game engines, like the Houdini tools that were used for The Ascent. or is something like that still very very far away still?
Well, Anything parametric can save hours and hours of time, even for modeling.
It's a bit of a lost art these days (as newer releases of Max seem very poorly put together, at least based on the demo I've recently installed), but basically max circa version 9 and 2009-ish had an incredibly robust workflow revolving around the Edit Poly modifier and the Modifier stack. The key point is that it allowed to select parts of a base model and apply parametric operations to that, non destructively, and with an excellent refresh rate.
Imagine this : a base mesh, with a sbdivision modfier applied to it. Then being able to select some faces from that resulting subdivided geo, and delete them (non-destructively). Then instructing the mesh to be given thickness, with a parametric profile shape even. And at any time, adding a bend modifier (or even a shrinkwrap) to the base unsubdivided shape, without losing any of the added operations on top. Or even, going back to the base shape and editing it further at the polygon level, that too withoug losing anything.
This workflow is extremely powerful, but as far as I am aware is only available in Max in such a fluid manner ... and the recent releases of Max are kinda awful. At the moment the Blender modifiers are not able to operate that way, but I believe that the push for nodes will likely introduce some new concepts which, eventually, would allow a fully non-destructive workflow close to oldschool Max.
@pior I see, that does sound attractive for modelers. I just hope the Blender devs do have that as a goal, since most of what I'm seeing people do, and seems to be currently oriented for, is to do very abstract art.
Well, abstract stuff is actually a very good testbed I think. I would say that the first step would be to replicate all modifiers as nodes (which is probably achieved already) ; then add solid ways to select components from any node, even if just randomly ; and then add ways to select specific elements at any step in the chain. I suppose that this last part is probably quite complex because unlike Max Blender probably doesn't have the data stucture to fully allow that, and it certainly doesn't have the UX for it either.
There's hope though : the Multires modifier already allows to merge/edit objects without breaking their respective levels, so i can only assume that someone somewhere is already thinking about all this and that the data structure is at least partially in place.
If anything, if they manage to solve this, this will probably mean that Blender will also eventually get a stackable "Edit Poly" modifier (because as great as nodes may be, most of modeling needs can be achieved with a linear 1D stack anyways).
Does anyone know if Geo nodes based instance scattering allows to make instances "real" i.e. export them as instanced objects rather than a single object?
Been scratching my head and googling this one the entire weekend...I'm preparing multiple beauty shots for render - is it possible to have both vertical (or portrait) shots and horizontal (or landscape) in one scene? Normally I'd do this by adjusting the dimensions in the Output Properties tab, but this affects all shots, all cameras. How can I have both in one scene?
@kio I could try that, sure. But, how would you go about working on a scene like that? What I do now is have in different workspaces a viewport window with its own focal length and framing (horizontal or vertical), for the way I want the respective shot to be. Could you give me any suggestions on how to create a script like that?
Replies
Apparently it's not implemented yet, did the same test here and it's still lagging behind at 4.2M verts
Pablo took a break from Blender dev, let's pray for him to get back some day
https://www.youtube.com/watch?v=lCvP917Z8uc
https://www.kit-ops.com/
It is perfectly smooth when it's just a cylinder. Then I scale down the top face and here is what I get:
I can fix it by adding many edge loops on the sides that suppose to be smooth or by adding Subdivision Surface modifier. But I just want to know why it happens 🙂
You found the solution already, you need to add loops.
Here I wrote an illustrated explanation on another forum where I asked the same question: https://blenderartists.org/t/issue-with-shading-on-a-surface-that-suppose-to-be-smooth-why-it-happens/1332556/3?u=yevhen_kutishchev
Hope it's not against the rules to post links to other forums 🙂
Maybe it will be useful for someone like me to better understand how things work.
Is there a built in option for this in Blender? Else, an addon you'd recommend?
Thanks!
edit:
this one
A little while back, wasn't there an addon which allowed to display the currently worked on UVs as an overlay in the corner of the 3d viewport ? This strikes me as being potentially very useful if implemented well, even if only to display the current UV channel. I might be completely mistaken though. Does that ring a bell ?
@another caveman : I would say that for this kind of things it might be best to ask about the one specific thing you want to retrieve, as it is rather unlikely that everything will be in the same place.
Addon answer here are two that offer it:
UV-Packer
UVPackMaster2
@pior
Visualizing the weight of UVs for instance I found interesting. So you know if your UVs are snapped on a grid or not as they would be more heavy with random values
Hi.
After you do UV → Unwrap in Adjust Last Operation popup (press F9 to show this popup) you will see the Margin option. This is basically what you need. It will control the margin (or "padding" if you like).
The value of 0.01 means padding will be 1% of texture size. That is if you use 512px textures 1% padding will be 5.12 px. If you need to calculate what padding to specify you need to divide padding size by texture size. For example if you use 1024px textures and want to have 8px padding you need 8 / 1024 = 0.0078125. So you need to set 0.0078125 in the Margin field. In that way you will get 8px paddings.
For all of these cases margin should be 0.0078125. but you can round it to 0.8.
Tell me if you need further assistance.
@littleclaude : your question is way too unclear - what do you *actually* mean, in simple straightforward terms ?
It could mean that you want the unwrap tool to space out islands by a certain distance, or it could mean that you want baked textures to a have certain amount of bleed, or it could mean that you want a packing tool that takes such distances into account, and so on.
@Oaken : the plot thickens ! I am getting a bit of a Mandela effect vibe with this thing now
https://youtu.be/aSdNAKLSrBk?t=774
It was in hardops after all
Hi, please ignore my explanation of how to calculate padding size. It is not correct.
When a game model uses a single texture sheet (Texture atlas) the image will have areas that are used for the textures and blank areas between them. The used areas are often called UV shells, and the blank areas are often called gutters.
When a game engine renders a scene it uses Texture filtering to smoothly render the texture, in a process called downsampling. If the gutters have colors that are significantly different from the colors inside the shells, then those colors can "bleed" creating seams on the model. The same thing happens when neighboring shells have different colors; as the texture is downsampled eventually those colors start to mix.
To avoid this, edge padding should be added in the gutters between each UV shell. Edge padding duplicates the pixels along the inside of the UV edge and spreads those colors outward, forming a skirt of similar colors.
When the UV layout is created, the spacing between the shells should be done with edge padding in mind. If the gutters between the UV shells aren't wide enough, there won't be enough edge padding to prevent bleeding.
This is what it looks like in Maya FYI
Danger, the edge padding and UV shells are way to close which will cause baking issues.
Solution
I found this video where the dude calculates the math behind the Margin field in Blender:
https://youtu.be/eh3pNZ9RpBw?t=1992
I didn't check myself if his math is correct. Here is his explanation in textual form: https://blenderartists.org/t/i-might-have-found-a-great-way-to-determine-the-margin-value-for-uvs-pixel-padding/1156290
Overall it really isn't rocket science - you start from a loosely estimated distance on a test UV layout, (whichever way works best for you, either in % of UV space or in pixels), use some strongly contrasting color for the background on a test texture, see how it behaves when mipping aggressively in your target environment, and adjust accordingly. That's all there is to it.
What is a "native 3d package" ?
I've developed a new free animation addon for Blender!
It's a global killer, a tool to transfer animation from a root to its children and clear the animation on said root bone. Or you can just transfer the animation onto an empty or any bone(s) you select
Often used in animation and game studios I thought it would be helpful!
Grab it here
What I asking for, in Blender how do I select 8pixel distance or 16, 32 and so on depending on the size of the texture.
- 1024 = 8px
- 2048 = 16px
- 4096 = 32px
And what I mean by native 3D package, is what you normally use for 3d creation, Max, Rhino, Modo, Maya, Houdini and so on.If you where to transition to Houdini for example and you got stuck on an issue the temptation would be to fix it in Blender quickly rather the research or ask people on forums.
There is a distance setting in the Blender UV tools that I assume you've found already (obviously), as it is right there in the options for UV>Unwrap and UV>Pack Islands. It goes from 0 to 1, with the value being (I think) linked to the averaged size of all current islands. There is probably a clever way to convert this value to an absolute distance but there is no way to input that by default.
But anyways, it would still be best for you to *actually* explain what you are trying to do in practical terms. Are you trying to unwrap a whole model at once from seams ? Are you trying to pack already unfolded islands ? And so on.
And regardless I do stand by the remark that even if there was a way to input a desired absolute distance in UV space for the spacing of islands during UV>Unwrap for instance, it would still not be a guarantee that your unwrap will mip elegantly in your target environment, hence it would *still* be necessary to test out the mipping by hand using a dark background. The values you are mentioning are loose guidelines at best, but not something to take as a definite formula.
Now of course I am not saying that being able to input a minimum distance as an absolute UV space value or as pixel value in relation to the currently displayed image wouldn't be useful - it sure would be. All I am saying is that if you want to get closer to a solution to your actual problem, you might start by explaining what you are actually trying to do as opposed to mentioning broad concepts ...
Regarding "native" packages : this is a rather odd take imho, as Blender has already been making some of these apps obsolete in quite a few aspects.
Any tips would be appreciated, you can see the issue in the last image compared to Maya in the first image. I will check to see if there is any difference in UE4 as maybe I am over thinking this.
In Blender I seem to be having a problem straightening the vertex normals, this is what is causing my issues as you can see in the second image. Any ideas?
Normally this is not an issue as you tent to split and harden edges on UV shells but for things with repeating textures like vehicles and spaceships like Star Citizen you need to have a weighted normal approach. As you can see below, there is clearly a normal issue due to it being averaged, I need to be able to control there direction.
I have asked over on Blender Artists, if there is a fix for “Weighted Normals” & “Pixel Edge Padding” I will be sure to post it back here if I get a solution.
https://blenderartists.org/t/help-with-weighted-normals-pixel-edge-padding/1335624
Isn't this is what you trying to achieve? Or am I missing something?
If you want to see the result of the Weighted Normal modifier in Edit mode make sure to toggle the button I've pointed on the screenshot above.
Thank you a 1000 times, so simple, it works perfecly. It's always that little check box.
It's a bit of a lost art these days (as newer releases of Max seem very poorly put together, at least based on the demo I've recently installed), but basically max circa version 9 and 2009-ish had an incredibly robust workflow revolving around the Edit Poly modifier and the Modifier stack. The key point is that it allowed to select parts of a base model and apply parametric operations to that, non destructively, and with an excellent refresh rate.
Imagine this : a base mesh, with a sbdivision modfier applied to it. Then being able to select some faces from that resulting subdivided geo, and delete them (non-destructively). Then instructing the mesh to be given thickness, with a parametric profile shape even. And at any time, adding a bend modifier (or even a shrinkwrap) to the base unsubdivided shape, without losing any of the added operations on top. Or even, going back to the base shape and editing it further at the polygon level, that too withoug losing anything.
This workflow is extremely powerful, but as far as I am aware is only available in Max in such a fluid manner ... and the recent releases of Max are kinda awful. At the moment the Blender modifiers are not able to operate that way, but I believe that the push for nodes will likely introduce some new concepts which, eventually, would allow a fully non-destructive workflow close to oldschool Max.
There's hope though : the Multires modifier already allows to merge/edit objects without breaking their respective levels, so i can only assume that someone somewhere is already thinking about all this and that the data structure is at least partially in place.
If anything, if they manage to solve this, this will probably mean that Blender will also eventually get a stackable "Edit Poly" modifier (because as great as nodes may be, most of modeling needs can be achieved with a linear 1D stack anyways).
Like this?