Blender stops responding when i try to go into rendered viewport. That only happens when i try to render Volume (Trying to give Liquid a color but it's solid, not fluid.... maybe my hardware is just not good enough? i5 2500k, 16gb ram, rtx 480.. I am using Cycles with GPU Compute selected. Or could it be that my HDD is too slow?
Yesterday everything stopped moving. I had sound but both screens were frozen after hitting Rendered Viewport. The Viewport also wont change anything when i change the volume settings.. so maybe it's the GPU?
That definitely sounds like your GPU is overloaded, if even things outside of Blender stop responding. Or your CPU is maxed out and there's not enough compute around for your OS to even send draw calls.
That definitely sounds like your GPU is overloaded, if even things outside of Blender stop responding. Or your CPU is maxed out and there's not enough compute around for your OS to even send draw calls.
Is there anything I can do? It even crashes when i just add it and limit the render viewport to just the corner of the object.
You could try with different performance options in the render properties. And since this is a volume, you could also try reimporting the cache as an OpenVDB.
This sounds like your GPU is not stable under certain loads. In this situation, I would use something like MSI Afterburner to underclock the GPU. It might be enough to do -50 mhz on the Core Clock to get the GPU stable again if that fails go further down with the numbers.
Do somebody know a way to make group of objects, several lods for example behave as a single object accepting modifiers? Like groups or linked objects in 3dMAx? Kind of "library override" for linked collection that woukld keep linked collection as a single object available for modifiers and I could switch the lods in the link origin?
With some addons maybe? Sverchok? Animation nodes? an easiest way?
Hi, I'm getting back into modelling and I'm helping some friends with their project. They wanted to use Blender and since I don't want to pay for max (or pirate it) I've decided to give it a shot and I'm having a ton of fun. But there are things that confuse me a lot. I'm feeling confused about the sharp-smooth edges, my friend developed a pipeline in which they usually use weighted normals (for some reason) and that's where problems start for me. I'm used to make a high poly model, do a low poly and bake, making each uv island a separate smoothing group. But Blender works differently so I don't know where to place a sharp edge, and how they relate to uv seams. Should I make a seam wherever i have a Sharp edge?
@HammerB Sharp edges would be the same as hard edges in Maya afaik, or smoothing group divisions in Max. Other than that, same rules apply. Different names or methods, but the same thing happening in all of them: you're breaking the vertex normals in those edges you select.
Generally speaking, you'd always want sharp edges to be UV seams, but not all UV seams need to be sharp edges.
@HammerB Sharp edges would be the same as hard edges in Maya afaik, or smoothing group divisions in Max. Other than that, same rules apply. Different names or methods, but the same thing happening in all of them: you're breaking the vertex normals in those edges you select.
Generally speaking, you'd always want sharp edges to be UV seams, but not all UV seams need to be sharp edges.
Right, like the unavoidable seam in a cylinder, right? that one wouldn't be sharp. I think I get it a little bit. Thanks Justo.
Honestly I also wouldn't say that all hard egdes need to be seams. Sometimes you want to use it as a stylistic tool and a seam would just make it more complicated to maintain the model afterwards.
Yes I know that hard edges and seams duplicate the vertex data each on their own and that when you have both on an edge, you kinda "save" data because you only have double the vertex data and not four times the amount. But still I wouldn't take this like a word from a higher power and say it always has to be this way. Know the reasons why you would do it and then decide each case for yourself. The other argument I keep seeing is that it gives you better bake results, but this becomes less relevant when you do either low poly or mid poly models. Especially when the latter go directly in a game and you achieve the smoothing with shaders for example.
Right, like the unavoidable seam in a cylinder, right? that one wouldn't be sharp. I think I get it a little bit. Thanks Justo.
Yup, exactly.
@f1r3w4rr10r I agree with you, that's why I began that statement with the caveat of 'Generally speaking'. I'm working for mobile games here and sometimes the res of textures will be so low that having one big shell with a few hard edges applied, is easier to work with in Photoshop and looks identical as if I had cut the seams at the hard edges....And then some "purists" in the team will not like this and tell me to go back and separate shells.
At the end of the day I think that what matters most is how the end result looks, and if you're saving a minimal amount, or adding it, by using unconventional methods, it shouldn't scare away people from using such techniques simply because 'it's not what the rulebooks say we should do'. If you think it's worth the trouble, absolutely do it.
Can I use vertex snapping in object mode? Meaning, I have two identical objects, in different positions in 3D space, and I need to match both objects perfectly in 3D space. To do that, I'd like to use any single vertex of the first object and snap that to the same vertex (same position in the topology, different position in 3D space) of the second object, while the shape of the entire object remains unaltered of course.
How do I do it?
EDIT: found a solution. Snap the 3D cursor to the position of the vertex of first object, put the pivot in the position of the vertex of the second object, then use Object -> Snap -> Selection to Cursor.
Any other solution that requires fewer steps, some kind of drag-and-drop-like functionality in the viewport?
@wilson66 That's similar to what I do yeah, I just skip the 3D cursor tweaking and use the move tool.
1-Set the pivot to a vertex in object A (I hotkeyed this in a context-sensitive script so I just press E) 2-Move-snap the object A to the vertex you want in object B
There is one edge case where it is easier and that is with the "Closest" "Snap With" option. This one allows you to snap the vertex closest to the element you are snapping to to that object. This at least covers my cases to 90%.
EDIT: Just to have it mentioned once: Remember that the hotkey Ctrl+. in Object mode switches between normal Object transforms or Origin transforms. You can then just snap the Origin to any vertex (or edge or whatever else).
Thanks guys! Addon looks interesting, will take a look at it.
Different question: Whats the best way to transfer shape keys from Blender to Maya? Alembic format does not seem to store shape keys, or Blender at least does not load them from the Alembic I have just exported. If I re-import the Alembic into Blender, there simply are no shape keys on the model.
Are shape keys stored in FBX file format exported from Blender? Can't test it directly unfortunately because I don't have Maya at home... I could export each shape key as a separate geometry object from Blender, but I'd like to avoid that and try to load them directly from a single object if possible.
When exporting an animation, the final, evaluated mesh is written to USD.
This means that the following meshes can be exported:
[...]
Deforming meshes; here the topology of the mesh does not change,
but the locations of the vertices change over time. Examples are animated characters or
bouncing (but not cracking) objects.
Is that possible to automatically make an edge (or edit existing one) between 2 vertices with achieving variable distance flow. I dunno how to explain this, so just make an image. So, as you can see, desired result are new edge (or existing one) which creates new vertices (or change position of existing one) while constantly and linearly decreasing.
I know about connect vertex patch. It will not provide desired result by default for complicated shapes.
Hello all - Do you guys know of any clever way to lock the View clipping values per scene ?
I am often switching between scenes with respectively x1 and x100 scale meaning that more often than not I have to adjust these values right after opening a scene. I there a way to set these things on startup on a per scene basis ? And if yes, would it be possible to also Frame All ?
[Edit] I've resorted to using two Pie Medu Editor macros, that I run right after opening a scene depending on the scale. They also adjust the grid to something appropriate. FWIW the one for "big models" (100x bigger than default units) is :
Has anyone found a way to sync viewports across workspaces? What I mean is, have the 3D viewport be oriented the same, or display isolated/hidden objects, the same way across all viewports? There is a long on-going design task doing this, but who knows when this would come to fruition and if it even is what I'm looking for by the time devs get to modify this based on their design ideas.
has anyone a fast solution for modeling embroidery photorealisticly ?
something like this
iam not looking for a texture workflow, purely modeling in blender.
every workflow i have found does not give very good results, i tried to find a solution with fur, but i'm not really an expert (yet) with the particle&fur system in blender
Anybody have a suggestion for how to subdivide a face selection to the same result you would get when applying a subdiv modifier set to 'catmull-clark' (to the object)? Sadly the subdivide command does not seem to cut it for this purpose regardless of what I'm playing with in the options. The surface forms it generates do not compare.
Just using the operator on a selection is of course going to be different, because you are missing the information of the neighboring faces. You could experiment with duplicating the mesh object, adding a subdiv modifier to it, then subdivide your selection, then add a shrinkwrap modifier to the original mesh, with the subdivided duplicate as a target. I have not tried this, but is an idea that came to mind.
Is there a way to use the clone brush to clone diffuse, roughness, height, normal textures at the same time from one part of a mesh to another? I wonder if there are efforts being made to improve texturing tools in Blender.
The viewer node is not showing up in the shader editor when I press ctrl shift Left right click. Node wrangler is enabled in the preferences addon tab. Is anyone having this issue?
Peculiar problem. I have an object with some shape keys, and a subdivision surface modifier on it. I have then created a duplicate for editing purposes (clicked the 'Create duplicate for editing' in the drop-down menu in the Shape Keys window).
So far so good, but I didn't remove the subdivision modifier first, so the duplicated object is still displayed in subdivision mode (smooth surface and all). It is displayed as subdiv surface in the viewport, although no subdivision surface modifier is there on the object.
How do I remove the subdivision surface from the object?
As much as I like Blender, I am getting more and more frustrated with it's gestion of scale. I work with a unit scale of and set to centimeters. My objects are at the correct sacle in my scene but every single time I try to import or export a mesh to or from Blender there's a scale problem of 10.
Any advice to have some kind of consistency because it definitely refuses to play nice with other packages.......
@joebount Are you saying that exporting the object and then importing into the same scene with the same unit settings, is giving an unexpected result?
It'd make sense to me to have a wrong result if you exported something with a certain Length type and Unit Scale value, and then imported it into another scene with completely different unit settings.
When exporting I think what matters is the raw Blender units -- the values that you see when you set the Length type to None in the unit settings. If some other app interprets these raw units in a different way (so a raw unit in the file translates to a small part of some metric unit), then you'll have these problems. Blender doesn't have a "Centimeter" Length type. The Centimeter preset rather uses the "Meter" length type and a Unit Scale of 0.01, so that one raw unit is interpreted as 0.01 (or 1%) of a meter, which is 1 centimeter.
My scene was indeed setup in metric with centimeters as a unit but the scale value was 1. That scale value was relative to meters not cm and setting it up to 0.01 fixed my scale issues (I had to rescale everything by a factor 100 though). Basically the unit is just for show and just to display values.
I need help with the curve deformer in Blender, never used it before. I'd like to re-create the method used here: https://www.youtube.com/watch?v=I7-z1bi_Epc (starting at ~5:00)
The problem is that I don't know how to use the second curve deformer, or what settings to use. When I activate the second curve deformer, Blender deforms the object, but additionally rotates the plane 90 degrees in Y, I have no idea why (see screenshot). I'd like the orientation of the plane to remain the same.
I'm sure that this is some really obvious and simple solution, but if someone could give me a hint I'd be really grateful.
EDIT:
Found it out myself. For some reason I needed to orient the plane and curves in X-direction. Works fine then. Don't really understand why though.
Hello all - I remember finding a little add-on (probably here in this thread, but not 100% sure about that) that added a useful functionality to the node editor : double-clicking on an image node would make the relevant image current in the image editor. Does anyone have a link to that ?
On a somewhat related topic, what exactly happens when selecting Unpack File > Remove Pack ? I understand the other two options (Create and Use) but not this first one ; and I cannot seem to find where said image gets unpacked in this first case. Any ideas ?
Maybe it just makes it become a virtual texture, only stored in memory but not anwhere on disk ?
Lastly, when using Create, is there a way to dictate a certain path for the unpacking/creation of the file ? Like using the Desktop as a place where to temporarily unpack a work file.
Hello there, as I was blown away recently by works I saw using some procedural magic in Blender I want to try and learn this as well. What I want to achieve is a tree (or any cylinder really) overgrown by some alien goo with tendrils and stuff like that. Any tips where to start? I have a tree - that was easy part but now how to generate such structures like slimy and sticky goo. Normally I would sculpt it but I want to learn something new Cheers!
Heya - If that is even possible, how does one revert the look of the search box to what it was just a few versions ago ? The current one truly is harder to read ...
Heya - If that is even possible, how does one revert the look of the search box to what it was just a few versions ago ? The current one truly is harder to read ...
Omg, I second that! Still haven't the time to hunt for a solution, but I hate this so much I only use the 2.9+ when it really can't be avoided.
Replies
I found Blender Cloud tutorials useful.
For me "From Maya and Max to Blender" made the switch easy.
Is there anything I can do? It even crashes when i just add it and limit the render viewport to just the corner of the object.
In this situation, I would use something like MSI Afterburner to underclock the GPU.
It might be enough to do -50 mhz on the Core Clock to get the GPU stable again if that fails go further down with the numbers.
for AMD cards you could try to look into their help
https://www.amd.com/en/support/kb/faq/dh2-020
Build:
https://blender.community/c/graphicall/sqbbbc/
Discussion:
https://devtalk.blender.org/t/parallax-occlusion-mapping/15774
https://www.youtube.com/watch?v=EeVC3tsSBFY
https://www.youtube.com/watch?v=e6htnCpWPGA
https://www.youtube.com/watch?v=Y_rV5FM_PHo
I'm feeling confused about the sharp-smooth edges, my friend developed a pipeline in which they usually use weighted normals (for some reason) and that's where problems start for me. I'm used to make a high poly model, do a low poly and bake, making each uv island a separate smoothing group. But Blender works differently so I don't know where to place a sharp edge, and how they relate to uv seams. Should I make a seam wherever i have a Sharp edge?
Generally speaking, you'd always want sharp edges to be UV seams, but not all UV seams need to be sharp edges.
@f1r3w4rr10r I agree with you, that's why I began that statement with the caveat of 'Generally speaking'. I'm working for mobile games here and sometimes the res of textures will be so low that having one big shell with a few hard edges applied, is easier to work with in Photoshop and looks identical as if I had cut the seams at the hard edges....And then some "purists" in the team will not like this and tell me to go back and separate shells.
At the end of the day I think that what matters most is how the end result looks, and if you're saving a minimal amount, or adding it, by using unconventional methods, it shouldn't scare away people from using such techniques simply because 'it's not what the rulebooks say we should do'. If you think it's worth the trouble, absolutely do it.
How do I do it?
EDIT: found a solution. Snap the 3D cursor to the position of the vertex of first object, put the pivot in the position of the vertex of the second object, then use Object -> Snap -> Selection to Cursor.
Any other solution that requires fewer steps, some kind of drag-and-drop-like functionality in the viewport?
1-Set the pivot to a vertex in object A (I hotkeyed this in a context-sensitive script so I just press E)
2-Move-snap the object A to the vertex you want in object B
Different question: Whats the best way to transfer shape keys from Blender to Maya? Alembic format does not seem to store shape keys, or Blender at least does not load them from the Alembic I have just exported. If I re-import the Alembic into Blender, there simply are no shape keys on the model.
Are shape keys stored in FBX file format exported from Blender? Can't test it directly unfortunately because I don't have Maya at home... I could export each shape key as a separate geometry object from Blender, but I'd like to avoid that and try to load them directly from a single object if possible.
When exporting an animation, the final, evaluated mesh is written to USD. This means that the following meshes can be exported:
[...]
Deforming meshes; here the topology of the mesh does not change, but the locations of the vertices change over time. Examples are animated characters or bouncing (but not cracking) objects.
You could test this by exporting an FBX and reimporting it in Blender, seeing if the shape keys (AKA blendshapes, morph targets) are preserved.
Is that possible to automatically make an edge (or edit existing one) between 2 vertices with achieving variable distance flow.
I dunno how to explain this, so just make an image.
So, as you can see, desired result are new edge (or existing one) which creates new vertices (or change position of existing one) while constantly and linearly decreasing.
I know about connect vertex patch. It will not provide desired result by default for complicated shapes.
Do you guys know of any clever way to lock the View clipping values per scene ?
I am often switching between scenes with respectively x1 and x100 scale meaning that more often than not I have to adjust these values right after opening a scene. I there a way to set these things on startup on a per scene basis ? And if yes, would it be possible to also Frame All ?
[Edit] I've resorted to using two Pie Medu Editor macros, that I run right after opening a scene depending on the scale. They also adjust the grid to something appropriate. FWIW the one for "big models" (100x bigger than default units) is :
bpy.context.space_data.clip_end = 5000 ; bpy.context.space_data.clip_start = 1 ; bpy.ops.view3d.view_all(center=True) ; bpy.context.space_data.overlay.grid_scale = 10
But looks like I can/t make it working in current Blender version . Getting kind of error window
Node wrangler is enabled in the preferences addon tab. Is anyone having this issue?
So far so good, but I didn't remove the subdivision modifier first, so the duplicated object is still displayed in subdivision mode (smooth surface and all). It is displayed as subdiv surface in the viewport, although no subdivision surface modifier is there on the object.
How do I remove the subdivision surface from the object?
If some other app interprets these raw units in a different way (so a raw unit in the file translates to a small part of some metric unit), then you'll have these problems. Blender doesn't have a "Centimeter" Length type. The Centimeter preset rather uses the "Meter" length type and a Unit Scale of 0.01, so that one raw unit is interpreted as 0.01 (or 1%) of a meter, which is 1 centimeter.
https://www.youtube.com/watch?v=jvXhp8sYctY
The problem is that I don't know how to use the second curve deformer, or what settings to use. When I activate the second curve deformer, Blender deforms the object, but additionally rotates the plane 90 degrees in Y, I have no idea why (see screenshot). I'd like the orientation of the plane to remain the same.
I'm sure that this is some really obvious and simple solution, but if someone could give me a hint I'd be really grateful.
EDIT:
Found it out myself. For some reason I needed to orient the plane and curves in X-direction. Works fine then. Don't really understand why though.
I remember finding a little add-on (probably here in this thread, but not 100% sure about that) that added a useful functionality to the node editor : double-clicking on an image node would make the relevant image current in the image editor. Does anyone have a link to that ?
[edit] Ha, nevermind, found it. The feature came with the "Extra Image List" addon, which makes it even more useful than it already is
https://meshlogic.github.io/posts/blender/addons/extra-image-list/
- - - - -
On a somewhat related topic, what exactly happens when selecting Unpack File > Remove Pack ? I understand the other two options (Create and Use) but not this first one ; and I cannot seem to find where said image gets unpacked in this first case. Any ideas ?
Maybe it just makes it become a virtual texture, only stored in memory but not anwhere on disk ?
Lastly, when using Create, is there a way to dictate a certain path for the unpacking/creation of the file ? Like using the Desktop as a place where to temporarily unpack a work file.
as I was blown away recently by works I saw using some procedural magic in Blender I want to try and learn this as well.
What I want to achieve is a tree (or any cylinder really) overgrown by some alien goo with tendrils and stuff like that. Any tips where to start? I have a tree - that was easy part but now how to generate such structures like slimy and sticky goo. Normally I would sculpt it but I want to learn something new
Cheers!
If that is even possible, how does one revert the look of the search box to what it was just a few versions ago ? The current one truly is harder to read ...