@rollin maybe I'm misunderstanding your intention, but why not edit the position/rotation at the mesh level(origin remains at 0,0,0 and object transforms unaffected), instance, mirror, and you have the result. 2 separate, but instanced objects with mirrored topology. You'll still have to make unique and flip normals though.
@musashidan hey Danny, thanks for the fast response. That looks like what I'm looking for - the videos shown however focus on creating HDRIs from scratch, which isn't what I'm interested in. But the feature list seems to confirm it is possible:
"Mix any Image with procedural lights or use a FlatColor or a Gradient Fill. "
I'm now asking the devs in their BA thread. Thanks for showing me this addon.
@rollin maybe I'm misunderstanding your intention, but why not edit the position/rotation at the mesh level(origin remains at 0,0,0 and object transforms unaffected), instance, mirror, and you have the result. 2 separate, but instanced objects with mirrored topology. You'll still have to make unique and flip normals though.
Thank you but I don't think this applies to my case. This would be what the modifier should be able to do imo. But as I only have one mesh data I don't really understand how I should edit it two get two positions / rotations out of it 0.o I want to edit the left side and the right side should follow. Just like with a mirror modifier - just not with a mirror modifier as it's currently not capable of what I want to do.
The workflow I want to have is: Mirror objects like with the workaround I have described just with one command and then revert the scale of the mirrored object to positive values and re-invert on the mesh level with the not-yet-existing mirror modifier v2 Hope this makes sense.
Is there a board or a thread dedicated to having someone review your topology? I'm working on a particular mesh for a character I plan on animating for a game and I'd like to have an experienced person review how I've made it before I get started.
Not sure how to delete the above post, but eh I'll just post a picture of it here.
The above image is a little thing I've been working on for a while. Now that I take a look at this monstrosity I've created this thing has no type of grace or consideration for good topology, especially on the arms. Now I'm wondering is it worth attempting to retopologize this mess or should I just start over form square one?
I'm asking this because I have no experience in retopologizing. From what I understand it is a way of correcting poor topology such as this but I have only ever seen this done on high res characters, not so much on low poly ones. So I'm trying to get some feedback on how I should proceed from here.
Not sure how to delete the above post, but eh I'll just post a picture of it here.
The above image is a little thing I've been working on for a while. Now that I take a look at this monstrosity I've created this thing has no type of grace or consideration for good topology, especially on the arms. Now I'm wondering is it worth attempting to retopologize this mess or should I just start over form square one?
I'm asking this because I have no experience in retopologizing. From what I understand it is a way of correcting poor topology such as this but I have only ever seen this done on high res characters, not so much on low poly ones. So I'm trying to get some feedback on how I should proceed from here.
Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
You've been using it obsessively for months!! Maxivs Interactive tools has this feature and can also create seams from sharps and unwrap in one click. They're at the bottom of the panel in UV utilities.
The workflow I want to have is: Mirror objects like with the workaround I have described just with one command and then revert the scale of the mirrored object to positive values and re-invert on the mesh level with the not-yet-existing mirror modifier v2 Hope this makes sense.
Still not exactly clear on your intention, but I'm thinking you want the same outcome as using Max's Mirror tool set to 'Geometry'? Max's Mirror used to invert scale and flip normals before they added that extra geometry feature. I don't think Blender can do this without workarounds like you've tried.
A remark on making a selection based on explicitely marked hard edges : as far as I am aware there is no way to do it trough the interface (the "select hard edges" command is not doing what it says, it is really just a filter based on angle). However the bmesh library has it in the form of the "smooth" property. Meaning that you can do a script that filters edges based on that, and then do any edge operation after - like marking seams for UVs.
Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
You've been using it obsessively for months!! Maxivs Interactive tools has this feature and can also create seams from sharps and unwrap in one click. They're at the bottom of the panel in UV utilities.
Looool indeed I have, oh god! Maxi must be shaking his head in disgust somewhere right now...Didn't know it was there hahaha. Now I'll need to watch some tuts on UVs too, since my God they've made even previewing UVs in that editor kind of strange and cumbersome...
I've just started experimenting with the unwrap workflow myself. I've been using Rizom for a few months now so I didn't really bother as the Blender workflow seemed very frustrating with the weird selection disconnect between the 3D view and the UV view. I'd like to master unwrapping in Blender though, so I spent some time at it yesterday and after some fiddling it actually doesn't seem that bad. One thing I found absolutely VITAL is the UV Hightlight addon. This will somewhat alleviate the weirdness(sync toggle too, but it's a bit hacky and some tools/addons don't seem to work when it's on) Also, TexTools and UV Packmaster are pretty essential.
On another note, I think I read a while back that you were looking for an easy way to hide boolean cutters. Boxcutter automatically creates a Cutter collection and there's a checkbox option to hide them automatically. This is great, but when you use Hops boolean tools it doesn't do this. I found that just hitting M>move to collection>cutters is a fast way to do this. Then you can use shift+1(or whatever number the cutters collection is in the outliner list) to toggle the global visibility of all your cutters in that collection.
Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
Is that not possible with the select similar tool?
Well, Select Similar is indeed a workaround, but having to manually select one "starter" edge like that would prevent any automation of the process. I ran into exactly this issue when looking for a way to automate Hard Edge to Beveled Edge conversion (which is similar to what Justo is trying to do) and unfortunately the default exposed tools don't allow for that. Hence the need to script : deselect all > filter for non smooth edges using bmesh > mark as desired.
Is it beneficial or advisable to represent the same details in both a normal map and a Bump Displacement at the same time? It's unclear whether having both improves the effect or amounts to the same thing, or what.
@Zablorg As I understood it the outcome of both is the same, but they take different inputs. Normal maps just use the regular normal map, representing the angles of the changed normal as input. And I believe bump maps use a height map as input.
@Zablorg As I understood it the outcome of both is the same, but they take different inputs. Normal maps just use the regular normal map, representing the angles of the changed normal as input. And I believe bump maps use a height map as input.
Ah, I should have been more clear: I am talking about plugging something into the normals input of a shader's surface (using either technique from your first two links), as well as the equivalent detail in its Displacement input (like in your last link).
But, the shader would be set to use use "Bump" for its displacement method rather than "True Displacement", so the mesh would be unchanged.
Question about OBJ export. I have an asset that needs to be textured in Substance Painter. It is essential that the individual objects within the OBJ file are named correctly (e.g. I need to name the objects 'screws_low', 'wheels_low' ect.)
When I export an OBJ file from Blender and then re-import it into Blender or Modo, the names of the objects have changed. Blender has attached some kind of appendices to each object, e.g. the object 'griff_low' in the blender file now is named 'griff_low_griff_low_test_for_modo:002:griff_test_for_modo:010' in the OBJ file, making it unusable in Substance Painter.
This is really undesirable behavior. How do I tell Blender to keep the names of the objects within the OBJ file exactly the same as in the blender file it was exported from?
There also is an option to 'keep the vertex order' in the Blender OBJ export window which is de-activated by default. I always want to keep the vertex order (obviously?), is there some way to have that option activated by default?
@wilson66 I have not used the OBJ export in a while, but I believe the names in the exported OBJ file come from the mesh names, not the object names. The object names are completely lost and blender generates some out of the mesh names on importing from OBJ.
And if your target is substance painter, have a look at FBX export as well.
@wilson66 I have not used the OBJ export in a while, but I believe the names in the exported OBJ file come from the mesh names, not the object names. The object names are completely lost and blender generates some out of the mesh names on importing from OBJ.
And if your target is substance painter, have a look at FBX export as well.
Thanks for the quick answer. Of course, it was the mesh names. Sorry, I'm a Modo user, and been testing Blender for just a couple of months now. Still a lot to learn. I renamed the meshes, and the object names within the OBJ are now correct.
They need to work on the OBJ importer, this is not usable as is. Blender needs almost 10 minutes (!) to import a 250 MB OBJ file, whereas Modo imports the same file in less than 20 seconds.
They need to work on the OBJ importer, this is not usable as is. Blender needs almost 10 minutes (!) to import a 250 MB OBJ file, whereas Modo imports the same file in less than 20 seconds.
That's why i prefer to use fbx instead of obj using Blender. They need to polish too many basic things.
Is it beneficial or advisable to represent the same details in both a normal map and a Bump Displacement at the same time? It's unclear whether having both improves the effect or amounts to the same thing, or what.
It depends on the microdetail frequency of the height/disp map, but generally you don't need the normal map if you have a high detail disp map and use disp+bump. Renderers like Arnold/Vray/Corona have added this displacement bump feature to cut down on geometry density. Arnold displacement, for instance, works like magic and doesn't require a dense mesh for fine detail. Cycles does have this feature, but from my tests it still requires a dense mesh, even using disp+bump.
If you don't intend to displace the mesh then why are you using displacement? It is no benefit if you only intend it as a bump map and is additional rendering overhead. You can use the height map you have as a bump map and plug it into normal through a bump node, but in this case you would be better off using a normal map instead. Bump maps are pretty old school and not as effective as normal maps.
FBX likely won't be any faster, if only because the FBX importer/exporter is reversed-engineerered (if I understand things correctly Autodesk doesn't allow for open source usage of the FBX SDK, so the Blender one has to use its own code). I assume that something similar is happening with the OBJ importer/exporter being extremely slow (although admitedly OBJ is very easily readable, so maybe it simply isn't optimized)
One thing for sure : you won't find any answers/solutions to these issue here. The only proactive thing to do is to bring it up to the developers with clear data and example files to back it up.
(BTW : for transfer between Blender sessions ctrlc ctrlv is very handy. It some cases it might be one way to alleviate some of these issues)
Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above. I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry. Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above. I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry. Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
You can try to use your array as an emitter and use the stones as particles. Might sound a bit over complicated, but give you a lot of randomisation options. The array geometry can be a single vertex or face and the emitter has to be setup accordingly to assure that you get one stone per face/vertex.
This would be very easy to do using the Sverchok node based geometry addon. This can be used to generate a unique transform matrix with slight randomization to place each paving stone, not to mention a lot of options for randomizing the stones themselves. Creating the array could be done within sverchok using a grid node or use the scene geometry in node to take in a geometry from your scene (this could be a plane or a single vert acted on by an array modifier).
This would be very easy to do using the Sverchok node based geometry addon. This can be used to generate a unique transform matrix with slight randomization to place each paving stone, not to mention a lot of options for randomizing the stones themselves. Creating the array could be done within sverchok using a grid node or use the scene geometry in node to take in a geometry from your scene (this could be a plane or a single vert acted on by an array modifier).
You can try to use your array as an emitter and use the stones as particles. Might sound a bit over complicated, but give you a lot of randomisation options. The array geometry can be a single vertex or face and the emitter has to be setup accordingly to assure that you get one stone per face/vertex.
I suggest an easier solution below but there are certain cases where particle systems are the way to go because it allows for more options at the expense of a more complicated setup in the beginning
Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above. I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry. Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
This is how I would do it: Create a control plane, cut it up to be the pattern you want your stones to have. Create a single stone block then parent this to the plane. In the object settings for the plane enable instancing and then add a displace modifier to the plane with a noise. In 2.7 this feature was called duplifaces.
There is a sverchok branch for 2.8 available on the sverchok GitHub. But your method seems good too. Didn't know about duplifaces feature. Good to know.
This is how I would do it: Create a control plane, cut it up to be the pattern you want your stones to have. Create a single stone block then parent this to the plane. In the object settings for the plane enable instancing and then add a displace modifier to the plane with a noise. In 2.7 this feature was called duplifaces.
Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above. I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry. Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
This is how I would do it: Create a control plane, cut it up to be the pattern you want your stones to have. Create a single stone block then parent this to the plane. In the object settings for the plane enable instancing and then add a displace modifier to the plane with a noise. In 2.7 this feature was called duplifaces.
Wow, thanks for your effort! I have already reproduced this, works really well. I think this will be the way to go here.
I have come across another little problem in the meantime. I have bought the Fluent/ Speedflow hard surface modeling plugins a while ago, and have used the 'tubify' feature to create a frame for a sign board. The tube it created can be adjusted in the 'Object Data Properties' tab (preview resolution, extrude and bevel depth ect).
How do I create actual geometry/ polygons from this? There are no modifiers that could be applied.
EDIT:
I actually found the solution myself. I can do this with Object -> Apply -> Visual Geometry to Mesh.
@ant1fact Oh, for that kind of pattern it will get tricky, wouldn't you actually need 3 different duplimeshes one for each of the stone sizes? I think I would have to test a bit before I could approach this kind of irregular patterns.
Oh, for that kind of pattern it will get tricky, wouldn't you actually need 3 different duplimeshes one for each of the stone sizes? I think I would have to test a bit before I could approach this kind of irregular patterns.
@thomasp I'm curious to ask - I know you adopted Blender a few years ago, mainly interested for hair stuff, I think. Do you use Blender nowadays fulltime for professional work (in conjuction with ZBrush I assume), or do you find yourself going back to Max/Maya to do certain tasks?
@Ruz I think you will find that Hair Tool solves that issue and saves you the roundtrip and probably then some. Then again I have never looked at xGen at all, wouldn't know how it compares. In Blender you have orientation and 'scale' on the CV level and at least as a former longtime Max user (does that answer your question @Justo ?) these work a million times better for me than anything I ever had access to to make hair geo. Also totally reliable in comparison and whatever I set for the CVs stays consistent even if separated from or attached to another curve.
I am on 2.79 though. If you have 100's or in some cases 1000's of
curves in the scene I don't find 2.8 useable at this point.
The complexity I work on these days while relying on Hair Tool, I would have totally given up on in the past and if you sat me in front of another software I would honestly not have the faintest idea how to approach the job and definitely no way of changing things after the fact.
I am tempted to buy that hair tool add on, looks pretty good. i think the same guy has done a blender cloth simulator, which is looking really good
re xgen, its nice, but its the buggiest crash fest i have ever used. It does do the job though if you can keep maya working long enough to convert whatever curves you have from either zbrush or blender.
scupting in xgen is ok also, but you get lots of inter penetration which takes time to iron out. hell maybe i am just using it wrong I don 't know
I think the reason I shy off buying 3rd party software like hair tool is that they might stop supporting it in future, but in this case Its proabably worth it.
Trying to place hair cards manually is just a real pain in the bum, even with good controls, like tilt or twist/orientation
Replies
"Mix any Image with procedural lights or use a FlatColor or a Gradient Fill. "
I'm now asking the devs in their BA thread. Thanks for showing me this addon.
I want to edit the left side and the right side should follow. Just like with a mirror modifier - just not with a mirror modifier as it's currently not capable of what I want to do.
The workflow I want to have is: Mirror objects like with the workaround I have described just with one command and then revert the scale of the mirrored object to positive values and re-invert on the mesh level with the not-yet-existing mirror modifier v2
Hope this makes sense.
The above image is a little thing I've been working on for a while. Now that I take a look at this monstrosity I've created this thing has no type of grace or consideration for good topology, especially on the arms. Now I'm wondering is it worth attempting to retopologize this mess or should I just start over form square one?
https://polycount.com/discussion/comment/2700504#Comment_2700504
But, the shader would be set to use use "Bump" for its displacement method rather than "True Displacement", so the mesh would be unchanged.
When I export an OBJ file from Blender and then re-import it into Blender or Modo, the names of the objects have changed. Blender has attached some kind of appendices to each object, e.g. the object 'griff_low' in the blender file now is named 'griff_low_griff_low_test_for_modo:002:griff_test_for_modo:010' in the OBJ file, making it unusable in Substance Painter.
This is really undesirable behavior. How do I tell Blender to keep the names of the objects within the OBJ file exactly the same as in the blender file it was exported from?
There also is an option to 'keep the vertex order' in the Blender OBJ export window which is de-activated by default. I always want to keep the vertex order (obviously?), is there some way to have that option activated by default?
I will definitely test the FBX format as well.
One thing for sure : you won't find any answers/solutions to these issue here. The only proactive thing to do is to bring it up to the developers with clear data and example files to back it up.
(BTW : for transfer between Blender sessions ctrlc ctrlv is very handy. It some cases it might be one way to alleviate some of these issues)
Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above.
I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry.
Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
Might sound a bit over complicated, but give you a lot of randomisation options.
The array geometry can be a single vertex or face and the emitter has to be setup accordingly to assure that you get one stone per face/vertex.
This would be very easy to do using the Sverchok node based geometry addon. This can be used to generate a unique transform matrix with slight randomization to place each paving stone, not to mention a lot of options for randomizing the stones themselves. Creating the array could be done within sverchok using a grid node or use the scene geometry in node to take in a geometry from your scene (this could be a plane or a single vert acted on by an array modifier).
This is how I would do it: Create a control plane, cut it up to be the pattern you want your stones to have. Create a single stone block then parent this to the plane. In the object settings for the plane enable instancing and then add a displace modifier to the plane with a noise. In 2.7 this feature was called duplifaces.
https://drive.google.com/open?id=1fQ7WlLmHfivEmUYiPN_d9hOWubwtoodk
Example file:
There is a sverchok branch for 2.8 available on the sverchok GitHub.
But your method seems good too. Didn't know about duplifaces feature. Good to know.
I meant that in case you have something like this:
http://www.paversearch.com/images/portfolio/big/menu_pavingstone04.jpg
Of course you dont need it for a simple grid
I have come across another little problem in the meantime. I have bought the Fluent/ Speedflow hard surface modeling plugins a while ago, and have used the 'tubify' feature to create a frame for a sign board. The tube it created can be adjusted in the 'Object Data Properties' tab (preview resolution, extrude and bevel depth ect).
How do I create actual geometry/ polygons from this? There are no modifiers that could be applied.
EDIT:
I actually found the solution myself. I can do this with Object -> Apply -> Visual Geometry to Mesh.
Oh, for that kind of pattern it will get tricky, wouldn't you actually need 3 different duplimeshes one for each of the stone sizes? I think I would have to test a bit before I could approach this kind of irregular patterns.