Home Technical Talk

Blender Mega Thread

Replies

  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    @rollin maybe I'm misunderstanding your intention, but why not edit the position/rotation at the mesh level(origin remains at 0,0,0 and object transforms unaffected), instance, mirror, and you have the result. 2 separate, but instanced objects with mirrored topology. You'll still have to make unique and flip normals though.
  • Justo
    Offline / Send Message
    Justo polycounter
    @musashidan hey Danny, thanks for the fast response. That looks like what I'm looking for - the videos shown however focus on creating HDRIs from scratch, which isn't what I'm interested in. But the feature list seems to confirm it is possible: 

    "Mix any Image with procedural lights or use a FlatColor or a Gradient Fill. "

    I'm now asking the devs in their BA thread. Thanks for showing me this addon.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    Justo said:
    "Mix any Image with procedural lights or use a FlatColor or a Gradient Fill. "


    Yeah mate, this is the first thing I checked in the list before posting. :D

    This would be a great feature to have built in.
  • rollin
    Offline / Send Message
    rollin polycounter
    @rollin maybe I'm misunderstanding your intention, but why not edit the position/rotation at the mesh level(origin remains at 0,0,0 and object transforms unaffected), instance, mirror, and you have the result. 2 separate, but instanced objects with mirrored topology. You'll still have to make unique and flip normals though.
    Thank you but I don't think this applies to my case. This would be what the modifier should be able to do imo. But as I only have one mesh data I don't really understand how I should edit it two get two positions / rotations out of it 0.o
    I want to edit the left side and the right side should follow. Just like with a mirror modifier - just not with a mirror modifier as it's currently not capable of what I want to do.

    The workflow I want to have is: Mirror objects like with the workaround I have described just with one command and then revert the scale of the mirrored object to positive values and re-invert on the mesh level with the not-yet-existing mirror modifier v2
    Hope this makes sense.
  • 99499
    Offline / Send Message
    99499 node
    Is there a board or a thread dedicated to having someone review your topology? I'm working on a particular mesh for a character I plan on animating for a game and I'd like to have an experienced person review how I've made it before I get started.
  • 99499
    Offline / Send Message
    99499 node
    Not sure how to delete the above post, but eh I'll just post a picture of it here.



    The above image is a little thing I've been working on for a while. Now that I take a look at this monstrosity I've created this thing has no type of grace or consideration for good topology, especially on the arms. Now I'm wondering is it worth attempting to retopologize this mess or should I just start over form square one?
    I'm asking this because I have no experience in retopologizing. From what I understand it is a way of correcting poor topology such as this but I have only ever seen this done on high res characters, not so much on low poly ones. So I'm trying to get some feedback on how I should proceed from here.
  • rollin
    Offline / Send Message
    rollin polycounter
    99499 said:
    Not sure how to delete the above post, but eh I'll just post a picture of it here.



    The above image is a little thing I've been working on for a while. Now that I take a look at this monstrosity I've created this thing has no type of grace or consideration for good topology, especially on the arms. Now I'm wondering is it worth attempting to retopologize this mess or should I just start over form square one?
    I'm asking this because I have no experience in retopologizing. From what I understand it is a way of correcting poor topology such as this but I have only ever seen this done on high res characters, not so much on low poly ones. So I'm trying to get some feedback on how I should proceed from here.
    You better create a new topic for this.
  • Justo
    Offline / Send Message
    Justo polycounter
    Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    Justo said:
    Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
    You've been using it obsessively for months!! :D  Maxivs Interactive tools has this feature and can also create seams from sharps and unwrap in one click. They're at the bottom of the panel in UV utilities.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    rollin said:

    The workflow I want to have is: Mirror objects like with the workaround I have described just with one command and then revert the scale of the mirrored object to positive values and re-invert on the mesh level with the not-yet-existing mirror modifier v2
    Hope this makes sense.
    Still not exactly clear on your intention, but I'm thinking you want the same outcome as using Max's Mirror tool set to 'Geometry'? Max's Mirror used to invert scale and flip normals before they added that extra geometry feature. I don't think Blender can do this without workarounds like you've tried.
  • pior
    Offline / Send Message
    pior grand marshal polycounter
     A remark on making a selection based on explicitely marked hard edges : as far as I am aware there is no way to do it trough the interface (the "select hard edges" command is not doing what it says, it is really just a filter based on angle). However the bmesh library has it in the form of the "smooth" property. Meaning that you can do a script that filters edges based on that, and then do any edge operation after - like marking seams for UVs.
  • Justo
    Offline / Send Message
    Justo polycounter
    Justo said:
    Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
    You've been using it obsessively for months!! :D  Maxivs Interactive tools has this feature and can also create seams from sharps and unwrap in one click. They're at the bottom of the panel in UV utilities.
    Looool indeed I have, oh god! Maxi must be shaking his head in disgust somewhere right now...Didn't know it was there hahaha. Now I'll need to watch some tuts on UVs too, since my God they've made even previewing UVs in that editor kind of strange and cumbersome...
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    @justo Haha! Maxi to the rescue.

    I've just started experimenting with the unwrap workflow myself. I've been using Rizom for a few months now so I didn't really bother as the Blender workflow seemed very frustrating with the weird selection disconnect between the 3D view and the UV view. I'd like to master unwrapping in Blender though, so I spent some time at it yesterday and after some fiddling it actually doesn't seem that bad. One thing I found absolutely VITAL is the UV Hightlight addon. This will somewhat alleviate the weirdness(sync toggle too, but it's a bit hacky and some tools/addons don't seem to work when it's on) Also, TexTools and UV Packmaster are pretty essential.

    On another note, I think I read a while back that you were looking for an easy way to hide boolean cutters. Boxcutter automatically creates a Cutter collection and there's a checkbox option to hide them automatically. This is great, but when you use Hops boolean tools it doesn't do this. I found that just hitting M>move to collection>cutters is a fast way to do this. Then you can use shift+1(or whatever number the cutters collection is in the outliner list) to toggle the global visibility of all your cutters in that collection.
  • f1r3w4rr10r
    Offline / Send Message
    f1r3w4rr10r polycounter lvl 9
    Justo said:
    Is there any Blender 2.8+ script that can cut seams across sharp (or even Bevel'd) edges to create UV islands? I found the opposite: to set hard edges based on the islands, but not the inverse...
    Is that not possible with the select similar tool?
  • pior
    Offline / Send Message
    pior grand marshal polycounter
    Well, Select Similar is indeed a workaround, but having to manually select one "starter" edge like that would prevent any automation of the process. I ran into exactly this issue when looking for a way to automate Hard Edge to Beveled Edge conversion (which is similar to what Justo is trying to do) and unfortunately the default exposed tools don't allow for that. Hence the need to script : deselect all > filter for non smooth edges using bmesh > mark as desired.
  • ant1fact
    Offline / Send Message
    ant1fact polycounter lvl 9
  • Zablorg
    Offline / Send Message
    Zablorg polycounter lvl 6
    Is it beneficial or advisable to represent the same details in both a normal map and a Bump Displacement at the same time? It's unclear whether having both improves the effect or amounts to the same thing, or what.
  • f1r3w4rr10r
    Offline / Send Message
    f1r3w4rr10r polycounter lvl 9
    @Zablorg As I understood it the outcome of both is the same, but they take different inputs. Normal maps just use the regular normal map, representing the angles of the changed normal as input. And I believe bump maps use a height map as input.


    You could use a displacement with a height map instead to have a different outcome.

  • Zablorg
    Offline / Send Message
    Zablorg polycounter lvl 6
    @Zablorg As I understood it the outcome of both is the same, but they take different inputs. Normal maps just use the regular normal map, representing the angles of the changed normal as input. And I believe bump maps use a height map as input.


    You could use a displacement with a height map instead to have a different outcome.

    Ah, I should have been more clear: I am talking about plugging something into the normals input of a shader's surface (using either technique from your first two links), as well as the equivalent detail in its Displacement input (like in your last link).

    But, the shader would be set to use use "Bump" for its displacement method rather than "True Displacement", so the mesh would be unchanged.
  • f1r3w4rr10r
    Offline / Send Message
    f1r3w4rr10r polycounter lvl 9
    But as far as I can see, plugging something into Displacement and then disabling displacement should not have any effect on the render.
  • wilson66
    Offline / Send Message
    wilson66 polycounter lvl 8
    Question about OBJ export. I have an asset that needs to be textured in Substance Painter. It is essential that the individual objects within the OBJ file are named correctly (e.g. I need to name the objects 'screws_low', 'wheels_low' ect.)

    When I export an OBJ file from Blender and then re-import it into Blender or Modo, the names of the objects have changed. Blender has attached some kind of appendices to each object, e.g. the object 'griff_low' in the blender file now is named 'griff_low_griff_low_test_for_modo:002:griff_test_for_modo:010' in the OBJ file, making it unusable in Substance Painter.

    This is really undesirable behavior. How do I tell Blender to keep the names of the objects within the OBJ file exactly the same as in the blender file it was exported from?

    There also is an option to 'keep the vertex order' in the Blender OBJ export window which is de-activated by default. I always want to keep the vertex order (obviously?), is there some way to have that option activated by default? 
  • f1r3w4rr10r
    Offline / Send Message
    f1r3w4rr10r polycounter lvl 9
    @wilson66 I have not used the OBJ export in a while, but I believe the names in the exported OBJ file come from the mesh names, not the object names. The object names are completely lost and blender generates some out of the mesh names on importing from OBJ.

    And if your target is substance painter, have a look at FBX export as well.
  • wilson66
    Offline / Send Message
    wilson66 polycounter lvl 8
    @wilson66 I have not used the OBJ export in a while, but I believe the names in the exported OBJ file come from the mesh names, not the object names. The object names are completely lost and blender generates some out of the mesh names on importing from OBJ.

    And if your target is substance painter, have a look at FBX export as well.
    Thanks for the quick answer. Of course, it was the mesh names. Sorry, I'm a Modo user, and been testing Blender for just a couple of months now. Still a lot to learn. I renamed the meshes, and the object names within the OBJ are now correct.

    I will definitely test the FBX format as well.
  • wilson66
    Offline / Send Message
    wilson66 polycounter lvl 8
    They need to work on the OBJ importer, this is not usable as is. Blender needs almost 10 minutes (!) to import a 250 MB OBJ file, whereas Modo imports the same file in less than 20 seconds.
  • Blaizer
    Offline / Send Message
    Blaizer polycounter
    wilson66 said:
    They need to work on the OBJ importer, this is not usable as is. Blender needs almost 10 minutes (!) to import a 250 MB OBJ file, whereas Modo imports the same file in less than 20 seconds.
    That's why i prefer to use fbx instead of obj using Blender. They need to polish too many basic things.
  • f1r3w4rr10r
    Offline / Send Message
    f1r3w4rr10r polycounter lvl 9
    Is FBX faster with the same file format? If so it might really be the importer, otherwise it's Blenders internal structure at fault in this case.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    Zablorg said:
    Is it beneficial or advisable to represent the same details in both a normal map and a Bump Displacement at the same time? It's unclear whether having both improves the effect or amounts to the same thing, or what.
    It depends on the microdetail frequency of the height/disp map, but generally you don't need the normal map if you have a high detail disp map and use disp+bump. Renderers like Arnold/Vray/Corona have added this displacement bump feature to cut down on geometry density. Arnold displacement, for instance, works like magic and doesn't require a dense mesh for fine detail. Cycles does have this feature, but from my tests it still requires a dense mesh, even using disp+bump.

    If you don't intend to displace the mesh then why are you using displacement? It is no benefit if you only intend it as a bump map and is additional rendering overhead. You can use the height map you have as a bump map and plug it into normal through a bump node, but in this case you would be better off using a normal map instead. Bump maps are pretty old school and not as effective as normal maps.
  • pior
    Offline / Send Message
    pior grand marshal polycounter
    FBX likely won't be any faster, if only because the FBX importer/exporter is reversed-engineerered (if I understand things correctly Autodesk doesn't allow for open source usage of the FBX SDK, so the Blender one has to use its own code). I assume that something similar is happening with the OBJ importer/exporter being extremely slow  (although admitedly OBJ is very easily readable, so maybe it simply isn't optimized)

    One thing for sure : you won't find any answers/solutions to these issue here. The only proactive thing to do is to bring it up to the developers with clear data and example files to back it up.

    (BTW : for transfer between Blender sessions ctrlc ctrlv is very handy. It some cases it might be one way to alleviate some of these issues)
  • f1r3w4rr10r
    Offline / Send Message
    f1r3w4rr10r polycounter lvl 9
    I just remembered something: OBJ is actually plain text, while FBX is binary I believe. So that might be one of the speed factors.
  • kio
    Offline / Send Message
    kio polycounter lvl 16
    the main issue is that the obj parser is written in python which cant really compete with a proper C based implementation.

  • xrg
    Offline / Send Message
    xrg polycounter lvl 10
    Faster I/O is on their todo. They had a GSoC project to do it, but the student didn't finish, unfortunately.

  • wilson66
    Offline / Send Message
    wilson66 polycounter lvl 8
    Thanks guys for your answers.

    Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above.
    I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry.
    Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
  • f1r3w4rr10r
    Offline / Send Message
    f1r3w4rr10r polycounter lvl 9
    Hmm I'd also be interested in this. The only way I was able to do randomization by mesh Hull so far was to put them in separate objects.
  • Prime8
    Offline / Send Message
    Prime8 interpolator
    wilson66 said:
    Thanks guys for your answers.

    Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above.
    I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry.
    Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
    You can try to use your array as an emitter and use the stones as particles.
    Might sound a bit over complicated, but give you a lot of randomisation options.
    The array geometry can be a single vertex or face and the emitter has to be setup accordingly to assure that you get one stone per face/vertex.
  • kwyjibo
    Offline / Send Message
    kwyjibo polycounter lvl 7
    @wilson66

    This would be very easy to do using the Sverchok node based geometry addon. This can be used to generate a unique transform matrix with slight randomization to place each paving stone, not to mention a lot of options for randomizing the stones themselves. Creating the array could be done within sverchok using a grid node or use the scene geometry in node to take in a geometry from your scene (this could be a plane or a single vert acted on by an array modifier).
  • ant1fact
    Offline / Send Message
    ant1fact polycounter lvl 9
    kwyjibo said:
    @wilson66

    This would be very easy to do using the Sverchok node based geometry addon. This can be used to generate a unique transform matrix with slight randomization to place each paving stone, not to mention a lot of options for randomizing the stones themselves. Creating the array could be done within sverchok using a grid node or use the scene geometry in node to take in a geometry from your scene (this could be a plane or a single vert acted on by an array modifier).
    sverchok is for 2.79 only if I'm not mistaken

    Prime8 said:
    You can try to use your array as an emitter and use the stones as particles.
    Might sound a bit over complicated, but give you a lot of randomisation options.
    The array geometry can be a single vertex or face and the emitter has to be setup accordingly to assure that you get one stone per face/vertex.
    I suggest an easier solution below but there are certain cases where particle systems are the way to go because it allows for more options at the expense of a more complicated setup in the beginning
    wilson66 said:
    Thanks guys for your answers.

    Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above.
    I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry.
    Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
    This is how I would do it: Create a control plane, cut it up to be the pattern you want your stones to have. Create a single stone block then parent this to the plane. In the object settings for the plane enable instancing and then add a displace modifier to the plane with a noise. In 2.7 this feature was called duplifaces.

    https://drive.google.com/open?id=1T_PNbMfDHDpHwIqEdC1DFbjIp2ucYw0E

  • kwyjibo
    Offline / Send Message
    kwyjibo polycounter lvl 7
    @ant1fact


    There is a sverchok branch for 2.8 available on the sverchok GitHub. 
    But your method seems good too. Didn't know about duplifaces feature. Good to know.
  • Prime8
    Offline / Send Message
    Prime8 interpolator
    ant1fact said:
    ...
    This is how I would do it: Create a control plane, cut it up to be the pattern you want your stones to have. Create a single stone block then parent this to the plane. In the object settings for the plane enable instancing and then add a displace modifier to the plane with a noise. In 2.7 this feature was called duplifaces.

    https://drive.google.com/open?id=1T_PNbMfDHDpHwIqEdC1DFbjIp2ucYw0E

    Sure duplifaces/-verts works as well, you can use it with an array as well, no need to create the pattern by hand.
  • ant1fact
    Offline / Send Message
    ant1fact polycounter lvl 9
    @Prime8
    I meant that in case you have something like this:
    http://www.paversearch.com/images/portfolio/big/menu_pavingstone04.jpg

    Of course you dont need it for a simple grid
  • wilson66
    Offline / Send Message
    wilson66 polycounter lvl 8
    ant1fact said:
    wilson66 said:
    Thanks guys for your answers.

    Different question: I'm building a parking lot right now, and have made a couple of paving stones that I created paving from using multiple arrays. It works great, but now I'd like to randomize the transforms of each individual paving stone so the pavement looks more natural, especially when looking at it from some distance above.
    I'm aware of the Object -> Transform -> Randomize Transform feature. That requires though to first apply the array modifiers, producing lots and lots of geometry.
    Is it somehow possible to randomize transforms while keeping the array modifiers alive? Can I e.g. map a noise texture to a displace modifier, then tell the modifier to not displace vertices, but just transform entire connected geometry?
    This is how I would do it: Create a control plane, cut it up to be the pattern you want your stones to have. Create a single stone block then parent this to the plane. In the object settings for the plane enable instancing and then add a displace modifier to the plane with a noise. In 2.7 this feature was called duplifaces.

    https://drive.google.com/open?id=1T_PNbMfDHDpHwIqEdC1DFbjIp2ucYw0E

    Wow, thanks for your effort! I have already reproduced this, works really well. I think this will be the way to go here.

    I have come across another little problem in the meantime. I have bought the Fluent/ Speedflow hard surface modeling plugins a while ago, and have used the 'tubify' feature to create a frame for a sign board. The tube it created can be adjusted in the 'Object Data Properties' tab (preview resolution, extrude and bevel depth ect).

    How do I create actual geometry/ polygons from this? There are no modifiers that could be applied.

    EDIT:

    I actually found the solution myself. I can do this with Object -> Apply -> Visual Geometry to Mesh.
  • Prime8
    Offline / Send Message
    Prime8 interpolator
    ant1fact said:
    @Prime8
    I meant that in case you have something like this:
    http://www.paversearch.com/images/portfolio/big/menu_pavingstone04.jpg

    Of course you dont need it for a simple grid
    @ant1fact
    Oh, for that kind of pattern it will get tricky, wouldn't you actually need 3 different duplimeshes one for each of the stone sizes? I think I would have to test a bit before I could approach this kind of irregular patterns.
  • Ruz
    Offline / Send Message
    Ruz polycount lvl 666
    is there anyway to export blender corves( 2.80) to maya curves. I have hit a brick wall here, nothing seems to work
  • kio
    Offline / Send Message
    kio polycounter lvl 16
    @Ruz maybe via svg? have not tried this but at least google mentions some tooling for this

  • ant1fact
    Offline / Send Message
    ant1fact polycounter lvl 9
    Prime8 said:
    Oh, for that kind of pattern it will get tricky, wouldn't you actually need 3 different duplimeshes one for each of the stone sizes? I think I would have to test a bit before I could approach this kind of irregular patterns.
    I think you probably would :)
  • Ruz
    Offline / Send Message
    Ruz polycount lvl 666
    Kio - yeah I tried svg and it seems to export, but then maya will not read tthe curves
  • thomasp
    Offline / Send Message
    thomasp hero character
    Curves can be transferred via OBJ. Need to be NURBS inside Blender and have the NURBS write option in the OBJ exporter checked.
  • Justo
    Offline / Send Message
    Justo polycounter
    @thomasp I'm curious to ask - I know you adopted Blender a few years ago, mainly interested for hair stuff, I think. Do you use Blender nowadays fulltime for professional work (in conjuction with ZBrush I assume), or do you find yourself going back to Max/Maya to do certain tasks?
  • Ruz
    Offline / Send Message
    Ruz polycount lvl 666
    cheers thomasp. that works fine.

    still trickier to convert my nice blender  hair to a final hair cards version, but am working on it

    The problem with blender is that there is no quick way to align the hair cards to the hair cap. xgen has 'align to surface' using the  twist brush

    blender has minimum, tangent and z-up, none of which provide good results
    you can manually twist the orientation of  the  hair cards but when you have quite a lot of them, it beomes a real pain

    Currently blender has only a partial solution, unless you want to place all the cards manually.

    nice addon called hair tool i think it was , but it costs money :)


  • thomasp
    Offline / Send Message
    thomasp hero character
    @Ruz I think you will find that Hair Tool solves that issue and saves you the roundtrip and probably then some. Then again I have never looked at xGen at all, wouldn't know how it compares. In Blender you have orientation and 'scale' on the CV level and at least as a former longtime Max user (does that answer your question @Justo ?) these work a million times better for me than anything I ever had access to to make hair geo. Also totally reliable in comparison and whatever I set for the CVs stays consistent even if separated from or attached to another curve.
    I am on 2.79 though. If you have 100's or in some cases 1000's of curves in the scene I don't find 2.8 useable at this point.

    The complexity I work on these days while relying on Hair Tool, I would have totally given up on in the past and if you sat me in front of another software I would honestly not have the faintest idea how to approach the job and definitely no way of changing things after the fact.

  • Ruz
    Offline / Send Message
    Ruz polycount lvl 666
    I am tempted to buy that hair tool add on, looks pretty good. i think the same guy has done a blender cloth simulator, which is looking really good
    re  xgen, its nice, but its the buggiest  crash fest i have ever used. It does do the job though if you can keep maya working long enough to convert whatever curves you have from either zbrush or blender.
    scupting in xgen is ok also, but you get lots of inter penetration which takes time to iron out. hell maybe i am just using it wrong I don 't know

    I think the reason I shy off buying 3rd party software like hair tool is that they might stop supporting it in future, but in this case Its proabably worth it.
    Trying to place hair cards manually is just a real pain in the bum, even with good controls, like tilt or twist/orientation
Sign In or Register to comment.