Home Technical Talk

Looking for fur texture workflow ideas -- Possibilities with xgen and arnold

grand marshal polycounter
Offline / Send Message
Alex_J grand marshal polycounter
I have little experience with offline rendering in Maya and will be looking into this further, but I have a few basic ideas on how I could derive some texture maps for animal fur using xgen. 

1. Setup and groom the fur, render with Arnold, and right now I know that I can render an AO pass, and I assume a flat color as well, but I also need a normal channel. Is is possible to render the normal channel with Arnold? Or would I need to use displacement, and find a way to convert that? I could edit any grayscale texture and turn that into height in Substance Painter, but this usually doesn't have as good of fidelity as a normal channel. Idea here is to take lots of high resolution shots from the render and project the different channels in Substance Painter.

2. Export the xgen splines as curves, assign geometry to them somehow, and use this for baking. This doesn't seem as feasible as I could have tens of thousands of curves.... The main benefit with this idea is that I can use xgen's excellent grooming tools, as opposed to something like fibermesh in zbrush which outputs bakeable geometry easily enough but with poorer control. 

Have you done anything like this? Any ideas/tips are really appreciated. 


What I am doing right now is baking fibermesh, and it looks alright but I lack the level of grooming control I could be getting if I use xgen, and also it's a pretty slow workflow that requires guessing, waiting, and so on. Not an ideal way to work...

Replies

  • Starboxx
    Offline / Send Message
    Starboxx polycounter lvl 11
    Welp, i am looking exactly into the same thing as you are.

    For now, i haven't tried to groom my animal in Zbrush beceause i don't really like the fibermesh beceause i think it always look too fake.
    What i have done is to groom in Zbrush, then export the geometry (so far i've managed up to 60M polys) and bake in Marmoset, but that doesn't work well beceause of the back facing normals (strands are either created facing camera, or not, and when they aren't, they're just arbitrary, so in both cases, baking the normals doesn't work).

    I am considering a purchase of Ornatrix, since apparently it can output tubes geometry.

    Otherwise, i've seen this guy's horse : https://www.artstation.com/artwork/xlJEO
    Apparently fur done in max (wich has the advantage of outputing some tube geo wich can be baked into normal), color from underneath mesh and baked in max (wich saves the hassle of the export, but i didnt managed to get a good cage projection of the fur into maya - also is very slow), but grooming in max must have taken some serious commitment.

    I'll keep trying. A friend of mine is technical artist into one of these aaa studios, going to ask him if he can somehow get a work around to output tube geo out of xgen.
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    Hey @Starboxx ,

    I actually ended up finishing out my tiger using fibermesh baked in Toolbag. I had the exact same issue you described with the back facing normals, however I was able to solve that. I do not believe outputting splines and converting to geo from any other kind of fur simulation app would give a different result. It's all triangles in the end right? So the benefit of more robust fur sim's versus fibermesh is just that you get greater control over the groom. However, I think if I can get a good result on a tiger, you should be able to do almost any short-fur animal with fibermesh.

    Basically, I just had to do a lot of trial and error with the grooming. If you groom it too far inwards to the mesh you will get that bad back-face bake, but if you are gentle with it you can get the groom you need and a good bake. It does take a good bit of trial and error to get a feel for it. It is better to work in a may certain way than that, however given the ease of use of fibermesh over other systems I think it's a fair trade off.

    Also, I had originally been exporting the fur in small chunks that were several million tris each. But I found that reducing the fibers to small, single sided triangles still gave a realistic look and allowed me to export all the fur for the entire tiger in one go.

    Also be sure to bake with the high-poly mesh and fibers together. This helps alleviate any issues with the baking. If you need extra details for the fur you can do a separate bake with only the fur and use that to selectively blend in to certain areas in your texturing app.

    You can see the final results of this process in my thread "Durga:Hindu Goddess" or on my artstation. If you have any questions about the process feel free to ask here or on those other threads. I may make a short tutorial at some point cause its a pretty simple workflow that gives a nice realtime result that I think looks better than those multi-pass shaders -- at least when camera is up close anyway.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
     Is is possible to render the normal channel with Arnold?

    Yes, you can bake Xgen down to a normal map.
  • Starboxx
    Offline / Send Message
    Starboxx polycounter lvl 11
    Not really. Yes, you can do it if you bake from a single view, like for hair strands, but if you want to project it using a cage for example, you'll have an issue, beceause when you create the hairs, they'll be either facing the camera (so in other directions, will have face normal issues or be too thin (since it's just strands and not tubes), or if you don't create the hairs facing camera, they'll have arbitrary face normal orientation and that too will lead to baking issues.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    Yes, I'm talking about atlas baking for cards.
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter

    So, for example, if you had an unwrapped cube, and you covered it with xgen fur, you could do a bake that would transfer the normal map accordingly to the cubes UV's?

    My understanding was like Starboxx, that you could only render normals to the camera.
  • gnoop
    Offline / Send Message
    gnoop sublime tool
    It's why I have bought Clarisse when they did a X-mas sale few years ago.  It could scatter  billions of hi res objects in few clicks without doing  proxies over 300m poli Reality Capture thing  without decimation.
    Could rotate and edit  the scene in viewport and could bake ( UDIMs too) depth, curvature  and world space normals  in form of special materials or special "AOV store"  to be  then  converted to tangent space in Xnormal, Mari or SD. 

    Also have tired of doing it in Zbrush. It's just a tourture there and half of a time it hangs.

    Clarisse renderer is very similar to Arnold actually so perhaps it's all could still be possible there too with a same approach.    At least I did it  same way in Max 2008 a decade ago  using special "world space" material.

    Clarisse also doesn't have any issue with back facing normals, assuming any material is by default double sided.

    I bet Modo Indie could do it too.  But have not tried it  there yet

  • gnoop
    Offline / Send Message
    gnoop sublime tool
    Have just tried to bake world space normals in MAx 2020 Arnold .  No problems actually .  It bakes with Map to Material  node just fine while showing some error message
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    Thanks @gnoop , I'll have to check these out. Sounds promising.
  • gnoop
    Offline / Send Message
    gnoop sublime tool
    You even don't have to make any fancy mix of gradients.  Just use utility nodes  "NS" (normals shaded)  from its drop down list and input it to map to material node  (or whatever equivalents in Maya)
    Works pretty much the same as in Clarisse
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    @Alex Javor I'll hope to have a working example tomorrow. It's 4.17am here and I really should be asleep......
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    Thanks guys. Yeah that stuff is still a bit over my head so I have to put some time aside to research it. I haven't had a strong need to get into the offline rendering stuff so whenever I hit a big wall with it I just put it off for later. But this method of making fur is something I'll definitely revisit so I'll have to make time to look at this more in depth. Would definitely be worthwhile if I could use these more powerful fur sim apps over fibermesh. That was the key limitation to my tiger textures -- I had to keep a pretty basic groom. Worked alright but any time you can get better control you're gonna get a better result.

    @musashidan , please get some sleep! I really appreciate your help, but I am not doing anything related to this for a few months at least. Sounds like starboxx may be putting it to use sooner however. I'm gonna put togethehr a short video on how I used fibermesh but most likely I'll go with a more robust solution like you guys are suggesting when i do another furry critter.

  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    So, I did some tests and got a result using Ornatrix in Max and just the scanline renderer. Pretty straightforward, but ideally I would only use this for an underlying base layer for hair cards. Hair cards will still give you the best results because of the volume and layering you can achieve. Either way though, fibremesh has had its day in the hair pipeline. There are much better solutions like Xgen, Ornatrix, and even Blender. Still, I'd be interested to see your workflow video. We can never have enough workflow methods to play around with. :D

    For a more dedicated solution there is Neofur and gFur. Although NeoFur hasn't been active in 2 years and may well be dead.


  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    gFur looks really nice. I think I'll check that out. I didn't think you get get actual silhouette changes from a shader like that. May be really useful for long fur animals. I think fur cards would still look the best, but maybe rigging and animating an animal with 100k+ tri's worth of cards wouldn't be as simple or performant as a simple mesh that uses only a shader?

    I'm planning on making my video sometime next week. It is nothing special really, you just make fibermesh and bake it onto a mesh. But there is a few little quirks I learned to overcome that can save anybody else who wants to try it some headache.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    Well gfur is using a shell system which is similar in a way to cards. It is performance  expensive though, but the rigging/dynamics/secondary motion is for free. Your Tiger turned out well and looks convincing.
  • musashidan
    Offline / Send Message
    musashidan high dynamic range
    Also, Nvidia Hairworks is worth a look too.
  • Starboxx
    Offline / Send Message
    Starboxx polycounter lvl 11
    So, i managed to get a descent result with Zbrush fiber mesh, wich is the pic below. (This is just bent normal, which seems to get better result than a simple normal map, plugged in detail normal and the albedo from vertex color of the fibermesh - flat rough & spec values). Groom is far from optimal but this was just a test. For the density i'm going for, i'm planning on doing it into parts (9M tris for this one).



    artstation.com/gabrielnadeau <=== Props to this guy for thinking of the two upcoming solutions :

    There is a work around that goes pretty well to bake Xgen fur with normals facing the right way. I started my fur grooms by creating a groomable splines description, but if you convert it to an interactive groom (or just start your groom this way), you can just use the twist brush with the "align to surface" check box on, and it will solve the problem of arbitrary oriented face normals, and so you can just convert to geo and bake it correctly. I'll post some results later, the one i got now is definetly not worth being shown.

    An other method could be to convert xgen primitives to polygons once with 1 curvature (and let's say 1 span), then convert an other time with -1 curvature then invert normals. Now that might not be as good beceause you'd still have a hard edge on the hair unless you merge vertices, wich could be difficult for high density grooms, wich is my case. Also that will lead to higher polycount than simply exporting strands without spans (even tho i prefer to have at least one span to have some sort of roundness to the baked hair).

    For now, i dont know yet if i'll use a Zbrush fibermesh or an Xgen fur. So far, Zbrush gave better results, but it's probably beceause i didn't spent much time grooming the xgen one correctly and also just had the normal map baked from the xgen one (albedo contributes quite alot to the realistic look imo). I still need to find a way to transfer polypaint of my model to Xgen fur (if there is any, i'll need to look into it).

    Also, i'm going for this type of fur baking onto a low poly beceause it's basicly a workflow research for a potential project where any kind of fur system is a too big performance hit and i had quite bad looking results with "fur cards" (hair cards), being too regular and in the end not looking very realistic. Also it will need rig/anim so i'm no rigger but i've rigged a character once and even a belt wich was not attached to the base character mesh was a pain to skin so i can't even imagine fur. Still, I'll try adding just a few to break flatness of the silouhette on certain parts and see how it goes.

    PS : Alex, please tell your tiger to not eat my deer, i'm a bit nervous to leave them alone together in here.
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    I'm a bit too busy with some other work to digest what you've wrote here. Gonna book mark and go over it later, in addition to what others posted above. Looks like good stuff. Thanks for sharing.

    And don't worry about that tiger, it only eats people. :)
  • BjornarAarset
    Offline / Send Message
    BjornarAarset polycounter lvl 6
    Hey this is the only thread I've found that goes into fur baking for games. On the internet. I would love some more breakdowns. Like how would you transfer the vertex color from your hair to the lowpoly mesh. Is there a way to bake in the diffuse? And currently using xgen, you would just make the groom directly on the model correct?  I've heard from some, that they groom on a plane which represents the UV's . 
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    I used marmoset toolbag to bake vertex color to low poly.

    not sure about an xgen workflow, but basically i create geometry hair and then bake that onto the low poly model. So whether you are using a spline simulation for hair or w/e, the idea is to turn it into geometry with vertex color and then bake that.

    Problem is that can get extremely heavy so you have to send it in patches to bake. A really tedious and time consuming workflow. Results were pretty good but if I did it again I would consider an alternative. You could probably get away with just covering the animal in fur cards and using a simple cloth type simulation on top of that in game.

    One thing I considered at the time but didn't have time to fully test is that I think you could render different passes onto the texture from Arnold. In other words, render out AO, normal, etc and that gets applied to the UV space as a texture. I believe that's possible but at the time getting into arnold and all that was too much for me and I haven't look further into it since.
  • BjornarAarset
    Offline / Send Message
    BjornarAarset polycounter lvl 6
    thanks for answering. I am currently testing it out like that now, but haven't converted to geo yet, so I'm nervous to see how my pc handles it. Not sure how you would make a normal map in arnold, if you are not doing it the way the other guy described it?  Like creating a plane which represents your uv's  then grooming on that plane. then just rendering out the normal and ao pass from arnold. 
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    yeah that was my idea, or even to grab orthographic shots of the model and then i could just project those pretty easily too. 
  • melviso
  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter
    IMO hair cards are a waste of time on anything that isn't similar to long human hair (ie thin, long and shiny) 

    What you see in the image below is fundamentally a fin and shell based system on PCP and is the correct way to approach anything that requires large areas of dense fur. 



  • BjornarAarset
    Offline / Send Message
    BjornarAarset polycounter lvl 6
    poopipe said:
    IMO hair cards are a waste of time on anything that isn't similar to long human hair (ie thin, long and shiny) 

    What you see in the image below is fundamentally a fin and shell based system on PCP and is the correct way to approach anything that requires large areas of dense fur. 



    hey I've seen this technique before, but not a lot of information about it. Do you know some good resources where I can find out more about it, and potentially implement it in unreal?
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
    there is a fins and shells shader on the unreal asset store by a guy named skulls or something like that. there is a couple i think but his was pretty good. i tried it out.
  • BjornarAarset
    Offline / Send Message
    BjornarAarset polycounter lvl 6
    Would you be thinking about gfur? Perhaps. Looks like a plugin for unreal, which provides a similar effect. 
  • Alex_J
    Offline / Send Message
    Alex_J grand marshal polycounter
  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter


    This old nvidia paper probably explains the system best.  It's worth noting that this is a very basic implementation and more modern implementations both perform and look significantly better. 

     if you're willing to author the shells and fins manually it would be fairly trivial to get it working in unreal although it's unlikely it'll be very performant.  To do the job in a production usable fashion you'd need to dig into the c++

    https://www.google.com/url?sa=t&source=web&rct=j&url=http://developer.download.nvidia.com/SDK/10/direct3d/Source/Fur/doc/FurShellsAndFins.pdf&ved=2ahUKEwjo4863_NbpAhVXShUIHfK8Dx4QFjAAegQIBRAB&usg=AOvVaw3EbO0MZAwcwScymXk-PP3I





  • bkost
    Offline / Send Message
    bkost interpolator
    Here's a relevant fur workflow reddit thread 

    I'm beginning to dive into the realtime fur rabbit hole. Looks like the workflow is leaning towards in engine shader/spline builds. 
    I'd love to see how Planet Zoo built their fur shader, the models turned out great.

    Have you all found any successful workflows?

Sign In or Register to comment.