Home Technical Talk

Zbrush to PBR Workflow & Limitations

polycounter lvl 9
Offline / Send Message
arcandio polycounter lvl 9
Alright, I'm coming back to 3d after a few years off my game, and I'm trying to get my workflow down.

The pipeline I want to use is Blender (basemesh) -> Zbrush (sculpting/highpoly, retopo) -> Blender (texture painting) -> Unity (PBR material, poor-man's Toolbag)

So far, I've gotten everything to "work" except that I've found that Zbrush doesn't export very good normal maps. I'm reading up on fixing some of these errors, but a lot of it comes down to "use xnormal instead," which would be more like

Blender -> Zbrush -> Xnormal -> Blender -> Unity

My problem with this is that I want to work on some really high-res meshes, using all the neat Zbrush tools like Nanomesh and Geometry HD, and it doesn't sound like this is feasible using Xnormal to bake.

Are there any tips or tricks for getting Zbrush to emit decent maps? Or should I just learn to deal with not being able to get the most out of Zbrush? What do people do with these absurdly-high-poly models you can make in Zbrush? Just beauty renders?

Replies

  • arcandio
    Options
    Offline / Send Message
    arcandio polycounter lvl 9
    Also, a state of the industry type question, what resolution are modern RPG characters and assets, in polycount and texture size? What's your experience lately? (Granted it depends on how many are on screen, but things have changed a bit since my day)
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    There are a number of workarounds.    But it's still a huge pain in the a...    Pixologic seems never wanted to address unfortunately.
       
    1.  You export low and middle res mesh with  a normal map baked for that mid-res object  having no  hard edges.  Then in  X-normal  you use that mid res object as Hi definition + that normal map of tiny details as base texture with "Base texture is a tangent-space normal map" checked in.   Xnormla would combine it with it's own baked normal map into base texture output.   With that method you wouldn't be able to bake all those tiny things into the height map but it's usually not necessary.  


    2. You do mesh decimation  in zbrush with respect of vertex coloring and then go as usual in X-norml.

    3. A combination of  1 and 2 ,   Point 2 for scattered nanomeshes for example if they don't use alpha.  

    4.  Same as in point 1  but mid-res having displacement map . Then baking it in MAx/Maya,  not Blender unfortunately since it  can't import and work ok with even middle res mesh  of couple millions .   Limited respect to nanomeshes alpha is possible.      




  • arcandio
    Options
    Offline / Send Message
    arcandio polycounter lvl 9
    I was all excited when I saw that Multi-Texture exporter in Zbrush had a flip-maps option. Thought that would work for me.

    What happens if you run something with nanomesh/GeoHD through Decimation Master? Just scrubs it all out?
  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool
    You can decimate them before scattering. it's more logical.    Once scattered  you turn them into true mesh  and subtool  and then export as usual. 

    There is also a pretty old quick  approach of having  midpoly   baked into single displacement texture with all UV patches oriented  accordingly with as equal texel  size  as possible .   Then  displace that texture from a square plane and add all the nanomeshes there , even hairs.   Not accross the seams although.  After that you can just do screen render  in Zbrush.  For normal map  be sure to get rid of   ambient light in lighting settings and bake your own normal sphere picture somewhere for normalRGB material  ( Zbrush one has wrong gamma curve)  , also for hairs and nanomeshes you have to do "best" render first  and if Zbrush is still not hang after that, BPR render. Save 16bit outputs.  Then go option 1 in my previous post.

    A depth pass could be used to make quick curvature/cavity map with hipass and levels in Photoshop. You could re-bake it after to your target low poly mesh with final UV packing  in Xnormal.

    This methos works for objects with not so many small UV islands and it's pretty time saving.


  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    - Don't bother baking in Zbrush. Instead use a dedicated, industry-compliant baking tool like Xnormal/Substance Painter/Toolbag3, as these can bake anything you throw at them, with the accuracy that you'll need. And if the highres models are too high then that just means it's time to decimate :)

    - Pipeline-wise focus your attention on being able to easily export things out of Zbrush - either directly to your baking application, or to your main 3d program, which will serve as a hub for your scene (which gives you the benefit of having to work lightly and efficiently, and also opens you up to all traditional modeling tools). Don't believe the hype about "AAA characters created fully in Zbrush" - while this is indeed possible for portfolio pieces, this is a nightmare in production as the amount of cleanup and the time wasted on redoing things goes through the roof.
  • arcandio
    Options
    Offline / Send Message
    arcandio polycounter lvl 9
    pior said:
    - Pipeline-wise focus your attention on being able to easily export things out of Zbrush
    Yeah, this seems to be the bottleneck for me at the moment. I'm trying to do most of this on a cheapo old laptop, which is seriously underpowered for baking, and even takes a long while when exporting big objs. I can move the files to my workstation (this is all happening in my free time in the evenings, while watching/listening to tv etc) and bake from there, but that sort of breaks my flow of working from my comfy chair. Any ideas on that front? Should I export separate subtools or something? Or just cultivate some patience?

  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    Well, this seems to vary a lot from one person to another. Some artists seem to have no problem working with very heavy files at very low response time - which is fine I guess, but that definitely creates a lot of inertia when the need to edit something arises somewhere down the line. In a sense working on a underpowered workstation is a bit similar.

    What you can do is to think of it the other way around, like : "If this scene is not fluid and cannot be displayed at 60fps, then something is wrong and I need to address it". This might force you to work at lower densities than what you'd like, but in the long run (after baking, texturing, and so on) the result will probably look just as good as if you worked with gazillions of polygons. And if a scene is light enough to be displayed with high response time in Zbrush, then it will likely be very fast to write out as OBJ too, and Blender will have no problem displaying it.


  • DavidCruz
    Options
    Offline / Send Message
    DavidCruz interpolator
    ^ I really hate to say this but i am in the boat of very low memory and i have to separate a LOT of tools and export them all one at a time is horrendous.  Seems your under-powered laptop suffers a bit from the same?

    Would go the route of exporting and working on each item separately in it's own .ztl file before exporting, I also do not decimate any meshes as i find it ruins my normal maps (idk if i am doing it wrong or this is the flaw in design? but the normals are never the same as the raw subtool, FOR ME, I might be missing a step somewhere.)

    Just for the record's sake the one thing i am working on now has 103 subtools, usually once I have them together and at a moderate level and final silhouette i can place them into their own .ztl and work on them more there if need be.(mentioning if you need this type of process to get a bit more out of the details)

    HOWEVER i find just using your program of choice, tessellating and sub-ding is a quicker solution to a better end result.(recent finding from the sticky here in tech, this because i only started to learn more about hard-surfacing.)Using zbrush as a detail only app.  

    I still don't get the - keep all objects together when making the AO bake - it never works for me and i still explode bake those separately. (The only time i would decimate is for a screen shot of the high-poly for say the portfolio.)

    Thought i'd attempt to help with the subtool separation "method". 
    Hope this helps someone somewhere on the forums.
  • arcandio
    Options
    Offline / Send Message
    arcandio polycounter lvl 9
    Pior: part of what's wacky to me is that this laptop has no trouble at all sculpting and viewing scenes in Zbrush up to about 1-2m polys, the furthest I've pushed it so far. On the other hand, baking maps, and exporting OBJs, that's like, "well, at least I'm already watching tv, I guess."

    David: Yeah, I'm not excited about having to decimate my highpolys, but if that's what it takes to get them OUT of zbrush, then I guess I'll have to deal. I'm reading that xnormal can run like 30m polygons for a highpoly, so that doesn't seem like it'd be a problem. Except if I go nuts on a sculpt up to like 1m, and THEN try to do GeoHD or Nanomesh or something. That might be pushing it. Who knows.
  • pior
    Options
    Offline / Send Message
    pior grand marshal polycounter
    That's the thing though - since you know that the process will involve getting out of Zbrush at some point or another, the fact that you can sculpt on a gazillion of polygons in it is irrelevant. I've ran into that issue countless of times with "Zbrush-centered" artists : they produce extremely high dense meshes (and they sometimes can do it fast, so credit where it's due), but if there is no way to art direct the proportions of the models in a fluid manner in an external 3d program then what's the point ...

     Again, tackle the problem the other way around : work with what you have, identify the bottlenecks, and adjust your pipeline accordingly. For instance, you could decide that an export should never take more than 5 seconds, and determine the density of your models from there.
  • arcandio
    Options
    Offline / Send Message
    arcandio polycounter lvl 9
    I mean, I get that, and I'm glad to see that a lot of the commentary I'm reading these days (here and elsewhere) is much more about the smoothness of the pipeline than excessive levels of detail. And for sure, that's how I'll end up forming my own workflow. But for this particular process I'm just practicing, trying to push my limits. Since I've never sculpted something up that high, I'm just trying to figure out how to give it a go without tearing my hair out.

    (and to be sure, I can spin hardware limitations as part of the process, if I decide to show these pieces for some sort of interview or something)
  • musashidan
    Options
    Offline / Send Message
    musashidan high dynamic range
    Not to sound like a prick but if you have a dedicated workstation in the next room that will solve a lot of your issues but, you would prefer to sit on the couch and watch the telly as you work, then I have to ask if your heart is really in this? ;)
  • arcandio
    Options
    Offline / Send Message
    arcandio polycounter lvl 9
    It's a work-life balance thing. I could (and do sometimes) spend all day at my work station, but I don't want to spend my whole life at my workstation. I do like spending time with my family, so I'm trying to figure out how to do this and that at the same time. When you work freelance, it's very easy to over-devote yourself to your work and not leave any time for anything else. Now, some people that I've met in the industry can do that, and that works for them. I, personally, only have so much fuel for a given day. This way I can try to coax a little more art time out of my schedule without disappearing into the office forever.

    That aside, what is the "industry standard" workflow for Zbrush? Or would that be a question better suited for ZBC? Maybe I should make an account there too.
Sign In or Register to comment.