Home Technical Talk

A few workflow questions regarding Zbrush, 3D Scanstore and TExturing XYZ assets

Hi!
So I've been sculpting this character and I'm using the MetaHuman topology. I now want to project a 3D Scan Store basemesh for the body and for the head, TexturingXYZ.

My MetaHuman basemesh is at real scale, that is 180 cm tall whereas the Scan Store mesh is 1.8 cm tall.
What is really the correct workflow you all use with ZBrush? Scaling has always been so confusing to me.
I'm familiar with the export scale settings. If I load up the 3D Scan Store ZTL it is 2 units high which is to be expected as is my own MetaHuman. The scale of the Scna Store mesh is 0.9 of course since exporting it should produce the 1.8 cm model again. My Metahuman scale is 90 though becauseit should produce the 180 cm model when exported.

I can change the scale of the Scan Store mesh to 90 and then append it to my MetaHuman ztool and they will be the same scale, the high frequency details will be intact(as opposed to actually scaling up the ztool itself). I'm just wondering is this the preferred workflow? Or am I doing things backwards? Do you all scale your models before bringing them into Zbrush? or what scale do you actually use for your content?

When it comes to the actual details with Scan Store and XYZ.
What is the preferred workflow?
With both Scan Store and XYZ I project the meshes with Wrap4D and using "TransferTexture" to bake the textures to my MetaHuman UV. I now end up with maps for the body and head, but these will not be seamless so how do I approach this?
The displacement details from Scan Store is only in the ztool so I have to project that in Zbrush to the actual mesh and then export it?
Do I bake down the XYZ displacement maps for the head in Zbrush blend it with the body that already have Scan Store details?
Then for all color textures like diffuse roughness, specular, bake them down similarly to polypaint data and then export it in the end?

Replies

  • pxgeek
    Offline / Send Message
    pxgeek greentooth
    For me since i absolutely loathe dealing with scale in zbrush, I always bring in source meshes to max and make sure my scale is set there (cm) and work from that. The scanstore meshes i've worked with are always really tiny...some i have to scale up 20k-50k percent.

    On working with scan displacement maps (and other super hi res textures), I really like using Mari. I think it's a great companion app to zbrush.
  • Geuse
    pxgeek said:
    For me since i absolutely loathe dealing with scale in zbrush, I always bring in source meshes to max and make sure my scale is set there (cm) and work from that. The scanstore meshes i've worked with are always really tiny...some i have to scale up 20k-50k percent.

    On working with scan displacement maps (and other super hi res textures), I really like using Mari. I think it's a great companion app to zbrush.
    Thank you. so I should really just bake the geomtry of the Scan Store model to a displadement map from Zbrush and then import all the other maps in Mari and blend the seams there?
    I mean all other assets are already maps so it seems like the wrong way to go to project it to Zbrush geometry and polypaint data and then export it. I would loose quality I'm sure and the multichannel fidelity of the XYZ map I'm sure.
  • pxgeek
    Offline / Send Message
    pxgeek greentooth
    It's just how i like to do it. If i have a different disp. scans that i need to integrate together on a single mesh, mari is going to give me the most control.

    Oh, so if you're using multi-channel scan assets then that method is probably not as viable since mari doesn't paint multiple channels at the same time. Substance painter is better with that.
  • Geuse
    Yeah. I should have explained better.
    Scan Store provides full body assets. They come with a 5 subdiv level Ztool and the following maps: color, normal, roughness, specular roughness
    So the geometry is not baked to a displacement map.
    You wrap and transfer the geometry details to your own model and export the displacement. The other maps are easy to transfer with Wrap4D.

    Then Texturing XYZ has these head assets that is a low poly obj basemesh with exr maps: albedo and displacement(3 channels where each channel has different level of details with the last one to have really fine high frequency details like pores so that you can control the different channels separately) and id-maps to isolate nose, ears etc for control in the shader.
    The thing is that I'm sure it is intended to be used just by wrapping the low poly head to your own model and then using the XYZ model to render as all maps come along and without the need to bake or transfer any maps. Otherwise you'd have to transfer every map to new UV:s for your own model which is what I'm thinking of doing.

    So since the Scan Store asset comes with details in a Ztool only I wonder if I should add the XYZ displacement in Zbrush on a new layer and blend it into the details I get from the Scan Store Ztool. Or If I had better bake the Scan Store details and blend the maps in another software?
    But this might be overkill. Scan Store also provides heads with HD maps that should be enough and easier to manage without the multichannel displacement.

  • pxgeek
    Offline / Send Message
    pxgeek greentooth
    Geuse said:
    So since the Scan Store asset comes with details in a Ztool only I wonder if I should add the XYZ displacement in Zbrush on a new layer and blend it into the details I get from the Scan Store Ztool. Or If I had better bake the Scan Store details and blend the maps in another software?

    Without knowing the details of the work and what the end result you're after, i'd say whichever way is more efficient for you...as it sounds like they achieve the same thing.
  • Geuse
    I just want to animate a MetaHuman, but make it look as good as possible.
    Thank you.
    I think I'll try using substance painter
Sign In or Register to comment.