Awesome! Thanks for posting this, was wondering when it was first announced if it would be possible. Would be cool to play around with baking matcaps down onto the fiber mesh as well. Probably get some interesting results.
I tried something similar recently using max hair converted to a mesh and basically found the results too noisy for use in a normal map - you're having the same problem here I tihnk although it's not too bad on account of the hair being relatively short.
I find normal mapping hair on a per-strand basis always looks a bit crap regardless of technique though so it's not a weakness in the process - i've always found it better to work on clumps and leave the specular and diffuse to pick out strands
Your AO and diffuse came out nice so this is definitely a workable technique and since you can get the shapes you want out of fibermesh ten times faster than using max hair i'd call it a win
You can make the strands very chunky and stylised easily. Works better for baking. The default strands are flat planes, but you can make them volumetric and much bigger just by dicking around with the sliders. Lots of potential not demonstrated in the pixologic preview vid.
I did it too fast, without testing thoroughly... but i liked the result.
only the normal comes out "weird" but i believe with some configuration is possible to have a good result
@Bonebrew22 Everything can be configured, width, thickness, twist, quantity, distribution, base color, mid color, top color....
I am wondering otugh why the look isn't that realistic? what's the problem with the normal maps?
Because normals only capture normals, you don't really get a layered, self-shadowed, volumetric look, and what you do get is confined to the silhouette of the mesh.
Here's a quick test with thicker fibres, rendered with BPR and then Zapplinked to the diffuse. I didn't spend too much time grooming as evidence by that terrible clusterfff near the mouth, and I used the default light which lead to baking in shadows behind and under the head. Perhaps with some more time spent, it could have some potential for certain uses.
I imagine you could do the same thing to some simpler planes, which would also let you bake in the edges of the strands for an alpha (flatcolor black on white document, or vice versa). I'm not sure how that would compete against just doing it in photoshop.
How bout instead use to make very very thick fur, like clusters of furs for a more 3d look and then over this project smaller and smaller over ... no idea how to place planes of alha fur texture to be captured ....
I imagine, since the fibres are just tens of thousands of individual, unconnected polygon strands, it's just a terrible amount of data to try and bake or even give UVs to (all those strands would probably be their own crazy island?). Whereas with xnormal, it can just load the polypaint data from the OBJ and bake it to a lowpoly mesh that isn't so UV-intensive.
fibermesh can be exported, although whatever you have in the viewport is a temporary blockout, and when u render it smooths it out. Although the mesh exported can be as good if you get enough subdivisions on it.
It's been a while since I've done it, but if memory serves it was along the lines of:
1. Give the mesh a texture. Create a new blank one if you must.
2. Do your BPR Render and save it. It's optional, but you might want to store the camera position in case you accidentally move it.
3. Drop with Projection Master and Zapplink it to Photoshop
4. Load the BPR Render in Photoshop and save it
5. Send it back to Zbrush and pick it up. The result should be transfered to the texture instead of polypaint.
Basically you need to transform the fibermesh into a 3D mesh (Polymesh3D)
Exports the high poly OBJ
(you can paint the model/fiber with polypaint and export, the polypaint information will be embedded inside the OBJ file *this is just vertex paint* xnormal also read this info to make texture projection)
Inside xnormal load your High and Low poly models to bake everything (The low poly model needs to have UVs)
Bake
This will works best with thicker fiber, extremely thin fibers can make the normal map jagged
Micromesh BPR view can be easily coverted to mesh by pressing 'Convert BPR To Geo' button which you can find above 'DIVIDE' button. You can press it only in BPR display mode Hope that helps.
You can easily convert / export Micromesh by pressing'Convert BPR To Geo' button which you can find above 'DIVIDE' button in Geometry tab You can press it only in BPR mode. Hope that helps
Would you be kind enough to share your process of transferring the fibermesh to UV's to your lowpoly head. I'm new to zbrush and i'm having a hard time converting fibermesh fur into a texture.
Because normals only capture normals, you don't really get a layered, self-shadowed, volumetric look, and what you do get is confined to the silhouette of the mesh.
Here's a quick test with thicker fibres, rendered with BPR and then Zapplinked to the diffuse. I didn't spend too much time grooming as evidence by that terrible clusterfff near the mouth, and I used the default light which lead to baking in shadows behind and under the head. Perhaps with some more time spent, it could have some potential for certain uses.
I imagine you could do the same thing to some simpler planes, which would also let you bake in the edges of the strands for an alpha (flatcolor black on white document, or vice versa). I'm not sure how that would compete against just doing it in photoshop.
the low poly result looks stylised to me . maybe some black lines here and there and bam ( the walking dead style )
Replies
seems your loosing some of the effect too noise.
Good stuff. :thumbup:
I find normal mapping hair on a per-strand basis always looks a bit crap regardless of technique though so it's not a weakness in the process - i've always found it better to work on clumps and leave the specular and diffuse to pick out strands
Your AO and diffuse came out nice so this is definitely a workable technique and since you can get the shapes you want out of fibermesh ten times faster than using max hair i'd call it a win
only the normal comes out "weird" but i believe with some configuration is possible to have a good result
@Bonebrew22 Everything can be configured, width, thickness, twist, quantity, distribution, base color, mid color, top color....
Now another thing that has much potential to generate maps is Micromesh
[ame="http://www.youtube.com/watch?v=pvTK-R5sK6E"]ZBrush 4R2b Basics of MicroMesh - YouTube[/ame]
[ame="http://www.youtube.com/watch?v=d67oTqaRtYA"]ZBrush 4R2b Fibers with MicroMesh - YouTube[/ame]
Would Micromesh generate maps though / be exported? I haven't played around with it, but it reads more like something that only occurs during BPR.
you're right, there is no way to export the Micromesh
maybe some plugin
I think using micromesh would be easy to do something like hauberk creating a tiled micromesh
Seems that it's more realistic a hand sculpt of furs or using a photosurce?
what's the problem with the normal maps?
keep us updated ...
Here's a quick test with thicker fibres, rendered with BPR and then Zapplinked to the diffuse. I didn't spend too much time grooming as evidence by that terrible clusterfff near the mouth, and I used the default light which lead to baking in shadows behind and under the head. Perhaps with some more time spent, it could have some potential for certain uses.
I imagine you could do the same thing to some simpler planes, which would also let you bake in the edges of the strands for an alpha (flatcolor black on white document, or vice versa). I'm not sure how that would compete against just doing it in photoshop.
1. Give the mesh a texture. Create a new blank one if you must.
2. Do your BPR Render and save it. It's optional, but you might want to store the camera position in case you accidentally move it.
3. Drop with Projection Master and Zapplink it to Photoshop
4. Load the BPR Render in Photoshop and save it
5. Send it back to Zbrush and pick it up. The result should be transfered to the texture instead of polypaint.
I used Xnormal to bake.
Basically you need to transform the fibermesh into a 3D mesh (Polymesh3D)
Exports the high poly OBJ
(you can paint the model/fiber with polypaint and export, the polypaint information will be embedded inside the OBJ file *this is just vertex paint* xnormal also read this info to make texture projection)
Inside xnormal load your High and Low poly models to bake everything (The low poly model needs to have UVs)
Bake
This will works best with thicker fiber, extremely thin fibers can make the normal map jagged
Would you be kind enough to share your process of transferring the fibermesh to UV's to your lowpoly head. I'm new to zbrush and i'm having a hard time converting fibermesh fur into a texture.
Any help would be greatly appreciated.
-Ali
the low poly result looks stylised to me . maybe some black lines here and there and bam ( the walking dead style )
also look at your ram consumption, xnormal is 64bit, zbrush is not.