What is the best way to generate normal maps from very dense meshes? I know of many ways to do it, but there are problems with those approaches.
My current workflow is the following.
1) Model the low poly character
2) Model high detail model, focusing on main forms, hard details and other inorganic details.
3) Bring high detail model into zbrush. the high detail model is still fairly low poly, much higher than I'd want for a game character model but still a lower poly cage mesh, as I would do all the subdivision in zbrush.
4) subdivde the model and flesh out the forms and add organic details in zbrush and other finer details.
5) unwrap low poly game model
6) Bring high poly model into max.
This is where the problem arises. I can't use zbrush to create the normal maps because I need normal maps generated from comparing the low poly game model with the high poly model, not the base level mesh I brought into zbrush, which means I have to use max's render to texture to get my normal maps. The problem is the scene just becomes unworkable with such a high poly mesh. trying to tweak the projection cage is almost impossible because the viewport moves so slowly but I can't hide the high poly because I need it there to make sure it's not penetrating the projection cage at all.
I've read somewhere I think that Epic and maybe others broke models up into several parts to work on so the rest could be hidden to make the scene more managable while working. Wouldn't this cause seams to show up in the normal maps between the parts? And even if I could hide the parts I'm not working on to improve performace, I'm not sure my computer could handle the memory requirements of having the entire model loaded, even if only parts are being rendered. Would you bring each part in one at a time and make a normal map for each part, then composite them together in photoshop?
A little clarification on an efficient system for obtaining normal maps from very dense meshes would be greatly appreciated. Thanks
Replies
Or, you could use xNormal, which works great:
xNormal. Again, let's you bake normals for arbitrary meshes.
If you definitely have to or want to use Max for some reason, I'm pretty useless since I use Maya. But I think poop has a tutorial where you make a new layer and then hide it before you import to make it workable?
I haven't tried Normalx, I may look into that if It can do what I'm trying to do resonably fast
Thanks for the reply
The problem using Zmapper is it only works between subdivision levels of the same mesh.
[/ QUOTE ]
Actually that's not true. If you look at the .pdf that comes with Zmapper, go to section 3 "Projecting Normal Maps Onto New Topology". You use the "projections" tab in Zmapper to "capture" the hi-res mesh, then close ZMapper, then open your game-res mesh, open Zmapper again, and bake the normal map. Again, pretty clunky but it works.
That being said, I've been much happier with results from xNormal.
a certain zbrush-competitor had similar problems to bake the normals correctly and just like zbrush did not provide obvious, useful controls to enhance the accuracy. i wonder how people deal with this.
Keeping the imported mesh as a editable mesh, not a editable poly, is also quicker in the viewport.
Yeah, compositing them in photoshop is the go, with the unwrap all laid out, its a simple case of deleting unwanted areas.
Slice your Zbrush HiRes mesh up and export chunks out.. Cast for each chunk.. Then assmeble in photshop. I still tend to do this because I use Kaldera in Max 6. I think it generates the best Normal Maps. Another Solution that most of us now use is XSI. I handles a lot of poly's no problem.
peace
-mike
PS havent tried xNormal but everyone seems to like it.. might give that as shot as others have posted