My art pipeline is currently set up using FaceGen, which gives me a number of blendshapes. However, we didn't have time to get blendshapes implemented into our engine. However, I still would like to include facial animation, so I am looking to do it via bones, for which the engine is set up.
Is there any way to have blendshapes drive bone animation? Any MEL or MAX scripts for this sort of thing? Does anyone have any experience or possible ideas? I'm going to ask a Tech Artist/CharTD today about it and see if he can come up with anything, but I thought I may as well as you all anyway (and perhaps you can come up with a native 3dsmax solution instead of having to do it in Maya and import the animation into Max, since I can only export from max).
What I'm thinking right now is to rivet some locators to certain vertices, and then aim constrain a relatively simple facial bone rig to the relevant vertices, and follow their motion with the bone's rotation. Then, I would skin the face's base mesh to the bones and hopefully get a slightly simpler version of the animation. I don't need anything super-duper complicated, only a couple visemes and expressions (mostly angry ones).
Any help you can give would be much appreciated. Thanks.
EDIT: And as long as we're on the subject, what sort of facial bone setup would you suggest? I'm thinking:
(1) jaw
(1) tongue
(2) corners of the mouth
(2) upper lip
(2) eyelids
(2) eyebrows (should I do 4 total?)
(6) cheeks/sides of face, that would be responsible for pulling the laughlines, etc, basically most of the deformation not already taken care of
Replies
About facial skeletal structure, there have been threads about this in the past IIRC. A search might turn up some screenshots & tips. UDN has a facial rig illustrated in their wiki too.
I created a bunch of Rivets (attached a locator/dummy to the center particular face). I then made a skeleton consisting of a root/head bone, and then a bone for each locator (and extra ones for the jaw, eyes, etc). I Position(point) Constrained each bone to a locator. I skinned a new mesh to this skeleton. It works as well as I could have hoped.
However, I'm having trouble doing this in Max, because the bones don't work the same way.
I use an Attachment Constaint to constrain a dummy box to the morphing face mesh. But I am lost after that... I tried position constraining the last "nub" bone to the dummy but it doesn't work. I'm used to working more with joints, not bones, so there is something I'm missing conceptually, I'm sure.
EDIT: Alright, I've imported from Maya and I'm using a sort of "points" instead of MaxBones, which seems like it should work... I'll let you know.
So you need a setup that translates the baked vertex motion into bone rotations. IK is the key thing I think you're forgetting to try... that lets the Attachment controller be separated from the bone itself, so the bone <u>has to</u> rotate to match it.
Depends on your exporter though, whether it's cool with no keyframes on the bones themselves. You could bake the bone rotations into keyframes though.
http://www.softimage.com/products/face_robot/game_export/acting_to_game/deformation/default.aspx
http://udn.epicgames.com/Two/SkeletalSetup.html#Highly%20Articulated%20Facial%20Skeleton
What I did was, just what I said basically. Attach constrained a bunch of dummy boxes to key areas of the morphing mesh (as per the XSI facerobot example bone structure), then Position Constrained a bone to each dummy object. For the eyelids, eye, and jaw, I Aim Constrained a bone to the dummy, to get the proper rotation. The skinning was the hardest part, but not too bad. Thanks again, your examples were definately a big help.