Home Technical Talk

Swappable animation rigs (or something)

ned_poreyra
polycounter lvl 4
Offline / Send Message
ned_poreyra polycounter lvl 4
I stumbled upon funny videos from this series https://www.youtube.com/watch?v=Ia4ziBah5hc where someone apparently could apply animations of any character in the game to any other character in the game. Although deformations happened, I'm curious how was it at all possible. Does it mean that all characters in the game share exactly the same rig setup, just with scaled bones and different weight painting? So, let's say, there are bones Head1, Head2, Arm1, Arm2 etc., and if I create animation strip for one character, it can be just copied to any character using the same rig? Is it a common thing in game dev?

Replies

  • Obscura
    Offline / Send Message
    Obscura grand marshal polycounter
    This is called retargeting.
  • oglu
    Offline / Send Message
    oglu polycount lvl 666
    mGear does have some fine tools to make that easier.

    https://youtu.be/pjPzCBN5KB0

    also Mayas HumanIK
    https://youtu.be/k2SmByZDIjw?t=318
  • oglu
  • Mark Dygert
    Very common.

    In general having a base skeleton for almost every character that shares the same physiology is pretty standard. I typically have... 
    One skeleton for bipeds (humans, most humanoid monsters, kangaroos, a lot of dinosaurs, monkeys and birds).
    One for quadrupeds (bears, goats, cows, horses, otters, rats and big cats). 

    It's important to note that just because they share the same bone names and their transforms are roughly in similar locations pointing in similar directions it doesn't mean we're using the exact same rig to animate with. Often we'll use the same skeleton but rig it up inside of different meshes to suit the needs of that character/creature and their behavior.

    If a suitable rig already exists and the behavior is roughly the same we might just use the same rig and animations, think donkey and a mule, we're not making two separate rigs unless one sprouts tentacles or wings.

    Unreal has an extensive set of tools to make the process of re-targeting animations pretty smooth.
    https://docs.unrealengine.com/en-US/Engine/Animation/RetargetingDifferentSkeletons/index.html 
    This method uses the same skeleton and same animations with a transform (skeleton preset) inserted between the the animation and the original animation.
    https://docs.unrealengine.com/en-US/Engine/Animation/Persona/BasePoseManager/index.html
    Unreal will figure out the difference between the rigs and apply a transform to the animation so it doesn't distort horribly. Even still, you can see the limitations when things diverge too far. Likewise you could take tracers animations that where built using her rig and apply them to any other character.

    Basic workflow:
    1. You import your base skeleton, "GenericBipedSkeleton" or whatever, this will be the skeleton that everything else uses.
    2. Import the Tracer Skeletal Mesh and point it to the GenericBipedSkeleton
    3. On the GenricBipedSKeleton, Retarget Manager, create a skeleton preset for tracer.
    4. Import Tracers anims and set them to use the Tracer preset.
    5. Import the junkrat skeletal Mesh and  create a preset for Junk, just like you did for tracer.
    6. Import Junks anims and set them to use the Junkrat preset.
    Now either skeletal mesh can use either set of animations because they use the same skeleton, even though their joint positions are different. 

    It will mostly compensate by applying the difference between the base skeletal poses, but there is a limit to what it can do. You see that in the video, where Genji anims fit pretty well but Zens legs and McCree's neck are horrible. Zen has poofy pants and there are some seriously wonky skin weights on Tracers lower legs when its applied to her Skeletal Mesh. Aside from some wonky skin weights or missing rigging, clearly his pants are much poofier than her skinny legs and the pose doesn't work and some kind of correction or compromise would need to be made if they both where going to share that anim. 

    There will always be some wonkiness if the base poses are off by too much. SO... you can use poses in unreal to fix base pose discrepancies and minor issues like broken fingers or arms clipping into torsos but really, you never really want to be in a position where you need to make extreme cases like that work. Maybe it's ok for rapid prototyping or just to block things on a new character that doesn't have the anims yet. 

    In those cases where you want to edit the anims for that specific character, you are better off using the copy/retarget workflow so you have a new set of anims, for that specific character that you can tweak and change.

    https://docs.unrealengine.com/en-US/Engine/Animation/AnimationRetargeting/index.html
    Copy/Retarget will clone the anims and apply the transform difference in the base pose, to the entire set of copied animations, blendspaces, AimOffsets ect. Then you can edit them in unreal to some degree, but most people export them back to whatever DCC they're using.

    This is also good if the characters are similar in physiology but different in behavior. "These background grunts only run and shoot, they don't need all of the complex systems the player needs to parkour around the map". So don't use the same pawn or animation blueprint. Copy it and cut out what you don't need, it will save on perf. You might then import other grunts that DO use skeleton presets but plug into this new animation system.

    The copy/retarget system can be a bit of a headache to maintain because it is a snapshot of the original system at that moment in time. Any work you need to add to both has to be done in two places, where if they share the same skeleton, anims and animation blueprints you only have to edit one place and all characters are updated. 

    Case in point: We want to add look-at to every character. If you copy/retarget that could be dozens of ABPs you have to add the exact same nodes to. If its shared, that is probably one ABP edit and everyone has it.

    SOO... it really depends on the needs of the game and what you're doing.

    Also, real-time control rigs are in Unreal.
    https://docs.unrealengine.com/en-US/Engine/Animation/ControlRig/index.html
    Slightly expensive to use and a little buggy but there is a lot of potential uses in a lot of cases. I won't get into all that, just know they exist and they can be useful, if your game can support it.

    Also, joint culling in the Skeletal Mesh Levels of Detail are possible. Unfortunately its a fairly new feature and not documented yet it works and it's awesome. Its the same implementation as Simplygon. You can delete joints from LODs to reduce the cost of the skeleton and it assigns the verts to whatever is left of the hierarchy. So if you have a cat with a really long articulated tail in LoD0 and in LoD1 most of the tail joints are removed. Usually this factors into a single mesh at a distance, but you can use joint culling on the lowest LoD to do things like remove fingers and toes from grunts that don't need them. 40+bones and all of their transforms are just, gone. 
    Example: you have a giant dinosaur that needs its feet and toes rigged up and animating, but you also have a chicken that uses the same skeleton and doesn't need feet and toes, cut them out without separating the rig or actually deleting them.

    So yea, it's really common, that's mostly what they went through to get that setup and working. It sounds like a lot of work but honestly its not that hard. Skinning a character takes more time.

    Epic put a lot of thought and work into the process and it rocks. I don't look forward to using another engine that is missing all of that... 
Sign In or Register to comment.