Home Technical Talk

Face capture . What you suggest??

Lorenzo444
polycounter lvl 5
Offline / Send Message
Pinned
Lorenzo444 polycounter lvl 5

Hi everyone i working on a project, i have my metahuman character modified in maya, the rig is exactly the same and works perfectly. I am using a live link between UE5 and Maya, so when i am changing the controls of my metahuman in maya, i can see in UE5 too. Now i ask you guys if there is any Face capture software that use, while is capturing my face, metahuman's control by live link in maya? i saw some software like rokoko and faceware. The problem is Faceware studio apparently does not have workflows between the software and maya and there is a plugin that works only with Unreal engine 4 . Rokoko use iphone and i don t have any device from apple, but i know there is a plugin that works by live link with maya that use blendshapes, while is capturing your face, but i don' t know if works with metahuman's controls ( i send an email to them) . What you suggest ? Do i need to buy an iphone? i know that rokoko company is working to develop an app for android too, but there is not update on this. Do you know any other softwares that could be usefull for me for this type of workflow that use live link? i know there is also facegood, audiotoFace Nvidia and iclone , but about the first two, you need to retarget and do some fix before the final result and iclone also use iphone. Can u help me please? Thank you


Sincerly,


Lorenzo

Replies

  • pior
    Offline / Send Message
    pior grand marshal polycounter

    While I can't provide an answer to most of your questions, I can at least clarify some things about the iPhone Livelink side of things.

    ARkit+LivelinkFace are actually not streaming blendshapes per say - all they do is stream the 52 channels that can drive the corresponding blendshapes of a model if they are present (+ a Yaw/Pitch/Roll rotator for neck). It sounds obvious but that's an important aspect to keep in mind because that means that the channels can be used for anything really. And in the case of Metahumans if I understand correctly there is some clever remapping going on using this 52-channel information to drive the many facial bones. For instance the 8 channels used for eyeballs (up/down/outwards/inwards) are converted into a rotator to control the eyeball direction using bones. Or, even if they turn out to indeed blendshapes, then they are definitely not the 50 default Arkit ones, meaning that there is at least some kind of remapping going on. After all, all the official LiveLinkFace examples from epic use a combination of bones and blends ... even though that is absolutely not necessary.

    Now whether or not the aforementioned channel mapping is possible in Maya as opposed to UE, I have no idea. I would assume that the channels can be streamed just fine, but the mapping of that information onto the Metahumans facial bones is probably only possible in UE, using the dedicated plugins/blueprints/rig.

    On a side note, even though ARkit does require an iphone with a Truedepth camera/lidar sensor for depth perception it actually works perfectly well when simply pointing the camera at a video performance playing on a monitor. So that makes me think that it is only a matter of time until we see some other version of LiveLinkFace that doesn't require an Apple device.

Sign In or Register to comment.