Home Animation Showcase & Critiques

[iPhoneX] facial capture to Maya game character [experiment]

vertex
Offline / Send Message
effectzero vertex
I've been experimenting with using the iPhone X for facial capture to bring life into the wild, immortal Baby characters i've been working for our VR game, Bebylon.  Since we're a team of about 5 its pretty hard to justify facial animation for our game at this stage considering how time consuming facial animation can be.  This is an early test but it shows a lot of promise for capturing decent data right out of the phone (depending on your quality bar) and it's super duper fast and easy.  

Im pretty confident that it can get a whole lot better with a little time spent perfecting the blendshapes (Should fix a lot of the mouth issues).  Also adding proper wrinkle maps will go a long way as well as using the captured data to drive secondary blendshapes, which should add more life and expressivity to the character.  

Next test is strapping the iPhone to a helmet and combining that with our xsens mocap suit for a full body and face test.  Also I'm going to get this data into UE4 and connected to the game version of this character. 

If you guys are interested in this I'll keep posting progress.  

You can skip to the middle where the maya section happens.
http://youtu.be/w047Dbo-fGQ


Here a loose snapshot of the process of the data from the iPhone X to maya.

Replies

  • viperthreon
    Options
    Offline / Send Message
    Hello, I'm not iPhone user. So is it only work with iPhoneX or all iPhone product?
  • Hito
    Options
    Offline / Send Message
    Hito interpolator
    Looks damn good already! I'm sure it'll look awesome with some tweaks in maya before final export to engine.

    How are you getting the captured data to text? is this using the new face recog emoji system in iPhones? I take it the analysis and conversion to maya anim file is your own software?

    Edit: woop, saw some info in the earlier youtube vid. Great stuff. i wonder where the analogue is in Android, probably long ways behind since Google didn't buyout Faceshift. Buggers :D
  • effectzero
    Options
    Offline / Send Message
    effectzero vertex
    Hello, I'm not iPhone user. So is it only work with iPhoneX or all iPhone product?
    Its using Apples ARKit API which, for face tracking uses the iPhoneX depth camera, so at the moment thats the only hardware that works with this.  
  • effectzero
    Options
    Offline / Send Message
    effectzero vertex
    Hito said:
    Looks damn good already! I'm sure it'll look awesome with some tweaks in maya before final export to engine.

    How are you getting the captured data to text? is this using the new face recog emoji system in iPhones? I take it the analysis and conversion to maya anim file is your own software?

    Edit: woop, saw some info in the earlier youtube vid. Great stuff. i wonder where the analogue is in Android, probably long ways behind since Google didn't buyout Faceshift. Buggers :D

    Thanks Hito!  Even though ARKit / iPhone X make it easy for this kind of thing, face tracking using a normal phone camera (on android) should deliver pretty comparable results.  But as you say, not as convenient as Faceshift! 
  • effectzero
    Options
    Offline / Send Message
    effectzero vertex
    Following up with a new PhoneX test!
  • Hito
    Options
    Offline / Send Message
    Hito interpolator
    New test video is great! Yea I was just looking at ARCore, still in beta test I guess. only supports a few devices at the moment.
  • effectzero
    Options
    Offline / Send Message
    effectzero vertex
    Heres the latest test #3: this time strapping the iPhoneX to a DIY moCap helmet and using our Xsens suit for simultaneous full body and face capture!
    https://youtu.be/i51CizUXd7A
  • Arturow
    Options
    Offline / Send Message
    Arturow polycounter
    wow thats so powerfull!!! *cries without a job in the future* haha jk , awesome stuff!
Sign In or Register to comment.