Has anyone successfully used Leap Motion for animating first-person hands? Before investing on it, I was curious to see if anyone had attempted this with any success. Currently working on a project that has a considerable amount of first-person animations, most of them with props (using scissors, pressing buttons, using phone, firing weapon, using knife), and my animation skills are not the best (I'm a programmer by day, hobby game dev by night).
Replies
but
I'm not sure it's where I'd spend my money tbh - if you want super high quality animation then you want mocap gloves and some animators to clean it all up. if you're more concerned with getting 70-80% good enough then I think you can do it cheaper with roughly the same code time investment using camera based approaches
Google's camera based hand tracking stuff looks pretty solid to me. I've not worked with it yet but if it's as good as their face tracking stuff it's well worth a look (its all part of mediapipe I think)
There's also intel realsense - you can think of it as kinect on steroids. The outgoing generation of depth cameras are cheap at around 150 uk pounds and are probably good enough. There's a skeletal tracking example in the SDK iirc
I'd try the google stuff first
You can dump a stream of positions out of both systems and they both have python bindings so it should be possible.