Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible

The new Kinect API may replace your mouse with hand motions

1 min read
Microsoft Kinect Learns to Read Hand Gestures, Minority Report-Style Interface Now Possible

Not only is the Microsoft Research Cambridge team finally releasing their 3D modeling API Kinect Fusion, they’re bringing you gesture control—with mouse clicks and multi-touch, pinch-to-zoom interactions.

Current Kinect sensors can track joints in your body, but not small gestures, in particular hand motions. But that's about to change. Using machine learning, Kinect can now recognize open and closed hands. In the video below, Jamie Shotton, the man behind Kinect’s skeletal modeling, shows how users can use their hands to navigate a map or draw using a painting program.

As developers explore this new Kinect API capability (and as Microsoft refines the level of gesture recognition beyond open and closed hands), the possibilities here look pretty exciting. We're hoping, of course, to finally throw away our mouses and use computers like Tom Cruise in Minority Report. The API will be released in the next Kinect for Windows SDK.

And if you’re waiting to be able to build your own custom Kinect avatar with your hands without building a program, you may not have to wait that long. The Microsoft Beijing research team demoed their prototype Body Avatar, which lets you do just that—whether you want to be a five-legged dragon or a one legged-goose. We'll be posting that video soon. Stay tuned. 

The Conversation (0)