Microsoft are developing a new technology to read hand gestures. The device means more games design potential.
The device, currently named Digits, is worn on the wrist and creates a 3D model of the hand gesture that you do. It has been designed to be more comfortable than sensor gloves.
The technology was developed at the University of Cambridge, with help from researchers at Newcastle University and the University of Crete. It was unveiled at a conference on user-interface technology and a video has been posted online demonstrating the device in action.
Digits uses a camera-based sensor that detects infra-red (IR) light coupled with software that interprets the data produced to construct a model of a fully articulated hand skeleton. This is then used to interpret what the user’s hand is doing.
The equipment involves a IR laser beam which sends out a thin invisible line across the user’s hand to measure the distance to their fingers and thumbs to determine to what degree they are bent upwards.
David Kim, project leader, said “The Digits sensor doesn’t rely on external infrastructure which means users are not bound to a fixed space
“Ultimately we would like to reduce Digits to the size of a watch that can be worn all the time.”
Microsoft have given some ideas of how the device could be used including, but not limited to, shaping your hand like a gun for first person shooters and pressing your thumb down to fire, further improving your immersion without the need for a lot of space, like is the case with Kinect.
Microsoft’s team acknowledged the current device was still some way from being ready for market. It currently needs to be attached to a PC to carry out the necessary computations, making it impractical for real-world use. It also struggles if two fingers are crossed, the hand is flattened or if the user is holding something while making the gestures. However, the researchers suggested all these issues could be overcome with further work.