A wrist-worn sensor that creates 3D-models of the user's hand movements in real-time has been built by Microsoft.
The Digits prototype is part of an effort to create a mobile device that would allow its owner to control a range of equipment using hand gestures.
The firm said it could be used as a virtual TV control, a way to operate a smartphone while it is in the user's pocket, and to play video games.
It is designed to be less cumbersome and uncomfortable than sensor gloves.
However, some experts question whether consumers would want to wear such a device during their day-to-day activities.
The Digits sensor was developed at Microsoft's computer science laboratory at the University of Cambridge, with help from researchers at Newcastle University and the University of Crete.
It was unveiled at a conference on user-interface technology in Massachusetts, and a video showing off the product has been posted online.
Digits uses a camera-based sensor that detects infrared (IR) light coupled with software that interprets the data produced to construct a model of a "fully articulated hand skeleton". This is then used to interpret what the user's hand is doing.
The equipment involves a IR laser beam which sends out a thin line across the user's hand to measure the distance to their fingers and thumbs to determine to what degree they are bent upwards.
In addition a ring of IR light emitting diodes (LEDs) are used to illuminate the hand and determine the position of the user's fingertips.
IR light is used because it is invisible to the human eye and so not a distraction.
"The Digits sensor doesn't rely on external infrastructure which means users are not bound to a fixed space," said project leader David Kim.
"They can interact while moving from room to room or running down the street."
He added that the prototype had been built using existing off-the-shelf components, but there was scope to improve the equipment with customised parts.
"Ultimately we would like to reduce Digits to the size of a watch that can be worn all the time," he said.
"We want users to be able to interact spontaneously with their electronic devices using simple gestures and not even have to reach for their devices."
Suggested uses for the equipment include:
- Twisting an imaginary dial to raise the volume of music playing from a radio or TV.
- A user tapping one of their fingers at a make-believe number pad in front of their face to dial a number on a smartphone without having to take it out of their pocket.
- Playing video games without a controller. For instance a player could use their hand as a virtual gun, pointing a finger out to resemble a weapon's barrel and pressing down their thumb to fire a shot. This goes beyond what current gaming sensors can detect.
- 3D-gesture controls for tablet computers. For example, by clenching their fist a user could zoom into an image, while opening their palm would reverse the move.
The team suggested an advantage of its device over other gesture sensors was that it could be used with a variety of devices across the day.
But another gesture tech researcher suggested that consumers would prefer sensors to be built into the relevant gadgets, avoiding the need to wear equipment.
"This portable, mobile solution is an interesting development with potential for novel applications," Dr Richard Picking, reader in human-computer interaction at Glyndwr University, told the BBC.
"However, similar innovations, such as data-glove technologies have failed to find mainstream application domains outside the computer games industry, and this may be also prove to be the case for Digits.
"It's not clear how reliable the technology is: how accurately does the camera need to be calibrated? What happens if it gets knocked about, or inadvertently moved? how comfortable is it?
"Also, this device doesn't support tactile feedback, as is the case for some data-gloves."
Microsoft's team acknowledged the current device was still some way from being ready for market.
It currently needs to be attached to a PC to carry out the necessary computations, making it impractical for real-world use.
It also struggles if two fingers are crossed, the hand is flattened or if the user is holding something while making the gestures.
However, the researchers suggested all these issues could be overcome with further work.