Last updated at 09:23 GMT, Friday, 26 March 2010
Mark Ward
Technology correspondent, BBC News
Tapping your forearm or hand with a finger could soon be the way you interact with gadgets.
US researchers have found a way to work out where the tap touches and use that to control phones and music players.
Coupled with a tiny projector the system can use the skin as a surface on which to display menu choices, a number pad or a screen.
Early work suggests the system, called Skinput, can be learned with about 20 minutes of training.
"The human body is the ultimate input device," Chris Harrison, Skinput's creator, told BBC News.
Sound solution
He came up with the skin-based input system to overcome the problems of interacting with the gadgets we increasingly tote around.
Gadgets cannot shrink much further, said Mr Harrison, and their miniaturisation was being held back by the way people are forced to interact with them.
The size of human fingers dictates, to a great degree, how small portable devices can get. "We are becoming the bottleneck," said Mr Harrison.
To get around this Mr Harrison, a PhD student in computer science at Carnegie Mellon and colleagues Desney Tan and Dan Morris from Microsoft Research, use sensors on the arm to listen for input.
A tap with a finger on the skin scatters useful acoustic signals throughout the arm, he said. Some waves travel along the skin surface and others propagate through the body. Even better, he said, the physiology of the arm makes it straightforward to work out where the skin was touched.
Differences in bone density, arm mass as well as the "filtering" effects that occur when sound waves travel through soft tissue and joints make many of the locations on the arm distinct.
Software coupled with the sensors can be taught which sound means which location. Different functions, start, stop, louder, softer, can be bound to different locations. The system can even be used to pick up very subtle movements such as a pinch or muscle twitch.
"The wonderful thing about the human body is that we are familiar with it," said Mr Harrison. "Proprioception means that even if I spin you around in circles and tell you to touch your fingertips behind your back, you'll be able to do it."
"That gives people a lot more accuracy then we have ever had with a mouse," he said.
Early trials show that after a short amount of training the sensor/software system can pick up a five-location system with accuracy in excess of 95%.
Accuracy does drop when 10 or more locations are used, said Mr Harrison, but having 10 means being able to dial numbers and use the text prediction system that comes as standard on many mobile phones.
The prototype developed by the research team sees the sensors enclosed in a bulky cuff. However, said Mr Harrison, it would be easy to scale them down and put them in a gadget little bigger than a wrist watch.
Mr Harrison said he envisages the device being used in three distinct ways.
The sensors could be coupled with Bluetooth to control a gadget, such as a mobile phone, in a pocket. It could be used to control a music player strapped to the upper arm.
Finally, he said, the sensors could work with a pico-projector that uses the forearm or hand as a display surface. This could show buttons, a hierarchical menu, a number pad or a small screen. Skinput can even be used to play games such as Tetris by tapping on fingers to rotate blocks.
Mr Harrison would not be drawn on how long it might take Skinput to get from the lab to a commercial product. "But," he said, "in the future your hand could be your iPhone and your handset could be watch-sized on your wrist."
From: http://news.bbc.co.uk/2/hi/technology/8587486.stm
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.