You are on page 1of 2

Skinput turns your arm into a touchscreen (w/ Video)

An armband projects a user interface onto the skin, enabling users to control devices with a larger touchscreen than is offered by many mobile devices. Credit: Harrison, et al.

(PhysOrg.com) -- If you find yourself getting annoyed at the tiny touchscreens on today's mobile devices, you might be interested in a "new" yet overlooked input surface: yourself. A new skin-based interface called Skinput allows users to use their own hands and arms as touchscreens by detecting the various ultralow-frequency sounds produced when tapping different parts of the skin.
Skinput interface input armband sensors.

Skinput is a collaboration between Chris Harrison at Carnegie Mellon University and Dan Morris and Desney Tan at Microsoft's research lab in Redmond, Washington. The researchers have shown that Skinput can allow users to simply tap their skin in order to control audio devices, play games, make phone calls, and navigate hierarchical browsing systems. In Skinput, a keyboard, menu, or other graphics are beamed onto a user's palm and forearm from a pico projector embedded in an armband. An acoustic detector in the armband then determines which part of the display is activated by the user's touch. As the researchers explain, variations in bone density, size, and mass, as well as filtering effects from soft tissues and joints, mean different skin locations are acoustically distinct. Their software matches sound frequencies to specific skin locations, allowing the system to determine which skin button the user pressed. Currently, the acoustic detector can detect five skin locations with an accuracy of 95.5%, which corresponds to a sufficient versatility for many mobile applications. The prototype system then uses wireless technology like Bluetooth to transmit the commands to the device being controlled, such as a phone, iPod, or computer. Twenty volunteers who have tested the system have provided positive feedback on the ease of navigation. The researchers say the system also works well when the user is walking or running. As the researchers explain, the motivation for Skinput comes from the increasingly small interactive spaces

"Skinput turns your arm into a touchscreen (w/ Video)." PHYSorg.com. 1 Mar 2010. http://www.physorg.com/news186681149.html
Page 1/2

on today's pocket-sized mobile devices. They note that the human body is an appealing input device not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso). Furthermore, proprioception - our sense of how our body is configured in three-dimensional space - allows us to accurately interact with our bodies in an eyes-free manner, the researchers write in a recent paper. For example, we can readily flick each of our fingers, touch the tip of our nose, and clap our hands together without visual assistance. Few external input devices can claim this accurate, eyes-free input characteristic and provide such a large interaction area. In April, the researchers plan to present their work at the Computer-Human Interaction meeting in Atlanta, Georgia. More information: -- Chris Harrison, Desney Tan, Dan Morris. "Skinput: Appropriating the Body as an Input Surface." CHI 2010, April 10-15, 2010, Atlanta, Georgia, USA -- Skinput project: http://www.chrisha ... cts/skinput/ 2010 PhysOrg.com

This document is subject to copyright. Apart from any fair dealing for the purpose of private study, research, no part may be reproduced without the written permission. The content is provided for information purposes only.

"Skinput turns your arm into a touchscreen (w/ Video)." PHYSorg.com. 1 Mar 2010. http://www.physorg.com/news186681149.html
Page 2/2

You might also like