This action might not be possible to undo. Are you sure you want to continue?
Skinput is collaboration between Chris Harrison at Carnegie Mellon University and Dan Morris and Desney Tan at Microsoft's research lab in Redmond, Washington. Skinput can allow users to simply tap their skin in order to control audio devices, play games, make phone calls, and navigate hierarchical browsing systems. It represents one way to decouple input from electronic devices with the aim of allowing devices to become smaller without simultaneously shrinking the surface area on which input can be performed. While other systems, like Sixth Sense have attempted this with computer vision, Skinput employs acoustics, which take advantage of the human body's natural sound conductive properties (e.g. bone conduction). This allows the body to be annexed as an input surface without the need for the skin to be invasively instrumented with sensors, tracking markers, or other items. In Skinput, a keyboard, menu, or other graphics are beamed onto a user's palm and forearm from a pico projector embedded in an armband. In particular, the location of finger taps on the arm and hand is resolved by analyzing mechanical vibrations that propagate through the body. These signals are then collected using a novel array of sensors worn as an armband. This approach provides an always available, naturally portable, and on-body finger input system. As electronics get smaller and smaller they have become more adaptable to being worn on our bodies, but the monitor and keypad/keyboard still have to be big enough for us to operate the equipment. This can defeat the purpose of small devices but with the clever acoustics and impact sensing software. An acoustic detector in the armband determines which part of the display is activated by the user's touch. The acoustic detector in the armband contains five piezoelectric cantilevers, each weighted to respond to certain bands of sound frequencies. Different combinations of the sensors are activated to differing degrees depending on where the arm is tapped. As the researchers explain, variations in bone density, size, and mass, as well as filtering effects from soft tissues and joints, mean different skin locations are acoustically distinct. Their software matches sound frequencies to specific skin locations, allowing the system to determine which ‘skin button’ the user pressed.
our sense of how our body is configured in three-dimensional space allows us to accurately interact with our bodies in an eyesfree manner.Furthermore. The system could use wireless technology like Bluetooth to transmit commands to many types of device – including phones. The motivation for Skinput comes from the increasingly small interactive spaces on today's pocket-sized mobile devices. Few external input devices can claim this accurate. They note that the human body is an appealing input device “not only because we have roughly two square meters of external surface area. upper legs. A tap on the wrist or finger can be read and classified as a certain command and then matched with interactive activities. Mr.. ranging from smartphones to video game devices.g. the developers weren't sure if it Page 2 . but also because much of it is easily accessible by our hands (e. The common requirement of any device is that it should be small so that it can be portable. Harrison first came up with the idea while working as an intern with Microsoft.Inspiration came from the term ‘device size paradox’.g. iPods and even PCs. upper legs.” Skinput — which marries the words “skin” and “input” — is a technology with specialized sensors that gauge vibrations happening inside of the human body. Literature survey Touch technology — from keyboards to touch screens on the latest gadgets — is nothing new. a graduate student at the Human-Computer Interaction Institute at Carnegie Mellon University. proprioception .. torso). torso). When work on Skinput started. arms. The invention of Skinput came about during the exploration of ideas like using other surfaces – ranging from tabletops to the skin on one’s arm – as a keypad for various electronics. arms. has taken the concept a step further with something called “Skinput. eyes-free input characteristic and provide such a large interaction area.A human body is an appealing input device not only because we have roughly two square meters of external surface area. but also because much of it is easily accessible by our hands (e. but it should be large so that it can be usable. But Chris Harrison.
The projectors beam images onto the skin of your forearm and when you tap on them ripples run through your skin and bones. Some waves travel along the skin surface and others propagate through the body. Skinput will make computing accessible to people in a way that never would have been possible before. They tried clipping sensors to the ends of peoples' fingers and other strange configurations that made users feel like cyborgs. The technology is currently only 96% accurate.would be possible to turn the human arm into a virtual keypad. literally. With Skinput. Differences in bone Page 3 . allowing people to literally control their gear by touching themselves. which you don’t have to bother to touch. Even better. the device that senses the vibration cues is attached to the upper arm. But the most profound achievement of Skinput is proving that the human body can be used as a sensor. and the keypads can totally be seen by simply memorizing the five input points. the physiology of the arm makes it straightforward to work out where the skin was touched. Right now. unique acoustic signatures can be identified for particular parts of the arm or hand (including fingers). A person might walk toward their home. Skinput uses what are called bio-acoustic sensors and tiny projectors worn in arm bands. The added pico projector is there just for convenience. computing is always available. particularly since the band works even if you're running. tap their palm to unlock the door and then tap some virtual buttons on their arms to turn on the TV and start flipping through channels. How “Skinput” works: A tap with a finger on the skin scatters useful acoustic signals throughout the arm. and it will need to become more wearer-friendly if it’s to be useful. tissue mass and muscle size. . sensors pick up on the waves and convey the commands to your device. with the developers working to get its accuracy closer to 99%. Due to different bone densities.
Different functions. This could show buttons. start. such as a mobile phone. It could be used to control a music player strapped to the upper arm. arm mass as well as the "filtering" effects that occur when sound waves travel through soft tissue and joints make many of the locations on the arm distinct. The main advantage about the human body is its familiarity. The system can even be used to pick up very subtle movements such as a pinch or muscle twitch. Proprioception means that even if someone is spun around in circles and asked to touch their fingertips behind their back. That gives people a lot more accuracy then we have ever had with a mouse. Fig.1. 3. The sensors could be coupled with Bluetooth to control a gadget. louder. stop. The device can be envisaged being used in three distinct ways. Skinput turns a user's own body into a touch interface for electronics. Finally. they'll be able to do it. iii. the sensors could work with a pico-projector that uses the forearm or hand as a display surface.density. softer. a number pad or a small screen. a hierarchical menu. ii. can be bound to different locations. Software coupled with the sensors can be taught which sound means which location. in a pocket. Page 4 . i.
Skinput is a novel input technique that allows the skin to be used as a finger input surface. where they can sense vibrations propagating from the mouth and larynx during speech.. Skinput To expand the range of sensing modalities for always available input systems. and is thus related to previous work in the use of biological signals for computer input. a wearable armband that is nonPage 5 . but is limited in its precision in unpredictable acoustic environments. fingers and forearms. Techniques based on computer vision are popular. Speech input is a logical choice for alwaysavailable input. are computationally expensive and error prone in mobile scenarios (where. letting people communicate with their gadgets by gesturing. using sign language or. however. in the case of Skinput. Bone conduction microphones and headphones – now common consumer technologies represent an additional bio-sensing technology that is relevant to the present work. To capture this acoustic information.These devices seek to move beyond the mouse and physical keyboard. an input system that does not require a user to carry or pick up a device. These leverage the fact that sound frequencies relevant to human speech propagate well through bone. Bone conduction microphones are typically worn near the ear. tapping on their hands. the focus is on the arm.g. e. These. non-input optical flow is prevalent). Bio sensing Skinput leverages the natural acoustic conduction properties of the human body to provide an input system. A number of alternative approaches have been proposed that operate in this space. and suffers from privacy and scalability issues in shared environments. Detailed description of Skinput The primary goal of Skinput is to provide an always available mobile input system – that is. In this prototype system.
invasive and easily removable was developed. While bone conduction microphones might seem a suitable choice for Skinput. these devices are typically engineered for capturing human voice. a single sensing element with a flat response curve. the mechanical phenomena that enable Skinput. and filter out energy below the range of human speech (whose lowest frequency is around 85Hz). The amplitude of these ripples is correlated to both the tapping force and to the volume and compliance of soft tissues under the impact area. Some energy is radiated into the air as sound waves. these appear as ripples. several distinct forms of acoustic energy are produced. the most readily visible are transverse waves. with a specific focus on the mechanical properties of the arm has been discussed. Sensing To capture the rich variety of acoustic.g. which propagate outward from the point of contact. it has been proved that joints play an important role in making tapped locations acoustically distinct. This makes joints behave as acoustic filters. many sensing technologies. and bone conduction carries energy over larger distances than soft tissue conduction. Roughly speaking. piezo contact microphones. To overcome these challenges. and accelerometers were evaluated. Among the acoustic energy transmitted through the arm. Bones are held together by ligaments. higher frequencies propagate more readily through bone than through soft tissue. Bio-Acoustics When a finger taps the skin. the success of this technique depends on the complex acoustic patterns that result from mixtures of these modalities. created by the displacement of the skin from a finger impact. or depend on these mechanisms for analysis. this energy is not captured by the Skinput system. Thus most sensors in this category were not especially sensitive to lower-frequency signals (e. While Skinput does not explicitly model the specific mechanisms of conduction. which is vital in characterizing finger taps.. Similarly. conventional microphones coupled with stethoscopes. When shot with a high-speed camera. to an array of highly tuned vibration sensors were Page 6 . including bone conduction microphones. 25Hz). In this section. and joints often include additional biological structures such as fluid cavities.
on the forearm. shearing motions caused by stretching). Armband Prototype The decision to have two sensor packages was motivated by the focus on the arm for input. one package was located near the Radius. which runs parallel to this on the medial side of the arm closest to the body. the resonant frequency was altered. low-frequency band of the acoustic spectrum. creates longitudinal waves that emanate outwards from the bone (along its entire length) toward the skin. in turn. allowing the sensing element to be responsive to a unique.g.g.developed. when placed on the upper arm (above the elbow). Thus. the bone that runs from the lateral side of the elbow to the thumb side of the wrist. narrow. Finger impacts create longitudinal (compressive) waves that cause internal skeletal structures to vibrate. Finger impacts displace the skin. and the other near the Ulna. This. Additionally. Page 7 . the sensors are highly responsive to motion perpendicular to the skin plane – perfect for capturing transverse surface waves and longitudinal waves emanating from interior structures. When the sensor was placed below the elbow. creating transverse waves (ripples).. with better acoustic coupling to the Humerus. small cantilevered piezo films were employed. By adding small weights to the end of the cantilever. the cantilevered sensors were naturally insensitive to forces parallel to the skin (e. In particular. acoustic information from the fleshy bicep area in addition to the firmer area on the underside of the arm were collected. the skin stretch induced by many routine movements (e. The sensor is activated as the wave passes underneath it. reaching for a doorknob) tends to be attenuated. the main bone that runs from shoulder to elbow. Specifically.. Response curve (relative sensitivity) of the sensing element that resonates at 78 Hz. However.
1. We described the design of a novel.newscientist. References    http://ieeexplore. We assessed the robustness and limitations of this system. we presented our work on Skinput – a method that allows the body to be appropriated for finger input using a novel. wearable bio-acoustic sensor.” In this paper. iii.html  Harrison. The arm band Conclusion Skinput will make computing accessible to people in a way that never would have been possible before. ii. computing is always available. Desney. Dan.org/stamp/stamp. We described an analysis approach that enables our system to resolve the location of finger taps on the body. "literally. Morris.com/2010/03/02/skinput-touchscreens-never-felt-right-anyway/ Page 8 .com/article/body-acoustics-can-turn-your-arm-intotouchscreen. i.pdf http://www. "Skinput: Appropriating the Body as an Input Surface".engadget. Chris.microsoft.Fig.jsp?tp=&arnumber=5754300 http://research.ieee. proceedings of the ACM CHI conference 2010. With Skinput. Tan. wearable sensor for bio-acoustic signal acquisition. iv. We explored the broader space of bio-acoustic input through prototype applications. We have studied the following points. non-invasive.  http://www. 4.com/redmond/groups/publications/harrisonskinputchi2010.
T. Abstracts. In CHI ‘02 Ext.D. Amento. and Terveen.. A Research Paper on Skinput Submitted By Swetha Iyer (21) Nikita Neelam (42) Sowmya Puranam (57) Neety Sharma (61) Usha Mittal Institute of Technology S. B. 724-725. L.N. Women’s University Mumbai. W.400 049 Page 9 . The Sound of One Hand: A Wrist-mounted Bioacoustic Fingertip Gesture Interface. Hill..
1 Skinput Device………………………………………………… 4.2011-2012 LIST OF FIGURES Figures Page No. 4 7 3.1 The Armband…………………………………………………. Page 10 .
the device can provide a direct manipulation. Skinput employs acoustics. like Sixth sense have attempted this with computer vision. graphical user interface on the body. Page 11 . which take the advantage of the human body’s natural sound conductive properties. markers. Desney Tan and Dan Morris at Microsoft Research’s Computational User experience Group.Abstract Skinput is an input technology that uses bio-acoustics sensing to localize finger taps on the skin. tracking. When augmented with a pico projector. Skinput represents one way to decouple input from electronic devices with the aim of allowing devices to become smaller without simultaneously shrinking the surface area on which input can be performed. The technology was developed by Chris Harrison. or other items. While other systems. This allows the body to be annexed as an input surface without the need for the skin to be invasively instrumented with sensors.
Page 12 .
Page 13 .
Page 14 .
This action might not be possible to undo. Are you sure you want to continue?
We've moved you to where you read on your other device.
Get the full title to continue listening from where you left off, or restart the preview.