You are on page 1of 18


The Advancement of Tactile Assistive Devices

for the Deaf and Hard of Hearing

Katelyn Pearson

Howard High School


The Advancement of Tactile Assistive Devices

for the Deaf and Hard of Hearing

Though the use of vibration is being increasingly recognized and utilized in sensory aids

for the hearing impaired, there is still an abundant amount of room for improvement in

applications. In particular, vibrotactile aids and tactile alerting devices are especially useful to

assist deaf people. To make these aids the more efficient, it is necessary to be desirable to

understand the accessibility and availability of potential vibration locations, the body’s

sensitivity to vibrations, and the deaf brain’s neurological structure.

Tactile assistive devices for deaf people are mostly commonly used to alert the user of

surrounding sound, as is done with vibrotactile aids that apply vibration to the skin, or to portray

a message of an alarm or alert. These devices can be used alone or in conjunction with devices

such as hearing aids and cochlear implants. While the latter are very important and improve the

lives of many, vibrating assistive devices are also very important due to the immense number of

people who rely on them. The number of adults in the United States who use hearing aids is

much lower than what most people seem to believe, at 16.2 % as of 2006 ("Text Description,"

2012). This may be because of a variety of reasons such as price, medical limitations, stigma, or

cultural preferences. In any event, alert devices tend to be much cheaper and less invasive, and

consequently more accessible to many people.

Accessibility and availability is a key factor when developing a device to help people;

little would be accomplished if an assistive device is too difficult to use. So when developing a

new aid and choosing a body location for its placement, it is imperative to be aware of the

physical and practical limitations each body location may present. For example, research on

vibrotactile aids and prototypes reveals some that are placed on body locations that may hinder

everyday life or cause discomfort. This seems to be the case with a vibrotactile aid under

development Colorado State University students that applies vibrations the tongue to alert the

user of surrounding sounds (Sparks, 2015). At the risk of stating the obvious, not only does this

seem uncomfortable, but it would also interfere with how a person eats, drinks, and utilizes facial

expressions. A similar problem is also presented in a device called the Mood Glove, which is a

device that is intended to enhance a wearer’s perception of movies, and which is especially

beneficial to deaf and hard of hearing people people who have varying degrees of hearing loss.

The Mood Glove works by applying vibrotactile stimulus to the hand of the wearer,

synchronized with the movie effects, in an effort to heighten perceived emotions to amplify the

film’s mood (Mazzoni & Bryan-Kinns, 2016, p. 9). Unfortunately, wearing the Mood Glove may

hinder any activity such as eating, drinking, and possibly communicating with sign language


Another factor and potential obstacle is that of size. Effective devices generally need to

be compact, since designs that span a large portion of the body are impractical for everyday use.

This factor especially applies to vibrotactile aids due to the complex messages that are trying to

be conveyed. A prime example of the is a vibrotactile aid prototype with sensors all over the

arms that is being developed despite the fact it may interfere with clothing such as long sleeve

shirts and coats and be uncomfortable to wear (Khoo, Knapp, Palmer, Ro, & Zhu, 2013, p. 105).

The size challenge is also revealed in a vibrotactile aid in the form of a vest (Eagleman, 2015).

Though this device may seem to be comfortable and effective, the back of the device appears too

bulky, once again interfering with clothing.


Such factors should be taken into consideration when attempting to improve existing aids

or develop new ones. Two common vibrating alerting devices are vibrating alarm clocks and

vibrating baby monitors worn by a user, typically a deaf person ("Shake-n-wake Vibrating,"

2017; Taylor, 2017). Vibrating alarm clocks are generally designed to be worn on the wrist or as

a bed shaker, and vibrating baby monitors are also generally designed to be worn on the wrist.

The location availability aspect can be overlooked to focus on other concepts. Vibrating baby

monitors are also usually placed on the wrist, so the same analysis applies. While the location

(placement) of such aids might be considered good from the standpoint of practicality to the

wearer, there still may be room for improvement in considering other body locations for devices

to be worn from the standpoint of body sensitivity and perception.

Sensitivity to vibration is varied throughout the body depending on the body location.

When considering vibrotactile sensitivity, the body parts ranked most to least sensitive are the

hands, soles of feet, larynx region, abdomen, head region, and gluteal region (Weinstein &

Weinstein, 1964). In addition to considering sensitivity to vibration, a designer should also

consider sensitivity to pressure, so that a device being worn does not become uncomfortable.

This is especially true for devices that convey messages using vibrations at multiple different

body locations. For sensitivity to pressure, the body parts ranked from most to least sensitive are

the forehead and rest of the face, trunk, fingers, and lower extremities (Weinstein & Weinstein,

1964). For sensitivity to point localization, the body parts ranked from most to least sensitive are

the face region, fingers, hallux, palms, abdomen, arms, lower legs, upper chest, and thigh

(Department of Defense, 2007).


The variations in sensitivity to vibration and pressure are is due to the presence and

variability in mechanoreceptors in the skin, of which there are four main types: Meissner’s

corpuscles, Pacinian corpuscles, Merkel cells, and Ruffini's corpuscles (Purves & Williams,

2001). These mechanoreceptors sense tactile input and send the signals to the brain for analysis.

Meissner’s corpuscles account for most tactile perception on areas with glabrous skin such as the

fingers, palms, and soles. They are specialized to detect low frequency vibrations, 30-50 Hz, and

textured objects that move across the skin (Purves & Williams, 2001). Pacinian corpuscles are

not abundant in number, but are found on many of the same locations as Meissner’s corpuscles.

Pacinian corpuscles are made to detect much higher vibrations, 250-350 Hz, and finer textures

than Meissner’s corpuscles are ("Pacinian Corpuscle," n.d.). Merkel cells, found in the

epidermis, are essential for detecting light-touch sensations (Purves & Williams, 2001). Ruffini’s

corpuscles are elongated mechanoreceptors located deep in the skin, ligaments, and tendons.

They are slowly adapting and respond when skin is stretched and when bones change position in

joints (Parvizi, 2010, p. 315). The uneven distribution of these mechanoreceptors is responsible

for difference in sensitivity of different locations. This makes some areas more sensitive than

others, regardless of the mechanoreceptor receiving information (Purves & Williams, 2001).

One might wonder if these mechanoreceptors tire out if stimulated too long. For instance,

can mechanoreceptors experience sensory overload or fatigue due to prolonged or repeated

vibrations on the skin? After all, this happens with smell when the nose experiences olfactory

fatigue. Olfactory fatigue occurs when the sensory receptors are bombarded by a smell and

become accustomed to it. The individual experiencing this phenomenon becomes less or

unaware that the source of the smell is still present ("Lab 2: Olfactory," 2003). While secondary

research has not revealed evidence of a tactile assimilation parallel to olfactory fatigue,

mechanical vibrations have been shown to elicit muscle fatigue, where the power behind muscle

contractions decreases (Bennebach, Rognon, & Bardou, 2013). In any event, for short lived

vibrations, any possible fatigue issue should be minimal.

While the research on the body’s sensitivity to vibration and the brain’s perception to

vibration is helpful, it must be remembered that research on those with normal hearing might not

apply to the hearing impaired. This is because, though similar, there are several neurological

differences between the brains of deaf and those with normal hearing, namely, difference in

sensory compensation, left brain activity, and differences in white and grey matter. These

differences discussed below have varying potential to affect the input and processing of sensory

stimuli that need to be taken into consideration during the development of assistive devices

utilizing vibrations for deaf people.

Sensory compensation occurs when one sense is lost, and the others are heightened.

Hearing loss is not an exception. It has been shown that the auditory cortex is reapportioned in

the brains of deaf people to process visual and haptic input, rather than auditory input as is the

case with people with normal hearing (Napoli, 2014, p. 1). This makes sense, and studies have

shown that deaf people have better peripheral vision, aiding in the interpretation of sign

language, the most common replacement for spoken language (Bavelier, Dye, & Hauser, 2006,

p. 512). Signers split their focus between the hand signs, body language, and facial expressions

of their partner in a conversation. Therefore, when communicating with a deaf people through a

translator, it is important for hearing people to look at the person whom they are conversing with

out of respect. This gesture will likely be noticed and appreciated even as the deaf participant is

watching the translator (Siple, Greer, & Holcomb, 2004).

Deaf people alse display heightened left brain activity compared to those with normal

hearing. Stimulation from sign language affects the brain’s use and structure in deaf people

(Livadas, 2011). For example, more of the left parietal lobe is used for the use of signed

languages compared to spoken languages (Macsweeney, Capek, Campbell, & Woll, 2008, p.

438). The parietal lobes are used for the spatial processing of visual input and the integration of

somatosensory information. It is known that damage to the left lobe can result in the

development of Gerstmann's Syndrome, a cognitive impairment that affects the ability to write,

differentiate between left and right, understand arithmetic calculations and concepts, and identify

fingers ("Gerstmann's Syndrome," n.d.).

In addition, studies have found that the auditory cortexes of have less white matter than

those of people with normal hearing ("Brain Anatomy," 2014). The auditory cortex, located in

the temporal lobe, is used by people with normal hearing to process sounds transmitted from the

cochlea. Specific uses vary from processing natural sounds to deciphering the amplitude or

frequency of sound waves (Purves & Williams, 2001). It is perhaps not surprising that the brain

of deaf people would therefore possess less white matter in this way. Instead, those who are deaf

from birth and learn to sign at a young age are found to have more gray matter, which is neuronal

cell matter that processes information in the brain, in the right superior frontal gyrus (Olulade,

Koo, LaSasso, & Eden, 2014, p. 5617).

The end result is that deaf people have an increased tactile sensitivity compared to

hearing people (Levänen & Hamdorf, 2001, p. 75). This is supported by the fact that more of the

brain of is used for tactile response in deaf people compared to hearing people, and that deaf

people experience a more intense response to vibration in the secondary auditory cortex than

hearing people (Good, Reed, & Russo, 2014, p. 560; Auer, Jr., Bernstein, Sungkarat, & Singh,

2007, p. 645). Recent research suggests that, for deaf people, the auditory cortex is used to

process somatosensory input (touch and other skin sensation), more so than visual input, contrary

to what has been commonly accepted, which may partly explain the increased tactile sensitivity

(Karns, Dow, & Neville, 2012, p. 9626).

The information presented has provided a foundation from which to understand some

important aspects of tactile perception and improve the effectiveness of tactile assistive devices

for the deaf. Such aids help many people in their daily lives, and continuing to improving theses

devices for future users is imperative. Improvements should focus on the practicality and

sensitivity of the body location, the latter of which may differ for deaf people compared to those

with normal hearing and should be investigated. Assistive devices grant independence and

security to many people, and continued improvements and developments for those devices can

make a significant difference for those who need them.



Auer, Jr., E. T., Bernstein, L. E., Sungkarat, W., & Singh, M. (2007). Vibrotactile activation of

the auditory cortices in deaf versus hearing adults. ​Neuroreport​, ​18​(7), 645-648.

This article states that deaf people experience a more intense response to vibrotactile

stimulation in the secondary auditory cortex than hearing people do.

Bavelier, D., Dye, M. W.G., & Hauser, P. C. (2006). Do deaf individuals see better? ​Trends in

Cognitive Sciences​, ​10​(11), 512-518.

This journal article examines the relationship between sight and hearing loss. It states

that deaf people have better peripheral vision.

Bennebach, M., Rognon, H., & Bardou, O. (2013). Fatigue of structures in mechanical vibratory

environment. from mission profiling to fatigue life prediction. ​Procedia Engineering​, ​66​,


This article examines the effect of vibration on body parts and states that mechanical

vibrations have been shown to cause muscle fatigue.

Brain anatomy differences between deaf, hearing depend on first learned. (2014, April 14).

Retrieved April 3, 2017, from Georgetown University Medical Center website:


This article studies the differences between deaf and hearing brains. It states that there

is less white matter in the auditory cortex of deaf people.


Department of Defense Army Research Laboratory. (2007, May). ​The tactile modality: A review

of tactile sensitivity and human tactile interfaces​ (Technical Report No. ARL-TR-4115)

(K. Myles & M. S. Binseel, Authors). Retrieved from

This report analyzed how tactile stimulation is perceived by the body to help in the

creation of tactile aids. It is first stated that when a mechanism only utilizes one sense,

it can result in a sensory overload creating a need for multimodal devices. The two

main uses of tactile input are to emphasize vision or hearing and to act as an

independent information source. When vibrations are being used, they are best felt on

Pacinian tissue and on hairy, bony areas rather than fleshy, smooth ones. When

placing vibrating motors it is important to not place them too close together if they are

meant to be felt as separate inputs, though the amount of space needed varies among

different parts of the body. On the head, the face is the most sensitive followed by the

scalp, forehead, and then the temples. The hands are also recognized as being very

sensitive, however they are usually occupied causing developers to look at other areas

of the body for stimulation areas.

This article provides useful information that will directly help in the creation of the

experiment, including specifics on how close motors can be to each other and

graphical analysis of vibration detection. The next step is to find more information

along these lines to create a better understanding of the human body’s reaction to


Eagleman, D. (2015). Sensory substitution. Retrieved June 12, 2017, from David Eagleman


This website details the work of David Eagleman, specifically his vibrotactile vest

designed for deaf and hearing impaired people.

Gerstmann’s syndrome information page. (n.d.). Retrieved from


This page defines Gerstmann’s Syndrome in detail.

Good, A., Reed, M. J., & Russo, F. A. (2014). Compensatory plasticity in the deaf brain: Effects

on perception of music. ​Brain Sci​, ​4​(4), 560-574.

This article states that more of the deaf brain is sued for tactile processing than the

hearing brain.

Karns, C. M., Dow, M. W., & Neville, H. J. (2012). Altered cross-modal processing in the

primary auditory cortex of congenitally deaf adults: A visual-somatosensory fMRI study

with a double-flash illusion. ​Journal of Neuroscience​, ​32​(28), 9626-9638.

This article claims that the auditory cortex of deaf people is primarily used to process

tactile information, not visual input is is commonly accepted.

Khoo, W. L., Knapp, J., Palmer, F., Ro, T., & Zhu, Z. (2013). Designing and testing wearable

range vibrotactile devices. ​Journal of Assistive Technologies​, ​7​(2), 102-117.

This journal article tests how off the shelf vibration motors and materials can be used

to make a useful assistive device. The motors are placed along the arms and used for


Lab 2: Olfactory fatigue and memory. (2003, October 6). Retrieved from Evergreen State

College website:

This source is a lab for a college science class, teaching about olfactory fatigue. The

background information was used and the lab activity not performed.

Levänen, S., & Hamdorf, D. (2001). Feeling vibrations: Enhanced tactile sensitivity in

congenitally deaf humans [Abstract]. ​Neuroscience Letters​, ​301​(1), 75-77.

This article claims that deaf people have an increased tactile sensitivity compared to

hearing people.

Livadas, G. (2011, November). Unlocking the mysteries of the deaf brain. Retrieved November

16, 2016, from Rochester Institute of Technology website:

This article highlights key topics researched and conclusions made by Peter Hauser

and his team who focus on why the deaf brain differs from the hearing brain and how

sign language affects thought process. Little is understood about the human brain, but

Hauser emphasizes that deaf people may require alternative diagnosis treatment to

medical problems such as strokes. The other point Hauser emphasizes is that growing

up, deaf children require an environment based on visual learning in order to achieve

their maximum potential. As deaf children grow older, they become more aware of

their peripheral surroundings than their hearing peers, so an education based on visuals

allows them to excel and fully utilize their heightened sight processing. Hauser has

also concluded that when deaf children grow up in hearing families, their language

skills are not as advanced which inhibits executive functions. With his team, Hauser

wishes to encourage education strategies in the deaf community that will aid the

children’s development and optimize their potential.

This article focuses more on why the deaf brain is different from the hearing brain

rather than how. This makes it a good source to help bridge the gap between

mechanical innovations that assist deaf people to the biological differences between

deaf and hearing brains which is the next step of the research.

Macsweeney, M., Capek, C. M., Campbell, R., & Woll, B. (2008). The signing brain: The

neurobiology of sign language. ​Trends in Cognitive Sciences​, ​12​(11), 432-440.

This journal article talks about the neurology of the deaf brain. it mentions that more

of the left parietal lobe is used for signed languages than it is for spoken languages.

Mazzoni, A., & Bryan-Kinns, N. (2016). Mood glove: A haptic wearable prototype system to

enhance mood music in film. ​Entertainment Computing​, ​17​, 9-17.

This article explores how vibrations can be used to amplify emotions felt during

movies using a device called the Mood Glove. Influence for this project came from

sources such as Music For Bodies and vibrations used in video games. Rather than

using a chair as has been done in previous experiments, in this investigation a glove is

used as the medium to transmit vibrations. There are three steps to this experiment: the

annotation of movie clips for mood, the study of how vibration patterns are received

by participants, and the study of how the combination of vibrations and movie clips

affect the mood of participants. Through this process it was discovered that placement

of vibrations on the hand have no affect on mood but the intensity and frequency do.

The higher the intensity and frequency, the more pleasurable feelings are heightened.

With lower intensity and frequency, less enthused emotions can be felt.

Information on the effectiveness of vibrations, such as that different places on the

hand do not elicit different responses when vibrations are applied, was provided by

this article. This information will be useful when determining where to place the

vibrators. The next step is to research more in depth the sensitive locations on the


Napoli, D. J. (2014). A magic touch: Deaf gain and the benefits of tactile sensation. In H.-D. L.

Bauman & J. J. Murray (Authors), ​Deaf gain: Raising the stakes for human diversity​ (pp.

211-232). Retrieved from

This source explores the effects of tactile stimulation on hearing and seeing, deaf, and

deaf-blind people through different subtopics. The section about communication

through touch explains how greetings where where people touch are used around the

world and why. How touch affects infant development is shown to be positive and

correlates with increased social behavior and weight gain. Cognitive effects of touch

include increased functioning in the elderly, the identification of what is underneath an

object though through the detection of vibrations by apes and humans, and sustained

memory. Different ways the use of touch can be applied are to guide people and in

medical training. The plasticity of the brain allows the section known as the auditory

cortex to adapt to a lack of hearing and increased sensitivity to sound and touch. All of

this relates to “Deaf Gain” because touch has a large impact on Deaf culture such as

feeling the vibrations of music through a balloon or a deaf mother using touch to

communicate with her baby.

This book explicitly confirms the assumption that was being made without support

from a source that deaf people are more sensitive to tactile sensations than hearing

people. A broader understanding of how touch is used throughout different situations

and places was also provided by this article. The next step is to research how

vibrations can be mechanically produced to assist the tangible aspect of this research.

Olulade, O. A., Koo, D. S., LaSasso, C. J., & Eden, G. F. (2014). Neuroanatomical profiles of

deafness in the context of native language experience. ​The Journal of Neuroscience​,


This article links the brain structure of deaf people to sign language and compares it to

spoken language.

Pacinian corpuscle. (n.d.). Retrieved from Rutgers University Virtual Biology Labs website:

This is an interactive webpage form Rutgers University that details the Pacinian


Parvizi, J. (2010). Chapter 153 – nerve endings. In ​High yield orthopaedics​ (pp. 315-316).

Philadelphia: Saunders/Elsevier.

This book was used for information on Ruffini’s corpuscles, a type of


Purves, D., & Williams, S. M. (2001). ​Neuroscience​ (2nd ed.). Sunderland, Mass.: Sinauer


The information about mechanoreceptors, specifically Meissner’s corpuscles, was

used from this book. Information about the auditory cortex was also used.

Shake-n-wake vibrating alarm clock. (2017). Retrieved June 12, 2017, from Assistech website:

The “Shake-n-Wake Vibrating Alarm Clock” is a product sold at many online stores,

this on call Assistech. It is a watch-like device that is worn on the wrist and vibrates to

wake the user up.

Siple, L., Greer, L., & Holcomb, B. R. (2004). ​Deaf culture tipsheet​. Retrieved from Pepnet



This document from Rochester Institute of Technology explains some quick facts

about Deaf culture that can be very useful for a hearing person.

Sparks, M. (2015, January 19). New device allows deaf people to ‘hear with their tongue’

[Newsgroup post]. Retrieved from The Telegraph website:


This article describes a vibrotactile aid made by students at Colorado State University

that applies vibrations on the tongue to alert the user of sound around them. This

invention has yet to be commercialized or published in a journal article as of June


Taylor, J. (2017, March 7). Mom’s guide 2017: Best baby monitor for deaf parents. Retrieved

June 12, 2017, from MomTricks website:

This source reviews the best baby monitor for deaf parents. The device chosen, called

the “Summer Infant Babble Band,” is worn on the wrist and allows the user to set it to

vibrate, flash, or both when the child makes a noise.

Text description for use of hearing aids in 2006. (2012, October 18). Retrieved June 12, 2017,

from National Institute on Deafness and Other Communication Disorders website:

This source publishes statistics collected on the number of adults with moderate to

severe hearing loss who used hearing aids in 2006.


Weinstein, D., & Weinstein, S. (1964). Intensity and spatial measures of somatic sensation as a

function of body-part, laterality, and sex. ​PsycEXTRA Dataset​.

This document details how vibrations are perceived as sensitive to vibrotactile input

and pressure on multiple body parts.