You are on page 1of 4

Virtual Environment and Sensori-Motor Activities:

Haptic, Audition and Olfaction


M. Mokhtari†, F. Bernier†, F. Lemieux†,
H. Martel‡, J.M. Schwartz‡, D. Laurendeau‡ and A. Branzan-Albu‡
† ‡
Distributed Synthetic Environment Group Computer Vision and Systems Lab.
Defence R&D Canada - Valcartier Dept of Electrical and Computer Eng.
Val-Belair, Qc, Canada, G3J 1X5 Laval Univ., Ste-Foy, Qc, Canada, G1K 7P4
[marielle.mokhtari,francois.bernier,francois.lemieux] [laurend,branzan]@gel.ulaval.ca
@drdc-rddc.gc.ca
ABSTRACT
While the simplest Virtual Environment (VE) configuration considers only visual interfaces, the need for increased immer-
sion drives the VE designers towards the integration of additional communication modalities. Sensori-motor systems are the
components of a Virtual Reality (VR) system that contribute to generate the VEs and to create the feeling of immersion and
presence, the feeling of being there. This paper quickly reviews the (1) olfactory, (2) auditory, (3) haptic sensori-motor ac-
tivities and focuses on the hardware/software components that are currently available, either as commercial products or as
academic prototypes.

Keywords
Virtual Environment – Virtual Reality – Haptic – Virtual Audition – Virtual Olfaction.

1. INTRODUCTION academic prototypes, for building actual implementations.


Immersion is an essential feature of a Virtual Environment 2. HAPTIC
(VE) as it is central to the paradigm where the user be- The word haptic comes from the Greek word haptesthai,
comes part of the simulated world, rather than the simu- which means “to come in contact with”. The scientific term
lated world being a feature of the user’s own world. The haptic can be traced back to the German word haptik,
definition of the perfect immersion in a VE is analogous to meaning “the study of touch and tactile sensations, espe-
Turing’s definition of artificial intelligence: if the user cially as a means of communication”. Its modern extension
cannot tell which reality is real and which one is virtual has been expanded beyond touch to embrace contact forces
then the computer generated one is immersive. in general. Haptic perceptions nowadays include kinæstetic
However, immersion as the result of Virtual Reality (sense of limb motion), proprioception (sense of limb posi-
(VR) technology can be achieved to various degrees. The tion relative to the body), cutaneous perceptions (contact
degree of immersion in a VE depends on how much the of the skin with the outside world through vibro-tactile,
virtual sensory data is integrated in the proprioceptive temperature, and pain sensations), and vestibular sense
process. Each human being uses proprioceptive informa- (awareness of position and motion of the body relative to
tion to form a mental model describing the dynamic spatial the rest of the world and to the gravity force) [Sheridan97]
and relational disposition of his body-parts. An efficient [ETSI02].
VR system is to provide consistency between propriocep- Early haptic research was motivated by the desire to
tive information and sensory feedback. substitute sensory information lacking to the visually or
The degree of immersion in a VE can be increased by hearing impaired by touch information. Another source of
using not only visual, but also olfactory, auditory and hap- motivation was the need for improving and developing
tic user interfaces, by improving the body tracking process, telerobotic applications, where a slave device is attached,
by using high level body representations (avatars), by at first mechanically then electrically, to a master user-
minimizing the lag between the real user movements and controlled device [Rodrigues02].
the resulting changes in sensory data, and so on [Lauren- The increase of computer power has opened new ap-
deau03]. plications for haptic research, including virtual prototyp-
This paper covers the different sensori-motor activi- ing, virtual exploration and observation, training, and, of
ties (others than visualization) that are encountered in VR course, recreational activities and entertainment.
and focuses on the hardware and software components that
are currently available, either as commercial products or as 2.1 Types of Haptic Interfaces
Haptic displays are devices conveying haptic information
Permission to make digital or hard copies of all or part of from the computer to the human user. By opposition, hap-
this work for personal or classroom use is granted without tic controllers convey information from the user to the
fee provided that copies are not made or distributed for computer. A haptic interface is both a display and a con-
profit or commercial advantage and that copies bear this troller. Displays can be either passive or active. A passive
notice and the full citation on the first page. To copy oth- display always presents the same information to the user.
erwise, or republish, to post on servers or to redistribute to The perception of the information solely depends on the
lists, requires prior specific permission and/or a fee. user actions (active touch). On the other hand, active dis-
plays are able to transmit a variety of messages and require
WSCG POSTERS proceedings no activity from the user (passive touch).
WSCG’2004, February 2-6, 2004, Plzen, Czech Republic.
Copyright UNION Agency – Science Press
Active haptic feedback can be achieved in two ways: [SensAbleTechnologies] distributes its GHOST SDK.
by motion feedback, or by force feedback. Displays im- It supports the entire line of Phantom haptic interfaces, and
plementing motion feedback are usually devices with high provides stable 3D force feedback. The force feedback can
inertia and are quite bulky, rigid, and heavy. They have be associated with a static 3D geometric model, which
mostly been developed in robotics. Contrarily to motion allows a “touch through tools” perception of the geometry.
feedback devices, displays implementing force feedback It can also be used to interact with dynamic objects, having
are usually small, partially flexible, and light. Until now, mass and behaviours. Force feedback can also be felt when
they have been the most successful haptic devices on the rotating, scaling, and translating objects. Finally, the SDK
market. Haptic devices implementing motion feedback can be used to generate “spatial effects” force feedback,
respond to the force that the user is applying on them while which allows motion constraint, vibrations, and apparent
force feedback devices respond to the motion the user im- inertia.
poses to them. [NovintTechnologies] sells the e-Touch, which is a
Anholonomic devices are purely passive devices. In- C++ framework providing an interface for a multi-modal
stead of dictating the position of the display or applying a (haptic, auditory, and visual) applications. It provides core
force against the user, anholonomic devices prevent motion classes and can be extended through modules. The modules
in one or more directions and allow motion in at least one are distributed under the Open Module system, which al-
other direction [Colgate96]. Tactile devices were first de- lows free usage of any module for research purposes, but
veloped for converting sensory information (noise, visual requires a license for commercial applications. It is de-
data, etc.) into haptic information. Pictorial devices repro- signed to be hardware independent; its porting to other
duce spatial information directly on the user’s skin (i.e. operating system or hardware is supported. Modules for the
[Iwata01]) and frequency-to-place devices encode spa- Phantom haptic interface from [SensAbleTechnologies]
tial/temporal information into vibrations, which are per- and for the Delta haptic interface from [ForceDimension]
ceived at a specific spot on the user’s skin (i.e. [Tan01]). are available.
Different solutions have been explored to provide vestibu- 2.5 Available Haptic Devices
lar information to users immersed in VEs. For instance,
2.5.1 Research Projects
providing the user with information on his orientation in
A system called GSS (Ground Surface Simulator) has
the environment can be achieved by lifting platforms,
been developed by ATR Media Integration & Communica-
while the illusion of motion can be provided with tread-
tion Research Laboratories [Noma00]. It is a locomotion
mills, rolling spheres, force-feedback shoes, bearing floors,
interface, which allows free walking for the user and simu-
or tools such as bicycles or roller blades.
lates natural terrain surfaces. The GSS could be used for
2.2 Issues with Haptic Devices rehabilitation purposes.
Stability is a major concern when working with haptic A locomotion device based on an array of small roll-
devices. Stability is affected by the reaction rate of the ers has been developed by the Optical Design Laboratory
device, and by the quality of reaction. Unstable haptic [Hsu03]. It allows free walking using a “step-and-slide”
feedback will most likely destroy the user sense of immer- walking pattern without wearing special shoes or using
sion and can also be potentially dangerous for the user. devices. Each roller is mounted on a switch, which is trig-
Haptic systems can be stabilized by considering the gered by the pressure exerted by the user’s feet. The user is
haptic controller and the controller of the object modeled maintained on the array by a hoop.
by the haptic device as a whole. The stability of the device VR Systems UK has developed a fully immersive
will then depend on the design of the whole application. spherical projection system [Cybersphere]. It is a 3.5m of
Alternatively, the haptic controller can be separated from diameter translucent sphere in which the user is free to
the rest of the application and designed to guarantee stabil- walk. The walking motion of the user causes the sphere to
ity when the application meets a given set of requirements. rotate, and computer-generated images are projected ac-
In this case the haptic controller is more generic and can be cordingly.
used in different applications without having to modify its [VirtualSpaceDevicesInc.] has designed and devel-
design. oped an Omni-Directional Treadmill (ODT). The first gen-
2.3 Design of Haptic Interfaces eration model provides a walking speed of 3m/s, and has
The perceptual capabilities and limitations of human an active surface of 1.3m×1.3m. It is comprised of two
users impose requirements on the design of haptic inter- belts, one for each axis; each belt is made from about 3400
faces. These requirements must cover a large number of rollers woven together. A second-generation model has
topics such as: force sensing, pressure perception, position also been developed, which offers a larger surface area
sensing resolution, stiffness, viscosity, and mass percep- comprised of flat belts instead of rollers. The main problem
tion, human force and position control, temperature percep- with the ODT is that it is very noisy and users cannot hear
tion, texture perception, and finally, ergonomics and com- each other when using the device.
fort requirements. The University of Tsukuba (Japan) carried out a pro-
ject to develop a new interactive technique that combines
2.4 Commercial Haptic Controllers haptic sensation with computer graphics. The goal was to
[ReachinTechnologiesAB] is developing Reachin present visual and haptic sensation simultaneously, provid-
API, which is a C++ application creating an interface for ing users with a surface on which they can touch an image
developing “touch through tools” applications, where a using any part of their hand [Iwata01]. A device called
stylus like device (Phantom haptic interface from FEELEX was designed, comprising a flexible screen, an
[SensAbleTechnologies]) is manipulated as if it was an actuator array and a projector. The actuator deforms the
exploration stick. flexible screen onto which the image is projected, and the
user can touch the image and feel its shape and rigidity. rapidity of oscillations, (ii) intensity related to the ampli-
Possible applications include medical simulators featuring tude of the waveform, and (iii) complexity related to the
palpation, 3D shape modeling tools, and touch-screens. spectral content along with the manner that the content
The Haptic Interface Research Laboratory of Purdue changes over time. In the human auditory system, acoustic
University has developed a three-by-three tactor array signals are broken down into constituent frequency com-
designed to be worn on the back [Tan03]. This device can ponents in a Fourier-like analysis. The standard range of
provide attentional and directional cueing. Attentional audible frequencies is between 20Hz and 20kHz. Sensitiv-
cueing has been found to improve the wearer’s reaction ity to intensity is measured in decibels (dB) and follows a
time to detect a change in a visual scene. Directional cue- logarithmic scale. Doubling the level of a sound source
ing can provide spatial cue to the wearer, such as where to causes roughly the same perceived change independently
look, or provide a stable referential to users experiencing of the reference level. This gives the auditory system a
distorted spatial orientation such as pilots and divers. large dynamic range. The auditory system is much more
2.5.2 Commercially Available Devices sensitive to temporal fluctuations than the visual system.
[SensAbleTechnologies] offers different models of stylus- The system is also sensitive to small fluctuations in the
based haptic interfaces called Phantom. [ForceDimension] spectral content of the input signal for roughly the same
sells the DELTA haptic device. Three different models are modulation speeds [Shilling02].
available: one with three degrees of freedom, one with six, 3.2 Spatial Sound
and the cardanic model, with a more robust design, for The main problem with using loudspeakers for 3D
industrial applications. [ImmersionCorporation] produces sound is that control over perceived spatial imagery is
the CyberGlove, a fully instrumented glove providing 18 reduced, since the sound is reproduced in an unknown
or 22 joint-angle measurements. The CyberTouch adds environment. The room and loudspeakers will impose un-
vibro-tactile feedback to the CyberGlove. It features one known transformations that usually cannot be compensated
stimulator on each finger and on the palm. Sensations such for by the designer of the 3D audio system. So it is difficult
as pulses, sustained vibration or more complex tactile to harness 3D spatial imagery over loudspeakers in such a
feedback patterns can be reproduced. The CyberGrasp way that the imagery can be transported to a number of
system adds resistive force-feedback to the CyberGlove on listeners in a predictable or even meaningful manner.
each finger. The CyberForce finally attaches to the Cyber- Different types of systems exist for adding virtual
Grasp to provide grounded-force feedback. auditory into VR applications, depending on the level of
sophistication that is expected. Basically two categories of
3. VIRTUAL AUDITION
products can be distinguished, (i) systems that are exclu-
Unlike visual information, the manipulation of sound in
sively software-based (the Sound Lab [SLAB] system de-
VE has only recently come to broad attention. The main
veloped by NASA, the Mustajuuri software developed as
reason is probably that sound does not appear absolutely
part of the [EVE] project at Helsinki University of Tech-
necessary to most VE and users. But it has been widely
nology, and the DieselPower software by [AM:3D]), and
recognized that virtual sound strongly contributes to the
(ii) systems that include some sound-processing hardware
quality of immersion in a VE. Work in a VE is more effi-
([AuSIM][LakeTechnologyLtd]).
cient when actions are accompanied by appropriate sounds,
seemingly emitted from their proper locations. It has been 4. VIRTUAL OLFACTION
shown that response times to visual targets associated with Virtual olfaction is an emerging field and smell is
localized auditory cues drop dramatically. Sound can also generally forgotten in implementing current VR systems.
provide an important source of feedback for events occur- However virtual olfaction could increase the level of im-
ring out of the field of view of the user. Virtual sound be- mersion of people using virtual technologies and enrich the
comes even more important in a multi-source sound envi- range of sensitivity. Smell and taste are the only senses that
ronment, because it is easier to discriminate and compre- allow us to perceive information from the chemical do-
hend sounds when they are separated in space. main, and the effect of smell on the human mind is strong,
Virtual sound has been named by several equivalent especially at the subliminal level. In comparison with the
designations, such as 3D-sound, virtual acoustics, spatial- ear or the eye, the human nose is much more complicated.
ized sound... All of these refer to techniques allowing Hundreds of different classes of biological receptors are
sound sources to be placed anywhere around the listener in involved in olfaction, whereas in vision only three different
the virtual space, either by hard sources or by filtered digi- classes are found. There are approximately ten million
tized sound signals rendered through headphones. sensory receptor cells in the nose, each of them being sen-
The ultimate goal of a 3D-sound system would in- sitive to a large number of compounds. Therefore, human
volve the complete control and manipulation of somebody subjects are able to experience a wide range of different
else's spatial auditory perception. Sound perception is not sensations from different odours. The response of a recep-
only characterized by a sound source's perceived direction tor is due to the activation of biochemical processes in the
and distance, but also by an apparent width or extent. The cell or of ion channels in the cell membrane. Furthermore a
environmental context also plays a role on how a sound learning process is possible for smell, allowing a human
source is perceived, mostly because of reverberation, subject to better recognize odours he is often exposed to
which acts as if a set of secondary sound sources were [Davide01]. Two technologies open the way towards vir-
produced. tual olfactory: electronic noses and virtual olfactory dis-
plays.
3.1 Acoustics Issues
Sound is a pressure wave produced when an object vibrates
rapidly. It is characterized by (i) frequency related to the
4.1 Electronic Noses current state of specialized hardware necessary to support
An electronic nose is a system that aims at character- VE applications is not satisfactory in most of the cases, and
izing different gas mixtures. It uses a number of individual true consumer-grade, high performance VR technology is
sensors (typically 5 to 100) whose respective selectivity not yet currently available. Haptic interfaces are still in a
towards different molecules overlap. The response from a primitive phase of development, and progress in audition
chemical sensor is usually measured as the change of some and olfaction technologies is much slower than that of
physical parameter, e.g. conductivity or current. The re- displays and image generation.
sponse times for these devices range from seconds up to a
few minutes. This is a significant drawback, and one of the
6. REFERENCES
[AM:3D] http://www.am3d.com
main research topics in this field is to reduce response [AromaJet] http://www.aromajet.com
time. Electronic noses can be useful for a variety of appli- [AuSIM] AuSIM Engineering Solutions http://www.ausim3d.com
cations such as product or process control (food, pack- [Colgate96] Colgate, J.E., Peshkin, M.A. and Wannasuphoprasit,
ages), medical diagnosis, and environmental monitoring to W., “Nonholonomic Haptic Display”, Proceedings of the IEEE
name only a few... However present electronic nose tech- International Conference on Robotics and Automation, Minneapo-
nology is a long way from assessing the level of complex- lis, MN, USA, April 22-28, pp.539-544, 1996.
ity and sensitivity of the human olfactory system. [Cybersphere] http://www.vr-systems.ndtilda.co.uk/sphere1.htm
[Davide01] Davide, F., Holmberg, M. and Lundström, I., “Virtual
4.2 Virtual Olfactory Displays Olfactory Interfaces: Electronic Noses and Olfactory Displays”,
A virtual olfactory display is a system composed of Communication through Virtual Technology, IOS Press, pp.193-
hardware, software and chemicals, able to provide olfac- 220, 2001.
tory information to the user of a VE. Virtual olfaction is [ETSI02] ETSI EG 202 048 v1.1.1, “Human Factors (HF) –
defined as the act of smelling an odorant produced by a Guidelines on the Multimodality of Icons, Symbols and Picto-
virtual olfactory display. Teleolfaction is a form of virtual grams”, August 2002.
[EVE] The Experimental Virtual Environment - http://eve.hut.fi/
olfaction defined as the act of smelling a mixture of odor-
[ForceDimension] www.forcedimension.com
ants whose composition is related to a mixture present in a [Hsu03] Hsu, Y.L., Wang, C.F., Lien, C.M. and Kao, C.Y., “De-
remote place. A virtual olfactory display should possess sign and Comfort Evaluation of a Novel Locomotion Device for
information about the type of smell, its concentration, its Training Overhead Traveling Crane Operators in a VE”, Journal of
temporal dynamics, and its spatial localization. This infor- the Chinese Institute of the Industrial Engineers, Vol. 20, No. 1,
mation should be provided either by an electronic nose, in pp. 62-90, 2003.
the case of teleolfaction, or by a computer simulator. This [ImmersionCorporation] http://www.immersion.com
poses the problem of smell coding, and because of the [Iwata01] Iwata, H., Yano, H., Nakaizumi, F. and Kawamura, R.,
complexity of the mechanisms involved in smell, no gen- “Project FEELEX: Adding Haptic Surface to Graphics”, Proc. of
SIGGRAPH, Los Angeles, CA, USA, pp. 469-475, 2001.
eral way of representing odours has been found yet.
[LakeTechnologyLtd] http://www.laketechnology.com
Only a few companies already sell virtual olfactory [Laurendeau03] Laurendeau, D., Branzan-Albu, A., Boivin, E.,
displays. They all use a number of chemicals stored in a Drouin, R., Martel, H., Ouellet, D. and Schwartz, J.M., “Survey of
type of cartridge, and when receiving a signal describing the State-of-the-Art on Synthetic Environments, Sensori-Motor
an odour, they release a mixture of these chemicals. Be- Activities in Synthetic Environments, Simulation Frameworks and
cause no standardized way of describing odours has been Real-World Abstraction Models”, Technical report, contract be-
created, different manufacturers may represent one smell in tween DRDC-Valcartier and Laval University, Canada, 2003.
different ways: [TriSenx] with the Scent Dome, [ScentAir] [Noma00] Noma, H., Sughihara, T. and Miyasato, T., “Develop-
with the ScentShow system and [AromaJet] with the Pi- ment of Ground Surface Simulator for Tel-E-Merge System”,
Proc. of IEEE Virtual Reality, pp. 217-224, 2000.
noke. It is difficult to evaluate the quality of these products
[NovintTechnologies] www.novint.com
and to estimate their potential for long-term development [ReachinTechnologiesAB] www.reachin.se
on the market. Rather than simulating a whole range of [Rodrigues02] Rodrigues, D.L., Campos, M.F..M. and Kumar, V.,
odours with a limited number of base components, future “Design, Implementation and Evaluation of Haptic Interfaces “, VI
developments may rather attempt to focus on particular SPG, Sept. 4-6, 2002.
types of smells, for example fragrance or coffee. [ScentAir] http://www.scentair.com
[SensAbleTechnologies] http://www.sensable.com
5. CONCLUSION [Sheridan97] Sheridan, T.B., “Human and Machine Haptics in
This overview cannot offer a unique formula for creating a Historical Perspective”, Workshop on Human and Machine Hap-
VE (in the case of haptic, auditory and olfactory devices) tics, Monterey Peninsula, CA, USA, December 7-9, 1997.
for a given application since this does not exist. VR is still [Shilling02] Shilling, R.D., Shinn Cunningham, B., “Virtual Audi-
tory Displays”, in Handbook of Virtual Environments, Lawrence
an infant field, and fundamental issues about the best way
Erlbaum Assoc., 2002.
to build systems, to integrate the interfaces, and to design [SLAB] http://human-factors.arc.nasa.gov/SLAB/
graphics, interactions and behaviours for modeling specific [Tan01] Tan, H.Z. and Pentland, A., “Tactual Displays for Sen-
phenomena remain a very open research topic. The evolu- sory Substitution and Wearable Computers”, in Fundamentals of
tion of VR is typical for new interdisciplinary fields. This Wearable Computers and Augmented Reality, Laurence Erlbaum
evolution can be compared to the development of software Associates, Mahwah, NJ, USA, pp. 579-598, 2001.
engineering, which has evolved from a chaotic and [Tan03] Tan, H.Z., Gray, R., Young, J.J. and Traylor, R., “A Hap-
unstructured activity to a formal process driven discipline tic Back Display for Attentional and Directional Cueing”, Haptics-
of engineering. e The Electronic Journal of Haptic Research, 3(1), 2003.
[TriSenx] http://www.trisenx.com
Despite the enthusiasm surrounding VR, a substantial
[VirtualSpaceDevicesInc.] http://www.vsdevices.com/
gap still exists between the technology available today and
the technology needed to bring VEs closer to reality. The

You might also like