You are on page 1of 37

MODULE 2

Output Devices:
Graphics displays,
Sound displays &
Haptic feedback.
Introduction

• The feedback from the VR simulation in


response to this are sight (through graphics
displays), sound (through 3D sound displays),
and touch (through haptic displays)
• Some VR systems may not incorporate 3D
sound or haptic feedback interfaces, but all
will have some type of graphics display.
GRAPHICS DISPLAYS
Definition: A graphics display is a computer interface that presents synthetic world images to
one or several users interacting with the virtual world.
• Images can be characterized according to the type of image produced (monoscopic or
stereoscopic), their image resolution (number of pixels in the scene), the field of view
(portion of the eye's viewing volume they cover), display technology (LCD- or CRT based)
and cost.
GRAPHICS DISPLAYS
Note: With monoscopic VR, 1 image is directed to both eyes, just like a regular image or
video. With stereoscopic VR, there are 2 images, 1 for each eye.
GRAPHICS DISPLAYS
CRT stands for Cathode Ray Tube and LCD stands for Liquid Crystal Display
Parts of the human eye
GRAPHICS DISPLAYS
• Another important characteristic of the human vision system
is the field of view. This is approximately 150° horizontally
and 120° vertically when one eye is used and grows to 180°
horizontally and 120° vertically when both eyes are used. A
central portion of this viewing volume represents the area of
stereopsis, where both eyes register the same image. This
binocular overlap is approximately 120° horizontally. The
brain uses the horizontal shift in image position registered by
the two eyes to measure depth, or the distance from the
viewer to the virtual object presented in the scene.
• Within the field of view the eyes register the objects
surrounding the viewer, such as object A, which is behind
object B. The eyes concentrate on a feature of B, focusing on
a fixation point F. The angle between the viewing axis and the
line to the fixation point determines the convergence angle.
This angle depends on the distance between the pupils of the
GRAPHICS DISPLAYS
• The IPD is the baseline from which a person interprets distances to objects in the real
world. The larger the IPD, the larger is the convergence angle. The point F will appear
shifted horizontally between the right and left eyes because of its different position in
relation to the two eyes. This shift is called image parallax and needs to be replicated by the
VR graphics and stereo viewing hardware in order to help the brain interpret depth in the
simulated world.
• Designing graphics displays that satisfy all these requirements with inexpensive, is a
daunting technological task.
GRAPHICS DISPLAYS
• Personal Graphics Displays: A graphics display that outputs a virtual scene destined to
be viewed by a single user is called a personal graphics display. Such images may be
monoscopic or stereoscopic, monocular (for a single eye), or binocular (displayed on both
eyes).
• Categories of personal display are
I. Head mounted displays (HMDs),
II. Hand-supported displays (HSDs),
III. Floor supported displays, and
IV. Autostereoscopic monitors.
GRAPHICS DISPLAYS
i) Head-Mounted Displays: These
project an image floating some 1-5 m
(3-15 ft) in front of the user. They use
special optics placed between the
HMD small image panels and the
user's eyes in order to allow the eyes
to focus at such short distances.
ii) Optics is also needed to magnify the
small panel image to fill as much as
possible of the eyes' field of view
GRAPHICS DISPLAYS

Head-mounted display (HMD)


integration in a VR
system for:
(a) Monoscopic HMD
(b) Stereoscopic HMD
GRAPHICS DISPLAYS - Head-Mounted
Displays
• Displaying technology used in HMDs are,
A. Consumer-grade HMDs use LCD displays, while more expensive, designed primarily
for private viewing of TV programs and for video games rather than for VR, they
accept monoscopic video input.
B. Professional-grade HMDs use CRT based displays, which tend to have higher
resolution, designed specifically for VR interaction.
• The control unit also receives stereo sound, which is then sent to the HMD built-in
headphones.
• The user's head motion is tracked and position data sent back to the VR engine for use in
the graphics computations.
• A ratchet on the back of the HMD head support allows comfort adjustment for various head
GRAPHICS DISPLAYS
• ii) Hand-Supported Displays (HSDs): These are personal graphics displays that the user
holds in one or both hands in order to periodically view a synthetic scene. This means that
the user can go in and out of the simulation as required by the application. HSDs are similar
to HMDs in their use of special optics to project a virtual image in front of the user. In
addition, HSDs incorporate features not present in HMDs, namely push buttons used to
interact with the virtual scene. An example of a hand-supported graphics display is the
virtual binoculars SX shown in Figure.
GRAPHICS DISPLAYS - Hand-Supported
Displays (HSDs)
• These are constructed to resemble the look and feel of regular binoculars, to help the
realism of the simulation. However, virtual binoculars incorporate two miniature LCOS
displays (Liquid crystal on silicon) and a tracker, which measures the user's viewing
direction. The computer then updates the graphics based on tracker and push button
information.
• The virtual binoculars allows variable distance focusing as well as zooming on the scene,
using a mouse button on the top of the device (close to the user's fingertips). This is
connected to the computer running the simulation on a serial line. The use of LCOS
displays results in a high-resolution image (1280 x 1024 pixels) and low granularity (1.6
arcminutes/ pixel).
GRAPHICS DISPLAYS
• iii) Floor-Supported Displays: The HMDs and HSDs
previously discussed rely on 3D trackers to measure the
user's head position. When the user moves his or her
head he or she expects the displayed image to move in
the opposite direction. If the time delay between
corresponding head and image motions is too large,
simulation sickness may occur. In order to alleviate the
problem of simulation sickness, it is necessary to have
almost instantaneous response to the user's head motion,
which suggests the use of a mechanical tracker.
GRAPHICS DISPLAYS

• Floor-supported displays use an articulated mechanical arm to offload the weight of the
graphics display from the user. More importantly, floor-supported displays integrate sensors
directly in the mechanical support structure holding the display.
• This concept was developed by NASA and is used in Fakespace Labs.
• The major advantage that mechanical trackers have is their low latency, high data rate (140
datasets/sec), the latency is negligible (0.2 msec), eliminates the jitter, not affected by either
magnetic fields or ultrasound background noise, offer larger fields of view and superior
graphics resolution than HMDs or hand-supported displays. This is due to the use of larger
optics and CRT tubes, with weight supported by the mechanical arm.
GRAPHICS DISPLAYS

• iV) Desk-Supported Displays: Excessive display weight becomes an issue for HMDs and
hand-supported personal displays due to the user's fatigue, which can lead to neck and arm
pain. Even for floor-supported displays, excessive weight is undesirable, as it increases
inertia when the display is rotated and can lead to unwanted pendulum oscillations. One
category of displays where weight is not an issue is desksupported displays. Unlike
previously discussed personal displays, desk-supported displays are fixed and designed to
be viewed while the user is sitting. Thus the user's freedom of motion is limited when
compared to HMDs or HSDs.
GRAPHICS DISPLAYS

• Autostereoscopic: An interesting type of desk-supported displays are autostereoscopic


ones, which produce a stereo image while viewed with unaided eyes.
• The advantage of this approach is the ability to present a stereo image without requiring the
user to wear any vision apparatus.
• One disadvantage is increased system complexity and increased cost. The weight of the
display is 11.25 kg, but it is supported by the desk.
SOUND DISPLAYS
SOUND DISPLAYS
• Definition: Sound displays are computer interfaces that provide synthetic sound feedback to
users interacting with the virtual world. The sound can be monoaural (both ears hear the
same sound) or binaural (each ear hears a different sound).
• Sound displays play an important role in increasing the simulation realism by
complementing the visual feedback provided by the graphics displays previously discussed.
• When sound is added, the user's interactivity, immersion, and perceived image quality
increase. Sound in a real room bounces off the walls, the floor, and the ceiling, adding to
the direct sound received from the source. The realism of the virtual room therefore
requires that these reflected sounds be factored in.
SOUND DISPLAYS

The Human Auditory System: As illustrated


in Figure, the sound source location is uniquely
determined by three variables, namely azimuth,
elevation, and range.
Azimuth coordinates - position left to right.
Elevation coordinates - position up and down.
Distance coordinates - position from observer.
The brain estimates the source location
(azimuth, elevation, and range) based on
intensity, frequency, and temporal cues present
SOUND DISPLAYS
The architecture of our anatomy dictates how we understand
the sounds we hear: with an ear on either side of a thick skull
and spongy brain, we hear sounds enter our left and right ears
at different times.
If a dog barks by our left ear, it takes a few extra
microseconds for the bark to reach the right ear; the sound
will also be louder in one ear than the other. In addition,
sound waves interact with the physical constitution of the
listener — the pinna (or outer ear), the head, and the body —
and the surrounding space, creating listener-specific
HAPTIC FEEDBACK
HAPTIC FEEDBACK
• Named after the Greek term hapthai (meaning "touch")
• Haptic feedback convey important sensorial information that helps users achieve tactile
identification of virtual objects in the environment and move these objects to perform a
task.
• When added to the visual and 3D audio feedback previously discussed, haptic feedback
greatly improves simulation realism
• Definition: Touch feedback conveys real-time information on contact surface geometry,
virtual object surface roughness, slippage, and temperature. It does not actively resist the
user's contact motion and cannot stop the user from moving through virtual surfaces.
• Definition: Force feedback provides real-time information on virtual object surface
HAPTIC FEEDBACK
• Designing good haptic feedback interfaces is a daunting task because of user safety and
comfort.
• While the user interacts with virtual objects, the forces he or she feels are real. These
contact forces need to be large, but not large enough to harm the user.
• If haptic interfaces are too heavy and bulky, then the user will get tired easily
• Heavy feedback structures can be gravity counterbalanced, but this further increases
complexity and cost
• Haptic feedback interfaces should be self-contained, without requiring special supporting
construction, piping, or wiring.
HAPTIC FEEDBACK
Haptic Sensing: The skin houses four types of tactile sensors. They produce small electrical
discharges, which are eventually sensed by the brain. They are
I. Meissner corpuscles, (FA)
II. Merkel disks, (SA)
III. Pacinian corpuscles, (FA)
IV. Ruffini corpuscles. (SA)

Note: Slow-adapting (SA) sensors drop their rate of discharge so slow


Fast-adapting (FA) sensors drop their rate of discharge so fast
HAPTIC FEEDBACK

• Temperature sensing is realized by specialized thermoreceptors and nociceptors.


• The user’s body position and motion is sensed by another type of sensing called
HAPTIC FEEDBACK
Tactile Feedback Interfaces: (Tactile means sense
by touching)
I)The Tactile Mouse - The computer mouse is a
standard interface, serving as an open-loop
navigation, pointing, and selecting device. By open-
loop we mean that the information flow is
unidirectional, being sent from the mouse to the
computer. The standard way of using the mouse
requires that the user look at the screen all the time,
lest control be lost. Its outside appearance, weight
HAPTIC FEEDBACK
Tactile Feedback Interfaces:
II) The CyberTouch Glove - This is another haptic interface that provides vibrotactile
feedback to the user. As illustrated in Figure, the device is a CyberGlove retrofitted with six
vibrotactile actuators (one on the back of each finger and one in the palm). Each actuator
consists of a plastic capsule housing a DC electrical motor which produces vibrations. Each
actuator applies a small force of 1.2 N, which is felt by the skin mechanoreceptors and
kinesthetic receptors (through the finger bones). During VR simulations the CyberGlove
reads the user's hand configuration and transmits the data to the host computer. The
CyberTouch glove is most suitable, where contact is at the fingertips, since it has the ability
to provide feedback to individual fingers.
HAPTIC FEEDBACK
Tactile Feedback Interfaces:
II) The CyberTouch Glove
HAPTIC FEEDBACK
Tactile Feedback Interfaces:
III) The Temperature Feedback Glove - This glove allows users to detect thermal
characteristics that can help identify an object material. Such variables are surface
temperature, thermal conductivity. Thermoelectric heat pumps are acting as actuators. A
DC current applied to dissimilar materials placed in contact creates a temperature
differential. Thermoelectric heat pumps consist of solid-state N- and P-type semiconductors
sandwiched between ceramic electrical insulators, One is called a heat source and the other
a heat sink, as illustrated in Figure. When current from a DC source is applied to the heat
pump, the P and N charges move to the heat sink plate, where they transfer heat. This
results in a drop in temperature of the heat source plate and a corresponding rise in
temperature of the heat sink plate. The larger the current, the larger is the temperature
HAPTIC FEEDBACK
Tactile Feedback Interfaces:
III) The Temperature Feedback Glove
HAPTIC FEEDBACK
Force Feedback Interfaces
• Force feedback interfaces are devices that differ in several aspects from the tactile
feedback interfaces previously discussed.
• Force feedback interfaces need to be grounded (rigidly attached) on some supportive
structures to prevent slippage and potential accidents.
• Force feedback interfaces such as joysticks and haptic arms are not portable, since they
are grounded on the desk or on the floor.
• More portable interfaces, such as force feedback gloves, are grounded on the user's
forearm. This allows more freedom of motion for the user and more natural interaction
with the simulation
HAPTIC FEEDBACK
Force Feedback Interfaces
I) Force Feedback Joysticks - These are some of the simplest, least expensive and
most widespread force feedback interfaces today. These have a small number of degrees of
freedom and a compact shape and produce moderate forces with high mechanical
bandwidth.
II) The PHANTOM Arm - This is used in simulations
that need 3 to 6 haptic degrees of freedom as well as
increased handiness compared to joysticks. These were
large mechanical structures with imbedded position
sensors and electrical feedback actuators, and were suited
for use outside an industrial setting. The price of the
Haptic Degrees of Freedom
HAPTIC FEEDBACK
Force Feedback Interfaces:
III) The Haptic Master Arm – HapticMaster is produced in The Netherlands by FCS
Control Systems. As shown in Figure.
The HapticMaster is a cylindrical robot that can rotate
about its base, move up and down, and extend its arm radially
within a 0.64 x 0.4 x 0.36 m3 work envelope. Its abilities are
much larger than the corresponding PHANToM capabilities.
HAPTIC FEEDBACK
Force Feedback Interfaces:
IV) The CyberGrasp Glove – The CyberGrasp system is a retrofit of the 22-sensor
version of the CyberGlove, which it uses to measure user's hand gestures as illustrated in
Figure.
The CyberGlove interface box transmits the resulting finger
position data and wrist position data to the CyberGrasp force
control unit (FCU) from a 3D magnetic tracker worn by the
user. The resulting hand 3D positions are sent to the host
computer running the simulation over an Ethernet line (local
area network, LAN). The host computer then performs
collision detection and inputs the resulting finger contact forces
into the FCU.

You might also like