Professional Documents
Culture Documents
• virtual reality (VR) – a digital environment that replaces the user’s physical
surroundings.
• Mixed reality (MR) – an integration of virtual content and the real world
environment that enables interaction among elements of both.
• Immersive :
• Immersive VR system on the other hand is the most expensive and gives
the highest level of immersion; its components include HMD, tracking
devices, data gloves and others, which encompass the user with computer
generated 3D animation that give the user the feeling of being part of the
virtual environment. One of its applications is in virtual walk-through of
building
• Semi-Immersive VR system, also called hybrid systems [1] or
augmented reality system, provides high level of immersion, while
keeping the simplicity of the desktop VR or utilizing some physical
model. Example of such system includes the CAVE (Cave Automatic
Virtual Environment) and an application is the driving simulator.
• Distributed-VR also called Networked-VR is a new category of VR
system, which exists as a result of rapid development of internet. Its
goal is to remove the problem of distance, allowing people from
many different locations to participate and interact in the same
virtual world through the help of the internet and other networks. A
traditional application of this is the SIMNET which is a real time
distributed simulation developed by the US military and used for
combat trainings [3, 4].
Components of Virtual Reality Systems
• A VR system is made up of 2 major subsystems, the hardware and
software.
• The hardware can be further divided into computer or VR engine and
I/O devices, while the software can be divided into application
software and database as illustrated below
Virtual Reality System Hardware
• The major components of the hardware are the VR engine or
computer system, input devices and output devices shown here
Input Devices
• The input devices are the means by which the user interacts with the
virtual world.
• They send signals to the system about the action of the user, so as to
provide appropriate reactions back to the user through the output
devices in real time.
• They can be classified into tracking device, point input device,
bio-controllers and voice device.
• Tracking devices sometimes referred to as position sensors, are used
in tracking the position of the user and they include,
electromagnetic, ultrasonic, optical, mechanical and gyroscopic
sensors, data gloves, neural and bio or muscular controllers.
• Examples of point-input devices include 6DOF mouse and force or
space ball. Their technology is an adaptation of the normal mouse
with extended functions and capability for 3D.
• Voice communication is a common way of interaction among
humans. So it feels natural to incorporate it into a VR system. Voice
recognition or processing software can be used in accomplishing this.
VR Engine
• In VR systems, the VR engine or computer system has to be selected
according to the requirement of the application.
• Graphic display and image generation are some of the most
important factors and time consuming task in a VR system.
• The choice of the VE engine depends on the application field, user,
I/O devices, level of immersion and the graphic output required, since
it is responsible for calculating and generating graphical models,
object rendering, lighting, mapping, texturing, simulation and display
in real-time.
• The computer also handles the interaction with users and serves as
an interface with the I/O devices.
• A major factor to consider when selecting the VR engine is the
processing power of the computer, and the computer processing
power is the amount of senses (graphical, sound, haptic, etc) that can
be rendered in a particular time frame as pointed.
• The VR engine is required to recalculate the virtual environment
approximately every 33ms and produce real time simulation of more
than 24fps,
• furthermore, the associated graphic engine should be capable of
producing stereoscopic vision.
• The VR engine could be a standard PC with more processing power
and a powerful graphics accelerator or distributed computer systems
interconnected through high speed communication network
Output Devices
• The output devices get feedback from the VR engine and pass it on to
the users through the corresponding output devices to stimulate the
senses. The possible classifications of output devices based on the
senses are: graphics (visual), audio (aural), haptic (contact or force),
smell and taste. Of these, the first 3 are frequently used in VR
systems, while smell and taste are still uncommon.
• Two possible common options for the graphics are the stereo display
monitor, and the HMD which provides a higher level of immersion. In
the HMD, the two independent views produced are interpreted by
the brain to provide a 3D view of the virtual world.
• Audio or sound is an important channel in VR; its importance is only
surpassed by that of visual.
• 3D sound can be used in producing different sounds from different
location to make the VR application more realistic.
• Haptic is used to allow the user feel virtual objects. This can be
achieved through electronic signals or mechanical devices.
Virtual Reality System Software and Tools
• Phones are one of the more basic viewing options for VR content and
have been providing a super cheap yet still interactive solution for
several years now.
• The phone is put inside a headset, and the screen is split to give
output for each eye, to give the illusion of depth.
• The accelerometer and gyroscope systems in the smartphones and
headsets of the last two decades give virtual reality applications a
true sense of motion
How is it that the VR environment being viewed moves when
your head is moved?
• The direct physical tracking of the human body has advantages over
camera-based systems – it doesn’t need a camera! It’s not trapped in room
scale, and it also does not suffer from occlusion – imagine putting your arm
behind your back – the camera cannot see it.
• The direct measurement also means that pose data does not have to be
calculated, it can be read straight from the device in real-time, reducing lag,
which helps with immersive Virtual Reality.
Haptic Technology
• Haptic technology aims to simulate the sensation of touch with
various mechanisms.
• One of them is using touch as a feedback system to communicate
information to and from the user.
• As visually-oriented species, we usually don’t stop to think how
incredible our sense of touch really is. With our hands we can
determine hardness, geometry, temperature, texture and weight only
by handling something.
• Even though you might not know it, there is a good chance you are
already using haptic technology in your daily life.
• Many smartphones with touch screens use vibration as a form of
feedback.
• Unlike keypad, touchscreens are just flat plates of glass, so the
vibration function of the phone is used to simulate the tactile feel of
buttons.
• What is more some Android smartphones detect when you pick
them up and vibrate if there are any unread notifications for you.
That is exactly what haptic technology is. How does it translate to
virtual reality?
How do haptics work
• Different technologies are used to give sensations that feel like solid
objects and resistance. Devices apply force, pressure or resistance by
using electric actuators, pneumatics and hydraulics.
• For example, gamepads use electric motors to force feedback
vibrations. What is more interesting, some data gloves both track
hand motion and use air bladders to harden and restrict your grip, so
you can feel an object in virtual reality.
• Recently, Facebook CEO Mark Zuckerberg revealed a new prototype
for VR gloves, and companies like Manus VR and Dexmo exoskeleton
glove are working on delivering gloves that would work with available
VR headsets.
• A high-end technology are haptic suits. “Suit up to feel every
explosion, gunshot, and sword fight in VR,” NullSpace writes on its VR
Suit’s Kickstarter page. NullSpace suit relies on tracking systems from
systems like the Vive and Oculus Rift to figure out where your limbs
and torso are in relation to the headset.
Haptic interfaces are divided into two main categories:
• Force feedback
• Tactile feedback
• Human part(left)
• Controls the position of the hand
• Machine part(right)
• Exerts forces from the hand to
simulate contact with a virtual object
• Also both the systems will be provided with necessary sensors,
processors and actuators. In the case of
• the human system, nerve receptors performs sensing, brain performs
processing and muscles performs actuation of the motion performed
by the hand while in case of the machine system, the above
mentioned functions are performed by the encoders, computer and
motors respectively
Haptic Concepts
• Tactile cues include textures, vibrations, and bumps kinesthetic cues-
include weight, impact. In the following section, we present some
crucial concepts and terminology related to haptics:
• Haptic: Haptic is the science of applying tactile, kinesthetic, or both
sensations to human–computer interactions. It refers to the ability of
sensing and/or manipulating objects in a natural or synthetic
environment using a haptic interface.
• Cutaneous: Relates to or involving the skin. It includes sensations of
pressure, temperature, and pain
• Tactile: Pertaining to the cutaneous sense, but more specifically the
sensation of pressure rather than temperature or pain.
• Kinesthetic: Relates to the feeling of motion. It is related to
sensations originating in muscles, tendons, and joints.
• Force Feedback: Relates to the mechanical production of information
that can be sensed by the human kinaesthetic system.
• Haptics or Haptic Technology: An emerging interdisciplinary field that
deals with the understanding of human touch (human haptics), motor
characteristics (machine haptics), and with the development of
computercontrolled systems (computer haptics) that allow physical
interactions with real or virtual environments through touch.
• Haptic Communication: This means by which humans and machines
communicate via touch. It mostly concerns networking issues
Haptic devices
• Haptic devices (or haptic interfaces) are mechanical devices acts as
mediator in communicating between the user and the computer.
• Haptic devices allow users to touch, feel and manipulate three-dimensional
objects in virtual environments and tele-operated systems.
• Typically, a haptics system includes:
• Sensor(s)
• Actuator (motor) control circuitry
• One or more actuators that either vibrate or exert force
• Real-time algorithms (actuator control software, which
• we call a “player”) and a haptic effect library
• Application programming interface (API), and often a haptic effect authoring tool
• The Immersion API is used to program calls to the actuator into your product’s
operating system (OS).
Haptic Interface
• This consists of a haptic device and software-based computer control
mechanisms. It enables human–machine communication through the
sense of touch.
• By using a haptic interface, someone can not only feed the
information to the computer but can also receive information or
feedback from the computer in the form of a physical sensation on
some parts of the body.
• Haptic Perception: This is the process of perceiving the
characteristics of objects through touch
Haptic Rendering
• This is the process of calculating the sense of touch, especially force. It
involves sampling the position sensors at the haptic device to obtain the
user’s position within the virtual environment.
• The position information received is used to check whether there are any
collisions between the user and any objects in the virtual environment.
• In case a collision is detected, the haptic rendering module will compute
the appropriate feedback forces that will finally be applied onto the user
through the actuators .
• Haptic rendering is, therefore, a system that consists of three parts, a
collision detection algorithm, a collision response algorithm, and a control
algorithm.
Sensors and Actuators:
• A sensor is responsible for sensing the haptic information exerted by
the user on a certain object and sending these force readings to the
haptic rendering module.
• The actuator will read the haptic data sent by the haptic rendering
module and transform this information into a form perceivable by
human beings
Tele-Haptics
• This is the science of transmitting haptic sensations from a remote
explored object/environment, using a network such as the Internet,
to a human operator. In other words, it is an extension of human
touching sensation/capability beyond physical distance limits.
Tele-Presence:
• This is the situation of sensing sufficient information about the
remote task environment and communicating this to the human
operator in a way that is sufficient for the operator to feel physically
present at the remote site.
• The user’s voice, movements, actions, etc. may be sensed,
transmitted, and duplicated in the remote location. Information may
be traveling in both directions between the user and the remote
location