You are on page 1of 73

Dr.

Elia Gatti

UCL

elia.gatti@ucl.ac.uk

1
Introductions
Elia Gatti

• BSc Psychology
• MSc Neuroscience
• MSc Robotics
• PhD Design
• Post doc HCI
• Research scientist –Meta
• UX Researcher - Alexa
• Behavioral scientist –Alexa TTS

Simple introduction. For more info please contact me directly at elia.gatti@ucl.ac.uk

2
• Haptic stuff • Beyond the haptic stuff
• Haptic perception • Integration with vision (VR yeah yeah)
• Haptic displays • Integration with audio (They are basically cousins)
• Vibrotactile • Integration with taste (Kinda makes sense no?)
• Temperature • Integration with smell (Say whaaaaat?)
• Pressure/stroking
• Force feedback
• Impedance control • Haptic and cognition
• Admittance control • Haptic illusions
• Haptic and emotions
• Haptic and communication
• Embodiment

Simply the agenda of the lecture

3
4
Wet +
Smooth + Sticky
Cold +
Light

Hard +
Solid + Wet +
Stable Squishy

Haptic perception allows for a variety of sensations

5
Perception is mediated by receptors in the muscles and the skin. Information
collected by these receptors is integrated by our brain, mostly in the pre-frontal and
parietal cortex, in what is called somatosensory area.

6
TOUCH

We usually refer to the sensations coming from skin receptors as touch. We have
multiple receptor types in the skin (Left figure). Traditionally, the 4 most studied
tactile receptors in the skin are those depicted in the right figure. The are situated at
different depths in the skin. Their positioning and morphological characteristics allow
for specific properties, namely: size of their receptive field and transient response to
the presence of a stimulus (adaptation behavior). More specifically, the deeper the
receptor is in the skin, the larger will be the area on the skin that, if stimulated, will
activate the receptor. In terms of adaptation, Maissner and Pacian corpuscles we only
respond when a stimulus is applied or lifted (Fast adapting). Merkel and Ruffini will
constantly send signals to the brain as long as we apply a stimulus to the skin in their
receptive fields (slow adaptations)

7
PROPRIOCEPTION

We call the sensations coming from our muscles proprioception. Proprioception


allows us to perceive the work of our muscles and the orientation of our joints. In
other words, allows us to perceive forces and displacements. There are 2 main
receptors for proprioception: Muscle spindles and Golgi tendon organs. The first one
are nerve fibers wrapped around muscles. When the muscles stretch, the fibers come
loose, and send signals to the brain. The second type or receptors (Golgi) wraps
around tendons. They sense tendons stretch, providing the brain with information
about the configuration of the joints in our body.

8
Wet +
Smooth + Sticky
Cold +
Light

Hard +
Solid + Wet +
Stable Squishy

Back to my niece, let’s admire the amazing complexity and variety of the haptic sense

9
10
• Haptic stuff
• Haptic perception
• Haptic displays
• Vibrotactile Rough/Smooth
• Temperature Hot/Cold Wet/Dry
• Force feedback Hard/Soft
• Impedance control
• Admittance control

https://haptipedia.org

Haptic displays allow us to artificially re-create the sensations discussed so far

11
• Vibrotactile actuators
• Linear Electromagnetic Actuators (LEA)
• Linear Resonant actuators (LRA)
• Rotary Electromagnetic Actuators (REA)
• Piezoelectric actuators

Jonatandsiuclmes, J. M., García, A., & Oliver, M. (2014). Identifying 3D


geometric shapes with a vibrotactile glove. IEEE Computer Graphics and
Applications, 36(1), 42-51.

Vibrotactile displays deliver vibration to the user. They are very flexible, and creative
engineers can use vibrotactile displays even to signal the presence of objects and
shapes in VR.

12
Vibrotactile displays
• Linear Resonant Actuators (LRA)

One of the most common vibrotactile actuator is the LRA. In LRAs a magnetic field
is generated by the voice coil which interacts with the magnet mass, which
is suspended on a spring. As the magnetic field varies with the applied
drive signal, the magnet and mass are moved up and down as they
interact with the spring. Those familiar with vibration, RF, audio
engineering, mechanical systems will quickly spot that attaching a mass to
a spring causes a resonance effect. The combination of spring stiffness,
mass and magnet/coil size will cause the linear vibrator to have a natural
resonant frequency.
Typically for Y-axis vibrators, resonant frequencies are around 175 – 235
Hz.

13
• Rotary Electromagnetic Actuators (REA)

ERMs are small motors with an off-center load and driven by a differential
plus/minus DC voltage. As it starts to spin, the entire assembly vibrates due
to that eccentric mass and the effect of centripetal force. (There’s some
irony here, as motor designers and fabricators usually devote extra effort to
getting the motor balanced as perfectly a possible so that it doesn’t
vibrate!). Reversing the drive voltage acts to brake the motor. The ERM DC
motors are tiny, with a diameter up to about 6 mm (although some are
slightly larger). They normally operate from a 1.5-V to 3-V supply,
compatible with one or two batteries.

14
• Piezoelectric Actuators (Piezo)

Piezoelectric actuators are less known or used than either the LRA or ERM
devices, mainly due to the challenges of driving it. It is based on the well-
known, widely used piezoelectric effect. The piezo actuator can be built as
a stack of piezo elements but is more commonly constructed as a
cantilevered “bender,” which offers more displacement but less force.
When a high-voltage signal is applied, the beam bends, creating the
desired motion at the tip of the beam

15
• Texture perception

Asano, S., Okamoto, S., & Yamada, Y. (2014). Vibrotactile stimulation to increase and decrease texture roughness. IEEE Transactions on
Human-Machine Systems, 45(3), 393-398.

What is Vibration used for? Well, apart from vibrating for the sake of it (e.g. phones)
we can use vibration to replicate textures, like in this paper from 2014

16
• Texture perception

We are very good at perceiving vibrations. The table on the right shows how every
receptor in our skin has a preferential frequency band to detect different vibrations.
Why would we need such precision? One theory is rooted in evolution. As human,
one of the biggest advantage we have over other species is our manual dexterity. But
manual dexterity is pretty useless if we handles object with insufficient grip. Thus, the
perception of vibration (that is, the perception of texture) is extremely important to
make the most of our manual dexterity and gain competitive advantage over other
species. It is not unlikely that individuals with good roughness perception might have
also been better at manipulating tools, being therefore more fit to the environment
than their co-specific, thus being selected by the natural selection process.

17
• Water based actuators
• Thermoelectric actuators

Let’s talk about temperature displays

18
• Water based actuators

Guo, X., Zhang, Y., Wei, W., Xu, W., & Wang, D. (2020). ThermalTex: A Two-Modal Tactile Display for Delivering Surface Texture and Thermal Information. In Haptics:
Science, Technology, Applications: 12th International Conference, EuroHaptics 2020, Leiden, The Netherlands, September 6–9, 2020, Proceedings 12 (pp. 288-296).
Springer International Publishing.

Using cold and warm water is a way to deliver temperature sensations to


participants/users’ skins. Read the paper cited is the slide for an interesting example

19
• Thermoelectric actuators

Using the thermoelectric effect is, however, the most used way to deliver cold/hot
sensations. Thermoelectric cooling uses the Peltier effect to create a heat flux
at the junction of two different types of materials. A Peltier cooler, heater, or
thermoelectric heat pump is a solid-state active heat-pump which transfers
heat from one side of the device to the other, with consumption of electrical
energy depending on the direction of the current. Such an instrument is also
called a Peltier device, Peltier heat pump, solid state refrigerator,
or thermoelectric cooler (TEC) and occasionally a thermoelectric battery. It can
be used either for heating or for cooling, although in practice the main
application is cooling. It can also be used as a temperature controller that
either heats or cools.
is. The primary advantages of a Peltier cooler compared to a vapor-
compression refrigerator are its lack of moving parts or circulating liquid, very
long life, invulnerability to leaks, small size, and flexible shape. Its main
disadvantages are high cost for a given cooling capacity and poor power
efficiency. Many researchers and companies are trying to develop Peltier
coolers that are cheap and efficient.
A Peltier cooler can also be used as a thermoelectric generator. When
operated as a cooler, a voltage is applied across the device, and as a result, a

20
difference in temperature will build up between the two sides. When operated
as a generator, one side of the device is heated to a temperature greater than
the other side, and as a result, a difference in voltage will build up between the
two sides. However, a well-designed Peltier cooler will be a mediocre
thermoelectric generator and vice versa, due to different design and packaging
requirements.

20
Filingeri, D., & Havenith, G. (2015). Human skin wetness perception:
psychophysical and neurophysiological bases. Temperature, 2(1),
86-104.

One cool thing about changing skin temperature is that this allows to create the
illusion of having whet skin. Humans do not have a dedicated receptor for wetness.
Instead, we use our brains to compute the relative heat dispersion on the skin. We
can use the relative heat dispersion to decide whether what are touching is wet or
dry. Please have a look at the paper cited in the slide for more information.

21
• Impedance control
• Admittance control

Force feedback devices allow to touch the untouchable, to feel the virtual objects as
if they were real

22
• Impedance control

Park, H., & Lee, J. (2004). Adaptive impedance control of a haptic interface. Mechatronics, 14(3),
237-253.

Impedance control in haptics refers to the ability to control the impedance or


resistance of a haptic device, such that it behaves like a physical object with mass,
stiffness and damping. This is done to enhance the sense of touch, force, and
resistance in virtual environments, making the interaction with virtual objects feel
more realistic. The impedance of the device can be adjusted dynamically in real-time,
allowing for a wide range of tactile feedback experiences.

23
• Admittance control

Van der Linde, R. Q., Lammertse, P., Frederiksen, E., & Ruiter, B. (2002, July). The HapticMaster, a new high-
performance haptic interface. In Proc. Eurohaptics (pp. 1-5). Edinburgh University

Admittance control in haptics refers to the control of the motion of a haptic device by
adjusting its impedance such that it follows the motion of a virtual object in a virtual
environment. Admittance control is used to create a realistic, compliant response
from the haptic device, which allows users to physically interact with virtual objects in
a natural way. The admittance of the device is calculated in real-time based on the
motion of the virtual object and the impedance of the haptic device, and the
impedance of the device is adjusted accordingly to produce a compliant response.

Impedance and admittance control in haptics are two different control techniques
used to enhance the realism of haptic feedback in virtual environments.
Impedance control focuses on the impedance or resistance of the haptic device, with
the goal of making it behave like a physical object with mass, stiffness, and damping.
This allows for realistic representation of touch, force, and resistance in virtual
environments.
Admittance control, on the other hand, focuses on the motion of the haptic device
and how it responds to the motion of virtual objects in a virtual environment. The
impedance of the device is adjusted dynamically in real-time such that it follows the
motion of the virtual object, creating a compliant and natural response.
In summary, impedance control is concerned with the resistance of the haptic device,

24
while admittance control is concerned with the motion of the device and its response
to virtual objects.

24
• Grounded vs Ungrounded devices

"Grounded haptics device" refers to the difference in the way the device responds to
the force applied to it and to how the device dissipate the force, rather than electrical
charge.. A grounded haptic device is usually anchored to the floor/table surface. It
provide force feedback by "pushing back" from the ground. An ungrounded device,
usually wearable, is not anchored at the floor, and forces are not dissipated to the
ground.

An an exercise try to think of interactions made possible only by grounded or


ungrounded devices.

25
• Soft robotics

Zhu, M., Biswas, S., Dinulescu, S. I., Kastor, N., Hawkes, E. W., & Visell, Y. (2022). Soft, wearable
robotics and haptics: Technologies, trends, and emerging applications. Proceedings of the IEEE, 110(2),
246-272.

Soft robotics is a rapidly growing field that combines soft materials, such as silicone,
elastomers, and fabrics, with embedded electronics and actuators to create robots
that are flexible and can change their shape. In haptics, soft robotics technology is
used to create haptic devices that can produce a wide range of realistic and engaging
tactile feedback sensations.
Soft robotic haptic devices can provide a more natural and intuitive form of haptic
feedback, as they can conform to the shape of the user's skin, produce a wide range
of vibrations, and respond dynamically to user interactions. They can also be made to
be wearable and portable, making them suitable for use in virtual reality and
augmented reality applications.
Examples of soft robotic haptic devices include wearable gloves and sleeves that
provide touch feedback, and haptic displays that use soft materials to produce
pressure sensations on the skin. In some cases, soft robotics technology is combined
with impedance and admittance control techniques to create haptic devices that
offer highly realistic and engaging tactile feedback experiences.

26
• Ultrasound technology

Carter, T., Seah, S. A., Long, B., Drinkwater, B., & Subramanian, S. (2013, October).
UltraHaptics: multi-point mid-air haptic feedback for touch surfaces. In Proceedings of
the 26th annual ACM symposium on User interface software and technology (pp. 505-
514).

Ultrasound technology is used in haptics to produce touch sensations by using high-


frequency sound waves to vibrate the skin. This technology works by generating high-
frequency (typically 20-40 kHz) sound waves that travel through the air and are
converted into pressure waves when they reach the skin. The resulting pressure
waves produce a vibration sensation on the skin that can be used to create a wide
range of touch sensations, including pressure, texture, and temperature.
Ultrasound haptic technology can be used in a variety of applications, including
virtual reality, augmented reality, and gaming. For example, ultrasound technology
can be used to create haptic feedback for touchscreens, allowing users to feel virtual
buttons, textures, and other touch-based interactions. It can also be used in wearable
devices, such as smartwatches and fitness trackers, to provide haptic feedback for
notifications and other events.
One advantage of ultrasound haptic technology is that it is non-contact, so it does not
require physical contact between the haptic device and the user's skin. This makes it
suitable for use in applications where physical contact is not desirable, such as in
medical or hygiene-sensitive environments. Additionally, ultrasound haptic
technology is flexible and scalable, allowing it to be used in a wide range of
applications and devices.

27
28
Sensory fusion Crossmodal correspondences

When delivering sensory stimuli to the brain through different modalities, we can
think of multisensory integration in 2 ways. The first , “hard” one corresponds to the
computations the brain makes to mix information coming from different sources. The
second “softer” way refers to semantic connection between senses, called
“crossmodal correspondences”

29
Multisensory integration

Object length

Let’s start with the “hard” way. Imagine you estimate the length of an object by sight.
You do it repeatedly and, because perception is a stochastic process and our eyes are
(slightly) noisy sensors, every time your estimation is a little bit different

30
Multisensory integration

Object length

You can do the same length estimation only using the haptic sense. In this case, your
estimation of length will be more noisy, as haptic is more noicy than vision for this
particular task. The repeated estimation will distribute like a normal distribution. In
this view, the reliability of the sense is inversely proportional to the variance of such
distribution

31
Multisensory integration

Ernst, M. O., & Banks, M. S. (2002). Humans integrate visual and haptic information in
a statistically optimal fashion. Nature, 415(6870), 429-433.

Let’s suppose that in VR I shift the position of the visual render of a real haptic object
as in the figure. Users can touch the object in the real wiorld and see its virtual
rendering in VR (This is called passive haptics, BTW!!!). I ask participant to estimate
the length of the object. Data show that participants doing such task will optimally
integrate the sensory information as described in the picture.
S(hat) = estimatd length .
i= sensory modality index (e.g. vision or touch)
w= weight per sensory modslity
r=normalized reliability of a given modality

32
Multisensory integration

Ernst, M. O., & Banks, M. S. (2002). Humans integrate visual and haptic information in
a statistically optimal fashion. Nature, 415(6870), 429-433.

This integration. Does not necessarily need multiple sensory modality. One can
integrate single modality estimates over time, making the previous estimates a
“prior” for the estimation. This is very Bayesian!!!

33
Multisensory integration

Ernst, M. O. (2006). A Bayesian View on Multimodal Integration Cue. Human body


perception from the inside out, 105.

We can use priors to describe “expected” agreement between sensory modalities!


Refer to the paper in the slide for a comprehensive (and life changing) read about
Bayesian integration of multimodal/multisensory cues

34
Sensory fusion Crossmodal correspondences
Bayesian multisensory integration
Ideal observer analysis

35
“KIKI” “BOUBA”

RELAXED JUMPY

Spence, C. (2011). Crossmodal correspondences: A tutorial review. Attention, Perception, &


Psychophysics, 73, 971-995.

Crossmodal correspondences refer to the systematic relationships that exist between


different sensory modalities, such as sight, sound, touch, taste, and smell. These
correspondences describe how information from one modality can influence our
perception of stimuli from another modality. For example, a bright light might be
perceived as a loud sound, or a rough texture might be perceived as a bitter taste.
Crossmodal correspondences have been the subject of extensive research in the
fields of psychology, neuroscience, and human-computer interaction. The study of
crossmodal correspondences has revealed that our sensory systems are highly
interconnected and that our experiences in one modality can shape our perceptions
in another modality.
In haptic design, crossmodal correspondences can be used to create more engaging
and immersive experiences. For example, by using crossmodal correspondences,
designers can enhance the realism of virtual environments by creating more
believable touch and sound experiences that are consistent with what users would
expect in the real world. Additionally, crossmodal correspondences can be used to
create new forms of multimodal feedback that provide more information about the
state of a system or the environment, making it easier for users to understand and
interact with their surroundings.

36
Haptic crossmodal correspondences transpiring from English language:

Let’s look for haptic related crossmodal correspondences in the English language!

37
Sensory fusion Crossmodal correspondences

Summary of the way we can talk about multisensory inteegration

38
Physical hand
Virtual hand

λ =1.25
λ = 0.75

Samad, M., Gatti, E., Hermes, A., Benko, H., & Parise, C. (2019, May). Pseudo-haptic weight: Changing the perceived weight of virtual objects by
manipulating control-display ratio. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-13).

Have a look at the paper in the slide. Here is the paper’s abstract for your
convenience:

In virtual reality, the lack of kinesthetic feedback often prevents users from
experiencing the weight of virtual objects. Control-to-display (C/D) ratio
manipulation has been proposed as a method to induce weight
perception without kinesthetic feedback. Based on the fact that lighter
(heavier) objects are easier (harder) to move, this method induces an
illusory perception of weight by manipulating the rendered position of
users' hands---increasing or decreasing their displayed movements. In a
series of experiments we demonstrate that C/D-ratio induces a genuine
perception of weight, while preserving ownership over the virtual hand.
This means that such a manipulation can be easily introduced in current
VR experiences without disrupting the sense of presence. We discuss
these findings in terms of estimation of physical work needed to lift an
object. Our findings provide the first quantification of the range of C/D-
ratio that can be used to simulate weight in virtual reality. In virtual reality,
the lack of kinesthetic feedback often prevents users from experiencing
the weight of virtual objects. Control-to-display (C/D) ratio manipulation

39
has been proposed as a method to induce weight perception without
kinesthetic feedback. Based on the fact that lighter (heavier) objects are
easier (harder) to move, this method induces an illusory perception of
weight by manipulating the rendered position of users' hands---increasing
or decreasing their displayed movements. In a series of experiments we
demonstrate that C/D-ratio induces a genuine perception of weight, while
preserving ownership over the virtual hand. This means that such a
manipulation can be easily introduced in current VR experiences without
disrupting the sense of presence. We discuss these findings in terms of
estimation of physical work needed to lift an object. Our findings provide
the first quantification of the range of C/D-ratio that can be used to
simulate weight in virtual reality.

39
W= d f MORE WEIGTH!!

Samad, M., Gatti, E., Hermes, A., Benko, H., & Parise, C. (2019, May). Pseudo-haptic weight: Changing the perceived weight of virtual objects by
manipulating control-display ratio. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-13).

40
Yau, J. M., Olenczak, J. B., Dammann, J. F., & Bensmaia, S. J. (2009). Temporal
frequency channels are linked across audition and touch. Current biology, 19(7), 561-566

Have a look at the paper in the slide. Here is the paper’s abstract for your
convenience:

Temporal frequency is a fundamental sensory dimension in audition and touch. In


audition, analysis of temporal frequency is necessary for speech and music
perception; in touch, the spectral analysis of vibratory signals has been implicated in
texture perception and in sensing the environment through tools. Environmental
oscillations impinging upon the ear are generally thought to be processed
independently of oscillations impinging upon the skin. Here, we show that frequency
channels are perceptually linked across audition and touch. In a series of
psychophysical experiments, we demonstrate that auditory stimuli interfere with
tactile frequency perception in a systematic manner. Specifically, performance on a
tactile-frequency-discrimination task is impaired when an auditory distractor is
presented with the tactile stimuli, but only if the frequencies of the auditory and
tactile stimuli are similar. The frequency-dependent interference effect is observed
whether the distractors are pure tones or band-pass noise, so an auditory percept of
pitch is not required for the effect to be produced. Importantly, distractors that
strongly impair frequency discrimination do not interfere with judgments of tactile
intensity. This surprisingly specific crosstalk between different modalities reflects the

41
importance of supramodal representations of fundamental sensory dimensions.

41
Parise, C. V., Rahman, S., Gatti, E., & Samad, M. J. (2020). U.S. Patent No. 10,754,428. Washington, DC:
U.S. Patent and Trademark Office

The psychophysical (that’s how this branch of computational neuroscience is called)


relationship between vibration and pure tone has been used to develop a very useful
algorithm to augment / highlight specific sounds in AR

42
Stieger, M. (2011). Texture-taste interactions: Enhancement of taste intensity by structural
modifications of the food matrix. Procedia Food Science, 1, 521-527.

Have a look at the paper in the slide. Here is the paper’s abstract for your
convenience:

The reduction of salt and sugar in food products remains a challenge due to the
importance of those ingredients in providing a highly desired taste quality, enhancing
flavor, determining the behavior of structuring ingredients, and ensuring
microbiological safety. Several technologies have been used to reduce salt and sugar
content in foods such as replacement of sugar by sweeteners, replacement of sodium
salts by blends of other salts, taste enhancement by aromas and taste boosters or
gradual reduction of sugar and salt in small steps over time. In this study we present
two alternative approaches to enhance taste perception. First, the use of an
inhomogeneous spatial distribution of sugar in food gels is introduced as a way to
enhance sweetness perception [1]. The translation of the concept of taste contrast to
bread applications is discussed which allows to reduce salt content in bread by 25%
without loss of saltiness intensity and without addition of taste enhancers, aromas or
salt replacers [2]. Secondly, it is demonstrated how the serum release under
compression of mixed polysaccharide/protein gels can be engineered to enhance
sweetness perception. An increase of serum release by 5x allowed to reduce sugar
content of gels by 30% while maintaining sweet taste intensity [3]. The translation of

43
this concept to low salt sausages is discussed. Sausages were engineered to exhibit
enhanced juiciness which lead to a boost of saltiness allowing for up to 40% salt
reduction [4]. These approaches can be used to further optimize the development of
products with reduced salt and sugar content while maintaining taste intensity

43
Biggs, L., Juravle, G., & Spence, C. (2016). Haptic exploration of plateware alters the perceived texture and taste of food. Food Quality and
Preference, 50, 129-134

Have a look at the paper in the slide. Here is the paper’s abstract for your
convenience:

We report two naturalistic citizen science experiments designed to highlight the


influence of the texture of plateware on people’s rating of the mouthfeel and taste of
food (specifically, biscuits) sampled from that plateware. In the first experiment,
participants tasted a biscuit from a pair of plates, one having a rough and the other a
smooth finish. In the second experiment, participants tasted biscuits and jelly babies;
participants rated the mouthfeel and taste of the two foodstuffs. The results both
confirm and extend previous findings suggesting that haptically and visually perceived
texture can influence both oral-somatosensory judgments of texture as well as, in this
case, the reported taste or flavour of the food itself. The crossmodal effects reported
here are explained in terms of the notion of sensation transference. These results
have potentially important implications for everything from the design of the tactile
aspects of packaging through to the design of serviceware in the setting of the
restaurant.

44
Gatti, E., Bordegoni, M., & Spence, C. (2014). Investigating the influence of colour, weight, and fragrance intensity on the perception of liquid bath soap: An
experimental study. Food Quality and Preference, 31, 56-64.

Have a look at the paper in the slide. Here is the paper’s abstract for your
convenience:

We report a preliminary experiment designed to investigate people’s product


expectations (for a liquid soap) as a function of its fragrance and packaging. To this
end, a series of soap bottles was produced that were identical in shape but had
different intensities of colouring (white, pink, or red). The weight of the bottles also
varied (either light −350 g, or heavy −450 g). Two different concentrations of perfume
were added to the liquid soap contained in the bottles (either low or high). The
participants evaluated the perceived intensity of the fragrance contained in each
bottle, the perceived weight of each bottle, and the expected efficacy of the soap
itself (that is, the soap’s expected “cleaning ability”). The results revealed a significant
main effect of the colour of the packaging on the perceived intensity of the soap’s
fragrance. Significant effects of the perceived weight of the container on both the
perceived intensity of the fragrance and on the expected efficacy of the soap were
also documented. These results are discussed in terms of the design of multisensory
packaging and containers for liquid body soap and, more generally, for body care and
beauty products.

45
46
Buckingham, G (2014). Getting a grip on heaviness perception: a review of weight illusions and
their probable causes. Experimental Brain Research 232: 1623-1629.

The size-weight illusion refers to a perceptual phenomenon in which objects of the


same weight but different sizes are perceived as having different weights. In other
words, larger objects are often perceived as being heavier than smaller objects of the
same weight. This illusion is thought to be a result of our brain combining information
from different sensory modalities, such as vision and touch, to form our perception of
the weight of objects.
The size-weight illusion has important implications for haptic design, as it can affect
how users perceive the weight and balance of objects in virtual environments. For
example, designers may need to take into account the size-weight illusion when
designing virtual objects to ensure that they are perceived as having the appropriate
weight and balance.
In addition to its practical applications, the size-weight illusion has also been the
subject of extensive research in psychology and neuroscience, as it provides insight
into how our brain integrates information from different sensory modalities to form
our perception of the world. Understanding the size-weight illusion can help
researchers to better understand the underlying mechanisms of sensory processing
and perception.

47
More info in the paper below:
Weight illusions—where one object feels heavier than an identically weighted
counterpart—have been the focus of many recent scientific investigations. The
most famous of these illusions is the ‘size–weight illusion’, where a small
object feels heavier than an identically weighted, but otherwise similar-
looking, larger object. There are, however, a variety of similar illusions which
can be induced by varying other stimulus properties, such as surface material,
temperature, colour, and even shape. Despite well over 100 years of research,
there is little consensus about the mechanisms underpinning these illusions. In
this review, I will first provide an overview of the weight illusions that have
been described. I will then outline the dominant theories that have emerged
over the past decade for why we consistently misperceive the weights of objects
which vary in size, with a particular focus on the role of lifters’ expectations of
heaviness. Finally, I will discuss the magnitude of the various weight illusions
and suggest how this largely overlooked facet of the topic might resolve some
of the debates surrounding the cause of these misperceptions of heaviness.

47
Dostmohamed, H and Hayward, V (2005). Trajectory of contact region on the fingerpad gives the illusion of haptic shape. Experimental
Brain Research 164(3): 387-394.

Have a look at the paper in the slide. Here is the paper’s abstract for your
convenience:

When one explores a solid object with a fingertip, a contact region is usually
defined. When the trajectory of this region on the fingerpad is artificially
controlled so as to resemble the trajectory that is normally present while
exploring a real object, the experience of shape is created. In order to generate
appropriate local deformation trajectories, we built a servo-controlled
mechanism that rolled a flat plate on the fingerpad during the manual
exploration of virtual surfaces so that the plate was kept tangent to a virtual
shape at the point of virtual contact. An experiment was then designed to test
which mode of exploration maximized the shape information gain: active
versus semi-active exploration, where semi-active exploration is when one
hand touches passively and the other moves the target object, and the use of
single versus multiple points of contact. We found that subjects were able to
perform curvature discrimination at levels comparable to those achieved
when using direct manual contact with real objects, and that the highly
simplified stimulus provided by the device was a sufficient cue to give the
illusion of touching three-dimensional surfaces.

48
Nakatani, M; Sato, A; Tachi, S and Hayward, V (2008). Tactile illusion caused by tangential skin srain and analysis in terms of skin deformation.
In: Proceedings of Eurohaptics, LNCS 5024 (pp. 229-237). Springer-Verlag

Have a look at the paper in the slide. Here is the paper’s abstract for your
convenience:

We describe a new tactile illusion of surface geometry that can be easily


produced with simple materials. When the fingertip skin is strained by loading
it in traction along a narrow band surrounded by two fixed traction surfaces,
the sensation of a raised surface is typically experienced. This and other
analogous cases are discussed in terms of tissue deformation created at a
short distance inside the skin where the target mechanoreceptors are
presumably located. A finite element analysis allowed us to propose that the
basis of this illusion is connected with the observation that normal loading
and tangential loading can create similar strain distribution, thereby creating
an instance of an ambiguous stimulus. In the discussion we relate this
stimulus to several other ambiguous tactile stimuli.

49
50
Russell, J. A. (1980). A circumplex model of affect. Journal of Moors, A., Ellsworth, P. C., Scherer, K. R., & Frijda, N. H. (2013). Appraisal theories of emotion:
personality and social psychology, 39(6), 1161. State of the art and future development. Emotion Review, 5(2), 119-124.

The circumplex model (RIGHT) is a framework that is used to describe the


relationships between different affective (emotional) states and to analyze patterns
of emotional experiences and expression. The circumplex model represents emotions
as points on a two-dimensional plane, with each point representing a specific
emotion or affective state.
The two dimensions of the circumplex model are typically defined as valence (the
degree to which an emotion is positive or negative) and arousal (the degree to which
an emotion is energetic or calming). Emotions are represented as points on the plane
that vary in their level of valence and arousal, and can be thought of as falling along a
circular continuum that ranges from highly positive and energetic emotions (such as
excitement and joy) to highly negative and calming emotions (such as sadness and
fear). The circumplex model is widely used in psychology and neuroscience to study
the relationship between emotions, moods, and personality traits, as well as to
understand how emotions are processed in the brain. It has also been applied in
fields such as marketing, where it is used to analyze consumer behavior, and in
human-computer interaction, where it is used to design affective interfaces that can
respond to users' emotional states. The circumplex model provides a simple and
intuitive framework for organizing and analyzing emotional experiences and
expressions, making it a useful tool for researchers and practitioners in a variety of

51
fields.

Appraisal theory of emotions (LEFT)is a framework that explains how emotions are
generated and experienced based on our interpretation and evaluation of events or
stimuli in the world around us. According to appraisal theory, emotions are not simply
triggered by events, but rather result from the complex process of appraisal, in which
we evaluate the significance of events and their personal relevance to us.
The appraisal theory of emotions has several key components. One is the idea that
emotions are generated by cognitive appraisals, which are conscious or unconscious
evaluations of events that determine the emotional significance of the event for the
individual. For example, if we appraise an event as a threat to our well-being, we are
likely to experience fear. If we appraise an event as a desirable outcome that is
consistent with our goals, we may experience joy or satisfaction.Appraisal theory has
been widely adopted and developed in psychology, neuroscience, and other related
fields, and has provided a rich and nuanced understanding of the complex interplay
between emotions, cognition, and the appraisal process. The theory has important
implications for our understanding of mental health and well-being, as well as for the
design of interventions and treatments that can help individuals manage and regulate
their emotions more effectively.

51
Gatti, E., Caruso, G., Bordegoni, M., & Spence, C. (2013, April). Can the feel of the haptic interaction modify a
user's emotional state?. In 2013 World Haptics Conference (WHC) (pp. 247-252). IEEE.

Haptic perception constitutes an important component of our everyday


interaction with many products. At the same time, several studies have, in
recent years, demonstrated the importance of involving the emotions in the
user-product interaction process. The present study was designed to
investigate whether haptic interactions can affect, or modulate, people's
responses to standardized emotional stimuli. 36 participants completed a self-
assessment test concerning their emotional state utilizing as a pointer either a
PHANToM device simulating a viscous force field while they moved the stylus,
or else a stylus with no force field. During the presentation of the emotional
pictures, various physiological parameters were recorded from participants.
The results revealed a significant difference in the self-reported arousal
associated with the pictures but no significant difference in the physiological
measures. The behavioural findings are interpreted in terms of an effect of the
haptic feedback on participants' perceived/interpreted emotional arousal.
These results suggest that haptic feedback could, in the future, be used to
modify participants' interpretation of their physiological states.

52
Obrist, M., Subramanian, S., Gatti, E., Long, B., & Carter, T. (2015, April). Emotions mediated through mid-air haptics.
In Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 2053-2062).

Touch is a powerful vehicle for communication between humans. The way


we touch (how) embraces and mediates certain emotions such as anger,
joy, fear, or love. While this phenomenon is well explored for human
interaction, HCI research is only starting to uncover the fine granularity of
sensory stimulation and responses in relation to certain emotions. Within
this paper we present the findings from a study exploring the
communication of emotions through a haptic system that uses tactile
stimulation in mid-air. Here, haptic descriptions for specific emotions (e.g.,
happy, sad, excited, afraid) were created by one group of users to then be
reviewed and validated by two other groups of users. We demonstrate the
non-arbitrary mapping between emotions and haptic descriptions across
three groups. This points to the huge potential for mediating emotions
through mid-air haptics. We discuss specific design implications based on
the spatial, directional, and haptic parameters of the created haptic
descriptions and illustrate their design potential for HCI based on two
design ideas.

53
Obrist, M., Subramanian, S., Gatti, E., Long, B., & Carter, T. (2015, April). Emotions mediated through mid-air haptics.
In Proceedings of the 33rd annual ACM conference on human factors in computing systems (pp. 2053-2062).

54
Russell, J. A. (1980). A circumplex model of affect. Journal of Moors, A., Ellsworth, P. C., Scherer, K. R., & Frijda, N. H. (2013). Appraisal theories of emotion:
personality and social psychology, 39(6), 1161. State of the art and future development. Emotion Review, 5(2), 119-124.

Concluding, multiple evidences point to the fact that haptic stimuli, themselves, can
work as emotional triggers. They seem to be particularly suitable to communicate
information in the arousal dimension. Appraisal theory gives us an interesting
framework to understand affective haptics. Indeed haptic stimuli can deliver arousal
activation but they often need to be “appraised” to be considered positive or
negativeOn the other hand, haptic stimulation can aid the appraisal process, allowing
to further elaborate the emotional content of , for example, visual stimuli.

55
The deaf-blind alphabet, also known as the tactile alphabet or the manual alphabet,
is a system of hand gestures used by individuals who are both deaf and blind to
communicate with each other and with sighted and hearing individuals who know the
alphabet. The manual alphabet is used by placing the hands on the communicating
partner's hands, arms, or face and forming the letters of the alphabet with the
fingers.
The manual alphabet is an important tool for deaf-blind individuals, as it provides a
means of communication that does not rely on sight or sound. The alphabet is
typically taught as part of deaf-blind education programs, and is used by deaf-blind
individuals, their family members, and support workers.
The manual alphabet has been in use for hundreds of years and has been adapted
and standardized in various forms throughout history. The most commonly used
manual alphabet today is the American manual alphabet, which is based on the
English alphabet and includes 26 hand gestures, each corresponding to a letter of the
alphabet. There are also other manual alphabets used in other countries, such as the
British manual alphabet and the French manual alphabet.
Overall, the manual alphabet is a critical tool for deaf-blind individuals, allowing them
to communicate with others and to fully participate in the world around them.

56
Enriquez, M., MacLean, K., & Chita, C. (2006, November). Haptic phonemes: basic building blocks of haptic communication. In Proceedings of
the 8th international conference on Multimodal interfaces (pp. 302-309).

A haptic phoneme represents the smallest unit of a constructed haptic


signal to which a meaning can be assigned. These haptic phonemes can be
combined serially or in parallel to form haptic words, or haptic icons, which
can hold more elaborate meanings for their users. Here, we use
phonemes which consist of brief (<2 seconds) haptic stimuli composed of
a simple waveform at a constant frequency and amplitude. Building on
previous results showing that a set of 12 such haptic stimuli can be
perceptually distinguished, here we test learnability and recall of
associations for arbitrarily chosen stimulus-meaning pairs. We found that
users could consistently recall an arbitrary association between a haptic
stimulus and its assigned arbitrary meaning in a 9-phoneme set, during a
45 minute test period following a reinforced learning stage.

57
Botvinick, M., & Cohen, J. (1998). Rubber hands ‘feel’touch that eyes see. Nature, 391(6669),
756-756.

The rubber hand illusion is a phenomenon that demonstrates how our sense of body
ownership and self-awareness can be manipulated. The illusion is created by placing a
fake rubber hand in front of a person, in a position similar to their own hidden hand.
The hidden hand is then stroked with a brush, while the rubber hand is also stroked in
synchrony. This leads the person to perceive the rubber hand as if it were their own,
despite knowing that it is not.
The rubber hand illusion demonstrates how the brain uses sensory information from
different sources (such as touch and vision) to create a sense of body ownership. The
illusion shows that when the visual information about the rubber hand and the touch
information from the person's hidden hand are in conflict, the brain tends to
prioritize the visual information and may create a sense of ownership over the rubber
hand.
The rubber hand illusion has been used in a variety of research contexts, such as the
study of body representation and self-awareness, the neural basis of body
perception, and the development of virtual reality technologies. It has also been used
to study disorders such as phantom limb syndrome and body dysmorphia.
Overall, the rubber hand illusion is a powerful tool for understanding how our sense
of body ownership and self-awareness is created and maintained, and for exploring
the complex interplay between our senses, brain, and body.

58
Get some more info from the paper in the slide! Here is the abstract for your
convenience:
Illusions have historically been of great use to psychology for what they can reveal
about perceptual processes. We report here an illusion in which tactile sensations are
referred to an alien limb. The effect reveals a three-way interaction between vision,
touch and proprioception, and may supply evidence concerning the basis of bodily
self-identification.

58
Kammers, M. P., de Vignemont, F., Verhagen, L., & Dijkerman, H. C. Tsuji, T., Yamakawa, H., Yamashita, A., Takakusaki, K., Maeda, T., Kato, M., ... &
(2009). The rubber hand illusion in action. Neuropsychologia, 47(1), Asama, H. (2013, November). Analysis of electromyography and skin conductance
204-211. response during rubber hand illusion. In 2013 IEEE Workshop on Advanced Robotics
and its Social Impacts (pp. 88-93). IEEE.

Explore the the effect of rubber hand illusion on perceived body perception and on
threat perception by looking at the two papers suggested in the slide. Below the
abstracts:
Proprioception( proprioceptive drift):
In the well-known rubber hand illusion (RHI), watching a rubber hand being stroked
while one's own unseen hand is synchronously stroked, induces a relocation of the
sensed position of one's own hand towards the rubber hand [Botvinick, M., & Cohen,
J. (1998). Rubber hands 'feel' touch that eyes see. Nature, 391(6669), 756]. As one
has lost the veridical location of one's hand, one should not be able to correctly guide
one's hand movements. An accurate representation of the location of body parts is
indeed a necessary pre-requisite for any correct motor command [Graziano, M. S. A.,
& Botvinick, M. M. (1999). How the brain represents the body: Insights from
neurophysiology and psychology. In D. Gopher, & A. Koriat (Eds.), Attention and
performance XVII-Cognitive regulation of performance interaction of theory and
application (pp. 136-157)]. However, it has not yet been investigated whether action
is indeed affected by the proprioceptive drift towards the rubber hand, nor has the
resistance of visual capture in the RHI to new proprioceptive information been
assessed. In the present two kinematic experiments, we show for the first time that
action resists the RHI and that the RHI resists action. In other words, we show a

59
dissociation between illusion-insensitive ballistic motor responses and illusion-
sensitive perceptual bodily judgments. Moreover, the stimulated hand was judged
closer to the rubber hand for the perceptual responses, even after active movements.
This challenges the view that any proprioceptive update through active movement of
the stimulated hand erases the illusion. These results expand the knowledge about
representations of the body in the healthy brain, and are in line with the currently
most used dissociation between two types of body representations so far mainly
based on neuropsychological patients [Paillard, J. (1991). Knowing where and
knowing how to get there. In J. Paillard (Ed.), Brain and space (pp. 461-481); Paillard,
J. (1999). Body schema and body image: A double dissociateon in deafferented
patients. In G. N. Gantchev, S. Mori, & J.Massion (Eds.), Motor control, today and
tomorrow (pp. 197-214)].

Threat perception:
Recently, the rubber hand illusion (RHI), which is one of phenomena that the sense of
ownership is extended to the objects over the external area, attracts much attention
to explain the brain mechanism of self body recognition of human. However, most
previous research have only focused on the conditions for the occurrence of the RHI.
In this study, we measured the electromyography (EMG) of the arm and the skin
conductance response (SCR) of the end of the finger when the strong blow with a
hammer would be given to the fake hand in order to examine whether the RHI is in
fact occurred to the subject at a certain time during the experiment. As a result, we
showed that the measurement of the EMG could satisfy above requirement and it is
implied that the measurement of EMG gets closer to the tendency of introspection
report than that of SCR.

59
Giummarra, M. J., Gibson, S. J., Georgiou-Karistianis, N., & Bradshaw, J. L. (2008). Mechanisms underlying embodiment,
disembodiment and loss of embodiment. Neuroscience & Biobehavioral Reviews, 32(1), 143-160.

It is clear now that haptic is a powerful mean to perceive our body and to create our
body image.

60
Blanke, O., Pozeg, P., Hara, M., Heydrich, L., Serino, A., Yamamoto, A., ... & Rognini, G. (2014). Neurological and robot-
controlled induction of an apparition. Current Biology, 24(22), 2681-2686.

Haptic stimulation can be used to disrupt our body perception, and even to create
out of body experiences! Learn more about it from the paper mentiond in the slide.
Below the abstract for your convenience:
Tales of ghosts, wraiths, and other apparitions have been reported in virtually all
cultures. The strange sensation that somebody is nearby when no one is actually
present and cannot be seen (feeling of a presence, FoP) is a fascinating feat of the
human mind, and this apparition is often covered in the literature of divinity,
occultism, and fiction. Although it is described by neurological and psychiatric
patients [1, 2] and healthy individuals in different situations [1, 3, 4], it is not yet
understood how the phenomenon is triggered by the brain. Here, we performed
lesion analysis in neurological FoP patients, supported by an analysis of associated
neurological deficits. Our data show that the FoP is an illusory own-body perception
with well-defined characteristics that is associated with sensorimotor loss and caused
by lesions in three distinct brain regions: temporoparietal, insular, and especially
frontoparietal cortex. Based on these data and recent experimental advances of
multisensory own-body illusions [5, 6, 7, 8, 9], we designed a master-slave robotic
system that generated specific sensorimotor conflicts and enabled us to induce the
FoP and related illusory own-body perceptions experimentally in normal participants.
These data show that the illusion of feeling another person nearby is caused by

61
misperceiving the source and identity of sensorimotor (tactile, proprioceptive, and
motor) signals of one’s own body. Our findings reveal the neural mechanisms of the
FoP, highlight the subtle balance of brain mechanisms that generate the experience of
“self” and “other,” and advance the understanding of the brain mechanisms
responsible for hallucinations in schizophrenia.

61
• Haptic stuff • Beyond the haptic stuff
• Haptic perception • Integration with vision (VR yeah yeah)
• Haptic displays • Integration with audio (They are basically cousins)
• Vibrotactile • Integration with taste (Kinda makes sense no?)
• Temperature • Integration with smell (Say whaaaaat?)
• Pressure/stroking
• Force feedback
• Impedance control • Haptic and cognition
• Admittance control • Haptic illusions
• Affective haptics
• Haptic and communication
• Embodiment

62
63

You might also like