You are on page 1of 5

Alya Marwan Emara202200375Section 5

Imagine discovering that a man had been arrested for assaulting a robot over the morning
news. Despite the oddity, The Japan Times News covered this occurrence in 2015, noting that
he was detained and accused of property damage but not attacking the robot. As per police
investigation, the man was aggravated by a store employee and kicked Pepper, Softbank's
emotional robot. Pepper's unique selling proposition, however, was the capacity to identify
human mental states and respond correspondingly. Emotional robots are one of the most
divisive and reviled AI technologies. There are several disadvantages to integrating emotions
into robots, but there are just as many advantages to incorporating sentiments into robots.
While robots are often looked at antagonistically when people insist on human primacy,
emotional robots elicit an even stronger negative response. Several skeptics argue that
incorporating emotions into robots can never provide a complete picture of what a human
feels. Proponents say that instead of idolizing machines and disparaging their shortcomings,
emotional robots could be valuable assets in various categories. Robots and emotions have
recently grown in popularity to the extent that they are considered a national phenomenon.
Therefore, they need to be examined in light of the following questions: first, are robots’
emotions real? Second, Is it possible for robots to express emotions in a similar way to
humans? Finally, should robots have emotions?

In the debate on robots' emotions, there is a lot of controversy over whether robots'
emotions are genuine or not. Advocates believe that affective computing has already made
significant advances in the creation of emotional machines and that new advancements are
being made every month. An illustration is a robot called 'Kismet' constructed by Cynthia
Breazeal and colleagues at the Massachusetts Institute of Technology (MIT), capable of
experiencing various emotions. Kismet, for example, appears sorrowful when left alone, but
grins when sees a human face, attracting attention. If someone moves too quickly, it gives a
fearful expression to indicate that something is wrong. Those who interact with Kismet
cannot help but sympathize with these basic forms of emotional behavior. Yet, does Kismet
have emotions? It is evident that Kismet exhibits some emotional behavior, thus if we define
emotions in terms of behavior, we must acknowledge that Kismet has emotional capability
(Evans, 2004). It should be noted, however, that emotional capability is not a binary concept.
While chimpanzees do not display the full range of human emotions, they can clearly feel
genuine emotions. According to research presented in the article "Can robots have
emotions?", the Cartesian distinction between sentient humans and emotional robots does not
make much sense with their ability to recognize human sentiments and act upon them
following the dictates of their inner mechanism. On the other hand, objectors contend that
portraying emotions in scientific terms creates several complications and has often been
considered unfeasible. Where programmed emotions are nothing but principally regarded
techniques for enhancing collected data with meanings and valences (Lorenz & Barnard,
2007). A study presented in the article "Why machines cannot feel?" shows a computational
model for emotions, developed during the frontier research project NeuroSym. The goal of
this research was to create a neuroscientifically and neuro-psychologically influenced
machine recognition system that could evaluate perceptual pictures based on emotional
qualities. It was considered possible by adopting the brain as the paradigm for machine
comprehension, technological systems with humanoid perception capacities could be
designed (Velik et al., 2008). The model presented for this job comprised two major
components: the system accountable for exterior perception for observing the surroundings
and the system tasked with gaining insight into a body's interior environment to produce
emotional responses. Model advancement has shown that robots may acquire equivalent
efficient perception abilities to humans when it comes to sensing the environment. However,
including emotions in the technical design to analyze perceived images of the environment
Alya Marwan Emara202200375Section 5

presented multiple obstacles. It was proposed that the body has a vital impact on the
evolution of emotions, at the very least throughout the early phases of development but
definitely throughout one's life. Emotions must be anchored in the body. Aiming to
integrate sentiments into a robot would necessitate the creation of, or simulation of, an
instinctive body. However, no suitable specimens to replicate such a physique are currently
available.

Aside from that, another point of contention revolves around whether robots can display
emotions in the same manner that humans do. Defenders assert that the development of new
mechanisms and functionalities for humanoid robots enabled emotional expression and
communication with humans with a humane approach. Waseda University published the
article "Various Emotional Expressions with Emotion Expression Humanoid Robot WE-
4RII", which demonstrated that the humanoid robot WE-4RII could successfully express its
emotions when compared to humans. It was possible to accomplish coordinated head-eye
movements using the Vestibular-Ocular Reflex, depth perception utilizing the angle of
convergence between the two eyes, a colored visual sensation system, and four senses were
achieved including visual, aural, cutaneous, as well as olfactory senses (Miwa et al., 2002,
pp. 2443-2448). In addition to that, WE-4RII’s body movements are slow for melancholy, yet
quick for astonishment depicting human emotional expression. A total of 18 individual
videos were presented to assess how WE-4RII conveys its emotions through its six primary
emotional expressions. The respondents identified an emotion that they believed the robot
exhibited. Most of the subjects felt that Pattern 1 represented "Sadness," Pattern 2 represented
"Surprise," Pattern 4 represented "Anger," and Pattern 5 represented "Happiness". As a result,
it went without saying that WE-4RII could represent its current state through a variety of
patterns deemed precisely similar to those of humans. On the contrary, critics claim that the
existing facial design of commercially available robots is frequently insufficient for
portraying emotions. Pollmann et al. (2019) proposed five facial design versions for an
emotional robot named 'Pepper' that can convey fundamental emotions (happiness, anger,
sorrow, and fear) utilizing various cartoon-inspired designs based on expressiveness
enhanced by exaggeration (Thomas & Johnston, 1995). The eyes are made up of significant
black pupils encircled by an eye socket lighted by eight RGB FullColor LEDs. The mouth is
a compact black crescent that cannot be altered. The design of eyes was originally seen on
Pepper's precursor, the Nao robot, and was initially designed to show the robot in various
states. Johnson et al. (2013) created and tested several LED eye designs to allow Nao to
portray primary core emotions. Their findings confirm the overall principle of transmitting
emotions by employing eye patterns but also discovered that their designs were vague or
unrecognizable. Four emotive facial expressions of Pepper were created for each of the five
versions. This generated 20 robot faces as cartoon drawings. According to the results, 13 of
the 20 were allocated the proper response. Happiness and dismay were accurately
identified in all designs due to the exaggeration inspired by cartoon and animated movie
characters. However, this alone was insufficient to adequately resemble human expression,
particularly when taking overemphasis as a key factor in designs. Only three faces conveyed
the required impression of despair. Fear was incorrectly attributed to any of the face designs.
In other words, none of the designs waswere optimal for all four emotions examined.
Alya Marwan Emara202200375Section 5

Beyond that, Aanother area of debate revolves around whether robots should have emotions
at all. Those in favor anticipate a near future in which robots are omnipresent, and bBecause
emotions are such a crucial element of human existence, it appears only reasonable to
contemplate their incorporation in robot creation. One of the most noticeable benefits of
incorporating emotions into robots is the establishment of mutual psychological
understanding during human-robot interaction (Sciutti et al., 2018). In educational settings,
studies have also demonstrated that when children are subjected to prolonged interactions
with robots that replicate compassionate capacities (illustrated in personalization during tasks
and contingency behaviors), they can attain meaningful learning gains in school curriculum
areas. In contrast to brief interactions with robots devoid of empathetic capacities (Alves et
al., 2019). Furthermore, children have been shown to consider empathy-mimicking robots as
their friends, despite being told clearly that the robot was posing as their teacher, indicating
that implementing emotions in robots may result in beneficial shifts in people's perceptions of
a robot's function (Alves et al., 2016). Those opposed to the integration of emotions into
robots emphasize that most people envision robots as instruments rather than human-like
machines, where individuals have more constructive implicit connections with one another
than with robots (Sanders et al., 2016), and are more critical of emotionalized robots than
they formerly thought (De Graaf et al., 2016). Although humans usually regard robots as
more acceptable for housework, several studies have revealed that they are uneasy about
interacting with emotional robots, let alone the concept of emotional robots in general
(Carpenter et al., 2009). Based on research conducted by Broadbent et al. (2009), individuals
who were instructed to imagine a robot with emotional characteristics showed higher blood
pressure levels and unpleasant emotions. This was in contrast to those who were instructed to
imagine a robot with a mechanical appearance. Therefore, emotional robots receive fewer
positive remarks than mechanical robots and are less likely to be accepted than mechanical
robots.

There is no doubt that robots and emotions combined are a highly controversial topic.
Where emotional robots are credited with a psychological understanding of humans but are
also partially responsible for eliciting negative emotions in humans as mentioned in the
studies included by (Sciutti et al., 2018) and (Broadbent et al., 2009). Incorporating emotions
into robots for the prospect of “being there”, and being humanely knowledgeable and
comprehended (Breazeal, 2004) is beneficial. Hence, robots that mirror humans could
develop the experience to base interactions on previously inveterate human abilities and
social norms (Schmitz, 2011), facilitating human-robot interaction. However, the absence of
direct interaction with robots paired with primarily unpleasant images may explain why, till
now, people have different perspectives about emotional robots while still having high
expectations of them. Despite that, certain design considerations could be made to sustain
human involvement and acceptability in long-term engagement with robots, such as their
consistent and progressive behaviors, replicating emotional exchanges and empathy, memory
use, and adaptability (Paiva et al., 2013). Therefore, since emotions are such an integral part
of human life, it seems only prudent to incorporate them into the creation of robots.
Alya Marwan Emara202200375Section 5

References

Evans, D. (2004). Can robots have emotions? ResearchGate, 1-


5.https://www.researchgate.net/publication/244106245_Can_robots_have_emotio
ns
Giger, J. C., Piçarra, N., Alves‐Oliveira, P., Oliveira, R., & Arriaga, P. (2019). Humanization
of robots: Is it really such a good idea? Human Behavior and Emerging
Technologies, 1(2), 111-123. https://doi.org/10.1002/hbe2.147
Itoh, K. M. H, M. M, Zecca. M, Takanobu. H, Rocella. s, Carroza. M, Dario. P & Takanishi.
A.(2004). Various emotional expressions with emotion expression humanoid robot
WE-4RII. Research Gate, 1-2.
https://www.researchgate.net/publication/4140966_Various_emotional_expressions_wi
th_emotion_expression_humanoid_robot_WE-4RII
Nitsch, V., & Popp, M. (2014). Emotions in robot psychology. Biological Cybernetics,108,
621–629. https://doi.org/10.1007/s00422-014-0594-6
Pollmann, K., Tagalidou, N. and Fronemann , N. (2019). It’s in Your Eyes: Which Facial
Design is Best Suited to Let a Robot Express Emotions? Research Gate, 1-5.
https://www.researchgate.net/publication/
335564624_It's_in_Your_Eyes_Which_Facial_Design_is_Best_Suited_to_Let_a_Robo
t_Express_Emotions
Velik, R. (2010). Why machines cannot feel. Research Gate, 1-18.
https://www.researchgate.net/publication/220636830_Why_Machines_Cannot_Feel
Alya Marwan Emara202200375Section 5

You might also like