You are on page 1of 18

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/259135343

Exploring influencing variables for the acceptance of social robots

Article in Robotics and Autonomous Systems · December 2013


DOI: 10.1016/j.robot.2013.07.007

CITATIONS READS
314 6,506

2 authors:

Maartje M.A. de Graaf Somaya Ben Allouch


Utrecht University University of Twente
51 PUBLICATIONS 1,603 CITATIONS 106 PUBLICATIONS 1,794 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Impact meten van eHealth innovatieprojecten van Bartiméus View project

Anticipated Acceptance of Domestic Social Robots View project

All content following this page was uploaded by Maartje M.A. de Graaf on 24 September 2018.

The user has requested enhancement of the downloaded file.


This is the final authors’ version of the manuscript

Exploring Influencing Variables for the Acceptance of Social Robots


By Maartje M.A. de Graaf, & Somaya Ben Allouch

Full reference:
De Graaf, M. M., & Allouch, S. B. (2013). Exploring influencing variables for the acceptance of social
robots. Robotics and Autonomous Systems, 61(12), 1476-1486.

DOI: 10.1016/j.robot.2013.07.007
Exploring Influencing Variables for the Acceptance of Social Robots

Maartje M.A. de Graaf* & Somaya Ben Allouch


University of Twente, Department of Media, Communication and Organization, Enschede, The Netherlands

* Corresponding author.
Telephone: +31 53 489 6094
E-mail address: m.m.a.degraaf@utwente.nl
Postal Address: University of Twente
Att. M.M.A. de Graaf
Faculty of Behavioral Sciences
Department of Media, Communication and Organization
Drienerlolaan 5
7522 NB Enschede, The Netherlands

Abstract
In order to introduce social robots successfully, we must first understand the underlying reasons whereupon potential users
accept these robots to reside within their own homes. An extensive literature review has been conducted and provides an overview
of variables influencing the acceptance of social robots categorized by utilitarian variables, hedonic variables, user characteristics,
social normative beliefs and control beliefs. A user study, in which 60 participants interacted with a social robot, both the robot
itself and the interaction experience the users had with the robot were evaluated. The results indicate that especially the variables
of usefulness, adaptability, enjoyment, sociability, companionship and perceived behavioral control are important evaluating the
user acceptance of social robots. Hence, the present study contributes to human-robot interaction research by designating the
variables that lead to social robot acceptance. Subsequently, this study may serve as a onset in developing an integral model which
takes into consideration the relevant determinants of social robot acceptance.

Keywords
Social robots, user study, user acceptance, utilitarian variables, hedonic variables, user characteristics.
Highlights
• An extensive literature review was performed on social robot acceptance variables.
• In a user study, users evaluated both the social robot and the interaction experience.
• Social robot acceptance research should include both utilitarian and hedonic factors.
• Important utilitarian factors are usefulness and adaptability.
• Important hedonic factors are enjoyment, sociability and companionship.
Vitae
Maartje M.A. de Graaf is a PhD candidate at the University of Twente. Her
thesis work focuses on the long-term acceptance and use of social robots in
domestic environments. The main goal is to get insight into (1) which variables
influence the acceptance or rejection of social robots in home environments, (2)
how the relationships people may build with these robots affect the process of
long-term acceptance, and (3) what societal and ethical issues might arise when
domestic use of robots becomes ubiquitous.
Dr. Somaya Ben Allouch is an assistant professor at the University of Twente.
Her main research interest focuses on the acceptance and use of technologies in
everyday life from a user-centered perspective. Current research topics are
ubiquitous computing and social robotics.
1. INTRODUCTION
Social robots are expected to increasingly penetrate our everyday lives. They are designed to interact socially with humans to
simplify communication and, therefore, increase their acceptance by users [Breazeal, 2003]. However, if social robots are to be
successfully introduced into people’s homes, we must understand the underlying reasons whereupon potential users decide to
accept these robots and invite them into their domestic environments. To be able to explain social robot acceptance and use, it is
essential to understand the determinants of the key acceptance variables (Venkatesh & Morris, 2000). Cumulatively, the fields of
information systems, human-computer interaction, psychology and communication science realms a long history in technology
acceptance research. Prominent models such as the Technology Acceptance Model (TAM) [Davis, 1989] or the Unified Theory of
Acceptance and Use of Technology (UTAUT) [Venkatesh, 2003] provide the utilitarian variables such as usefulness and ease of
use. However, these basic technology acceptance models do not take into consideration the hedonic variables such as enjoyment
and attractiveness.
Over the past decades, the field of human-computer interaction experienced a transformation from pragmatics and functionality
to encompassing emotional responses and positive experiences associated with the use of that technology [Norman, 2004]. Indeed,
the experience people have with interactive systems is two-fold. Currently, both the utilitarian and the hedonic views are considered
equally important when studying technology acceptance. Moreover, several studies in human-robot interaction also point to the
relevance of including the hedonic view in evaluating robots [Graaf & Ben Allouch, 2012; Heerink et al., 2010; de Ruyter et al.,
2005; Weiss et al., 2012]. Utilitarian variables are attributes connected to the practicality and usability of a product, whereas, in
contrast, the hedonic variables are attributes related to the user experience while using a product. Although a few studies on social
robots have included hedonic aspects, only a few have focused on the user acceptance of these robots (for example Heerink et
al.,2010). This study examines both utilitarian and hedonic variables as they present a broader view on robots as social actors in
interaction scenarios and enable the evaluation of the affective factors of the interaction which distinguishes social robots as a
unique technological genre [Graaf & Ben Allouch, 2012; Young et al., 2007]. This paper will focus on providing insight into the
influence of the various variables for the user acceptance of social robots. Subsequently, our results may serve as a protocol for the
development of an integral model which takes into consideration the relevant variables of social robot acceptance.

2. THEORETICAL BACKGROUND
Because intentions are found to be good predictors of specific behavior, they have become a critical part of many contemporary
theories of human behavior (Ajzen & Fishbein, 2005). And although the details of these theories differ, they all show conjunction
on a small number of variables that account for much of the variance in behavioral intentions. These variables can be considered as
three major kinds of considerations that influence the decision to perform a particular behavior: (1) the likely positive or negative
consequences of the behavior, (2) the approval or disapproval of the behavior by respected individuals or groups, and (3) the factors
that may facilitate or impede performance of the behavior. When these three categories are applied to the acceptance of social
robots, the first category can be viewed as the user’s evaluation of (using) a robot, the second category as the social normative
beliefs the user holds about using a robot, and the third category as the contextual factors that play a role while using a robot.
The next sections address these three categories and present the corresponding variables of social robots relating to their
acceptance. Additionally, we also present relevant user characteristics, which are more trait like, thus stable, variable that are found
to be influential in the acceptance of technology in general or social robots specifically. Before presenting an overview of
hypotheses derived from findings of previous studies, this theoretical outline will describe the outcome variables of social robots
acceptance.
2.1 Attitudinal Beliefs
The attitudinal belief structure involves the user’s favorable or unfavorable evaluation of a particular behavior (Ajzen &
Fishbein, 2005), or in this case the evaluation of behavioral beliefs resulting from the use of a social robot. The experience people
have with interactive technologies are two-fold. According to researchers in human-computer interaction (Hassenzahl &
Tractinsky, 2006; Van der Heijden, 2004), there are both utilitarian and hedonic product aspects. To allow a broader view on the
acceptance of social robots, both utilitarian and hedonic factors of human-robot interaction have been chosen to evaluate the
interaction between humans and robots. Utilitarian factors relate to the practicality and usability of a product. Hedonic factors, on
the other hand, refer to the user experience while using a product and have no obvious relation to task-related goals. The dichotomy
of utilitarian and hedonic factors as determinants of technology acceptance also arises from motivation theory which suggests a
main classification of motivators based on the different reasons or goals people have that encourages the performance of an activity
(Ryan & Deci, 2000; Vallerand, 1997). This dichotomy is between extrinsic and intrinsic motivation. Extrinsic motivation refers to
doing something because it leads to a separate outcome and intrinsic motivation relates to the performance of an activity for no
apparent reinforcement other than the process of performing that activity itself. This holistic dichotomous view, including both
utilitarian and hedonic factors of social robot acceptance, acknowledges the unique elements that distinguishes social robots as a
new technological genre, and demonstrates the need to include these factors with respect to traditional evaluation factors in human-
computer interaction.
2.1.1 Utilitarian Factors
Utilitarian factors are tied to utility and emphasizes the extrinsic motivations to accept or use a technology. Widely
acknowledged utilitarian variables originating from the TAM are usefulness and ease of use [Davis] and are solid predictors of
intention to use in the context of human-computer interaction (Lee, Kozar, & Larsen, 2003). In the context of robotics, usefulness is
defined as the user’s belief that using the robot would enhance their daily activities and ease of use is defined as the user’s belief
that using the robot would be free from effort [Heerink et al., 2010]. Both robotics and information systems research indicates that
perceived usefulness influences usefulness, use attitude, use intention and actual use [Heerink et al., 2010] (Heerink et al., 2008)
(Shin & Choo, 2011) (Lee, Kozar, & Larson, 2003). Together, research in human-robot interaction and human-computer interaction
found that perceived ease of use has a direct influence on perceived usefulness, use attitude and use intention [Heerink et al., 2010]
(Heerink et al., 2008) (Shin & Choo, 2011). Human-computer interaction research also found an influence of ease of use on use
attitude (Ahn et al., 2007; Chesney, 2006). Besides being useful and easy to use, a technology must be a counterpart to its intended
function. People expect a robot to look and act appropriately given the task in context (Goetz, et al., 2003). If a robot is designed for
social interaction with humans, the robot must project some amount of humanness so that the user feels comfortable enough to
socially engage with the robot [Fong et al., 2003]. The adaptability of the robot is defined as the perceived ability of the system to
be adaptive to the changing needs of the user [Heerink et al., 2010]. Perceived adaptability influences perceived usefulness,
enjoyment, attitude towards use and use intention [Heerink et al., 2010; Broadbent et al., 2009] (Heerink et al., 2008) (Shin &
Choo, 2011) (Venkatesh & Davis, 2000). Robotics research thus suggests that a robot’s ability to adapt its behavior to the user’s
preferences and personality can improve acceptability (Broadbent et al., 2009). Therefore, it is important to include this variable in
our exploration of variables influencing social robot acceptance. In addition to these general technological variables, robots face the
significant challenge of attempting to appear intelligent to provoke users to perceive them as genuine. The intelligence of the robot
is defined as the user’s evaluation of the robot’s level of intelligence [Bartneck et al., 2009]. A robot that is evaluated as more
intelligent is liked more and viewed as more realistic (Cuijpers, Bruna, Ham, & Torta, 2011). As the authenticity of the robot
depends on its intelligence, it is important to include this variable when studying the user acceptance of social robots.
2.1.2 Hedonic factors
Both consumer behavior research and information systems research have indicated various constructs related to hedonic factors
or intrinsic motivations in technology acceptance of the consumer context [van der Heijden, 2004] (Brown & Venkatesh, 2005).
Well-known hedonic variables in technology acceptance research are enjoyment and attractiveness. Enjoyment is defined as
feelings of joy or pleasure associated by the user with the use of the robot [Heerink et al., 2010]. When people evaluate a social
robot their pleasures experience may certainly influence user acceptance. Enjoyment appears to be a crucial variable for social robot
acceptance as it directly influences ease of use, use attitude and use intention of robots [Heerink et al., 2010] (Shin & Choo, 2011).
In addition to enjoyment, visual attractiveness is also a very powerful concept for technological objects [van der Heijden, 2003] as
it effects the attribution of positive traits (Eagly, Ashmore, Makhijani, & Longo, 1991). The influence of attractiveness is explained
by the ‘what is beautiful is good’ paradigm [Dion et al., 1972]. The attractiveness of the robot is defined as the positive evaluation
of the robot’s physical appearance [Lee, Kozar, Larsen, 2003]. Perceived attractiveness has been ascertained as the most important
attribute in the preference for hedonic systems [Schenkman et al., 2000] as well as being the mediator for other qualities of
technological systems [van der Heijden, 2004]. To our knowledge, perceived attractiveness as a predictor of other aspects in
human-robot interaction has not been studied before. However, research in human-computer interaction indicates that perceived
attractiveness has an influence on usefulness, ease of use, and enjoyment [van der Heijden, 2003]. Both the variables of enjoyment
and attractiveness will, therefore, be included in our set of variables that influence the acceptance of social robots.
Together with these general variables of technology acceptance, for social robots specifically the variables of
anthropomorphism, realism, sociability and companionship also influence the user experience with these robots. Bodies are salient
indicators of social identity: entities that are embodied, either physically or virtually, are expected to function in the human social
context [Groom et al., 2009]. When such an entity presents explicit cues of identity or social agency, it is not unlikely that humans
will treat these as social actors [Reeves, Nass, 1996]. It is expected that this effect may even magnify with embodied agents that
interact socially using natural language and non-verbal behaviors. Indeed, research shows that people also tend to ascribe human-
like properties, characters, or mental states to social robots [Kahn, Friedman, 2006; Lee, Park, Song, 2005]. This phenomenon is
known as anthropomorphism. Anthropomorphism is a tendency to regard and describe object in human terms, attributing human
characteristics to them with the intention to rationalize their actions [Duffy, 2003]. As people feel the need to control their
environment, those who feel uncertain are more likely to anthropomorphize a social robot as it helps then to explain, control and
predict the behavior of a nonhuman agent (Epley et al., 2008). Perceiving the presence of a social entity when using an interactive
system has been found to be a crucial factor in the process of accepting these technologies [Heerink et al., 2010; Lee, Park, Song,
2005; Lee, Jung, 2006], as it affects usefulness, attitude towards robots, social influence, companionship, use attitude and use
intention [Heerink et al., 2010] (Lee, Jung, Kim, & Kim, 2006; Lee, Kozar, & Larson, 2003; Shin & Choo, 2011). Being
anthropomorphized, social robots could also be viewed as being genuine. Several studies demonstrate that the increase of realism
may indeed improve human-robot interaction [Groom et al., 2009]. In the field of human-robot interaction, researchers have
suggested that matching the realism of robot appearance and behavior can improve interactions with humans. Human-likeness of a
robot’s appearance is preferred when it is a counterpart to the sociability of their behavior [Goetz et al., 2003]. The realism of the
robot is defined as the degree to which users believe the robot behaves and responds realistically, and research demonstrates that
realism may improve the human-robot interaction [Bartneck et al., 2009]. Moreover, more realistic robotic interfaces are perceived
as more intelligent (Bartneck, Kanda, Mubin, Al Mahmud, 2009). People across gender, age and culture tend to perceive a robot
who possesses a lifelike appearance as a friendly companion (Libin & Libin, 2004). As social robots are designed for social
interaction, they should possess a considerable amount of social skills. In observing human-human interactions, there are several
characteristics that make certain people more likeable than others in social interactions. These characteristics are conceptually
combined as social competence (Rubin & Martin, 1994) or sociability. Adapted to robotics, sociability is defined as the users’ belief
that the robot possesses the social, emotional, and cognitive skills required for successful social adaptation [Rubin, Martin, 1994].
Research studies in the field of human-robots interaction demonstrate that a more sociable robot will be more effective in its
communication and is, therefore, expected to be easier and more pleasant to interact with [Heerink et al., 2010; de Ruyter et al.,
2005; Breazeal, 2003] as well as more readily perceived as a social entity [Heerink et al., 2010]. The sociability of a robot also
affects its usefulness and people’s attitude towards use (Shin & Choo, 2012). Combining the knowledge about anthropomorphism,
realism and sociability, one might assume the possibility for people to regard social robots as potential companions. In the study of
Dautenhahn [Dautenhahn et al., 2005] on futuristic social roles of robots, 70% of the participants have indicated that they would
eventually prefer to have companionship with robots. Companionship is defined as the user’s perceived possibility to build a
relationship with the robot [Lee, Jung, 2006]. Bonding with a social robot influences the intention to continuing its use [Graaf &
Ben Allouch, 2012; Kanda et al., 2007; Klamer et al., 2010]. However, the effects of companionship on other variables in the
evaluation of social robot acceptance has not been studied before.
Collectively, the inclusion of both utilitarian and hedonic variables will provide a broader overview of the attitudinal belief
structure in the user acceptance of social robots. The next section will describe the user characteristics relevant for the process of
social robot acceptance.
2.2 Social Normative Beliefs
Social normative beliefs contain the several dispositions people hold towards their general behavior. It includes both
dispositions and rules that a group of people uses for appropriate and inappropriate values, beliefs, attitudes and behaviors. The
literature on technology acceptance pay most attention to social influence and image. The effects of both variables on technology
acceptance have been widely acknowledged [Lee, Kozar, Larsen, 2003; Rogers, 1995]. Social influence is defined as the user’s
perception of how other people think about using the robot [Karahanna, Limayem, 2000] and it. has a positive effect on usefulness,
ease of use, use attitude, use intention, and actual use (Heerink et al., 2010; Lee, Kozar, & Larsen, 2003; Malhotra & Galetta, 1999;
Venkatesh & Davis, 2000). Technology acceptance has been primarily researched in organizational settings (Taylor & Todd, 1995)
where the use is often mandatory. It could be that social influence plays a bigger role in a consumer context where technology
adoption is voluntary (Brown & Venkatesh, 2005). When technology is used within the home, users strongly consider the opinions
of the family [Venkatesh, Brown, 2001]. For the domestic use of social robots, future adopters can, therefore, be affected by
positive feedback from family, friends, and other adopters. Moreover, if significant others support a particular behavior, it is
believed that performing that behavior will tend to elevate one’s status within that group (Venkatesh & Davis, 2000). In addition to
social influence, one might thus seek public recognition achieved from adopting an innovation (Fisher & Price, 1992), such as a
robot. The use of an innovation is sometimes perceived as enhancing one’s self-image or social status in the opinion of significant
others, which, in turn, could have consequences for the user’s acceptance of that innovation (Rogers, 1995). Thus, robots, being a
relatively new technology on the consumer market, might be subject to this process as well. In the context of robotics, image is
defined as the users’ belief that the use of the robot is perceived to enhance one’s image or status in one’s social system [Venkatesh,
Davis, 2002]. Studied within the context of human-computer interaction, image has an influence on perceived usefulness (Lee,
Kozar, & Larsen, 2003; Venkatesh & Davis, 2000). Both social influence and image influences the process of technology
acceptance, which makes it a relevant variable for studying social robot acceptance.
2.3 Control Beliefs
Psychology research in general and TPB research in particular have established the inhibiting effects of the presence of
constrains and facilitators for both the intention to perform a behavior as well as the behavior itself (Ajzen, 1991). Specifically,
information systems research has identified several factors as barriers to technology acceptance (Mathiessen et al., 2001; Taylor &
Todd, 1995). Control beliefs consists of the user’s beliefs about salient control factors, that is their beliefs about the presence or
absence of resources, opportunities and obstacles which may facilitate or impede performance of the behavior. The technology
acceptance literature point to three variables of control beliefs which could be relevant for the user acceptance of social robots:
perceived behavioral control, anxiety towards robots and robot related experiences. First, people need to feel confident about their
own capabilities when using a technology. Adapted to robotics, perceived behavioral control is defined as the user’s perceived ease
or difficulty of using the robot [Bandura, 1977] and is particularly relevant for novice users who have not yet acquired the necessary
skills to successfully perform the behavior in question [LaRose et al., 1994], which is the case with human-robot interaction.
Perceived behavioral control has a positive effect on perceived ease of use, intention to use, and actual use (Ajzen, 1991; Heerink et
al., 2010; Karahanna & Limayem, 2000; LaRose & Eastin, 2000; Venkatesh, 2000; Venkatesh & Bala, 2008). Emotional arousal is
important in the formation of perceived behavioral control [Bandura, 1977]. Thus, second, anxiety, being a negative emotional
arousal, might be an important variable to consider when evaluating the acceptance of social robots. Anxiety experienced by users
in relation to technology use tends to generate further anxieties [Sarason, 1975]. Anxiety towards robots is defined as the user’s
anxious or emotional reactions evoked in real and imaginary human-robot interaction scenarios [Nomura et al., 2008]. Anxiety has
an effect on perceived ease of use (Venkatesh, 2000; Venkatesh & Davis, 2000). In contrast to anxiety, enjoyment in using
technology should reduce anxiety and assist people in feeling confident about their ability to successfully execute the needed
actions [Yi, Hwang, 2003]. These implications point to a correlation between enjoyment, anxiety and perceived behavioral control,
which validates their importance for social robot acceptance. Third and last, people have varying levels of exposure to robots
through either media or personal experiences which affects people’s familiarity with robots. Robot related experience is defined as
experience gained with robots, both directly via real face-to-face interactions and indirectly via media from both objective, e.g.
news articles and scientific papers, and subjective sources, e.g. science fiction [MacDorman et al., 2009]. Previous experience with
specific technology has a positive effect on usefulness, ease of use, attitude towards robots, intention to use, and actual use of that
technology (Bartneck, Suzuki, Kanda, etc., 2007; Hackbarth, et al., 2003; Karahanna & Limayem, 2000; Lee, Kozar, & Larsen,
2003). Being familiar with robots has profound effects on the manner in which people perceive the robot’s accessibility, desirability
and expressiveness [Fong et al., 2003]. On the other hand, a lack of familiarity can be a reason to feel uncertain when confronted
with robots [Broadbent et al., 2009]. People’s robot related experiences could influence the user experience and are thus important
for the user acceptance of social robots.
2.4 User Characteristics
In addition to variables regarding the social robot itself and the users’ beliefs about their possible interactions with it, also
variables involving the users themselves could be related to the acceptance of these robots. Research has found that age, gender,
cultural background, personal innovativeness and the general evaluation of a particular technology influences user acceptance. First,
the influence of age has been confirmed in several studies in human-computer interaction and psychology. Age differences have
been discovered to impact abilities, attitudes, behaviors and willingness to use new technologies [Kuo et al., 2009]. Older people, as
compared to younger people, are more likely to enjoy using and anthropomorphize a robot, are more susceptible for other people’s
opinions when using technology, show more negative emotions towards robots, and have lower intentions to use a robot
(Broadbent, et al., 2009; Lessister, et al., 2001; Libin & Libin, 2004; Scopelliti, Giuliani, & Fornara, 2005; Venkatesh, Thong, and
Xu, 2012). Second, gender has been proposed to be an important variable in human-robot interaction, which might be caused by the
differences between male and female social behavior and their social roles in society [Norman, 2004]. Gender affects how users
react to a robot [Forlizzi, 2007]. Men perceive robots as more useful, show higher intention to use them in the future, and are more
willing to accept these robots in their daily lives compared to females (Arras & Cerqui, 2005; Kuo et al., 2009). This might be due
to the fact that men tend to anthropomorphize a robot more because they perceive a robot as an autonomous person (Lessister, et al.,
2001; Schermenhorn et al. (2008). In contrast, the study of Shibata et al. (2009) reports that women feel more comfortable with
social robots and are more willing to interact. This might be due to the type of robot, which was the zoomorphic baby seal robot
Paro, and female are more open to taking care of artificial companions than males (Turkle, 2011). These results indicate gender
differences on the perception of robots. Indeed, gender also affects one’s attitude towards robots and anxiety towards robots
(Nomura, Kanda, Suzuki, & Kato, 2008).Third, each culture possesses its own level of exposure to robots through either media or
personal experiences [Broadbent et al., 2009]. For example, the typical “robots will take over the world” scenarios that are often
portrayed in Western cultures is less dominant in Japan (Kaplan, 2004). These cultural differences are expected to affect the
evaluation of the interaction between humans and robots. Indeed, research shows that people with other nationalities differ on their
attitude towards robots (Bartneck et al., 2006; Li, Rau, & Li, 2010) and their robot related experiences (Broadbent, et al., 2009;
Kaplan, 2004). They also tend to rate their experiences with robots differently on enjoyment, anthropomorphism, sociability, and
perceived behavioral control (Li, Rau, & Li, 2010; Libin & Libin, 2004; Nomura, Suzuki, Kanda, Han, Shin, Burke, & Kato, 2008).
Fourth, when people are willing to experiment with innovative technologies, they evaluate these technologies as easier to use
[Serenko, 2008] and have greater intentions to use new technologies [Lee, Kozar, Larsen, 2003]. Personal innovativeness is defined
as the willingness of users to try out any new technology [Aragwal & Karahanna, 2000] and is conceptualized as a trait, a relatively
stable descriptor of an individual that is invariant cross situational considerations [Aragwal, Prasad, 1998]. Traits are generally not
affected by any environmental or internal variables [Webster et al., 1992], thus, they are not idiosyncratic to a specific configuration
of individual or situational factors. Last, people’s general perceptions towards a technology influence how they evaluate its impact
on society and their understanding of the technology [Brosnan, 1998]. In this line of reasoning, people’s attitudes towards robots
may influence their behavior when confronted with a robot and their acceptance of it within society. Attitude towards robots is
defined as the psychological states reflecting opinions that people ordinarily have about robots [Nomura, Kanda, Suzuki, & Kato,
2008]. Research including this measure has discovered that people’s attitudes towards robots influences the level anxiety when
confronted with a robot (Nomura, Kanda, Suzuki, & Kato, 2008) and the acceptance of assistive robots in people’s own homes
[Nomura et al., 2009]. These studies suggest several effects of attitude towards robots on human behavior towards and evaluations
of robots, which make an inclusion of this concept logical for evaluating social robot acceptance.
2.5 Dependent Variables of Social Robot Acceptance
The determinant variables of technology acceptance are attitude towards use, use intention and actual use. In psychology, an
attitude is known as a relatively stable and enduring predisposition to behave or react in a certain manner towards persons, objects,
institutions, or issues [Chaplin]. In regards to robotics, attitude towards use is defined as the user's positive or negative evaluation of
the use of the robot [Heerink et al., 2010]. If a person’s attitude towards a particular object is known, it can be used to predict and
explain reactions of that person to that class of objects. Attitude towards a certain behavior has a strong, direct and positive effect
on the intention to perform that behavior [Fishbein, Ajzen]. In reference to robotics, intention to use is defined as the indication of
the user's readiness to use the robot [Moon, Kim, 2001]. Together, both of these variables are acknowledged predictors of actual
behavior [Fishbein, Ajzen], making them relevant for studying the user acceptance of social robots.
1.1 Hypotheses
Based on the theoretical overview above, we can draw the following hypotheses:
H1: Actual use is determined by (1) use intention, (2) usefulness, (3) enjoyment, (4) companionship, (5) social influence, (6)
robot related experienced, (7) perceived behavioral control, (8) attitude towards robots, and (9) gender.
H2: Use intention is determined by (1) use attitude, (2) usefulness, (3) ease of use, (4) adaptability, (5) enjoyment, (6)
anthropomorphism, (7) companionship, (8) social influence, (9) robot related experiences, (10) perceived behavioral
control, (11) attitude towards robots, (12) personal innovativeness, (13) age, and (14) gender.
H3: Use attitude is determined by (1) usefulness, (2) ease of use, (3) adaptability, (4) enjoyment, (5) anthropomorphism, (6)
sociability, (7) social influence, and (8) anxiety towards robots.
H4: Usefulness is determined by (1) ease of use, (2) adaptability, (3) enjoyment, (4) attractiveness, (5) anthropomorphism, (6)
sociability, (7) social influence, (8) image, (9) robot related experiences, and (10) gender.
H5: Ease of use is determined by (1) enjoyment, (2) attractiveness, (3) sociability, (4) social influence, (5) robot related
experiences, (6) perceived behavioral control, (7) anxiety towards robots, and (8) personal innovativeness.
H6: Adaptability is determined by (1) realism and (2) sociability.
H7: Intelligence is determined by (1) realism.
H8: Enjoyment is determined by (1) adaptability, (2) attractiveness, (3) anthropomorphism, (4) sociability, (5) anxiety towards
robots, (6) age, and (7) nationality.
H9: Attractiveness is determined by (1) anthropomorphism.
H10: Anthropomorphism is determined by (1) sociability, (2) perceived behavioral control, (3) age, (4) gender, and (5)
nationality.
H11: Realism is determined by (1) intelligence and (2) anthropomorphism.
H12: Sociability is determined by (1) intelligence, (2) realism, and (3) nationality.
H13: Companionship is determined by (1) anthropomorphism, (2) realism, and (3) gender.
H14: Social influence is determined by (1) anthropomorphism and (2) age.
H15: Image is determined by (1) social influence.
H16: Perceived behavioral control is determined by (1) enjoyment, (2)robot related experiences, (3) anxiety towards robots, and
(4) nationality.
H17: Anxiety towards robots is determined by (1) enjoyment, (2) robot related experiences, (3) attitude towards robots, (4) age,
and (5) gender.
H18: Robot related experiences are determined by (1) nationality.
H19: Attitude towards robot is determined by (1) anthropomorphism, (2) robot related experiences, (3) age, (4) gender, and (5)
nationality.

2. METHOD
To explore the hypotheses on social robot acceptance based on the theoretical overview, we performed a laboratory based user
interaction study with an academic robot platform.
2.1 Social Robot
To examine the user acceptance of social robots, a user study was conducted using the NAO Academic Edition, an autonomous,
programmable, medium-sized humanoid robot developed by Aldebaran Robotics. We employed the NAO for two reasons. First,
NAO is the most widely used robot for academic purposes. Second, NAO is a humanoid robot. Earlier robotics research shows that
people are interacting more naturally with humanoids because of familiarity [Fong et al., 2003]. NAO’s body has 25 degrees of
freedom. It has a series of sensors: two cameras, four microphones, a sonar distance sensor, two IR emitters and receivers, one
inertial board, nine tactile sensors, and eight pressure sensors. For expression it has a voice synthesizer, LED lights, and two
speakers. NAO was programmed to interact with the participants according to the procedure as described under the scenario in
paragraph 3.2. To appear to be life-like, NAO waved as the participants entered the room and its eyes lit up when it was listening to
the answers of the participants. Robots using gestures were more positively evaluated and were perceived as having higher
anthropomorphism even when using only incongruent gestures [Salem et al., 2011] and demonstrated that using any type of gesture
is better than no gestures at all.
2.2 Scenario
The procedure utilized in this study is adopted from Nomura et al. [Nomura et al., 2008]. Participants were requested to have a
casual conversation in which the robot would take the lead. As social robots are designed to interact with people in a natural way,
an information discourse was deemed an appropriate way in which to evaluate their acceptance. The robot was pre-programmed to
interact with the participants in a room, and each participant communicated alone with it for a few minutes. The procedure for each
session was as follows:
1: Just prior to entering the room, participants were instructed to take a place at the table where the robot was situated and allow
the robot take the lead in the acquaintance conversation.
2: Participants entered the room alone. NAO greeted the participant with a hand wave and by saying, “Hello” and, after a brief
interval, stated, “Take a seat, please.”
3: After the participant has taken a seat at the table, the robot asked, “I am NAO. What is your name?”
4: After conversing with the robot or after a continuous amount of time (30s) passed, the robot asked, “Can you tell me
something about yourself?”
5: After conversing with the robot again or after continuous amount of time (30s) passed, the robot then asked, “Tell me one
thing that recently happened to you”, in order to encourage participants to respond.
6: After they replied to the robot or a continuous amount of time (30s) passed, the robot uttered a sentence to encourage physical
contact, “Will you touch my head?”
7: After touching the robot or a continuous amount of time (30s) passed, the session was finished by the robot saying, “Nice
meeting you. You may go back to the researcher now.”
8: After the session, the participants completed the questionnaire including age and gender. Afterwards, the researcher conducted
a short debriefing before the participants departed.
The interactions with NAO lasted on average between 5 and 10 minutes, and completing the questionnaire averaged between 20 to
30 minutes. Although the interaction was programmed, none of the participants mentioned being suspicious about the robot’s
interactivity during the debriefing.
2.3 Questionnaire
The variables are well established in the human-computer interaction and human-robot interaction research. Each variable in the
questionnaire has been drawn from an extensive literature review. Only constructs validated in earlier studies were utilized.
However, some items have been adapted to robots. To measure the actual use of the robot by the participants, we asked them how
much they had talked to the robot (from ‘much less than I had expected’ to ‘much more than I had expected’) and whether or not
they had touched the robot when it asked them to do so (yes or no). Estimation of the actual use was made by adding the scores of
these two items together. Items were translated into Dutch. The translation was completed by two bilingual speakers using the back-
translation process. This process ensures that meaning and nuance are not lost, and that the translated versions remain as true to the
original construct as possible. Prior to the main study, a pilot test for the questionnaire was conducted to confirm the clarity of the
wording of the questions and answers as well as the relevance of the questions to the variables. The pre-test was undertaken by a
sample of both researchers and graduated students (n=8). Their comments on question variation, meaning, task difficulty, interest
and attention were employed to create the final version of the questionnaire. Completing the final version took approximately 20 to
30 minutes.

TABLE 1: VARIABLES: CODE, EXAMPLE ITEMS AND ITS SOURCE, MEAN, STANDARD DEVIATION, AND ALPHA SCORES
Initial Final Items Mean Standard Cronbach’s
Variable Source
Items Deviation Alpha
Adaptability [Heerink et al., 2010]. 3 3 3.37 1.21 .62
Anthropomorphism [Bartneck et al., 2009]. 5 5 2.46 0.98 .80
Attitude towards Use [Heerink et al., 2010]. 3 3 3.43 1.20 .75
Attractiveness [Lee, Jung, 2006]. 4 4 4.56 1.30 .85
Behavioral Control [Bandura, 1977]. 10 10 4.42 1.01 .84
Companionship [Lee, Jung, 2006]. 8 8 2.40 0.82 .74
Ease of Use [Heerink et al., 2010]. 4 4 4.40 1.10 .70
Enjoyment [Heerink et al., 2010]. 5 5 3.65 1.01 .71
Image [Venkatesh, Davis, 2002]. 3 3 2.25 0.97 .67
Intelligence [Bartneck et al., 2009]. 5 5 3.41 0.90 .74
Negative Attitude towards Robots [Nomura et al., 2008]. 14 14 4.02 0.75 .75
Personal Innovativeness [Aragwal & Karahanna, 2000]. 4 4 4.32 1.25 .82
Robot Related Experience [MacDorman et al., 2009]. 5 4 2.37 1.34 .63
Realism [Bartneck et al., 2009]. 6 6 2.92 0.93 .77
Robot Anxiety [Nomura et al., 2008]. 11 11 4.07 0.98 .85
Sociability [Rubin, Martin, 1994]. 10 9 2.86 0.67 .71
Social Influence [Karahanna, Limayem, 2000]. 3 3 4.41 1.18 .71
Use Intention [Moon, Kim, 2001]. 3 3 2.98 1.27 .85
Usefulness [Heerink et al., 2010]. 3 3 3.92 1.09 .62

The internal consistency of each construct was tested by calculating Cronbach’s alpha. Reversed coded items were recoded.
Table 1 displays the used variables, their codes, example items from the questionnaire, the mean scores and their standard
deviations, and internal consistency of the constructs. Reliability analyses made it necessary to eliminate some items to improve the
internal consistency to an acceptable level. Some researchers believe that an internal consistency of below .70 is questionable
[Nunnaly, 1992], which in our study was the case for adaptability, image, robot related experiences and usefulness. Other sources
advocate that an internal consistency higher than .60 together with both the inter-item correlations and the corrected item-total
correlations ranging between .30 and .70 indicates a favorable scale [Ferketich, 1991]. Therefore, we chose to continue to use these
constructs in our further analysis.
2.4 Participants
Students from the faculty of behavioral sciences of a Dutch university were recruited to participate in this study in exchange for
credits. A total of 60 students, between 18 and 28 years old (M= 20, SD= 2,352) participated in this study. An equal distribution of
gender was almost accomplished with 28 male participants and 32 female participants. Thirty participants had the Dutch nationality
and thirty had the German nationality.
3. RESULTS
The correlations between all of the variables were examined using Spearman’s correlation. Table 2 displays the significant
correlations. Strong correlations were found between attitude and anxiety towards robots, attitude towards use and intention to use,
and realism and anthropomorphism.

TABLE 2: SIGNIFICANT CORRELATIONS BETWEEN THE VARIOUS VARIABLES

SOCO
NARS

REAL
PEOU
COM

PENJ
PAD

RRE

RAS
ATT

PBC
USE

PIIT
AM

PA

IM

UI
PI

SI
PAD
AM .29
ATT .53* .26
PA .45* .46*
PBC .33* .41* .29
COM .29 .58* .46* .62* .32 .26
PEOU .40* .36* .60*
PENJ .35 .56* .31 .61* .35* .34* .65* .38*
IM .27 .29
PI .33 .69* .34* .35* .50*
NARS -.28 -.31 -.40* - .43* -.31 -.44* -.32
PIIT -.34*
RRE .40
REAL .73* .42 .43* .42* .26 .60*
RAS -.28 -.27 -.35* .28 .74*
SOCO .42* .60* .53* .37* .60* .27 .64* -.27 .69*
SI .49* .54* .45* .44* .44* .30 .51* .39* .44*
UI .30 .64* .78* .33 .49* .68* .33* .63* .27 .32 -.32 .35* .47* .59*
PU .30* .61* .28 -.32 .30* .50*
Correlations <.30 are weak, correlations >.70 are strong. And correlation with an * are significant at the .01 level.

However, correlation scores can only indicate that variables are related but cannot determine the influence in a particular
direction. They also do not present which influences are dominating determinants of particular variables. Therefore, we used
regression analysis to test the hypotheses as derived from the literature. For each hypothesis, we established the R² value of the
regression, which can be used as an indication of the predictive strength. The value of R² is between 0 and 1, meaning that 0.50
indicates that 50 per cent of the variation of the dependent variable can be explained by the predicting variables. Table 3 displays
the results of the stepwise multiple regressions. Hypothesis 1 involved nine variables which could explain actual use of a robotic
system. In our sample, only enjoyment remained as a predicting variable for actual use, which explained 11 per cent of the variance.
For hypothesis 2, the theory found fourteen variables that could help explain use intention. In our sample, together use attitude,
adaptability, companionship, and perceived behavioral control explained 71 per cent of the variance of use intention. Hypothesis 3
indicated eight predictors for use attitude. In our sample, use attitude was for 69 per cent explained by usefulness, enjoyment, and
sociability. Hypothesis 4 involved ten variables which could explain the usefulness of the robot. In our sample, only adaptability
remained as a predicting variable for usefulness, which explained only 8 per cent of the variance. For hypothesis 5, the theory found
eight variables that could help explain ease of use. In our sample, together perceived behavioral control and anxiety towards robot
explained 50 per cent of the variance of ease of use. Hypothesis 6 indicated that both realism and sociability influence adaptability.
However, in our sample only sociability explained 16 per cent of the variance of adaptability. Hypothesis 7 stated that realism
explains the intelligence of a robot. This hypothesis was confirmed within our sample, in which realism explained 35 per cent of the
variance of intelligence. For hypothesis 8, the theory provided us with seven variables that could explain enjoyment. In our sample,
both adaptability and attractiveness together explained 34 per cent of the variance of enjoyment. Hypothesis 9 indicated that
anthropomorphism could help explain attractiveness. This hypothesis was confirmed by our sample, in which attractiveness was for
19 per cent explained by anthropomorphism. Hypothesis 10 involved five variables that could explain anthropomorphism. Our
sample indicates that both sociability and nationality together could explain 48 per cent of the variance of anthropomorphism.
Hypothesis 11 stated that intelligence and anthropomorphism explains realism. In our sample, in was only anthropomorphism that
explained 53 per cent of the variance of realism. Hypothesis 12 indicated that intelligence, realism and nationality could explain
sociability. This hypothesis was confirmed by our sample, in which these three variables explained 59 per cent of the variance of
sociability. Hypothesis 13 identified anthropomorphism, realism and gender as explanatory variables for companionship. In our
sample, only anthropomorphism remained as a predicting variable for companionship, which explain 20 per cent of the variance.
For hypothesis 14, the theory found that anthropomorphism and age could help explain social influence. In our sample, age
explained only 7 per cent of the variance of social influence. Hypothesis 15 indicated social influence predicts image. However,
within our sample, we could not confirm this hypothesis as the regression analysis was not significant. Hypothesis 16 involved four
variables which could explain perceived behavioral control. In our sample, only enjoyment remained as a predicting variable for
perceived behavioral control, which explained only 10 per cent of the variance. For hypothesis 17, the theory found five variables
that could help explain users’ anxiety towards robots. In our sample, it was the users’ attitude towards robots that explained 54 per
cent of the variance of anxiety towards robots. Hypothesis 18 indicated that nationality influences robot related experiences.
However, our sample could not confirm this hypothesis as the regression analysis was not significant. Hypothesis 19 indicated five
variables that could explain users’ attitude towards robots. In our sample, both anthropomorphism and gender together explained 16
per cent of the variance of attitude towards robots.

TABLE 3: REGRESSION ANALYSIS ON THE THEORETICAL HYPOTHESES


Hypothesis Independent Variable Dependent Variable Beta t-value Sig. R²
H1 Use Intention Use .13 .804 .424 .11
Usefulness -.18 -1.411 .164
Enjoyment .35 2.875 .006
Companionship .11 .681 .499
Social Influence -.18 -1.235 .222
Robot Related Experiences .02 .151 .880
Perceived Behavioral Control .05 .359 .721
Attitude towards Robots -.16 -1.163 .250
Gender -.07 -.576 .576
H2 Use Attitude Use Intention .46 4.668 .000 .71
Usefulness .15 1.630 .109
Ease of Use -.07 -.792 .432
Adaptability .20 2.183 .033
Enjoyment .06 .574 .568
Anthropomorphism -.09 -1.169 .248
Companionship .36 2.373 .021
Social Influence .01 1.267 .210
Robot Related Experiences .01 .168 .867
Perceived Behavioral Control .18 2.244 .029
Attitude towards Robots .02 .281 .779
Personal Innovativeness -.01 -.143 .887
Age .04 .572 .570
Gender .00 .119 .906
H3 Usefulness Use Attitude .45 5.964 .000 .69
Ease of Use .12 1.508 .137
Adaptability .05 .508 .613
Enjoyment .39 4.977 .000
Anthropomorphism -.13 -1.423 .160
Sociability .37 4.860 .000
Social Influence .07 .756 .453
Anxiety towards Robots .02 .280 .780
H4 Ease of Use Usefulness .12 .952 .345 .08
Adaptability .30 2.407 .019
Enjoyment .15 1.023 .311
Hypothesis Independent Variable Dependent Variable Beta t-value Sig. R²
Attractiveness .11 .815 .418
Anthropomorphism -.09 -.719 .475
Sociability .01 .084 .933
Social Influence .20 1.419 .161
Image -.05 -.349 .728
Robot Related Experiences .12 .959 .341
Gender -.23 -1.834 .072
H5 Enjoyment Ease of Use .11 1.084 .283 .50
Attractiveness .15 1.589 .118
Sociability -.03 -.310 .758
Social Influence .08 .763 .449
Robot Related Experiences .01 .074 .941
Perceived Behavioral Control .62 6.743 .000
Anxiety towards Robots -.39 -4.225 .000
Personal Innovativeness .11 .074 .258
H6 Realism Adaptability -.10 -.584 .561 .16
Sociability .42 3.526 .001
H7 Realism Intelligence .60 5.754 .000 .35
H8 Adaptability Enjoyment .50 4.618 .000 .34
Attractiveness .23 2.085 .042
Anthropomorphism .07 .606 .547
Sociability -.03 -.270 .788
Anxiety towards Robots -.09 -.801 .427
Age .09 .791 .432
Nationality -.15 -1.393 .169
H9 Anthropomorphism Attractiveness .45 3.795 .000 .19
H10 Sociability Anthropomorphism .63 6.638 .000 .48
Perceived Behavioral Control .07 .739 .463
Age -.09 -.953 .345
Gender .01 .108 .914
Nationality -.37 -3.922 .000
H11 Intelligence Realism .18 1.503 .138 .53
Anthropomorphism .73 8.192 .000
H12 Intelligence Sociability .38 3.651 .001 .59
Realism .50 4.790 .000
Nationality .23 2.734 .008
H13 Anthropomorphism Companionship .46 3.947 .000 .20
Realism .18 1.064 .292
Gender -.05 -.436 .665
H14 Anthropomorphism Social Influence .18 1.364 .178 .05
Age -.26 -2.031 .047
H15 Social Influence Image .24 1.858 .068 -
H16 Enjoyment Perceived Behavioral Control .34 2.724 .009 .10
Robot Related Experiences .05 .420 .676
Anxiety towards Robots .14 1.095 .278
Nationality .18 1.415 .162
H17 Enjoyment Anxiety towards Robots .14 1.420 .161 .54
Robot Related Experience .05 .499 .620
Attitudes towards Robots .74 8.336 .000
Age -.05 -.566 .573
Gender .07 .728 .470
H18 Nationality Robot Related Experiences -.05 -.352 .726 -
H19 Anthropomorphism Attitude towards Robots -.25 -2.084 .042 .16
Robot Related Experiences -.03 -.252 .802
Age .02 .105 .917
Gender .36 2.997 .004
Nationality .17 1.164 .249

Besides testing the hypotheses, we also investigated the existence of socio-demographic differences in the sample. Gender
differences in the sample were examined utilizing an independent T-test. Females possessed a more negative attitude towards
robots (t= -2.937, p= .005), exhibited higher anxiety towards robots (t= -2.627, p= .011), and were more influenced by their social
network (t= -2.641, p= .013) than males. However, male participants exhibited higher interests towards innovative technologies (t=
4.475, p= .000) and had more robot related experiences (t= 3.765, p= .000) than females. Another set of independent T-tests was
used to explore the effects of different nationalities in the sample. Germans had a more negative attitude towards robots (t= -2.589,
p= .024), and were more influenced by their social network (t= -2.300, p= .015) than Dutch participants. However, Dutch
participants tended to anthropomorphize the robot more (t= 2.588, p= .013) and had more robot related experiences (t= 2.333, p=
.024) than Germans. Age effects were analyzed with regression analysis. Age affected usefulness and social influence. Usefulness
increased with age (β= 0.23, p= .049. R²= .07), but the influence of a participant’s social network decreased with age (β= -0.26, p=
.047. R²= .07).

4. DISCUSSION
This study yields some intriguing insights into the variables influencing social robot acceptance, their importance and
interrelationships. In scrutinizing the social robot acceptance variables, our study indicates both resemblances as well as differences
with other studies. Despite the many hypothetically assigned explanatory variables of actual use, in our study only enjoyment was
found to explain the actual use of the robot. I could be that, within the context of our study, there are other aspects of either the user
or the interaction experiences that could better explain how the users interacted with the robot. However, another explanation lies in
the way we measured actual use. We measured actual use as the quantity of spoken words and the existence of physical contact
between the user and the robot. As our participants had only a short interaction with the robot, this seemed to be an appropriate way
to measure aspects of the actual use of the robot. However, other measures of actual use have been suggested. A review on
technology acceptance research identified that actual use is measured with a combination of the frequency of use, duration of use,
actual number of use, and diversity of use (Lee, Kozar and Larson, 2003). Future human-robot interaction studies could
demonstrate if other results merge when the theoretically assigned explanatory variables of actual use as indicated in hypothesis 1
are tested with their measurement of actual use. When looking at use intention, we have identified use attitude, adaptability,
companionship and perceived behavioral control as the explanatory factors. Moreover, use attitude can be explained by usefulness,
enjoyment and sociability. Together, these acceptance variables should be given more attention in human-robot interaction research
to improve users interaction experiences.
This paragraph will further discuss these variables of social robots acceptance. First, similar to previous studies [Fishbein &
Ajzen] [Heerink et al., 2010], we have also detected a direct positive influence of attitude towards use on intention to use. However,
when looking at the UTAUT model [Venkatesh, 2003], the direct link between attitude and intention to use was omitted. Future
studies can better clarify this relationship in the context of social robot acceptance. Second, contrary to most former TAM research
stressing the importance of usefulness in the technology acceptance process [Lee, Kozar, Larsen, 2003], it is rather surprising that,
in our study, usefulness only has a direct influence on attitude towards use. One explanation could be that in the context of our
study –the task of the robot engaging in casual conversation– usefulness does not have the same function as in earlier TAM
research. Our study thus reveals that, when the purpose of a robot is act sociable, users will focus on other variables of that robot.
This yields interesting questions about when certain variables become important for acceptance of social robots. Is it the task
assigned to participants, the type of users, the robot itself being used, the duration of use, etcetera? A better understanding of the
variables that influence social robot acceptance is required using various types of technologies within various contexts of use.
Third, the influence of adaptability on both usefulness and use intention is consistent with earlier findings [Broadbent et al., 2009]
[Heerink et al., 2010]. This connotes that a social robot must be able to adapt to the changing needs of its users. Sociability was
found to explain the adaptability of the robot in our study. However, our results could not give profound insight into what makes
users perceive social robots as adaptive to their needs, as sociability could only explain a little of the variance of adaptability. Future
robotics research should further investigate the aspects of the robot and the users’ interaction experience that could better help
explain what makes a robot perceived as adaptive to user needs. Fourth, the influence of enjoyment on use attitude is confirmed by
our study and has been established before in human-robot interaction studies (Shi & Choo, 2011). However, the existence of a
direct relationship between enjoyment and use intention in earlier studies (Heerink et al., 2010; Shin & Choo, 2011) has not been
found in our study. It could be that, in the presence of other variables in the explanation of use intention, the influence of enjoyment
is predominated by the other variables. Future research on the evaluation of human-robot interactions should further analyze the
effect of enjoyment on user acceptance of social robots. Fifth, when users perceive social robots as companions and are willing to
build a relationship with them, they are more likely to continue interacting with these robots [Kanda et al., 2007; Klamer et al.,
2010]. Our results confirm the importance of companionship with a direct positive influence on intention to use, making this
variable particularly relevant for social robot acceptance. Unfortunately, our study could only explain a little of the variance of
companionship. One clarification for this is the short interaction the user had with the robot. Relationships develop over time
(Levinger, 1983), thus to increase the validity of the companionship measurement we need long-term studies with repeated
interactions between the user and the robot. Future research should further investigate what makes people want to build
relationships with robots. As the way people interaction somewhat similar with robot as with other humans (Kahn, Friedman, 2006;
Lee, Park, Song, 2005), theories of human-human relationships might shed light on the reasons behind the existence of human-
robot relationships. Sixth, most variables that correlate with intelligence also correlate with sociability. Additionally, the
correlations with sociability were also stronger. Where intelligence measures the general cognitive competence, sociability
evaluates the social intelligence. This indicates that, for social robot acceptance, sociability is more important than general system
intelligence. This is conform other studies advocating the need for implementing social capacities into social robots [de Ruyter et
al., 2005; Breazeal, 2003]. Seventh and final, consistent with earlier findings [Karahanna, Limayem, 2000], our results also point to
the influence of perceived behavioral control on ease of use. This endorses that behavioral control is particularly relevant for the
acceptance of novice users [LaRose et al., 1994], as the participants in this study were not familiar with using robots.
In addition to examining the acceptance variables, we also presented some demographic differences. Our study has confirmed
that gender differences do exist within human-robot interaction scenarios. However, our results were not entirely in line with other
research. Shibata et al. [Shibata et al., 2009] found that females are more comfortable with social robots. Contradictory to this, we
found that females exhibited higher anxiety towards our robot than did males. One explanation could be the use of a different robot.
Paro, the baby seal robot, is more cuddlesome than NAO, the humanoid robot used in this study. A different reason for inconsistent
outcomes is the age of the participants. Elderly people participated in their study, versus students in ours. Age has been found to
have an influence on attitudes towards technology [LaRose et al., 1994] which would then explain the different outcomes. Another
demographic result was the influence of different nationalities of participants, which has been indicated in previous research.
Similar to previous findings [Broadbent et al.] [MacDorman et al., 2009], our study also demonstrates that people from different
countries vary in their robot related experiences. Moreover, we have found that being of a certain nationality has an influence on the
attitude people hold towards robots, which is in line with previous findings [Li et al., 2010]. These differences might be due to the
different levels of exposure to robots [Broadbent et al., 2009]. Additionally, the results of our study indicated that German
participants experienced less anthropomorphism in the interaction with a robot as compared to Dutch participants. Similar results of
nationality differences with anthropomorphism have been previously discovered in an earlier study [Li et al., 2010]. Future robotics
research should thus carefully consider their participants as demographic variety could influence established results.
As with every study, this study also has limitations. Being a short-term study, it may be that a novelty effect might have been
measured. However, this holds true for every short-term study. Our study design is based on and in line with earlier studies, e.g.
Heerink et al. [Heerink et al., 2010] and Nomura et al. [Nomura et al., 2008]. Short-term interaction studies appear to be a standard
method of addressing user experience evaluations within robotics. To omit novelty effects in the future, our current follow-up study
focusses on long-term effects in social robot acceptance using the same variables as were used in this study. Also, the robot used in
this study creates a limitation in that the results are only generalizable to other humanoid robots. The same holds true for the chosen
scenario and the participants. The main goal of the present study is to investigate which variables to include when evaluating social
robots. Our findings have yet not been tested in different contexts. Currently we are testing if our findings will hold when
employing other types of robots, scenarios and users.

5. CONCLUSION
The aim of this paper is to provide insight into the influence of various variables for the user acceptance of social robots. To be
able to explain social robot acceptance and use, it is essential to understand the determinants of the key acceptance variables
(Venkatesh & Morris, 2000). To accomplish this, an extensive literature review of research has been conducted and has provided us
with a set of variables identified in social robots and human-robot interaction combined with those from information systems,
human-computer interaction, psychology and communication science. Hence, we have defined a new set of variables to evaluate
the user acceptance of social robots arguing that especially some specific hedonic factors are currently being overlooked in social
robotics acceptance research. Some of these variables, e.g. companionship, have never even been considered to be included before
for social robot acceptance. Moreover, the indicated interrelationship between sociability, companionship and the original TAM
variables entails an important contribution to robotics research as, up until recently [Shin, H. Choo, 2011], previous studies have
only focused on either the influence of social presence or on the determinants of usefulness and ease of use. Given the autonomous
interactivity of robotics technology, the complex interrelationships between social presence and technology acceptance variables
should be further investigated and clarified by future robotics research.
Overall, while more testing of our results is needed in various contexts using different robots with various user groups, the
present study, provides an extensive overview of important variables that should be considered when evaluating or creating social
robots. Our results confirm that usefulness, adaptability, enjoyment, sociability, companionship and perceived behavioral control
are key variables in explaining social robot acceptance. Hence, the present study contributes to human-robot interaction research by
designating the variables that lead to social robot acceptance. Subsequently, this study may serve as an onset in developing an
integral model which takes into consideration the relevant determinants of social robot acceptance.

REFERENCES
[1] C.L. Breazeal, Emotion and sociable humanoid robots, Int. J. of Human-Computer Studies 59 (2003) 119-155.
[2] V. Venkatesh, M.G. Morris, Why don't men ever stop to ask for directions?: Gender, social influence, and their role in
technology acceptance and usage behavior, Management Information Systems 24 (2000) 115-136.
[3] F.D. Davis, Perceived usefulness, perceived ease of use, and user acceptance of information technology, MIS Quarterly 13
(1989) 319–340.
[4] V. Venkatesh, M.G. Morris, G.B. Davis, F.D. Davis, User acceptance of information technology: Toward a unified view,
MIS Quarterly 27 (2003) 425-478.
[5] D.A. Norman, Emotional design. Basic Books, New York, NY, 2004.
[6] M.M.A. de Graaf, S. Ben Allouch, Harvey’s last appearance: Long-term use and acceptance of social robots, The 62th Int.
Communication Association Conference, Phoenix, AZ, 2012.
[7] M. Heerink, B. Krose, V. Evers, B. Wielinga, Assessing the acceptance of assistive social agent technology by older users:
The Almere model, Int. J. of Social Robotics 2, 2010.
[8] B. de Ruyter, P. Saini, P. Markopoulos, A. van Breemen, Assessing the effects of building social intelligence in a robotic
interface for the home, Interacting with Computers 17 (2005) 522-541.
[9] A. Weiss, R. Bernhaupt, M. Tscheligi, The USUS evaluation framework for user-centered HRI, in: K. Dautenhahn, J.
Saunders (eds). New frontiers in human-robot interaction 2, John Benjamins Publishing Co, Amsterdam, The Netherlands,
2011, pp. 89-110.
[10] J.E. Young, R. Hawkins, E. Sharlin, T. Igarashi, Towards acceptable domestic robots: Applying insights from social
psychology, Int. J. of Social Robotics 1 (2007) 95-108.
[11] I. Ajzen, M. Fishbein, The influence of attitudes on behavior, in: D. Albarracín, B.T. Johnson, M.P. Zanna (eds.). The
handbook of Attitudes, Erlbaum, Mahwah, NJ, USA, 2005, pp. 173-221.
[12] M. Hassenzahl, N. Tractinsky, User experience: A research agenda, Behaviour & Information Technology 25 (2006) 91-97.
[13] H. van der Heijden, User acceptance of hedonic information systems, MIS Quarterly 28 (2004) 695-704.
[14] R.M. Ryan, E.L. Deci, Intrinsic and Extrinsic Motivations: Classic Definitions and New Directions, Contemporary
Educational Psychology 25 (2000) 54–67.
[15] R.J. Vallerand, Toward a Hierarchical Model of Intrinsic and Extrinsic Motivation, Advances in Experimental Social
Psychology 29 (1997) 271-360.
[16] Y. Lee, K.A. Kozar, K.R.T. Larsen, The Technology Acceptance Model: Past, present and future, Communications of the
Association for Information Systems 12 (2003) 752-780.
[17] M. Heerink, B. Krose, V. Evers, B. Wielinga, Measuring acceptance of an assistive social robot: A suggested toolkit, The
18th IEEE International Symposium on Robot and Human Interactive Communication, Toyama, Japan, 2009.
[18] D.H. Shin, H. Choo, Modeling the acceptance of socially interactive robotics, Interaction Studies 12 (2011) 430-460.
[19] T. Ahn, S. Ryu, I. Han, The impact of Web quality and playfulness on user acceptance of online retailing, Information &
Management 44 (2007) 263-275.
[20] T. Chesney, An acceptance model for useful and fun information systems, Human Technology 2 (2006) 225-235.
[21] J. Goetz, S. Kiesler, A. Powers, Matching robot appearance and behavior to tasks to improve human-robot cooperation,
Proceedings of the 12th IEEE Int. Workshop on Robot and Human Interactive Communication, Milbrae, CA, 2003.
[22] T. Fong, I. Nourbakhsh, K. Dautenhahn, A survey of socially interactive robots, Robotics and Autonomous Systems 42
(2003) 143-166.
[23] E. Broadbent, R. Stafford, B. MacDonald, Acceptance of healthcare robots for the older population: Review and future
directions, Int. J. of Social Robotics 1 (2009) 319-330.
[24] V. Venkatesh, F.D. Davis, A theoretical extension of the technology acceptance model: Four longitudinal field studies,
Management Science 46 (2000) 186-204.
[25] C. Bartneck, D. Kulić, E. Croft, S. Zoghbi, Measurement instruments for the anthropomorphism, animacy, likeability,
perceived intelligence, and perceived safety of robots, Int. J. of Social Robotics 1 (2009) 71-81.
[26] R.H. Cuijpers, M.T. Bruna, J.R.C. Ham, E. Torta, Attitude towards robots depends on interaction but not on anticipatory
behavior, International Conference on Social Robotics, Amsterdam, The Netherlands, 2011.
[27] S.A. Brown, V. Venkatesh, Model of adoption of technology in households: A baseline model test and extension
incorporating household life cycle, MIS Quarterly 29 (2005) 399-426.
[28] H. van der Heijden, Factors influencing the use of websites: The case of a generic portal in The Netherlands, Information &
Management 40 (2003) 541-549.
[29] A.H. Eagly, R.D. Ashmore, M.G. Makhijani, L.C. Longo, What is beautiful is good, but…: A meta-analytic review of
research on the physical attractiveness stereotype, Psychological Bulletin 110 (1991) 109-128.
[30] K. Dion, E. Berscheid, E. Walster, What is beautiful is good, J. of Personality and Social Psychology 24 (1972) 285-290.
[31] B.N. Schenkman, F.U. Jönsson, Aesthetics and preferences of webpages, Behaviour and Information Technology 19 (2000)
367-377.
[32] V. Groom, C Nass, T. Chen, A. Nielsen, J.K. Scarborough, E. Robles, Evaluating the effects of behavioral realism in
embodied agents, Int. J. of Human-Computer Studies 67 (2009) 842-849.
[33] B. Reeves, C.I. Nass, The media equation: How people treat computers, television, and new media like real people and
places. CSLI Publications, New York, NY, 1996.
[34] P. H. Kahn, B. Friedman, D.R. Perez-Granados, N.G. Freier, Robotic pets in the lives of preschool children, Interaction
Studies 7 (2006) 405-436.
[35] B.R. Duffy, Anthropomorphism and the social robot, Robotics and Autonomous Systems 42 (2003) 177-190.
[36] N. Epley, A. Waytz, J.T. Cacioppo, On seeing human: A three-factor theory of anthropomorphism, Psychological Review
114 (2007) 864-886.
[37] K.M. Lee, N. Park, H. Song, Can a robot be perceived as a developing creature?: Effects of a robot’s long-term cognitive
developments on its social presence and people’s social responses towards it, Human Communication Research 31 (2005)
538-563.
[38] K.M. Lee, Y. Jung, J. Kim, S.R. Kim, Are physically embodied social agents better than disembodied social agents?: The
effects of physical embodiment, tactile interaction, and people's loneliness in human-robot interaction, Int. J. of Human-
Computer Studies 64 (2006) 962-973.
[39] C. Bartneck, T. Kanda, O. Mubin, A. Al Mahmud, Does the design of a robot influence its animacy and perceived
intelligence, International Journal of Social Robotics 1 (2009) 195-204.
[40] A.V. Libin, E.V. Libin, Person-robot interactions from the robopsychologists' point of view: The robotic psychology and
robotherapy approach, Proceedings of the IEEE 92 (2004) 1789-1803.
[41] R.B. Rubin, M.M. Martin, Development of a measure of interpersonal communication competence, Communication
Research Rapports 11 (1994) 33-44.
[42] K. Dautenhahn, S. Woods, C. Kaouri, M.L. Walters, K.L. Koay, I Werry, What is a robot companion: Friend, assistant or
butler?, Proceedings of the 14th IEEE IRS / RSJ Int. Conference on Intelligent Robots and Systems, Edmonton, Canada, 2005.
[43] T. Kanda, R. Sato, N. Saiwaki, H. Ishiguro, A two-month field trial in an elementary school for long-term human–robot
interaction, IEEE Transactions on Robotics 23 (2007) 962-971.
[44] T. Klamer, S. Ben Allouch, D. Heylen, Adventures of Harvey: Use, acceptance of and relationship building with a social
robot in a domestic environment, Proceedings of the 3rd Int. Conference on Human Robot Personal Relationships LNCS,
Leiden, The Netherlands, Springer-Verlag, 2010.
[45] E.M. Rogers, Diffusion of innovations, fifth edition, The Free Press, New York, NY, 2003.
[46] E. Karahanna, M. Limayem, E-mail and V-mail usage: Generalizing across technologies, J. of Organizational Computing and
Electronic Commerce 10 (2000) 49-66.
[47] Y. Malhotra, D.F. Galletta, Extending the technology acceptance model to account for social influence: Theoretical bases and
empirical validation, Proceedings of the 32nd Hawaii International Conference on System Sciences, Maui, Hawaii, USA,
1999.
[48] S. Taylor, P.A. Todd, Understanding information technology usage: A test of competing models, Information System
Research 6 (1995) 144-176.
[49] S.A. Brown, V. Venkatesh, Model of adoption of technology in households: A baseline model test and extension
incorporating household life cycle, MIS Quarterly 29 (2005) 399-426.
[50] V. Venkatesh, S.A. Brown, A longitudinal investigation of personal computers in homes: Adoption determinants and
emerging challenges, MIS Quaterly 25 (2001) 71-102.
[51] R.J. Fisher, L.L. Price, An investigation into the social context of early adoption behavior, Journal of Consumer Research 19
(1992) 477-486.
[52] I. Ajzen, The theory of planned behavior, Organizational Behavior and Human Decision Processes 50 (1991) 179-221.
[53] K. Mathieson, E. Peacock, W.W. Chin, Extending the technology acceptance model: The influence of perceived user
resources, The DATA BASE for Advances in Information Systems 32 (2001) 86-112.
[54] A. Bandura, Self-efficacy: Towards a unified theory of behavioral change, Psychological Review 84 (1977) 191-215.
[55] R. LaRose, M.S. Easton, A social cognitive theory of internet uses and gratification: Toward a new model of media
attendance, J. of Broadcasting & Electronic Media 48 (1994) 358-377.
[56] V. Venkatesh, Determinants of perceived ease of use: Integrating control, intrinsic motivation, and emotion into the
technology acceptance model, Information Systems Research 11 (2000) 342-365.
[57] V. Venkatesh, H. Bala, Technology acceptance model 3 and a research agenda on interventions, Decision Sciences 39 (2008)
273-315.
[58] I.G. Sarason, Anxiety and self-preoccupation, in: I.G. Sarason, D.C. Spielberger (Eds.), Stress and anxiety, second volume,
Hemisphere, Washington, D.C., 1975, pp. 27-44.
[59] T. Nomura, T. Kanda, T. Suzuki, K. Kato, Prediction of human behavior in human-robot interaction using psychological
scales for anxiety and negative attitudes towards robots, IEEE Transaction on Robotics 24 (2008) 442-451.
[60] M.Y. Yi, Y. Hwang, Predicting the use of web-based information systems: Self-efficacy, enjoyment, learning goal orientation
and the technology acceptance model, Int. J. of Human-Computer Studies 59 (2003) 431-449.
[61] K.F. MacDorman, S.K. Vasudevan, C.C. Ho, Does Japan really have robot mania?: Comparing attitudes by implicit and
explicit measures, AI & Society 23 (2009) 485-510.
[62] C. Bartneck, T. Suzuki, T. Kanda, T. Nomura, The influence of people’s culture and prior experiences with AIBO on their
attitude towards robots, AI & Society 21 (2006) 217-230.
[63] G. Hackbarth, V. Grover, M.Y. Yi, Computer playfulness and anxiety: positive and negative mediators of the system
experience effect on perceived ease of use, Information & Management 40 (2003) 221-232.
[64] I.H. Kuo, J.M. Rabindran, E. Broadbent, Y.I. Lee, N. Kerse, R.M.Q. Stafford, B.A. MacDonald, Age and gender factors in
user acceptance of healthcare robots, The 18th IEEE Int. Symposium on Robot and Human Interactive Communication,
Toyama, Japan, 2009.
[65] J. Lessiter, J. Freeman, E. Keogh, J. Davidoff, A cross-media presence questionnaire: The ITC-Sense of presence inventory,
Presence-Teleoperators and Virtual Environments 10 (2001) 282-297.
[66] M. Scopelliti, M.V. Giuliani, F. Fornara, Robots in a domestic setting: A psychological approach, Universal Access in the
Information Society 4 (2005) 146-155.
[67] V. Venkatesh, J.Y.L. Thong, X. Xu, Consumer acceptance and use of information technology: Extending the unified theory
of acceptance and use of technology, MIS Quarterly 36 (2012) 157-178.
[68] J. Forlizzi, How robotic products become social products: An etnographic study of robotic products in the home, Proceedings
of the ACM / IEEE Int. Conference on Human-Robot Interaction, Arlington, VA, 2007.
[69] K.O. Arras, D. Cerqui, Do we want to share our lives and bodies with robots?, Lausanne, Switzerland, Swiss Federal Institute
of Technology Lausanne, EPFL, 2005.
[70] P. Schermerhorn, M. Scheutz. C.R. Crowell, Robot social presence and gender: Do females view robots differently than
males?, Proceedings of the 3rd ACM / IEEE International Conference on Human Robot Interaction, Amsterdam, The
Netherlands, 2008.
[71] T. Shibata, K. Wada, Y. Ikeda, S. Sabanovic, Cross-cultural studies on subjective evaluation of a seal robot, Advanced
Robotics 23 (2009) 443-458.
[72] S. Turkle, Alone together: Why we expect more from technology and less from each other, Basic Books, New York, NY,
USA, 2011.
[73] F. Kaplan, Who is afraid of the humanoid?: Investigating cultural differences in the acceptance of robots, International
Journal of Humanoid Robotics 1 (2004) 1-16.
[74] D. Li, P.L.P. Rau, Y. Li, A cross-cultural study: Effects of robot appearance and task, Int. J. of Social Robotics 2 (2010) 175-
186.
[75] T. Nomura, T. Suzuki, T. Kanda, J. Han, N. Shin, J. Burke, K. Kato, What people assume about humanoid and animal-type
robots: Cross-cultural analysis between Japan, Korea, and the United States, International Journal of Humanoid Robotics 5
(2008) 25-46.
[76] A. Serenko, A model of user adoption of interface agents for email notification, Interacting with Computers 20 (2008) 461-
472.
[77] R. Agarwal, E. Karahanna, Time flies when you're having fun: Cognitive absorption and beliefs about IT usage, MIS
Quarterly 24 (2000) 665-694.
[78] R. Aragwal, J. Prasad, The antecedents and consequents of user perceptions in information technology adoption, Decision
Support Systems 22 (1998) 15-29.
[79] J. Webster, J.J. Martocchio, Microcomputer playfulness: Development of a measure with workplace implications, MIS
Quarterly, 16 (1992) 201-226.
[80] M. Brosnan, Technophobia: The psychological impact of information technology, Routledge, London, UK, 1998.
[81] T. Nomura, T. Kanda, T. Suzuki, S. Yamada, K. Kato, Influences of concerns toward emotional interaction into social
acceptability of robots, Proceedings of the 4th ACM / IEEE Int. Conference on Human-Robot Interaction, Boston, MA, 2009.
[82] J.P. Chaplin, Dictionary of psychology, second ed., Dell Publishing Company, New York, NY, 1991.
[83] M. Fishbein, I. Ajzen, Belief, attitude and behavior: An introduction to theory and research, Addison-Wesley, Reading, UK,
1975.
[84] J.W. Moon, J.G. Kim, Extending the TAM for a world-wide-web context, Information & Management 38 (2001) 217-230.
[85] M. Salem, F. Eyssel, K. Rohlfing, S. Kopp, F. Joubin, Effects of gesture on the perception of psychological
anthropomorphism: A case study with a humanoid robot, Proceedings of the Int. Conference of Social Robotics, Amsterdam,
The Netherlands, 2011.
[86] J.C. Nunnaly, I.H. Bernstein, Psychometric theory, McGraw-Hill, Sidney, Australia, 1994.
[87] S. Ferketich, Focus on psychometrics: Aspects of items analysis, Research in Nursing & Health 14 (1991) 165-168.
[88] G. Levinger, Development and change, In: H.H. Kelley, E. Berscheid, A. Christensen, J.H. Harvey, T.L. Huston, G.
Levinger, E. McClintock, L.A. Peplau, D.R. Peterson (eds.), Close relationships, WH Freeman, New York, NY, USA, 1983,
pp. 315-359.

View publication stats

You might also like