Professional Documents
Culture Documents
net/publication/281863047
CITATIONS READS
21 2,815
4 authors, including:
Mustapha Mouloua
University of Central Florida
160 PUBLICATIONS 2,887 CITATIONS
SEE PROFILE
All content following this page was uploaded by Christine Kreutzer-Seaver on 18 September 2015.
1 Introduction
1.1 Background
As vehicles with autonomous features become standard in today’s market, so too does
our need to understand the intricate role human trust plays in the operation of these
vehicles. Certainly, human trust towards AVs has become a salient issue in
human-robot interaction literature. For example, when an autonomous car has
anthropomorphized features, humans are more likely to trust the vehicle [1]. However,
previous research has indicated that humans are poor at monitoring automated systems
[2–5]. Despite numerous advances in technology, autonomous systems still remain
prone to automation failures [6, 7]. In addition to technical problems, there are a
number of human factors design issues facing AV designers, such as displayed
information, situation awareness, level of training and experience, control design,
support from backup personnel or systems, data-link delays, and cognitive load
limitations [3].
© Springer International Publishing Switzerland 2015
C. Stephanidis (Ed.): HCII 2015 Posters, Part II, CCIS 529, pp. 610–615, 2015.
DOI: 10.1007/978-3-319-21383-5_102
Measuring Trust of Autonomous Vehicles 611
2 Methods
2.1 Participants
A total of 400 participants from the University of Central Florida will be recruited for
participation in this study. Participants can sign up for this study via SONA, the
university’s online research participant pool. Extra credit will be awarded in exchange
for participation.
612 D. Garcia et al.
2.2 Materials
Vignettes reflecting the five levels of vehicle autonomy as identified by The National
highway Traffic Safety Administration [8] will be presented. Each vignette describes
the features of the corresponding level of autonomy as follows:
• Level 0. The driver is in complete and sole control of the primary vehicle controls
(brake, steering, throttle, and motive power) at all times, and is solely responsible
for monitoring the roadway and for safe operation of all vehicle controls. This
vehicle may have certain driver support/convenience systems but do not have
control authority over steering, braking, or throttle. Examples include systems that
provide only warnings (e.g., forward collision warning, lane departure warning,
blind spot monitoring) as well as systems providing automated secondary controls
such as wipers, headlights, turn signals, and hazard lights.
• Level 1. Automation at this level involves one or more specific control functions; if
multiple functions are automated, they operate independently from each other. The
driver has overall control, and is solely responsible for safe operation, but can
choose to cede limited authority over a primary control (as in adaptive cruise
control), the vehicle can automatically assume limited authority over a primary
control (as in electronic stability control), or the automated system can provide
added control to aid the driver in certain normal driving or crash-imminent situa-
tions (e.g., dynamic brake support in emergencies). The vehicle may have multiple
capabilities combining individual driver support and crash avoidance technologies,
but does not replace driver vigilance and does not assume driving responsibility
from the driver. The vehicle’s automated system may assist or augment the driver in
operating one of the primary controls – either steering or braking/throttle controls
(but not both). As a result, there is no combination of vehicle control systems
working in unison that enables the driver to be disengaged from physically oper-
ating the vehicle by having his or her hands off the steering wheel AND feet off the
pedals at the same time. Examples of function-specific automation systems include:
cruise control, automatic braking, and lane keeping.
• Level 2. This level involves automation of at least two primary control functions
designed to work in unison to relieve the driver of control of those functions.
Vehicles at this level of automation can utilize shared authority when the driver
cedes active primary control in certain limited driving situations. The driver is still
responsible for monitoring the roadway and safe operation and is expected to be
available for control at all times and on short notice. The system can relinquish
control with no advance warning and the driver must be ready to control the vehicle
safely. An example of this would be “smart”, or adaptive cruise control as seen with
some newly released cars.
• Level 3. Vehicles at this level of automation enable the driver to cede full control of
all safety-critical functions under certain traffic or environmental conditions and in
those conditions to rely heavily on the vehicle to monitor for changes in those
conditions requiring transition back to driver control. The driver is expected to be
available for occasional control, but with sufficiently comfortable transition time.
The vehicle is designed to ensure safe operation during the automated driving
Measuring Trust of Autonomous Vehicles 613
3 Expected Results
Upon the completion of data collection, the data will be subjected to a factor analysis to
reveal the underlying factor structure of the experimental scale. The goal of factor
analysis is to condense a larger set of variables to a smaller set of factors, which account
for a sizeable proportion of variability within the items. Thus, it is desirable to have a
few factors that account for a large portion of the variance. It is anticipated that scale
ratings will converge to support one underlying factor.
A one-way analysis of variance (ANOVA) will also be conducted to examine
differences in trust among the five levels of autonomy. It is anticipated that there will be
differences in trust depending on the autonomous level. More specifically, we antici-
pate that trust will attenuate with higher levels of autonomy.
614 D. Garcia et al.
4 Discussion
The aim of the present ongoing investigation is to examine the factor structure of an
experimental metric designed to quantify attitudes towards different levels of autonomy
in vehicles. In particular, this will allow us to identify the factors underlying trust
towards autonomy. In addition, this study will identify differences in trust among the
levels of autonomous vehicles.
The technological capacities of vehicles have vastly increased in recent years,
leading to the advancement of both the functional capability and autonomy of current
systems. With these advancements, autonomous features have become increasingly
prevalent within everyday vehicles. This has led to a transition of the human role from
an operator to that of a supervising member, assistant, or even bystander. As such, the
intricacies of interaction have changed to where consumers must place increasing
amounts of trust into these technological features. Thereby, the individual’s trust in that
system takes a prominent role in the success of any interaction and therefore the future
use of the vehicle.
Despite the infiltration of autonomous vehicles faced by consumers, researchers
have not examined attitudes towards various autonomous features. This is problematic,
as vehicle manufacturers and government agencies responsible for vehicle regulations
must understand if consumers are receptive of these recent and ongoing advancements.
The deployment of AVs today is less about technological capabilities and more about
the ability of stakeholders to implement such vehicles into an everyday environment.
One barrier to successful deployment may be a lack of consumer trust. Thus, our study
will make the contribution of providing a means to quantifying trust towards autono-
mous features in vehicles. Additionally, our study seeks to identify how consumers feel
about different autonomous features. More specifically, we hope to identify how
trusting individuals are of different autonomous features that are gaining popularity.
Thus, the results of our study will have implications for vehicle manufacturers, as better
understanding their consumers will help them to design more desirable vehicles,
thereby increasing profit and user adoption.
We believe that the validation of our measure will also provide fruitful avenues for
future research. According to Schaefer [9], individuals’ mental models change as trust
changes from pre- to post-interaction with a robot. Future work should examine if a
similar relationship exists within the context of autonomous vehicles. That is, future
research should examine how operating vehicles of varying autonomy changes indi-
viduals’ degree of trust and thereby, mental models. Moreover, our measure could be
utilized to quantify these changes in trust from pre- to post-interaction. An additional
avenue for future work is related to trust as it applies to expansion and transition of the
human role. The human element is often overlooked or even forgotten during the
design and development process [9]. Thus, future work should be conducted to further
understand the differences in trust perceptions between individuals as it applies to this
process.
Measuring Trust of Autonomous Vehicles 615
References
1. Waytz, A., Heafner, J., Epley, N.: The mind in the machine: anthropomorphism increases
trust in an autonomous vehicle. J. Exp. Soc. Psychol. 52, 113–117 (2014)
2. Hancock, P., Mouloua, M., Gilson, R., Szalma, J., Oron-Gilad, T.: Provocation: is the UAV
control ratio the right question? Ergon. Des. 15(1), 7 (2007)
3. Mouloua, M., Gilson, R., Hancock, P.: Human-centered design of unmanned aerial vehicles.
Ergon. Des.: Q. Hum. Factors Appl. 11(1), 6–11 (2003)
4. Mouloua, M., Parasuraman, R.: Human Performance In Automated Systems: Recent
Research And Trends. Erlbaum, Hillsdale, NJ (1994)
5. Parasuraman, R., Riley, V.: Humans and automation: use, misuse, disuse, abuse. Hum.
Factors 39(2), 220–253 (1997)
6. Wiener, E.L., Nagel, D.C. (eds.): Human Factors In Aviation. Gulf Professional Publishing,
Houston (1989)
7. Mouloua, M., Koone, J.: Human-Automation Interaction: Research and Practice. Erlbaum,
Hillsdale, NJ (1997)
8. National Highway Traffic Safety Administration. Preliminary statement of policy concerning
automated vehicles. Washington, DC (2013)
9. Schaefer, K.E.: The perception and measurement of human-robot trust. Doctoral
dissertation, University of Central Florida Orlando, Florida (2013)
10. Joosse, M., Sardar, A., Lohse, M., Evers, V.: Behave-ii: the revised set of measures to assess
users’ attitudinal and behavioral responses to a social robot. Int. J. Soc. Robot. 5(3), 379–
388 (2013)
11. Parasuraman, R., Sheridan, T.B., Wickens, C.D.: A model for types and levels of human
interaction with automation. Syst., Man Cybern., Part A: IEEE Trans. Syst. Hum. 30(3),
286–297 (2000)
12. Yagoda, R.E., Gillan, D.J.: You want me to trust a ROBOT? The development of a human–
robot interaction trust scale. Int. J. Soc. Robot. 4(3), 235–248 (2012)