Professional Documents
Culture Documents
malfunction indication. Misses, on the other hand, under the auspices of the Center for Research and
are completely unknown until the decision aid Education on Aging and Technology Enhancement
displays the message stating that the engine (CREATE).
malfunctioned and was reset (i.e., “out of time”
displayed in the action results window). The only REFERENCES
way for the participants to confirm a miss is to try
to completely prevent it (i.e., continuously press the Bliss, J. P., & Dunn, M. C. (2000). Behavioral implications of
view gauges button in an attempt to catch the miss alarm mistrust as a function of task workload.
Ergonomic, 43, 1283-1300.
when it is occurring). Lastly, this research may find Breznitz, S. (1984). Cry wolf: the psychology of false alarms.
that trust does not vary with the type of error Lawrence Erlbaum Associates: Hillsdale, NJ.
experienced by the participants. Dzindolet, M. T., Peterson, S. A., Pomranky, R. A., Pierce, L.
Proposed follow-up research will need to be G., & Beck, H. P. (2003). The role of trust in automation
conducted to examine some of the questions this reliance. International Journal of Human-Computer
Studies, 58, 697-718.
study is not designed to answer. Future work Harris, H., & Provis, C. (2000). Teams, trust and norms:
should investigate the effects of changing the there’s more to teams as a manufacturing management
scenario in which the participants find themselves, system than some might think. Proceedings of the 8th
such as monitoring a nuclear reactor or conducting International Conference on Manufacturing
DNA tests for a criminal trial. By doing this, Engineering. Sydney, Australia, 1-5. (Available from
http://www.smartlink.net.au/library/harris/teamstrustnor
participants will be put in situations which appear to ms.pdf).
foster tolerance of one type of error over the other. Moray, N., Inagaki, T., & Itoh, M. (2000). Adaptive
Also of interest, and closely related to the scenario, automation, trust, and self-confidence in fault
is the degree to which the cost of the automation management of time-critical tasks. Journal of
failure is varied. Lastly, a multifactor experiment Experimental Psychology: Applied, 6(1), 44-58.
Parasuraman, R., Mouloua, M., Molloy, R., & Hilburn, B.
should be conducted in which all of the (1996). Monitoring of automated systems. In R.
aforementioned factors are systematically varied to Parasuraman & M. Mouloua (Eds.), Automation and
determine their levels of interaction. Those results human performance: theory and applications (pp. 91-
coupled with this single factor experiment will 115). Mahwah, NJ: Lawrence Erlbaum Associates.
provide a more complete picture of the complex Parks, K. S. (1997). Human error. In G. Salvendy (Ed.),
Handbook of human factors and ergonomics (pp. 150-
dynamics involved with determining and predicting 173). New York: Wiley-Interscience.
operator trust and reliance in automation. Sanchez, J., Fisk, A. D., & Rogers, W. A. (2004, March). Age-
related and reliability related effects on trust of a
ACKNOWLEDGMENTS decision support aid. Poster session presented at the
2004 Human Performance, Situation Awareness and
Automation Technology Conference, Daytona Beach,
The research team would like to extend a very FL.
special thanks to Neta Ezra for her contributions to Skitka, L., Mosier, K., & Burdick, M. (2000). Accountability
programming the interface. This research was and automation bias. International Journal of Human-
supported in part by contributions from Deere & Computer Studies, 52, 701-717.
Company and we thank Jerry Duncan and Bruce Wiegmann, D., Rich, A., & Zhang, H. (2001). Automated
diagnostic aids: the effects of aid reliability on users’
Newendorp for their support and advice on this trust and reliance. Theoretical Issues in Ergonomics
research. This research was also supported in part Science, 2(4), 352-367.
by a grant from the National Institute of Health
(National Institute of Aging) Grant P01 AG17211