Professional Documents
Culture Documents
Manufacturing Engineering
Human-Computer Systems
MM4HCI
2013
Lecture 3 Evaluation methods and
guidelines
Professor Sarah Sharples
AN EVALUATION
FRAMEWORK
Outline
1. Understand what evaluation is for
2. Preparing for an evaluation
3. The range of evaluation techniques and
their uses
4. Understanding some of the practical
issues of applying evaluation methods
What is evaluation?
Involving users, and user
representatives, in the technology / ICT
design and development process in a
structured manner
Capturing responses to a design or a
design artefact
Can be carried out at any point in the
development process
fun
emotionally
fulfilling
satisfying
Efficient
to use
enjoyable
Easy to
remember
how to use
entertaining
Easy to
learn
helpful
Effective
to use
rewarding
Usability
goals
Safe to
use
supportive
of creativity
Have good
utility
motivating
aesthetically
pleasing
Source: Preece et al., 2002
What
Who
When
Where
How
Why evaluate?
Ensure a user-centred design
Easy to learn, easy to use, efficient, useful, satisfying to use
Performance /
efficiency
Drawbacks
Lo-fi
Hi-fi
Cheap
Addresses layout
Proof-of-concept
Open to participatory design
and comment (Erickson,
1995)
Complete functionality
Supports quantitive evaluation
(eg users error rates)
Marketing and sales tool
A living specification
Expensive
Time consuming
Perceived limited scope for
change
Effort
Requirements
Last
minute
panic
testing!!!
Concept
Implementation
Deployment
Formative vs summative
Formative
To inform the design process
Explorative, using partially completed artefacts
(prototypes)
Maybe more qualitative or subjective
Summative
A confirmation exercise
To ensure meets intended aims
Often against a recognised standard or set of
benchmarks (or initial requirements)
Simulation
Real world
Context
Technology
User
Experience
Tasks
Users
EVALUATION METHODS
Evaluation Approaches
Analytical
Predictive
evaluation
methods
Field study
Interpretive
evaluation
methods
Collecting
users
opinions
Lab study
Experiments
and
benchmarking
Usability
studies
Analytical - Walkthroughs
Cognitive walkthrough focus on ease of learning
Scenario-based evaluation
3 main questions:
Will the correct action be evident to the user?
Participatory design
Analytical evaluation
Advantages
Disadvantages
Experienced reviewers
Qualitative data
Description of performance/outcome
Contextual Inquiry
Originates from ethnography
Observe the entire process of interface use, from switching
on computer to going home after task completion
Disadvantages
Description of performance
or outcome
Disadvantages
Experiments provide
quantitative measures
EVALUATION IN PRACTICE
Effort
Travel
application
concepts
Indoor
navigation
prototype
testing
Lab usability
study
Presentation
of
privacy
information
Concept
Implementation
Deployment
Practical issues
Ethical issues
Develop an informed consent form
Participants have a right to:
- Know the goals of the study
- Know what will happen to the findings
- Privacy of personal information
- Leave when they wish
- Be treated politely
Summary (1)
There are many issues to consider before conducting an
evaluation study
References
Cockton, G., & Woolrych, A. (2002) Sale must end: should discount methods be cleared off
HCIs shelves? Interactions, 9 (5), 13-18.
Chevalier, A., & Kicka, M. (2006) Web designers and web users: Influence of the ergonomics
quality of the web site on the information search. International Journal of Human-Computer
Studies, 64 (10), 1031-1048.
Duh, H. B-L., Tan, G. C. B., & Chen, V. H. (2005) Usability evaluation for mobile device: a
comparison of laboratory and field tests. In Proceedings of the 8th conference on Humancomputer interaction with mobile devices and services. pp 181-186. New York, NY.: ACM
Press.
Erickson, T. (1995) Notes on design practice: stories and prototypes as catalysts for
communication. In J. M. Carroll (Ed.) Scenario-based design: Envisioning work technology in
system development pp. 37-58. New York, NY: John Wiley & Sons
NIELSEN, J. (2000a). The use and misuse of focus groups. http://www.useit.com/papers.
Nielsens Ten usability Heuristics (2001). retrieved from www.useit.com.
Sharp, H., Rogers, Y. and Preece, J. (2007). Interaction Design, Beyond human-computer
interaction (2nd edition). John Wiley and Sons:NY. Chapters 12, 13, 14 & 15.
Shneiderman, B. (1998). Designing the User Interface (3rd edition). Addison-Wesley:MA.
Standard Usability Measurement Inventory (SUMI) retrieved from http://sumi.ucc.ie/index.html
January 2008.
WCAG (1999) Web Content Accessibility Guidelines 1.0. Retrieved 28th Feb 2008, from
http://www.w3.org/TR/1999/WAI-WEBCONTENT-19990505/
Summary (2)
Different evaluation approaches and methods are often
combined in one study