You are on page 1of 18

HUMAN–COMPUTER

INTERACTION

TOPIC 6:
INTERACTION DESIGN:
EVALUATION
LEARNING OUTCOME:
 The students shall be able to recognize
how design and evaluation are
intermeshed, identify the differences
between different evaluation methods and
select appropriate evaluation methods for
different contexts, carry out effective and
efficient evaluations and critique reports of
studies done by others.

08/24/2020 2
PERFORMANCE CRITERIA:

6.1 The Role Of Evaluation


6.2 Usage Data: Observations, Monitoring,
User’s Opinions
6.3 Experiments and Benchmarking
6.4 Interpretive and Predictive Evaluation
6.5 Comparing Methods

08/24/2020 3
6.1: THE ROLE OF EVALUATION
 Without doing some evaluation it is
impossible to know whether or not the design
or system fulfills the needs of the users and
how well it fits the physical and organizational
context in which it will be used.
 Evaluation is concerned with gathering data
about the usability of a design or product by a
specified group of users for a particular
activity within a specified environment or
work context.
08/24/2020 4
6.1.1: WHAT DO YOU WANT
TO KNOW AND WHY?
 To find out what users want and what problems
they experience, because the more
understanding designers have about user’s
needs, then the better designed their products
will be.
 The kind of evaluation helps to form a product
that will be usable as well as useful, is called
Formative evaluation.
 Evaluations that takes place after the product
has been developed are known as Summative
evaluations.
08/24/2020 5
6.1.1: WHAT DO YOU WANT TO
KNOW AND WHY?

 Reasons for doing evaluations:


 Understanding the real world
 Comparing designs
 Engineering towards the target
 Checking conformance to a target.

08/24/2020 6
6.1.2 : WHEN AND HOW DO
YOU DO EVALUATION
 Evaluation in the life cycle
 Kinds of evaluations:
 Observing and monitoring usage
 Collecting user opinions
 Experiments and benchmarking
 Interpretive evaluation
 Predictive evaluation
 Issues
 Pilot studies
 User’s rights
08/24/2020 7
6.2 : USAGE DATA: OBSERVATIONS,
MONITORING, USER’S OPINIONS
 Observation can be quite informal and there
are a number of ways of recording your
observations.
 As well as observing and measuring user’s
performance, it is important to find out what
aspects of the system users like and what
they don’t like.

08/24/2020 8
6.2.1 : OBSERVING USERS

 Direct observation
 Indirect observation
 Video recording
 Analyzing video data:
 Task-based analysis
 Performance-based analysis.

08/24/2020 9
6.2.2 : VERBAL PROTOCOLS

 Video recording is usually coupled with


some form of audio record, which is known
as a verbal protocol.
 A think aloud protocol is the term given to a
special kind of verbal protocol in which the
user says out loud what she is thinking while
she is carrying out a task or doing some
problem solving.

08/24/2020 10
6.2.3 : USER’S OPINIONS:
INTERVIEWS AND QUESTIONNAIRES

 Interviews:
 Structured interviews
 Flexible interviews
 Semi-structured interviews
 Prompted interviewing
 Questionnaires and surveys
 Closed questions
 Open ended questions
 Ranked order
 Pre- and post-questionnaires.
08/24/2020 11
6.3 : EXPERIMENTS AND
BENCHMARKING
 Considerable skill and scientific knowledge is
necessary to do well-designed experiments.
A good knowledge of statistics is also
important.
 Here we are going to discuss the main issues
involved in planning and carrying out
laboratory experiments and then to examine
benchmarking and usability engineering.

08/24/2020 12
6.3.1 : TRADITIONAL
EXPERIMENTS
 The important feature in experimental studies is
that investigator can manipulate a number a
number of factors associated with the design
and study the effect on various aspects of user
performance.
 Fundamentals of doing experiments
 Variables
 Selecting subjects
 Experimental design
 Critical review of experimental procedure and results

08/24/2020 13
6.3.2 : USABILITY ENGINEERING

 Defined as “ a Process whereby the usability


of a product is specified quantitatively and in
advance. Then as the product is built it can
be demonstrated that it does not reach the
required levels of usability.”
 By Tyldesley
 Metrics and usability specification
 Benchmark tasks
 User opinions
 Making trade-offs
08/24/2020 14
6.4:INTERPRETIVE AND PREDICTIVE
EVALUATION

 6.4.1: Interpretive Evaluation:


 Contextual inquiry
 Cooperative and participative evaluation.
 Participative evaluation
 6.4.2: Predictive Evaluation
 Inspection methods
 Usage simulations
 Structured expert reviewing
 Heuristic evaluation
 Discount evaluation

08/24/2020 15
6.5 : COMPARING METHODS

 Selecting appropriate methods and planning


an evaluation are not trivial. Many factors
need to be taken into account.
 We are going to examine issues that to be
considered when selecting evaluation
methods and planning a study and to give us
experience of critiquing the studies of others.

08/24/2020 16
6.5.1 : DIFFERENCES
BETWEEN METHODS

 The purpose of the evaluation


 Stage of system development
 Involvement of users in the evaluation
process.
 Practical considerations
 Scope of the information needed
 Ecological validity
08/24/2020 17
END OF LECTURE

You might also like