Professional Documents
Culture Documents
Heuristic Studies-Merged
Heuristic Studies-Merged
Made By:
Zeeshan Arshad 18 ARID 5444
Saad ul Haq 18 ARID 5422
Ameer Hamza
Outline
• Usability heuristics
• Heuristic evaluation
Usability heuristics
• Heuristics:
– rules of thumb that describe features of usable systems
• As design principles:
– broad usability statements that guide design efforts
– derived by evaluating common design problems across
many systems
• As evaluation criteria:
– the same principles can be used to evaluate a system for
usability problems
– becoming very popular
• user involvement not required
• catches many design flaws
Usability heuristics
• Advantages
– the minimalist approach
• a few guidelines can cover many usability problems
• easily remembered, easily applied with modest effort
– discount usability engineering
• cheap and fast way to inspect a system
• can be done by usability experts and end users
• Problems
– principles are at the ‘motherhood’ level
• can’t be treated as a simple checklist
• there are subtleties involved in their use
Visibility of system status
• The system should always keep users informed
about what is going on
• Appropriate feedback within reasonable time
Visibility of system status
What mode
am I in now?
What did I
select? How is the
system
interpreting
my actions?
Match between system and real world
• The system should speak the users' language
– Words, phrases and concepts familiar to the user
– Not system-oriented terms
• Follow real-world conventions
• Put information in a natural and logical order
Match between system and real world
Match between system and real world
User control and freedom
• Users often choose system functions by
mistake
– They need clearly marked “emergency exits”
– Let them get back without extended dialogue
• Support undo and redo
User control and freedom
User control and freedom
Consistency and standards
• Users should not have to wonder whether
different words, situations, or actions mean
the same thing.
• Follow platform conventions
Consistency and standards
• Main concepts are to improve analyzing the interface and judge its compliance with
recognized usability principles ( heuristics )
1. Pre-evaluation training
2. Evaluation
3. Severity rating (Priority)
4. Debriefing
Problems & Evaluators
• No evaluator finds everything
• Some find more than others
Predictive Studies
This material has been developed by Georgia Tech HCI faculty, and continues
to evolve. Contributors include Gregory Abowd, Al Badre, Jim Foley, Elizabeth
Mynatt, Jeff Pierce, Colin Potts, Chris Shaw, John Stasko, and Bruce Walker.
Permission is granted to use with acknowledgement for non-profit purposes.
Last revision: January 2007.
Agenda
• Evaluation
– Overview
• Predictive evaluation
– Heuristic evaluation
– Discount usability testing
– Cognitive walkthrough
6750-Spr ‘07 2
Evaluation
6750-Spr ‘07 3
Goals
6750-Spr ‘07 4
Forms
• Formative
– As project is forming. All through the
lifecycle. Early, continuous. iterative.
– “Evaluating the design”
• Summative
– After a system has been finished. Make
judgments about final item.
– “Evaluating the implementation”
6750-Spr ‘07 5
Approaches
6750-Spr ‘07 6
Tradeoffs
• Experimental • Naturalistic
6750-Spr ‘07 7
Evaluation Methods
• 1. Experimental/Observational Evaluation
– Typically with users
– Experiments (usability specifications)
6750-Spr ‘07 8
Predictive Evaluation
• Basis:
– Observing users can be time-consuming and
expensive
– Try to predict usage rather than observing it
directly
– Conserve resources (quick & low cost)
6750-Spr ‘07 9
Approach
• Best if
– Haven’t used earlier prototype
– Familiar with domain or task
– Understand user perspectives
6750-Spr ‘07 10
Predictive Eval. Methods
• 1. Heuristic Evaluation
• 2. Discount usability testing
• 3. Cognitive Walkthrough
6750-Spr ‘07 11
1. Heuristic Evaluation
(www.useit.com)
Essay: http://www.useit.com/papers/guerrilla_hci.html
6750-Spr ‘07 12
Procedure
• 1. Gather inputs
• 2. Evaluate system
• 3. Debriefing and collection
• 4. Severity rating
6750-Spr ‘07 13
Gather Inputs
6750-Spr ‘07 14
Evaluation Method
6750-Spr ‘07 15
Updated Heuristics
• Stresses
6750-Spr ‘07 16
Process
6750-Spr ‘07 17
Debriefing
6750-Spr ‘07 18
Severity Rating
• Based on
– frequency
– impact
– persistence
– market impact
6750-Spr ‘07 19
Advantages
6750-Spr ‘07 20
Application
6750-Spr ‘07 21
Somewhat Controversial
6750-Spr ‘07 22
2. Discount Usability Testing
• Hybrid of empirical usability testing and
heuristic evaluation
6750-Spr ‘07 23
6750-Spr ‘07 24
3. Cognitive Walkthrough
6750-Spr ‘07 25
CW Process
6750-Spr ‘07 26
Requirements
6750-Spr ‘07 27
Assumptions
6750-Spr ‘07 28
Methodology
6750-Spr ‘07 29
CW Questions
6750-Spr ‘07 30
Answering the Questions
6750-Spr ‘07 31
Next Question
6750-Spr ‘07 32
Next Question
6750-Spr ‘07 33
Next Question
6750-Spr ‘07 34
Example
• Program VCR
– List actions
– Ask questions
6750-Spr ‘07 35
IRB
6750-Spr ‘07 36
Administratia
6750-Spr ‘07 37
Upcoming
6750-Spr ‘07 38
Introduction to User Studies
Three Types of Studies
1. Traditional studies of users and usage
(often of libraries)
2. Studies that include Internet uses and
similar measures
3. Studies of Digital Libraries and their users
(“digital communities,” just starting)
Traditional Studies
Focus of
the study
Studies of Studies
Users of Usage
- socio-economic
factors - transaction
-ethnicity, national data, both
origin, language, automated and
etc. traditional
- non-users
Basic Problems
• Is there a theoretical basis to the analysis?
• Are the results reproducible?
• What can be changed based on the findings?
• How do we know changes will help?
Possible solution:
- Develop theoretical approach early
Typical Steps in Social Science Research
Socio-economic factors
Institutional factors
Funding Research output
Revised Bibliometric Model of Scholarly Productivity
Socio-economic factors
Institutional factors
Computer use /
Internet use
Revised Bibliometric Model of Scholarly Productivity
Socio-economic factors
Institutional factors
Computer skills/literacy
Computer use /
Internet use
Major issues:
- What sample/universe to use?
- How to measure inputs?
- How to measure outputs?
- Functional form? (linear?, additive?)
List of Selected Processes
Mail Pnews archie cnrmenu elm eudora finger ftp gladis gopher
imap irc kermit lynx mail mailtool mailx melvyl netscape newmail
nn nslookup pine popper rlogin talk telnet tin trn trn-artc
weather webster xrn rsh rcp whois
Percentage of Faculty Using Each
Service
100 94%
90
80
70 62%
% of Faculty
60
50 44%
37% 35%
40
25% 24%
30
16% 13%
20
9%
10 3%
0
WAIS
Sequence
FTP
WWW
Gopher
Telnet
Newsgroup
Supercomp
E-mail
Listserver
E-journal
ana.
uter
Services
% of Faculty Using Internet Services
from Questionnaire and Log
100
90
% of faculty using the
80
70
60 Que %
service
50
40 Log %
30
20
10
0 Gopher
FTP
WWW
Telnet
Newsgroups
E-mail
Services
Our main hypothesis in this research is that scholars'
Internet-use data add explanatory power to models of
scholarly productivity. The formal null and alternate
hypotheses are:
H0: Internet use data does not add explanatory
power to the Traditional Publication Model.
H1: Internet use data adds explanatory power to
the Traditional Publication Model.
We used multiple regressions to estimate the four models
listed in Figure 5. Since we are interested in whether a set
of one or more Internet-use variables “improves” the
explanatory power of the traditional model, the appropriate
measure is the F-statistic from a comparison of the
restricted (traditional) model with the unrestricted (new)
model.
Dependent Variable: PUBAV
Analysis of Variance
Sum of Mean
Source DF Squares Square F Value Prob>F
Model 6 56.72845 9.45474 1.230 0.3148
Error 35 269.07034 7.68772
Total 41 325.79879
Parameter Estimates
KEY:
N Number of cases