Professional Documents
Culture Documents
Moiz Hci Quiz
Moiz Hci Quiz
Moiz Hci Quiz
2020/11/18
M.Moiz
17-Arid-6396
Qno1:
Evalaution:
Evaluation tests usability and functionality of system occurs in laboratory, field and/or in collaboration
with users evaluates both design and implementation should be considered at all stages in the design
life cycle.
Evaluation is defined as “To examine and judge carefully” In order to examine and judge, we need
criteria against which to base our assessment.
Why Evaluate:
In HCI we evaluate interfaces and systems to: Determine how usable they are for different user groups
Identify good and bad features to inform future design Compare design choices to assist us in
making decisions Observe the effects of specific interfaces on users
Importance of Evalaution:
• assess extent of system functionality
• assess effect of interface on user
• identify specific problems
• Evaluation provides a systematic method to study a program
• Evaluation provides a systematic method to study a practice, intervention, or initiative to
understand how well it achieves its goals.
• Evaluations help determine what works well and what could be improved in a program or
initiative.
•
A "quick and dirty" evaluation is a common practice in which designers informally get feedback from
users or consultants to confirm that their ideas are in line with users' needs and are liked. "Quick and
dirty" evaluations can be done at any stage and the emphasis is on fast input rather than carefully
documented findings.
Predictive evaluation:
In predictive evaluations experts apply their knowledge of typical users, often guided by heuristics, to
predict usability problems. Another approach involves theoretically based models. The key feature of
predictive evaluation is that users need not be present, which makes the process quick, relatively
inexpensive, and thus attractive to companies; but it has limitations.
Observing users :
Observation techniques help to identify needs leading to new types of products and help to evaluate
prototypes. Notes, audio, video, and interaction logs are well known ways of recording observations and
each has benefits and drawbacks. Obvious challenges for evaluators are how to observe without
disturbing the people being observed and how to analyze the data, particularly when large quantities of
An evaluation framework video data are collected or when several different types must be integrated to
tell the story
Asking users:
Asking users what they think of a product-whether it does what they want; whether they like it;
whether the aesthetic design appeals; whether they had problems using it; whether they want to use it
again-is an obvious way of getting feedback. Interviews and questionnaires are the main techniques for
doing this. The questions asked can be unstructured or tightly structured. They can be asked of a few
people or of hundreds. Interview and questionnaire techniques are also being developed for use with
email and the web.
Asking experts:
Software inspections and reviews are long established techniques for evaluating software code and
structure. Experts step through tasks role-playing typical users and identify problems. Developers like
this approach because it is usually relatively inexpensive and quick to perform compared with laboratory
and field evaluations that involve users. In addition, experts frequently suggest solutions to problems..
User testing Measuring user performance to compare two or more designs has been the bedrock
Cognitive Walkthrough:
➢ Evaluates design on how well it supports user in learning task .
➢ Usually performed by expert in cognitive psychology .
➢ Expert 'walks through' design to identify potential problems using psychological principles .
➢ Forms used to guide analysis.
Heuristic Evaluation:
• Proposed by Nielsen and Molich.
• Usability criteria (heuristics) are identified
• Design examined by experts to see if these are violated
• Example heuristics
• System behavior is predictable
• System behavior is consistent
• Feedback is provided
• Heuristic evaluation `debugs' design.
Review-based Evaluation :
• Results from the literature used to support or refute parts of design.
• Care needed to ensure results are transferable to new design.
• Model-based evaluation
• Cognitive models used to filter design options
• e.g. GOMS prediction of user performance.
• Design rationale can also provide useful evaluation information
model can be used to predict user performance with a user interface, keystroke-level model can be
used to predict performance for low-level tasks.
Cooperative evaluation :
• variation on think aloud
• user collaborates in evaluation
• both user and evaluator can ask each other questions throughout
• Additional advantages
• less constrained and easier to use
• user is encouraged to criticize system
• clarification possible
Experimental evaluation :
• Controlled evaluation of specific aspects of interactive behavior
• Evaluator chooses hypothesis to be tested
• A number of experimental conditions are considered which differ only in the value of some
controlled variable.
• Changes in behavioral measure are attributed to different conditions
When u logged into your account ,the default page “home” appears so you should have to go to profile .
Covid 19
Bscs-
8 ball
Super
Mario
Now go to Profile and make a single click on previous profile pic changing icon
Covid 19
Bscs-
8 ball
Super
Mario
X
Search Facebook
This
This pcpc
Local disk1
disk
1
Local disk3
Local disk
Profile 1 Profile2 Profile 3 Profile4
Recylce
Recylce bin
Search Facebook _ X
For best quality, your profile picture should be at least 320 pixels wide and 320 pixels tall.
Updating your picture
When updating your profile picture, you can: