You are on page 1of 47

Human Centered Design

HC1: Intro to theory of HCD

HC2: Formulating research questions and goals; methods for analyzing


context-of-use; research ethics

HC3: Data analysis & interpretation

HC4: Methods for communicating insights; methods for ideation &


conceptualization

HC5: Methods for low & mixed fidelity prototyping

HC6: Intro to expert review and methods for evaluating prototypes/products


with users

HC7: Analyzing evaluation findings (incl. triangulation) & formulating design


recommendations

1
Human centered design HC1
What is human-centered design?
 Human-centered design puts users’ needs and wants at the center of the design process
from start to finish.
 Human-centered design calls for involving users throughout the design process via a variety
of research and design techniques so as to create meaningful, usable, and accessible
products for them.
 A methodology? A prescribed process?
o Each design project is different, so one unique methodological approach wouldn’t
work.
o Design is too iterative, contextualized and focused on finding the best solution to
follow a standard process.
 A philosophy? A framework or way of thinking?
o We’re designing for real world design problems with real world challenges, so a
philosophy isn’t enough.
 Let’s work with both methodology and philosophy.
o Set of guiding principles
o Set of methods and techniques

Why do we involve users?


 Case 1 – the interface that killed Jenny
o 4 year old girl with cancer who needed a lot of fluid
o Information overload
o Information was missed (very small on screen)
o Bad interface design

 Case 2 – air inter flight 148


o Plane crashed in mountains
o Bad interface design
o V/S was seen as PFA

 Prevents us from designing products that lead to mistakes – safe use


 Allows us to design products that are intuitive, that don’t make the user think – usability
 Allows us to design products based on users’ preferences and needs, products they actually
enjoy using – user experience
 Helps us to design products that are of value to users in real life – utility and meaningfulness

2
Human or user-centered design

 Stages of UX
o All stages are important to consider as a designer
 We want to focus on all stages of the user experience.
 We need to take in account more characteristics of our users than those of their role as user
alone.
 It’s a holistic view.
o Usability vs. UX
 Usability:
 Usability is a quality attribute that assesses how easy user interfaces
are to use
o Effectiveness
o Efficiency
o satisfaction
 UX:
 User experience is a person’s perceptions and responses that result
from the use or anticipated use of a product, system or service
o Emotions
o Attitudes
o Fun
o Perceived ease of use
o Mood
o Motivation
o Expectations

The human-centered design process


 Analysis of context-of-use
 Conceptualization
 Design
 Evaluation
 Iterative process
o We keep repeating the steps with the knowledge of the previous steps.

3
 Understand and specify the context of use = analysis of context-of-use
 Specify the user requirements = conceptualization
 Produce design solutions to meet user requirements = design
 Evaluate the designs against requirements = evaluation

The 10 commandments of human-centered design


1. Involve users early
 To understand who we design for, what they want and need, and the environment in which
they will use your design.
 To match users’ mental model of the world and our product:
o How do they think about the product? How do they think it works?
 To prevent or correct design choices that don’t work

User roles
 User
o Don’t have a specific task to a user, just watch
 Tester
o Build a prototype and ask users to test it, specific task
 Informant
o Also ask users to reflect on other parts in the design process
 Design partner
o Involve users in the design process. They are involved in every design decision

4
2. Involve users often
 Iteration is core to human-centered design

3. Be polite
 Users will try to make using a product feel simple.
 Don’t make users think hard about how to use a product.
 Try to understand problems users may have with a product, don’t blame them.

3. Know your users


 We are not our users, never
 We can’t know who our users are unless we interact with them, do research with them, gain
empathy
 What do we need to know?
o Motivations
o Emotions
o Cognitive capabilities
o Preferences
o Physical capabilities
o Problems and needs

4. Design for emotion


 3 levels at which product design strikes us:
o Visceral: our reaction to the appearance
 Example citrus juicer, looks nice, doesn’t work
o Behavioral: our feelings about using the product
 Example ugly chair, but the chair has memories
o Reflective: how we see ourselves using the product

Glossary
 Conceptualization
 Context-of-use
 Empathy
 Evaluation
 Iteration
 Mental model
 Prototype
 Usability (testing)
 User needs
 User experience (UX)

5
Human centered design HC2
Analysis of context-of-use
Group project
Design team roles
 Researcher takes the lead in
o Involving users
o Preparing and carrying out data collection
o Analyzing and interpreting research data
 Designer takes the lead in
o Generating ideas based on research insights
o Visualizing design ideas
 Personas
 Storyboards
 Wireframes
 UI sketches
o Communicating plans and finding (internally and externally)
 Visually
 Communicator takes the lead in
o Team collaboration
o Project management
o Communicating plans and finding (internally and externally)
 Written
 Orally
 Developer / maker
o Translating design ideas to tangible artefacts, e.g.
 Storyboards
 Wireframes
 Prototypes
o Making and remaking prototypes

Narrowing down the project scope


Suggested approach
 First brainstorm – activate prior knowledge
o What do you (as a team) already know about the design challenge?
 Problem space
 Target group
 Existing solutions
 Relevant scientific knowledge
o Identify missing information: what don’t we know yet?
 Search for missing information
o Scientific literature
 Theories
 Know causes/predictors/factors
 Drivers and barriers
o Existing data sources
 Statistiscs (e.g. Eurostat, StatLine)

6
 User forums and online reviews
o Societal information
 Government agendas
 Popular news media topics
 Public opinion
o Existing solutions
 Which problems do these not solve yet?
 Second brainstorm – combine prior and new knowledge, choose a project focus
o Share new knowledge
o Revisit design challenge
 Probable causes/determinants/solution spaces
 When would the problem be solved?
 Choose one approach to address the problem
o Formulate preliminary project goal and scientific contribution
 Formulate project goal and research question

Project goal and scientific contribution


Research vs. design
Research Design
Purpose General knowledge Specific solution
Result Abstracted Situated
Orientation Long-term Short-term
Outcome Theory Realization

Research for design


 Doing research became a recognized part of designing products
 Generating knowledge that is needed to design a product
 Conducting studies to learn specific information about the situation for which the design is
made

Research into design


 Doing research into design methods

Research through design


 Design activities and designed artefacts are used to generate and communicate knowledge.
 Generating knowledge that other researchers and designers can use, e.g.
o Stimulus material others can use in their products
o A new combination of factors as a provocation for discussion
o Interactions that were not possible before
o Insight that can be shared with and used by others

Intermediate level knowledge


 Patterns
o Typical solution that can be applied in other products too.
 Design-oriented research practices create opportunities for constructing knowledge that is
more abstracted than particular instances, yet does not aspire to the generality of a theory.
We call this middle territory intermediate-level knowledge” (Höök& Löwgren, 2012)

7
Project goal and intermediate level knowledge
 Besides a general project goal, you should think about the intermediate level knowledge you
aim to generate (= scientific contribution)
o Either by reflecting on your entire project
o Or by addressing a specific (scientific) research question in one of the HCD phases

Project vs. project phase


 Besides your main project goal
o Formulate separate (and smaller) goals for each HCD phase
 Make sure these contribute to your main project goal
 What is it that you want to achieve in this HCD phase and how does it help
you to reach your project goal?
o Translate each phase goal to research questions (RQ)
 Make sure these contribute to your phase goal
 Which questions do you need to answer in order to reach your phase goal?

Choosing a suitable method


Formulate questions that can be answered via user research
 What do users want and need?
o What characteristics of your users may affect their use of your product?
 Cognitive
 Physical
 Emotional
o What real-life problems do users have within your domain of focus?
o How do users think about a topic? What does the mental model of users look like?
o Which elements in the users’ context may affect their use of your product?
 Physical
 Psychological
 Social
o Which tasks/activities will your users use your product for?

Consider your constraints


 Time
 Budget

8
 Access to participants
 Legal and ethical issues
Triangulation
 In new media design: An approach to data collection and analysis that uses multiple
methods, measures, or approaches to look for convergence on product requirements or
problem areas.
 Different methods
 Qualitative and quantitative methods
 Theoretical approaches (and related methods)
 Use at least two methods in you analysis of the context-of-use

Triangulation in HCD

What is the role of your users?

9
Determine what you want to know
 What do you want to be able to say at the conclusion of your study (take another look at
your research questions and HCD phase goal)?
o Attitudes
o Motivations
o Behavior (patterns, routines)
o Values and beliefs
o Experiences
o Capabilities

Determine the best way to gain knowledge


 Behavioural or attitude?
 Self-report or direct observation/measurement?
 Lab or field?
 Qualitative or quantitative?

Common methods for analysis of context-of-use


Diary studies Say/think
Interviews Say/think
Focus groups Say/think
Surveys Say/think
Field studies:
Observation Do/us, see
Contextual inquiry Do/use, see, say/think
Generative methods Do/use, see, make
Task analysis -

Design anthropology
 “Design anthropology describes the practices of anthropologists who collaborate with
designers and team members from other disciplines in order to develop new product ideas.
The primary contribution of the anthropologists lies in the ethnographic research they
conduct with users, or potential users, of the product being envisioned, in order to learn
about the everyday practices, symbolic meanings, and forms of sociality with which a
successful new product would need to articulate. Designers and other members of product
development teams draw on findings from such research to develop design ideas that fit the
lived experience of intended users.”

Generative methods
 (Very) early design phases
 Discover latent needs
 Making and storytelling
 Cultural probes
 Sensitization and generative sessions (Sleeswijk Visser et al.)
 Sensitization
o Prepare people for different stages in the research

10
Preparing the data collection
Typical setup of methods section
 Participants
 Materials
 (Design)
 Procedure
 Analysis

Participants
 Formulate inclusion criteria: who are you designing for (and who is eligible to participate)?
 Determine a recruitment strategy
o Convenience sampling
o Snowball sampling
o Purposive sampling
 How many users?
o Magic number for user tests: (per user group)
 N = 10 – average of 95% of the user problems and at least 82%
 N = 5 – average of 85% of user problems
o Saturation: the point at which no new relevant information emerges
o Sometimes it’s inspiration that you want rather than validated findings
o Two-step approach: inspiration first, validation second

Materials and procedure


 Design/produce the materials you will need
o E.g. diaries/cultural probes
o Interview topic lists
o Questionnaire items
o Observation goals and field note templates
o Creative material for generative sessions
 Write a protocol for your study (as much detail as possible – reproducibility)

Ethical considerations
Research ethics – comfortable experience
 Research must be beneficial and cause no harm, don’t do useless research
 Participants should never feel uncomfortable, physically or psychologically
o Note that your participants will usually be slightly nervous, no matter what
 Assess all potential risks, real and perceived, and mitigate these
o Consider the study form the participants’ perspective
o Notify participants of potential (perceived) risks
 When evaluating: stress that you are evaluating the product, not the participant
 Acknowledge the participants’ expertise (they are experts of their world of experience)

Research ethics – informed consent


 Participants have the right to be informed
o Purpose of the study

11
o Procedure
o Risks
o Their rights
 Using deception is not allowed without permission from an ethical board
 Working with children and vulnerable target groups is not allowed without permission from
an ethical board
 Debrief participants about nature, results and conclusions

Research ethics – right to withdraw


 Participants always have the right to withdraw
o Without penalty
o Without giving you a reason
 Do not rick people into participation, especially if you know them personally

Research ethics – permission to record


 Always obtain permission for recording audio or image, before you start recording
 Include this permission in the consent form, incl. additional permission if you plan to show
recordings to others (e.g. in class, in your report0

Research ethics – privacy and confidentiality


 Keep participation confidential
 No identifying information should be kept with actual data
o Names – use a participant ID (and de-identify a.s.a.p.)
o Contact details
o Any identifiers really
 Store data securely, no links between participant IDs and names/contact details

Research ethics – valid and reliable data


 Always ensure that the data you collect are accurate, valid and reliable
 Never collect data you know are invalid or unreliable
 Be transparent about this

12
13
Human centered design HC3
Ethics of technology design
Human or humanity-centered design
 To be agents of positive change, we as designers need to think more broadly about the
direct and secondary consequences of our work. […] To do that, we need to integrate our
discipline with systems thinking, which entails understanding how systems work and evolve
over time. This will allow us to anticipate and mitigate the negative longer-term
consequences of well-intentioned solutions. […] We have the responsibility to evolve from
human-centered design thinkers to humanity-centered designers.

Social design
 Design is more than just decoration.
 Being mindful of role and responsibility in society as a designer
 Using the design process to bring about social change
o Designing “good” products
o Using design to provoke, to speculate about the future
 Critical of market-orientedness of conventional design practice

Games against health: a player-centered design philosophy


 “these [Games for Health] studies usually imply one type of goal: the promotion of an
ideologically-informed “good” health and behavioural change among players towards that
end.”
 “A troubling aspect of Games for Health is the unacknowledged conception of a player as a
deficient or broken entity in want of repair. ”
o Would gamers consider themselves in need of repair?
 “Games for Health are instrumentalized primarily for the good of health insurance
companies.”
o Do games for health designers really act ‘for the people’s own good’?

What should be our main aim: humans or humanity?


Human-centered design is not the holy grail
 Involving users is not enough
 Risk of HCD becoming an ‘OK stamp’
 Who should we prioritize, the end-user or other stakeholders?
o What if a health insurance company or the government funds our research?
 First step is transparency about origins and underlying values of research

Is technology neutral?
Technological utopianism
 In the 1990s:
o Technology as the solution to any problem
o Technology as intrinsically empowering
 But missteps have raised ethical awakening

14
 Every increasing ubiquity of technology and data makes people skeptical to trust tech
companies.  Is technology neutral? Or does it have its own social, political, or moral
effects?
Examples algorithmic bias
 Algorithmic bias based on user activity and concerted action by far-right groups to skew
responses.

 Algorithmic bias based on poor training data

 Algorithmic bias based on poor training data

What do you think? Is technology neutral?


View on tech neutrality
 Instrumentalism:
o Technology is just a tool that can be used for good or bad.
 Technological determinism:
o Technology is anything but neutral, it moulds society and culture acting more as our
master than our servant.

15
 Mediation theory:
o Technology is a medium through which we perceive and manipulate our world 
we don’t control tech or v.v., instead humans and technology co-create the world.
Design as applied ethics
 Every act of design is a statement about the future: in our design choices, we disregard
alternative realities.
 Law of unintended consequences: there will always be outcomes we overlook.
 We have responsibility
o To anticipate and mitigate the worst possible outcomes
o To anticipate the impact on non-users and wider society
 Algorithmic bias is a form of negligence

So how do we take responsibility?


 Futuring and speculative design (reading suggestion: speculative everything by Dunne &
Raby)
 Moral imagination:
o Design team role play: the designated dissenter
o Design for real life: highlight moments of user crisis
 Duty ethics:
o What if everyone did what we’re about to do?
o Are we treating people as ends or as means?

Data analysis & interpretation


What is the goal of analyzing qualitative data?
 Drawing insights to inform design
 Find and confirm recurring patterns
 Interpret patterns

Starting point
 Clear statement of the problem to solve (= your project goal)
 Specific research questions to answer (= your analysis phase RQs)
 Expected output
o Match analytical strategies to desired outcomes

Personas
 Fictional character representing your user groups
 Based on real data
 Research and communication tool

16
o Identify and understand users
o Present insights
 Design reflection tool
o What would Emma do?
Scenario
 Fictitious story of a user (or persona) accomplishing an action or goal
o Current
o Future
 Focus on user actions and product responses

Storyboard
 Originates in visual arts (film, comic books, animation)
 Visualizes the user’s actions and circumstances under which these are performed
 Broader picture: user and product in context

Analysis: processing raw data – digitalization


 Get all the relevant data – text, pictures, video, drawing, etc. – into a format that your team
can work on together.
 Move photos, videos, etc. into file system and label with important metadata
o Date and time
o Participant ID
o Researcher ID
 Transcribe audio and video into text

Analysis: processing raw data – transcription


 Literally write down everything that is being said (and by whom)
 Aim for exact transcription
 In case of constraints:
o Paraphrase
o Drop unnecessary words
o Make a selection
 Be very careful in deciding that something is not relevant
 Revisit disregarded data in a later stage

Analysis: data immersion & high level initial coding


 Immerse yourself in the data, i.e. read the transcripts several times
 Capture initial sights

Analysis: chunking data


 Break up large chunks of data into smaller units
o Single unit of information
 Data can be part of multiple chunks

17
Chunk 1

Medical
Chunk 2, 3, 4 and 5 incident
Personal medical history
Attitude
Caregiver relationship
patient exp.

Caregiver relationship Nature


patient exp.
Coping with consequenes
Limitations
work
Analysis: sorting data into categories
 Systematic approach to data analysis: sorting data units into meaningful categories
 Understanding/interpreting categories of data is the backbone of your analysis
o Finding patterns
o Deriving insights

Analysis: sorting data into categories – step 1


 Examine each data unit carefully: close reading
 What does it really mean in context [of everything else the participant has told you]?
 Assign ‘codes’ to data units
o Duplicate data units if they belong to multiple categories
 in the end, each data unit has one or more codes
o if a unit seems irrelevant, give it the code ‘seems irrelevant’

Codes
 A code is a short, descriptive label that describes data units:
o Subject
o Nature
o Tone
o Speaker
 Data units can belong to multiple categories (assign multiple codes or duplicate data units)
 See image ()

18
Coding
 Coding is the process of sorting the data into categories and to assign codes describing the
categories
 Deductive coding – top-down
o Predetermined categories, e.g. theory, research questions, desired outcomes
 Inductive coding – bottom-up
o Data-driven or open coding: allow data to suggest new categories
 Combination of bottom-up and top-down coding, e.g.
o Main categories for each interview topic, research question, persona characteristic…
o Allow data to suggest new categories
o Create subcategories based on data

Ideas for categories


 Basic (if no pre-defined categories):
o Activities o Motivations (drivers and barriers)
o Environments o Decision making processes
o Interactions o Skills
o Onbjects users o Activities and tasks
o Mental models o Resources
o Behaviours o Experiences and attitudes)
o Goals

Analysis: sorting data into categories – step 2


 Within each main category, identify subcategories
 Define subcategories and give it (and the data units in it) an appropriate subcode
o Again: duplicate data units if they belong to multiple subcategories
 Repeat for subcategories if necessary
o You can use different levels of subcategories
 See image ()
 (Sub)codes are always assigned to multiple units of data
o If you end up with a code that has just one data unit, but you still find this code to be
unique and valuable, you should do more research.

Analysis: sorting data into categories – step 3


 Map your coding scheme, i.e. list of all (sub)categories with definitions
 Look for patterns
o Expected and clear
o Unexpected and surprising
 Combine and divide (sub)categories
o Move data units if necessary

Coding is a messy process


 Review and revisit your categories and codes
 New codes will emerge
 Initial codes may become redundant
 Merging codes may be required
 Revisit your data (or disregarded data) once your coding scheme is complete

19
Analysis: analyse coding scheme and categories
 Use the patterns to generate insights that might create opportunities for change (i.e. design
ideas)
 Interpret the categories and patterns: what do these tell you regarding
o The problem you are trying to solve?
o The users you are designing a solution for?
o The context your solution will be used in?
o The activities your user will use your design with?
 Coding scheme provides reporting structure.

Recap: the ideal analysis process


 Processing raw data – digitalization and transcription
 Data immersion and high level initial coding
 Chunking data into single meaningful units
 Sorting data into categories;
1. Close reading and assigning main codes to data units
2. Identify sub-categories (and assign sub-codes)
3. Map coding system = i.e. list of (sub)categories with definitions
 Recode data
 Finalize coding system
 Analyzing coding schema and interpret categories  derive insights to inform design

What about non-textual data?


 Does it contain information/data that fits a (sub)category or warrants a new category?
o Add to categories/assign codes
 Can it be used to illustrate a (sub) category in your reporting?

Analysis medium
Paper Digital
Physical manipulation of data allows for Flexibility: adjusting/revisiting coding scheme is
drawing connection between pieces of data easier/speedier
Different people can work on the data Allows for sharing
simultaneously
Digitized data must be transferred onto paper Joint data analysis with multiple researchers is
difficult
Rearrangements must be done by hand Does not allow for (easy) adding of non-textual
data
Requires wall space Clouds can crash
One overeager cleaning crew may mean the
end of your analysis work

What about surveys (or other quantitative data)?


 Descriptives mostly
 Main goal remains to derive insights to inform you design

20
Human centered design HC4
Communicating insights
Reporting findings – main insights
 Use your coding scheme as main structure for the findings section.
 For the main codes you want to report, provide:
o The brief descriptive label – this can be the header
o A summary of the insights you gained, explaining subcategories
o Supporting evidence
 Quotes
 Photos
 Sketches

Reporting findings – quantitative data


 Descriptives mainly
 Triangulation: combine qualitative and quantitative findings in your insights
 Example

Report findings – recommendations


 Different types of recommendations
o Features/functionalities of your design
o Specific user requirements
o Design recommendations
 Interaction design
 Interface design
 Look and feel
 Either for each code separately or in one general section after presenting your insights

Personas

21
 Fictional characters representing your user groups

Personas – purpose
 Research tool
o Identity and further understand user groups (differences between user groups)
 Communication tool
o Present and encapsulate insights from user research
o Shared focus within design team
o Allows for remembering relevant user data in subsequent design steps
 Design reflection tool
o Mental model of user groups in a design team  predict behavior
o Evoke empathy. Prevents designers from projecting their own needs and desires
o Guide design decisions by how well a design meets user needs
 Maintain the perspective of the users: what would Emma do?
 Prioritization of features
 Explore actual use of a product (e.g. interaction, navigation)

Why would we use personas rather than the real participants in our research?
 Combine data from multiple people to create a general person (average user)
 Unethical to use someone as “persona”

Data-driven personas
 Goal-directed
o Focus on what user aims for
o Focused on the goals the users have
 Role based
o Focus on the role of users (e.g. in an organisation)
o Also focused on the goals

Persona goals
 End goals – users’ motivation for performing the tasks with your products
o E.g. not forgetting medication, being aware of one’s physical condition
 Experience goals – how someone wants to feel when using a product
o E.g. feeling in control, reassured
 Life goals – personal aspirations of users beyond your design
o E.g. reaching optimal health given one’s condition

Data-driven personas
 Traditionally mostly qualitative data
 Scientific approach also focuses on using quantitative data for persona creation
 Quickly presenting some facts about your user groups

Automatic persona generation

Engaging personas
 Focus on richer, more engaging description of a persona
o Psychological characteristics
Requires more data

22
o Social background
o Emotional relationship with design focus
 Combine data and storytelling techniques

Fiction-based personas
 Not create a new persona, but use existing personas from fictions

Personas – critique Our suggestion is to work with data-based personas


 Empiricism (which can be combined with fiction elements
 Stereotypes
 Quick and dirty methods, may prevent designers from meeting real users

Creating personas – content


 Manu different options
 Minimum content:
o Name
o Photo
o Demographics
o Descriptive section(s)
o Needs, goals and capabilities
 Other ideas;
o Attitudes, frustrations, social relationships, problems, typical day, hobbies, interests,
… (anythings that is relevant for your design goal)

Creating personas
 “The actual purpose of the [persona] method is not the persona descriptions, but the ability
to imagine the product. […] I designate these product ideas as scenarios. It is in scenarios
that you imagine how the product is going to work and be used.
 Telling stories about your product
o How is it going to work? For the group assignment
o How will it be used? you will be making a future
o In what context will it be used? scenario, featuring the
o What are the expected consequences? product you plan to design
(so you can only start doing
Storytelling in HCD this after ideation)
 Can be used to both:
o Tell the users’ current situation
 Context-of-use Communication of insights
 Problems
o Describe a users’ hypothetical future experience using a new design Conceptualisation/
 Vision on the situation after the introduction of your product ideation

Storytelling in HCD – purpose


 Using narrative techniques to engage and create empathy
o Identification
Communication of insights
o Character building

23
 Shared understanding in design team
 Reflection in action
Conceptualisation/ ideation
 Supports idea generation
 Allows for quick user feedback on early ideas evaluation

Three types of storytelling in HCD


 Scenarios
o Written narratives
 Storyboards
o Visual storytelling
 Video scenarios/storyboards
o Richter visual storytelling

Scenario
 Written story of a user (or persona) accomplishing an action or goal
 Future, hypothetical scenarios:
o Focus on high level user actions and product responses (i.e. interaction)
o Providing sufficient detail to suggest design implications

Elements of a scenario
 Persona as the protagonist
 Setting in which the product will be used
 Beginning: a problem/goal/situation is introduced and action starts
 Action/episodes
o Primary activities of the persona – e.g. attempts to reach the goal
o Decisions the persona makes (and what influences them)
o Task flow
o Tools used
o User experience
 Ending
o Expected result of the product introduced
o Resolution: how has the product changed the existing situation

Scenario – example

24
Storyboard
 Visual story of a user (or persona) accomplishing an action or goal
 Originates in visual arts (film, comic books, animation)
 Future, hypothetical scenarios:
o Focus on first (visual) interaction design ideas
o Focus on broader picture: user and product in context
 Zoom into key interactions
 Zoom back out to the consequences of using your design
 Adds visual storytelling elements to (further) engage and create empathy
Visual elements in storyboards
 Level of detail
o Play with zoom (context vs. interactions and interface design)
 Inclusion of text
o Usually necessary, but keep it short
 Inclusion of people and emotions
o Show how people experience the design (good and bad)
 Number of frames
o Mostly few frames per scenario

Creating storyboards – simplicity / creating storyboards – highlighting


 Using color you can direct the users attention to certain things

Creating storyboards – beyond sketching

25
Creating storyboards – SAP scenes
 Website to create storyboards

Involving users
 Validating personas
 Evaluating early design ideas
 Codesigning scenarios and storyboards

Ideation and conceptualization


 A process of creative thinking
 Generate initial ideas and develop these into concepts, offering realistic solutions to the
design problem
o An idea is a first thought that comes to mind  mental construct
o A concept is more developed, has materials, dimensions, shape, details and
technical solution principles  tangible construct

What is creativity?
 The word creativity comes from the Latin verb creo: to create, make
 Creativity = the production of novel, useful products (Mumford, 2003), something original
and worthwhile (Sternberg, 2011)
 Psychological processes involved
o Convergent and divergent thinking (Guilford, 1967)
 Convergent = come up with as many things as you can
 Divergent = focusing
o Generation and exploration (Finke et al., 1992)
o Conceptual blending (Fauconnier & Turner, 2002)

Divergence and convergence in HCD


 This is a simplified view. Convergence and divergence happen continuously (and in parallel)
throughout design. Not just twice.

26
Step 1 – from general problem to problem definition
 We started with a design challenge (general problem)
 After diverging…
o Gathering lots of insights with analysis of context-of-use
 We need to converge again…
o Identifying patterns and insights from our data
 How might we – question
 Example

Example
 General problem
o Initial general problem: Paraguay has one of the highest incident rates of mosquito-
borne diseases in the world, and government campaigns have had very little effect.
 Insights
o Important cause of the problem
 Lack of knowledge about precautions to take  education
o In rural areas
 Few computers/smartphones  public installation
o Many illiterate users  non-verbal interactions
 Problem definition
o How might we design a public installations to educate illiterate people in rural
Paraguay to take precautions to prevent mosquito-borne diseases?

Step 2 – design vision


 Your vision on the answer to the “how might we” question
 A statement of the future situation created by your design team, including:
o The main functions of the product you’re designing
o The context in which the product will be used
o The users of your product (personas)
o The use of your product (scenario/storyboard)
 Still high-level at this stage  you will ideate about how to design your product

Example
 Design vision:
o Educational content aiming for behavior change
o Public installation

27
 Attractive and engaging design
 Short interaction options
o Non-verbal design
 Game elements?

Step 3: ideation
 The process of generating ideas for design solutions
 Divergence and convergence (and repeat):
o Generation of as many ideas as possible
o Evaluation of ideas, bearing in mind the design goal and user insights
o Selecting ideas
o Further ideate/conceptualize selected ideas
 Best practices (cf. research by Nielson Norman Group):
o Involve several people in ideation
o Look to multiple sources of inspiration, and make sure to include user research as
one of these
o Employ structured ideation techniques
o Use some amount of written ideation
o Have a designated facilitator

Creativity techniques for ideation


 Creativity techniques are techniques for thinking up solutions to problems
 Generating large amounts of ideas in a short time (divergence)

Example
 Initial general focus: assisting hearing-impaired children in everyday life
 Extensive analysis of context-of-use (divergence)
o Observations
o Interviews
o Cocreation sessions

 Communicate insights
 Defining problems (convergence)
 Selecting problems (convergence) and ideation (divergence)
 Further ideation (divergence)

Creativity techniques for ideation


 Several types of ideation techniques (cf. Tassoul, 2006)
o Inventorisingtechniques

28
 Collecting and recalling all kinds of information around an issue (e.g.
mindmap)
o Associative techniques
 Generating great numbers of ideas and options within a relatively short time
through association (e.g. brainstorming/brainwriting)
o Provocative techniques
 Generating ideas by thinking outside one’s familiar frame of reference (e.g.
absurd questioning, extreme brainstorming)
o Confrontative techniques
 Identifying and breaking assumptions and preconceptions (e.g. lateral
thinking by asking questions like: “What if not?” and “What else?”).
o Intuitive techniques
 Letting go: spontaneous and intuitive idea generation and reflecting upon
the generated idea (e.g. guided fantasy)

Human centered design HC5


Introduction to card sorting
 Used a lot in the ideation phase.

Card sorting
 Emphasis on the information architecture
o What information should be included in your design?
o What is the mental model (relations, terminology, location of controls)?
o Structure and grouping of information
 Participatory method (with user/stakeholder)
 Can be done with individual participants, groups, or even remotely (digitally).

Card sorting: variations


 Open versus closed: do you allow participants to come up with groups/categories, and
add/remove/change cards? (open)

29
 Tree test: only show categories, check if participants can find items (seek task instead of sort
task)
 Delphi card sort (initial sort, followed by reviews/modifications only)
 Card sort for storytelling, prioritization, etc.

Card sorting: output


 Patterns across sessions (what the IA should be)
 Commentary during the sorting process (why it should be like that)

Introduction to prototyping
What are prototypes?
 Inexpensive
 Experimental
 Fast
 Externalizing a vision
o As a designer you have an idea for the concept. Prototypes help to visualize the idea
for the concept.
 By definition: not the real product

What can be prototyped?


 Most prototypes somewhere in the triangle.
 Role: what the experience may be like
 Implementation: what might it work like
 Look and feel: what might it look like

Goals of prototyping
 Explore design/solution space – fail fast

30
 Validate assumptions (formally/informally)
 Prove feasibility
 Communicate (collaboration, sell an idea)
 Positive side-effects include:
o Involving potential end-users, other stakeholders

What can be prototyped?


 Structure (screen layout, information display, navigation)
 Behavior (workflow, task design, interactivity)
 Presentation (visual style)
 Concepts and terminology
 Technical aspects
 Difficult/controversial/critical areas

Different prototype fidelities


Fidelity
 A prototype’s similarity to the end-product.

Fidelity
 Breadth
 Depth
 Visuals
 Interactions (functional)
 Data model (content)
 Context

Breadth and depth


 Amount of functionalities vs. de amount of features

Visuals

Interactions (functional)

31
Data model (content)

Context

Not either low or high!

Advantages of low fidelity


 Generally little effort to create
 Easy to make changes (even on the spot)
 Boosts creativity
 Free-form, generative – helps to explore user requirements
Challenges of low fidelity
 Does not provide the full experience
 Might not give you concrete answers
 Hard to measure user’s performance
 Not ready for hand-over to developers (esp. without documentation)

Advantages of high fidelity


 Convincing (sell an idea)
 Closer to the real experience (interactive, context)
 Better predictors of user performance
 Potentially usable without expert present
 Ready for hand-over to developers

Challenges for high fidelity


 Generally more time-consuming to develop
 Might create (over)expectations
 You might get different feedback

32
o People might hold back
o Detail-oriented

Almost always: mixed fidelity


 For example:
o High functional fidelity, low visual fidelity (=technical feasibility)

Prototyping methods
Paper prototypes
 + Allow you to experiment with several aspects of the concept: interactivity, layout, flow,
content
 + Generally fast and easy to make and alter
 - Interactions not fully natural

Interactive (digital) prototypes


 Replacing the human computer with an actual one.
 + More realistic interactions
 + Allows testing on the actual device (e.g. phone app)
 + Basis for increasing (visual) fidelity
 - Complex animations and real data can be challenging
 - More effort to create

Clickthrough prototypes

Rich interactive prototypes

Physical prototypes
 A 3D representation of your idea
 Helps to explore physical aspects such as shape, size, weight
 Clay, foam, cardboard, paper, LEGO, 3D printed, etc.
 Can be combined with paper prototypes
 High functional fidelity extension: physical computing (sensors and actuators)

Narrative prototypes
 + Show the product being used in context (role)
 + Help to create empathy
 + Show a particular (ideal) flow throughout a process or interaction
 + Allow you t fake advanced interactions, complex technologies
 - Lack interactivity (most of the time)

33
 - Do no allow you to identify usability problems

Narrative prototypes: role play (feat. Physical/paper)

Narrative prototypes: video

Narrative prototypes
 Think also of:
o Scale (e.g. maquette, figurines)
o Integration with paper/interactive/physical prototypes
o Stop-motion
1. Think before you make
Planning your prototyping efforts
What is the cheapest, fastest way to [goal] [feature] to/with [audience]?
 Think before you make
 Don’t prototype everything 2. Don’t prototype everything

What is the cheapest, fastest way to communicate to our team our intended interaction between
students and an interactive wall?
 Breadth – focused
 Depth – deep
 Interactions – middle
 Visuals – sketchy
 Data model – placeholder (if it is only for teammates)
 Context – stand-in

What is the cheapest, fastest way to validate whether car drivers are able to observe our public
installation in time and without causing a safety hazard?
 Breadth – focused
 Depth – deep
 Interactions – steps
 Visuals – middle
 Data model – middle
 Context – to in the field

34
Human centered design HC6
Group assignment prototype
 The type of prototype should match your research question in the evaluation phase.

Methods for involving users


 All about UX is a website for good evaluation methods

See-Say-Do methods
 Say – listen to what the user has to tell you about the experience (diary for example)
 See – on behalve of the researcher, see what the user is doing, observing
 Do – what the users are doing, also when the researcher is not watching (see what the user
click on for example)

35
Distinctions in evaluation methods
Goal of evaluation in the design phase
 Formative methods
o Discover insights and shape design direction
o For concepts, scenarios/storyboards, low-fidelity prototypes
 Summative methods
o Evaluate design against a set of metrics (time-on-task, success rate)
o For high-fidelity prototypes and fully functional products
o More doing

Focus of the evaluation on usability, UX, and/or utility


 Usability/ease of use
o Identify design problems that hinder effectiveness and efficiency
 Say, see and do methods
 User experience
o Assess the user’s perceptions of and responses to your design
 Say methods
 Utility/usefulness
o Test to which extent you design solves user’s problem
 Say and see methods
 Many (traditional) evaluation methods in HCD focus on usability. Make sure to focus on UX
and utility as well.

Period of experience – when to evaluate?


 During and after are most used, but you can also do an evaluation before users use the
product (for expectations).
 You can also evaluate for a longer period.

36
Quantitative vs. qualitative
 Quantitative
o Usability testing: evaluating user performance against metrics;
 Time-on-task
 Task success
 Error rate
 Efficiency
o Log data and analytics
o Self-reported behavior
 Scales and surveys
 Diary
 Qualitative
o Attitudes and opinions
 Interview and focus group
o Observable behavior
 Observation
o Self-reported behavior
 Diary study
 Think aloud

Psychophysiological metrics (quantitative)


 Eye tracking (attention and processing)
 Facial expressions (emotions) Arousal and psychological
 Heart rate value (positive or negative)
 Skin conductance and temperature
 Brain activity (EEG, fMRI)

Thinks aloud (qualitative)

37
 Verbalization of thinking process
 Helps you discover what users think (e.g. why they misinterpret your design)
 Good in addition to quantitative measures
 Might affect performance measures
o Performance time
o Eye tracking

Expert review methods


Expert review
 (Usability) inspection method
 Analysis by 3-5 UX experts to identity (usability) problems
 Evaluation against a set of criteria
 Advantages:
o Discount method: can save time and money
o Can be done in any stage in the design process
o It’s possible to review isolated design elements
 Disadvantages:
o Experts are not users
o Requires knowledge and experience

Evaluation against a set of criteria – (usability) heuristics


What is a heuristic?
 A rule or method, based on experience, that helps you to solve problems faster
 Shortcut, rule of thumb

Heuristic evaluation
 Judging compliance of a design with recognized usability principles
 Most well-know: Jacob Nielsen’s 10 usability heuristics

 Used as a starting point for more specific heuristics

Nielsen’s usability heuristics: some examples


 Gmail log-in (good)
o Error prevention
 Helping users to prevent them from making a weak password.
 Capstone log-in (bad)
o Help users recognize, diagnose, and recover from errors
 Google drive

38
o Visibility of system status
 Does give the user a visualization of the progress of the uploading
 Also update because you can see 1 of 1 uploading, so if you uploaded 3 files,
you know something is wrong.

Category-specific heuristics
Category-specific heuristics
 Nielsen’s usability heuristics are very generic and usability-oriented
 In addition, we need more specific criteria to evaluate our designs against
o User group (children, people with Down syndrome)
o Task (specific task like gamification or game design)
o Context (specific context)
o Class of products

Generating category-specific heuristics


 Existing design guidelines
 Previous scientific studies
 Scientific theory
 Competitor analysis
 Good example of intermediate level knowledge (knowledge that comes from a research
project that gives guidelines that other researchers and designers can use)

Category-specific heuristics: some examples


Interactive systems for children with autism spectrum disorder
 Existing literature on interactive systems for children with ASD was reviewed
 70 design guidelines were identified
 The guidelines were mapped to Nielsen’s heuristics
 Unmappable guidelines were regrouped and translated into 5 new heuristics
o 13. [Responsiveness] of the system
 Each action performed by children with autism (for instance click and select)
should have no latency as children with autism have shorter attention span,
typically forget quickly and can easily get frustrated.

Heuristics for mobile healthcare wearables


 Self-determination theory as a theoretical framework
o Self-determination theory
 Is about motivation
 3 basic needs for motivation
 Need for autonomy
 Need for competence
 Need for relatedness
 In-depth diary study to explore self-efficacy and user motivation for 2 fitness tracking apps
o Questionnaires
o Log app use
o Reflect on app use
 Findings were translated into heuristics for fitness tracking tools and apps

39
Critical look: any concerns with Asimakoupolo’s paper
 Representation bias: only highly motivated participants, while one of the problems identified
in the introduction is abandonment of fitness tracking devices
 Lack of transparency about how heuristics were created

Applying heuristics: exercise


Expert review exercise
 In your review:
o List usability strengths (incl. short explanations)
o List usability problems
 Which heuristic was violated?
 Why does this represent a problem?
 A recommendation of how to address the issue

40
Human centered design HC7
Which methods do we expect you to know?
See/say/ Self-report/ Qualitative or Formative or Advantages & Usability Before/during/
do? observation/ quantitative? summative? disadvantages or UX? after usage or
measurement? over time?
Usability Do Observation & Quantitative Both Usability During usage
testing measurement
Interview Say Self-report Qualitative Formative Both Before, after,
over time
Focus group Say Self-report Qualitative Formative Both Before, after,
over time
Survey & Say Self-report Both Both Before, after,
scales (qualitative over time
for open
questions
only)
Diary study Say Self-report Both Formative Both Before, after,
over time
Observation See Observation Both Formative During usage

Expert review n/a n/a n/a Formative Both, n/a


usability
mostly
Think aloud say Self-report Qualitative Formative Both During usage

Analyzing evaluation findings


Different types of findings
 Performance metrics
 Issue-based metrics
 Self-reported metrics

Performance metrics (do)


 Usability testing = evaluating user performance against metrics
 Making user behavior (i.e. interaction with a design) measurable:
o Task success
o Time on task
o Errors
o Efficiency
o
Performance metrics: task success
 Have participated reached their goal?
 Requires clear tasks and success criteria
 Unsuccessful could be:
o Participant gave up
o Time limit was reached
o Researcher intervenes
o Incorrect answer
 Many different ways to report task success rate

41
Performance metrics: task success
 Graph = of successful completion rate by task
o Some tasks have lower success rates, thus the design might be changed for those
taks.

 Squares / tasks = 1 = easy, 10 = complex task

 Just indicating whether participants were successful doesn’t help in improving your design
o Where did they encounter problems?
o Why did they encounter problems?

Performance metrics: time on task


 How long did it take participants to reach their goal?
o i.e. efficiency
 reporting time on task alone has no value: what is a good time on task?
o Short time  not hindered by issues
o Short time  engagement (e.g. in games or learning)
 Interpret time on task: what causes shorter/longer times?
 Never combine time on task with think aloud!

Performance metrics: time on task


 Graph where mean time on task is shown for different tasks. You can compare times for
different iterations to see if your new iteration is better.
 You can also do comparisons between different kinds of users. You can examine if the task is
achievable for different kinds of users.

42
Performance metrics: errors
 Did participants make any errors while attempting to reach their goal?
 Were participants able to recover from errors they made?
 Identifying and classifying errors helps to understand the nature of design problems
o Coding!
 Errors could be:
o Entering incorrect data
o Making a wrong choice in a menu or drop-down list
o Incorrect sequence of actions
o Failing to take action
 Reporting can include both error rates and nature of errors

Performance metrics: error rates


 Table from evaluation in which the error rates are listed
o The researches came up with errors beforehand, but they easily forget errors

Performance metrics: error nature


 Usability testing of diabetes insulin pomp
 Interview findings: several participants were double dosing themselves
 Three main reasons:
o User rounded up dose amount
 Cognitive or perception error
o User did not understand difference between dose types
 Cognitive error
o User accidentally pressed a button too many times
 Action error or cognitive error

Performance metrics: efficiency


 How efficiently did participants reach their goal?
o Time-on-task is also a metric of efficiency
o Here we mean the number of actions needed to complete a task
 Gives more insight into nature of design problem than time on task
 Requires benchmark: what is the ideal path?
 Which actions to measure?
o Mouse clicks (on what?)
o Keystrokes

43
Performance metrics: deviation from optimal path
 Where in the sequence of tasks do users do something different (compared to the ideal
path)
 Compare users’ paths to optimal path

Issue-based metrics
 Usability issue
o Identification and description of a design problem one or more participants
experienced
o An assessment of the underlying cause of the problem
 Examples:
o Behaviours that prevent task completion
o Expressions of frustration
o Missing crucial information
o Misinterpretation of design features or information

Issue-based metrics
 Who can identify issues?
o Experts
o Users
 In-person studies (with think aloud)
 Verbatim comments (scales open text fields)
 In-person studies (see and say)
o Analysis of:
 Verbal expressions (think aloud)
 Confusions, frustration, dissatisfaction, surprise
 Confidence, pleasure
 Non-verbal expressions (observation)
 Facial expressions
 Bodily reactions
 Eye movements

Self-reported findings
 Provides information about the users’ perception of your design and their interactions with
it
 That is: UX
 Prone to bias, e.g. social desirability bias

Self-reported findings (say)


 Survey items and rating scales
 Comments and reflections (e.g. in interviews/focus groups, diaries)
 Post-task or post-session
o Difficulty
o Ease of sue
o Satisfaction

44
o Task support by design
 Expectations vs. experience

Self-reported findings
 Post-task questionnaire
 Quotes can illustrate a design problem very well

Reporting evaluation findings


Reporting – frequencies
 Frequency of unique issues
o Especially useful when comparing iterations
 Frequency of issues per participant
o Gives an indication of the overall usability of a design
 Frequency of participants
o Percentage of participants who encountered a specific issue
o Relevant for severity of an issue

Reporting – sorting issues


 Issues by interface/interaction category, e.g.
o Navigation
o Content
 Issues by task
o Combine performance and issue-based metrics with post-task self-report comments
(i.e. what and why)
 Issues by user group
 Issues by severity

Reporting – severity ratings


 Based on impact on user experience, e.g.
1. Catastrophic, nearly unusable for all users: imperative to fix
2. Major problems, but users are still able to perform tasks: important to fix
3. Minor problems form some users: fixing is less important
4. Cosmetic issues only: interesting to fix if time
 Based on a combination of factors
o Impact on UX From a scientific research
o Predicted frequency of occurrences perspective (i.e. intermediate
o Impact on business goals level knowledge), this is the
o Technical implementation costs most important factor though

Reporting – qualitative findings


o Coding
o Results section
o Don’t report frequencies, however you can mention the number of participants
mentioning the issue
o You can summarize the issues in a table
o Introduce your list of issues in a meaningful order
o Illustrate each issue with the 2-3 most representative quotes/observations

45
o Describe the nature and relevance of each issue and how it is important for usability

Reporting – qualitative findings


o Example

Reporting – combining findings (triangulation)


o Different methods
o Qualitative and quantitative methods
o Theoretical approaches (and related methods)
o Use two or three from the do, see, say methods. Easy to use one say method.

Reporting: why triangulation?


o More convincing, persuasive recommendations
o Greater rigour
o More in-depth understanding (“what” information + “why” information)
o More richness, varied set of data
o Reduce “inappropriate certainty” that not much is wrong with a design
o Prioritizing requirements

Reporting: wrapping up identified (and prioritized) issues


o Where to focus efforts to improve the design?
o The overall usability of the product
o Is the usability improving with each iteration? Severity
o You can create a table of all the usability issues, with definition, recommendations and ranking is
severity ranking. more useful

46
47

You might also like