Professional Documents
Culture Documents
33
SA
M
PL
E
CH
AP
TE
R
David Dylan Thomas
DESIGN FOR
COGNITIVE BIAS
Foreword by Yasmine Mustafa
1 WHAT IS BIAS?
Think about the decisions you’re making at this very
moment. How fast to read. Whether or not to keep reading.
How to sit (or stand). What to do with your arms. How fast to
breathe. What to pay attention to in the room you’re in. What
to ignore. If you had to think carefully about each and every
one of these decisions, you’d probably dissolve into a puddle
of goo right here and now.
Do you ever wonder how many decisions that adds up to?
Even in just one day? There’s no good way to quantify “a deci-
sion,” so to get an approximate idea, let’s consider how much
information actually enters your brain in a given day: roughly
eleven million bits (http://bkaprt.com/dcb/01-01/). If we imagine
that between your nonconscious and your conscious mind you
are saying “yes/no” or having some other binary response to
each bit, we quickly get to 950 billion of those responses (which,
for the sake of argument, we’ll call “decisions”) in a day.
So that’s, you know, a lot of decisions. The good news is your
mind puts the vast majority of these decisions on autopilot.
They happen without you even knowing it. In fact, scientists
W h at I s B ias ? 1
estimate that about 95 percent of cognition happens below the
threshold of conscious thought (http://bkaprt.com/dcb/01-02/,
PDF). All of this saves you the time and effort of thinking care-
fully about things that probably don’t warrant it.
For the most part, the shortcuts your mind takes are a good
thing. They let you be present and focus on the big-ticket items
in life without sweating the small stuff. Sometimes, however,
the shortcuts lead to errors. We call these errors cognitive biases.
MOSTLY HARMLESS
Some of these biases are amusing. For example, illusion of control
is a bias where we like to think we’re in charge in situations
where we clearly are not. A simple example would be a game
where we have to throw a die, like craps. If we have to roll a
high number, we’ll throw the die really hard. If we have to roll
a low number, we’ll roll the die really gently (http://bkaprt.
com/dcb/01-03/). Obviously, the speed with which we throw
a die has no impact on the outcome, but we like to think it
does, and so we embody that with our throw. Silly, sure. But
mostly harmless.
Another fun bias to play with is reactance. This is the “you
can’t tell me what to do” bias. If one wall has a sign that says,
“Please do not write on this wall” and another wall has a sign
that says, “UNDER NO CIRCUMSTANCES SHOULD YOU
WRITE ON THIS WALL,” guess which wall is going to get more
graffiti (http://bkaprt.com/dcb/01-04/).
Reactance can show up in unpredictable ways. Say you have
a corridor and at the end of the corridor is a table with a bunch
of different kinds of candy on it. If you tell someone to “go
down that hallway and pick three pieces of candy, any candy
you want,” generally that person will pick three pieces of their
favorite kind of candy. If, however, the corridor is weirdly
narrow, and you give the same instruction, the subject is more
likely to pick three different kinds of candy, even if they aren’t
their favorite (http://bkaprt.com/dcb/01-05/, PDF).
2 D E S I G N F OR CO G N I T I V E B I A S
The idea is that the weirdly narrow corridor primes the sub-
ject to feel like their choices are being taken away. Reactance
kicks in by the time they reach the end of the corridor; they
express their independence the only way they can by, dammit,
picking three different pieces of candy no matter what the
corridor says!
We are so weird.
This, by the way, is the first example we’ll see of design (in
this case the design of the corridor) influencing behavior. It
will not be the last.
Reactance is so strong, in fact, that you can’t even tell yourself
what to do. There’s an experiment where one group of people
writes “I will” twenty times and another group writes “Will I?”
twenty times. The ones who write it as a question are better at
solving puzzles and display a greater intent to exercise than the
ones who are, effectively, telling their future selves what to do
(http://bkaprt.com/dcb/01-06/). Marketers, take note.
NOT SO HARMLESS
Some biases aren’t so cute. Some biases lead to decisions that
cause harm. We’re going to spend most of our time together
talking about those biases. Let’s start with one you’ve proba-
bly heard of.
Confirmation bias is pretty much what you think it is. You get
an idea in your head and you go looking for evidence to confirm
that it’s true. If any evidence comes up to challenge it, you cry
“Fake news!” and move on with your life.
One of the most trenchant examples of this came up during
the Iraq War. In the buildup to the conflict, the narrative the
administration established was that Saddam Hussein, then the
president of Iraq, was hiding weapons of mass destruction
(WMDs) that, if left unchecked, could someday be used against
us. It was a compelling argument.
As it turned out, WMDs? Not so much. As early as 2004, the
President of the United States, the one who insisted there were
weapons of mass destruction in the first place, acknowledged
W h at I s B ias ? 3
there were no weapons of mass destruction. Despite this, when
polled in 2015, 51 percent of Republicans (and 32 percent of
Democrats) still believed Iraq had WMDs (http://bkaprt.com/
dcb/01-07/). It should be noted that 52 percent of Fox News
viewers, a source that would confirm that belief, also believed
there were WMDs in Iraq.
Damn straight we’ll be coming back to this bias.
4 D E S I G N F OR CO G N I T I V E B I A S
voice. However, if that same person is hooked up to a galvanic
skin-response machine, their body will react every time they
hear their own voice (http://bkaprt.com/dcb/01-08/). It’s as if
our bodies know the difference, but our minds are unwilling
or unable to acknowledge it.
Either way, there’s a lot going on under the hood that we
never get a peek at.
W h at I s B ias ? 5
THE MYTH OF THE RATIONAL USER
Okay, so we have these things called biases and they sometimes
lead us to make errors that can be harmless or harmful but I’ve
got wireframes to deliver. Why should I care?
Because of choice architecture.
Choice architecture is a concept introduced in the book
Nudge by Richard Thaler (who would go on to win a Nobel Prize
for some of this work) and Cass Sunstein. The basic concept is
that how an environment is designed influences the decisions
people make in that environment.
A simple example is when you go to the store to buy pro-
duce. You may know the accepted wisdom that you never pick
from the top of the pile because the grocer will put the oldest
produce there, since that’s the produce they’re most trying to
get rid of. And why do they think that will work? Because most
people will reach for what’s right in front of them. It’s a bias
(or just laziness—it’s hard to tell sometimes). The interaction
has been architected to benefit the grocer. It could just as easily
have been architected to benefit the customer by putting the
freshest fruit on top.
As I mentioned before, we are in the business of helping
people make decisions. So, think about how your users make
decisions. And not the way a rational user makes decisions,
which it should be clear by now is how no user ever makes
decisions. But rather how a tired, busy user making most of
their decisions below the level of conscious thought makes
decisions. That’s why we need to understand bias. Because
that’s where our users live 95 percent of the time.
Luckily, we can make design and content strategy choices
that will help mitigate cognitive bias or, sometimes, even use
it for good. But first, we need to know what we’re up against.
Let’s start with biases that affect our users.
6 D E S I G N F OR CO G N I T I V E B I A S