You are on page 1of 7

No

33

SA
M
PL
E
CH
AP
TE
R
David Dylan Thomas

DESIGN FOR
COGNITIVE BIAS
Foreword by Yasmine Mustafa
1 WHAT IS BIAS?
Think about the decisions you’re making at this very
moment. How fast to read. Whether or not to keep reading.
How to sit (or stand). What to do with your arms. How fast to
breathe. What to pay attention to in the room you’re in. What
to ignore. If you had to think carefully about each and every
one of these decisions, you’d probably dissolve into a puddle
of goo right here and now.
Do you ever wonder how many decisions that adds up to?
Even in just one day? There’s no good way to quantify “a deci-
sion,” so to get an approximate idea, let’s consider how much
information actually enters your brain in a given day: roughly
eleven million bits (http://bkaprt.com/dcb/01-01/). If we imagine
that between your nonconscious and your conscious mind you
are saying “yes/no” or having some other binary response to
each bit, we quickly get to 950 billion of those responses (which,
for the sake of argument, we’ll call “decisions”) in a day.
So that’s, you know, a lot of decisions. The good news is your
mind puts the vast majority of these decisions on autopilot.
They happen without you even knowing it. In fact, scientists

W h at I s B ias ? 1
estimate that about 95 percent of cognition happens below the
threshold of conscious thought (http://bkaprt.com/dcb/01-02/,
PDF). All of this saves you the time and effort of thinking care-
fully about things that probably don’t warrant it.
For the most part, the shortcuts your mind takes are a good
thing. They let you be present and focus on the big-ticket items
in life without sweating the small stuff. Sometimes, however,
the shortcuts lead to errors. We call these errors cognitive biases.

MOSTLY HARMLESS
Some of these biases are amusing. For example, illusion of control
is a bias where we like to think we’re in charge in situations
where we clearly are not. A simple example would be a game
where we have to throw a die, like craps. If we have to roll a
high number, we’ll throw the die really hard. If we have to roll
a low number, we’ll roll the die really gently (http://bkaprt.
com/dcb/01-03/). Obviously, the speed with which we throw
a die has no impact on the outcome, but we like to think it
does, and so we embody that with our throw. Silly, sure. But
mostly harmless.
Another fun bias to play with is reactance. This is the “you
can’t tell me what to do” bias. If one wall has a sign that says,
“Please do not write on this wall” and another wall has a sign
that says, “UNDER NO CIRCUMSTANCES SHOULD YOU
WRITE ON THIS WALL,” guess which wall is going to get more
graffiti (http://bkaprt.com/dcb/01-04/).
Reactance can show up in unpredictable ways. Say you have
a corridor and at the end of the corridor is a table with a bunch
of different kinds of candy on it. If you tell someone to “go
down that hallway and pick three pieces of candy, any candy
you want,” generally that person will pick three pieces of their
favorite kind of candy. If, however, the corridor is weirdly
narrow, and you give the same instruction, the subject is more
likely to pick three different kinds of candy, even if they aren’t
their favorite (http://bkaprt.com/dcb/01-05/, PDF).

2 D E S I G N F OR CO G N I T I V E B I A S
The idea is that the weirdly narrow corridor primes the sub-
ject to feel like their choices are being taken away. Reactance
kicks in by the time they reach the end of the corridor; they
express their independence the only way they can by, dammit,
picking three different pieces of candy no matter what the
corridor says!
We are so weird.
This, by the way, is the first example we’ll see of design (in
this case the design of the corridor) influencing behavior. It
will not be the last.
Reactance is so strong, in fact, that you can’t even tell yourself
what to do. There’s an experiment where one group of people
writes “I will” twenty times and another group writes “Will I?”
twenty times. The ones who write it as a question are better at
solving puzzles and display a greater intent to exercise than the
ones who are, effectively, telling their future selves what to do
(http://bkaprt.com/dcb/01-06/). Marketers, take note.

NOT SO HARMLESS
Some biases aren’t so cute. Some biases lead to decisions that
cause harm. We’re going to spend most of our time together
talking about those biases. Let’s start with one you’ve proba-
bly heard of.
Confirmation bias is pretty much what you think it is. You get
an idea in your head and you go looking for evidence to confirm
that it’s true. If any evidence comes up to challenge it, you cry
“Fake news!” and move on with your life.
One of the most trenchant examples of this came up during
the Iraq War. In the buildup to the conflict, the narrative the
administration established was that Saddam Hussein, then the
president of Iraq, was hiding weapons of mass destruction
(WMDs) that, if left unchecked, could someday be used against
us. It was a compelling argument.
As it turned out, WMDs? Not so much. As early as 2004, the
President of the United States, the one who insisted there were
weapons of mass destruction in the first place, acknowledged

W h at I s B ias ? 3
there were no weapons of mass destruction. Despite this, when
polled in 2015, 51 percent of Republicans (and 32 percent of
Democrats) still believed Iraq had WMDs (http://bkaprt.com/
dcb/01-07/). It should be noted that 52 percent of Fox News
viewers, a source that would confirm that belief, also believed
there were WMDs in Iraq.
Damn straight we’ll be coming back to this bias.

WHAT LIES BENEATH


Harmless or not, cognitive biases are especially difficult to
counter for a number of reasons.
First of all, you may not know you have them. There’s even a
bias called the bias blind spot, where you think you’re not biased
but you’re sure everybody else is.
Remember what we said about 95 percent of cognition
happening below the threshold of conscious thought? This is
another reason bias is hard to spot. We’re making decisions,
generally, on autopilot. So the next time someone asks you
why you did something, the most honest answer you can give
is “How the hell should I know?” (Keep in mind this is also true
when you ask your users why they do what they do.)
There’s even research to indicate your body makes decisions
before your consciousness does. In The User Illusion, Tor Nørre-
tranders describes how your body reacts to stimuli before your
conscious mind even has a chance to register the reaction. But
your mind is extremely good at fooling you into thinking you’re
in control, so from your point of view, you saw something,
decided how to react, and then reacted. In reality, though, you
saw something, your body did a thing, your mind realized your
body did a thing, and then came up with a cover story that said
“Yeah, that was all you.”
We can even see this disconnect in our curious inability to
recognize our own voice. If someone listens to a recording of
their own voice mixed in with recordings of others’ voices,
they will be inconsistent in their ability to identify their own

4 D E S I G N F OR CO G N I T I V E B I A S
voice. However, if that same person is hooked up to a galvanic
skin-response machine, their body will react every time they
hear their own voice (http://bkaprt.com/dcb/01-08/). It’s as if
our bodies know the difference, but our minds are unwilling
or unable to acknowledge it.
Either way, there’s a lot going on under the hood that we
never get a peek at.

KNOWING IS, MAYBE, TWO


PERCENT OF THE BATTLE
Finally, even if you are aware of the bias, you yield to it anyway,
for the most part.
A phenomenon called anchoring describes a scenario where
some of your decisions can be influenced by irrelevant but
easily retrievable factors. In practice, it looks like this: I can
ask a room full of people to write down the last two numbers
of their social security number, and then ask each of them to
bid on a bottle of wine. Two completely unrelated tasks, right?
The folks who wrote down a low number will bid lower for the
bottle of wine than the folks who wrote down a high number
(http://bkaprt.com/dcb/01-09/, PDF). Anchoring. It’s a thing.
Reminding people of the unrelatedness of the anchor doesn’t
help (http://bkaprt.com/dcb/01-10/). Effectively, if I tell that
roomful of people before the experiment begins exactly what’s
going on—that there’s this thing called “anchoring” and that
if they write down a high number they’ll bid high and if they
write down a low number they’ll bid low and to not do that—
they’ll still do it.
It gets worse.
Monetary incentives also fail to move the needle (http://
bkaprt.com/dcb/01-11/). For all intents and purposes, if I tell
that roomful of people that I will pay them cash money not to
commit the bias, they will. Still. Do it.
Pray for us.

W h at I s B ias ? 5
THE MYTH OF THE RATIONAL USER
Okay, so we have these things called biases and they sometimes
lead us to make errors that can be harmless or harmful but I’ve
got wireframes to deliver. Why should I care?
Because of choice architecture.
Choice architecture is a concept introduced in the book
Nudge by Richard Thaler (who would go on to win a Nobel Prize
for some of this work) and Cass Sunstein. The basic concept is
that how an environment is designed influences the decisions
people make in that environment.
A simple example is when you go to the store to buy pro-
duce. You may know the accepted wisdom that you never pick
from the top of the pile because the grocer will put the oldest
produce there, since that’s the produce they’re most trying to
get rid of. And why do they think that will work? Because most
people will reach for what’s right in front of them. It’s a bias
(or just laziness—it’s hard to tell sometimes). The interaction
has been architected to benefit the grocer. It could just as easily
have been architected to benefit the customer by putting the
freshest fruit on top.
As I mentioned before, we are in the business of helping
people make decisions. So, think about how your users make
decisions. And not the way a rational user makes decisions,
which it should be clear by now is how no user ever makes
decisions. But rather how a tired, busy user making most of
their decisions below the level of conscious thought makes
decisions. That’s why we need to understand bias. Because
that’s where our users live 95 percent of the time.
Luckily, we can make design and content strategy choices
that will help mitigate cognitive bias or, sometimes, even use
it for good. But first, we need to know what we’re up against.
Let’s start with biases that affect our users.

Read more when you buy the book!

6 D E S I G N F OR CO G N I T I V E B I A S

You might also like