You are on page 1of 17

What are Dark Patterns?

“Confirmshaming”, “Bait and Switch”, “Roach Motel” are colorful names for a

group of confidence tricks in the best tradition of fraud. Embedded in user interfaces and

collectively called Dark Patterns, they are ploys created by designers to confuse users and

“make it difficult for users to express their actual preferences or manipulate users into taking

certain actions”. (Luguri and Strahilevitz p 44 2021). They are cheap pea and shell games

run by hucksters. And they all have an endgame to manipulate through deceit.

While the name Dark Patterns implies that it maybe be related to the Arts, they are no

more than UI ploys. Aptly, Dark Patterns are a form of Sludge; a behavioural economics

term describing a design that impedes users and prevents them from reaching their goals, and

consequently, reduces their welfare (McKay 2020). Dark patterns are digital Sludge. If you

design using Dark Patterns you are making Sludge. There is nothing artistic about it.

Deceptive practices in user interfaces are called Dark Patterns. But what is the term

for product managers and designers who choose to use Dark Patterns? According to Nodder

“Evil design is that which creates purposefully designed interfaces that make users

emotionally involved in doing something that benefits the designer more than them.”

(Nodder p8 2013). Maybe, but using Dark Patterns is certainly a choice. What is true here is

that the motivation to control choice is simple greed. Greed at the expense of the user. What

then is the best descriptive word? It is not fraudster, evil or unethical although they are all

potential characteristics of designers using Dark Patterns. A better word to use is dangerous.

The use of Dark Patterns is dangerous to the consumer and to the brands they represent.

Dark Patterns are a form of self-harm; a design boomerang that eventually comes back with

unintended consequences; a digital backfire.


It is true that some UI design is just bad, and not deliberate Dark Patterns at play. A

website may have a poorly designed UI without intent to rip-off users. But given designers

and product managers are trained in optimal HCI, it is more likely they decided to

purposefully go along the dirty road of Dark Patterns.

Do Dark Patterns work?

Dark Patterns sadly do work to control user thinking. Preying on human psychology,

Dark Patterns rely on the fact that we have a two-speed mind; we can make quick decisions

skimming over details, or we can think more slowly and take time to evaluate all the data

deeply. This is Kahneman’s Capacity theory, and it explains why Dark Patterns succeed.

Capacity theory describes thinking as belonging to either System 1 (fast/reactive) or

System 2 (slow/deliberate) processes. System 1 is automatic with little effort – it is an

unconscious reaction. System 1 is at play when we turn our head to a motorcycle backfiring.

System 2 is deliberate, slow and deploys our full cognitive functions. It is able to process

and find Dark Patterns in UI design.

But without two speeds of thinking we cannot keep up with external

stimulation. We cannot think long and deeply about every interaction we experience.

Kahneman describes this as the law of least effort where “if there are several ways of

achieving the same goal, people will eventually gravitate to the least demanding” (Kahneman

2011). We use what we know as instinct to make decisions quickly. And that presents a

vulnerability that Dark Patterns exploit.

Dark Patterns depend on our subconscious responses. Users fall prey to the designed

manipulation because they skim over details and miss the trick. It is automatic and we cannot

control this impulse to think fast. Why? The answer lies in the decisions we make

impulsively that are rooted in lessons learned from previous experiences (called Heuristics).
And so Dark Patterns work as fraud “because a design that tricks users into doing something

is likely to achieve more conversions that one that allows users to make an informed

decision” (Snyder 2012). If a design is clear without rushing the user past a decision, the user

can think and use System 2 thinking. And System 2 thinking is the enemy of Dark Patterns.

What Dark Patterns are not -and what they really are.

It is important not to confuse Dark Patterns with simple business practices of wanting

to make a profit. Satisfying a firms needs for profit does not have to be first and foremost an

exercise in dishonesty. Marketing projects seek to sell the potential of ideas, goods and

services to the consumer. It uses “processes for creating, communicating, delivering, and

exchanging offerings that have value for customers, clients, partners, and society at large”

(American Marketing Association 2021). The key word from this definition is value. Dark

patterns are not devices that build value; they are a short-term fraud grab for sales and

influence.

Dark Patterns are not anything as noble as a Nudge. A Nudge is a behavioural

economics term that describes an action to shape behaviour. Yes, Dark Patterns shape

behaviour as well. But a Nudge does so without “forbidding any options or significantly

changing their (the consumers) economic incentives.” (Thaler and Sunstein 2009). Nudges

inform, simplify, remind, use social norms and disclose. Dark patterns complicate, trap and

prey by relying on patterns of behaviour. They are designed with intent to leave consumers

the poorer. Dark Patterns are not marketing tools, and they are not Nudges. Using them is

not just another form of sales technique.

What Dark Patterns then are outright lies. For example, TurboTax is a market leader

in its product segment of tax services. It was required by the IRS to offer low-income

taxpayers free software to produce a tax return. TurboTax hid its U.S. government-mandated
free tax-file program for low-income users on its website – the location was obscured or hard

to find, which is a classic Dark Pattern. What was easy to find on their website was a ‘free’

version that cost $200. (Bloomberg 2019). In effect, the real free tax program’s landing

page was de-indexed. Obscuring information is a classic type of Dark Pattern.

This Dark Pattern example backfired badly. In December 2020, in an attempt to solve

their massive class-action legal problem (due to fraudulent behavior being detected),

TurboTax agreed to pay a 40 million dollars compensation as a class action settlement. This

was, subsequently, declined by the sitting judge as being ‘under compensation” (Bloomberg

2019). The final figure is likely to be a very large amount of money. For short term profit,

the net result to Turbo Tax is brand damage and a substantial penalty to come. If they had

their time over, would the designers and project managers still use a Dark Pattern? Likely,

no.

Dark Patterns can also use subtler deception. In a 2019, a Princeton study on 11,000

of the world’s most popular online shopping portals found more than 15% used Dark Pattern

practices that were “deceptive and information hiding in nature” (Mathur et al, p.2, 2019).

Deception is another way of saying fraudulent. Further, they found that twenty of the

reviewed shopping sites were using a random number generator to fake the number of users

viewing a product (Mathur et al, p 19, 2019). So what forms do Dark Patterns take?

What are the various types of Dark Patterns?

Maybe because Dark Patterns are not clever instruments; they are easy to detect.

Which explains why there is wide consensus on their various forms. Dark Patterns began to

be quantified in 2010 by Brignull. He considered that if a brand looks to manipulate, they

can do so by “making a page look like it’s saying one thing” , when in reality it has a totally

different intent; a sinister intent that will adversely impact the user (Brignull 2010).
Some Dark Patterns have imaginative titles, while others are more explanatory. But

all have the single intention of manipulation through leveraging human System 1 thinking.

Consider the earliest identified Dark Pattern – the Roach Motel. The idea is nasty as

it is malicious. The recipe calls for the making of a design that is a maze of selections. Each

selection leading to more choices and then more choices so the user cannot easily reach their

goal. Addressing the user’s goal is not the intention of the Roach Motel. The ploy is to keep

them from reaching it and to tire them out; resulting in the user giving up. For example,

opting out of a subscription should be simple; making the process complex, however, will

assist the vendor in keeping the user subscribed. This Dark Pattern variant uses obstruction

to block the user. It acts to impede “a task flow, making an interaction more difficult” (Gray

et al p 5 2018).

While there are several classification models of Dark Patterns, they all identify the

same variants of Dark Pattern. As noted, if you take the time to look, Dark Patterns are

obvious and blunt instruments. One model stands out as having simplified the classification

to five types of variant to which all Dark Patterns belong (Gray et al p5 2018). In this paper,

the five variant types is further edited to four types. The four types of Dark Pattern we will

now explore are Obstruction, Sneaking, Interface Interference and Forced Action (see Table

1). Table 1 captures a comprehensive collection of variants and classes them to their type.

Post this table, each variant will be explored further in detail.


Table 1
Types of Dark Pattern variants (Edited from Gray et al 2018)

Type of Dark Patten Variant


Obstruction Making a process more difficult than it needs to be.
Includes variants:

• Roach Motel
• Price Comparison Prevention
• Intermediate Currency
• Nagging

Sneaking Attempting to hide, disguise, or delay the divulging of information that is


relevant to the user.

• Forced Continuity
• Hidden Costs
• Sneak into Basket
• Bait and Switch

Interface Interference Manipulation of the user interface that privileges certain actions over
others.

• Hidden Information
• Preselection
• Aesthetic Manipulation
• Toying with Emotion
• False Hierarchy
• Disguised Ad
• Trick Questions

Forced Action Requiring the user to perform a certain action to access (or
continue to access) certain functionality.

• Social Pyramid
• Privacy Zuckering
• Gamification
• Scarcity

Types of Dark Patterns based on Table1.


1 Obstruction
As the name implies, Obstruction aims to block, stop and force users away from being

able to execute their choices with deliberate UI design. It is counter to best practice HCI and

a denial of user desires. There are four variants in this class.

1.1 Nagging/Obstruction
Nagging is a form of torture by distraction. It is design to use pop-ups to hide parts of

the interface. It is using audio to distract. Anything that blocks the user’s selections and that

appear as repeated intrusions is Nagging. It is annoying by design. The repeated obstruction

works to interrupt thinking. A user wants to change their terms? Launch a barrage of pop-

ups at them and keep the user busy. That is nagging.

1.2 Roach Motel/Obstruction


Like a mouse in a maze, the user tries to navigate a seedy navigation ploy that leads

always further away from their goal.

1.3 Price Comparison Prevention/Obstruction


The idea is to make prices and specifications on a website resistant to Ctrl-C/Ctrl-V;

the aim is to stop price shopping, which of course is a consumer right.

1.4 Intermediate Currency plotting/Obstruction


Take real money, allow the users to flip it into a virtual currency with the aim to make

a disconnect between spending real and virtual money. Your inner System 1 takes over and

you stop seeing $100 and start seeing 10,000 buy points or similar. The user spends the

10,000 points without connecting it to the seed $100.

2 Sneaking

Sneaking is “an attempt to hide, disguise or delay” (Gray 2018) the showing of

valuable and insightful information to the user. There are four variants that fit as “Sneaking”.
2.1 Hidden Costs/Sneaking
Hidden Costs manifests are undisclosed and appear just before check-out. We focus

on checking out to make the order (System 1 thinking); and not on the cost breakdown.

Methods often used include adding expensive shipping and handling charges before selecting

“order”. The price of the item until this point has been as advertised. The designer thinking

is the user will not want to cancel out of the cart after a considerable amount of time

selecting.

2.2 Sneak Into Basket/Sneaking


Another shopping cart trick is to load up unwanted items into a user’s cart and hope

they do not notice. As a user moves along the path to a purchase, the unwanted item will be

added to their cart. Without noticing, pre-selecting opt-ins have missed by the user in their

shopping dialogue and items added to their cart. If the user notices the unwanted item, the

user often is forced to exit the checkout and start over; the UI design punishes the user for the

discovery.

2.3 Bait and Switch/Sneaking


A user is ‘baited’ with an attractive offer that is a great price and product. If they take

the “bait”, a cheap and alternate product is substituted at check-out as the “switch”. For

example, a low priced suitcase is advertised on a shopping site and it is of good quality.

When the user goes to buy the case, however, it is “out of stock” and an expensive, possibly

poorer quality case, is offered. The reality is the great deal case was never in stock. If the

user’s inner System 2 thinking fails to kick in, they will buy the more expensive alternative.

This is Bait and Switch.

2.4 Forced Continuity/Sneaking


This ploy leverages on expiration dates of subscriptions such as discounted trials and

free for a month offers. Force Continuity depends on the user forgetting to remember to
cancel the subscription before a trial expires. The vendor using this Dark Pattern can exploit

the user by not sending reminders to the user. On expiration of the trial, the seller simply

switches the user over to a full fee monthly account.

3 Interface Interference

A common type of Dark Pattern found is Interface Interference. Interface

Interference are deceptions that rely on pre-selected fields and hidden facts. They are facts

that if known by the user it would influence them to take an undesirable action. It is polite to

call them half-truths and they are malicious by design.

3.1 Hidden Information/Interface Interference


It can be as simple as locating a dropdown to return an item in an obscure location.

Alternatively, hiding terms and conditions can be achieved if a tiny font pitch is used or a

color is selected that makes them blend into the background.

3.2 Pre-selection/Interface Interference


Logic and fair play would suggest that in a UI, selections are “default off” or

“unselected”. The Pre-selection scam loads webpages with options to accept offers and other

inclusions the vendor wants as on. Having a pre-selected radio button to opt into receiving

mail on “exciting new products” is a form of the Pre-selection Dark Pattern.

3.3 Disguised Ads/Interface Interference


Examples include benign interactions that are in advertisements; what appears to be

something the user wants leads to an unwanted sales opportunity. Google search results

place ads that appear to be returns from a query at the top of their search results pages.

Hence, what, for example, appears as wanted search guidance is in fact advertisements.
3.4 Aesthetic Manipulation /Interface Interference
A common variant is controlling user attention and direction through design aesthetic.

For example, using greyed out options for selecting subscription terms paired with selectable

and vibrant colored options. The greyed-out options are options the vendor does not want

selected; being greyed out makes them appear as it they cannot be selected when in fact they

can. The bright popped colors are reserved for the options the UI designer wants selected.

3.5 False Hierarchy /Interface Interference


Being lean or confusing with detail and hiding an option in a variety of other options defines

False Hierarchy. The options appear to have interactive precedence over other selections

instead of a flat or parallel decision surface. There appears to be only “one option or the best

option” (darkpatterns.uxp2.com 2021).

3.6 Trick Questions /Interface Interference


Using double negatives to make opting in seem like opting out; that’s the dark art of

Trick Questions. For example, where a website wants users need to opt in to use a feature

they could ask “Do you wish to be not exclude this feature?” with a “yes” and “no” radio

button pair on the webpage. The question is a double negative and a yes answer means

opting in rather than opting out. Instead of a simple select ‘yes’ to activate and no to turn it

off, the question is posed to be duplicitous. Trick Questions are “commonly seen when

registering with a service, where check boxes are shown, but their meaning is alternated, so

the first choice means opt out and the second choice means opt in” (itispivotal 2021).

3.7 Using Emotion/Interface Interference

Sometimes called called “Confirmshaming”, it is “the act of guilting the user into

opting into something” (Mathur p 6 2021). It’s a trick to make a user rise to a challenge; and

it often poses a question about a user’s intelligence. To opt out, it is implied that you, as the

user, must be stupid. Consider for example a user can select to receive a discount from a
pop-up, while browsing a shopping site. It asks “Accept 20% off your cart for life” as one

option. The other option is “I prefer to pay full price”. Clearly, for a rational thinking

consumer, paying full price is not a sensible option. The appeal made by the Dark Pattern is

to the user’s intelligence; the user, not being stupid, selects the 20% discount. The trick is

then revealed. On selection of the discount the user is asked to subscribe for a monthly fee

to get the discount. The offer has strings attached. Further, the discount is useless without

regular purchases to obtain the discount. The user loses twice.

4 Forced Action
The last of the four types is Forced Action. These variants are a one-way street. The

user is “required to perform a specific action” (or forced) to use or keep using some other

desired function (Gray et al p8 2018). It is a veiled threat of “Do this or don’t get that”.

4.1 Social Pyramid/Forced Action


This variant often requires users to actively allow spamming of their social media

contacts to recruit them to a social media platform. If they do not allow the action, a

desirable incentive is denied the user. Called Social Pyramid or Friend Spamming, LinkedIn

famously had to pay a 13 million dollar USD settlement in 2015 for using such a practice.

They were sending automated email requests to new user’s contacts asking them to join

LinkedIn. They also made the emails appear as if they were coming from the new user’s

email address to build trust with the recipient (Brownlee 2015).

4.2 Privacy Zuckering/Forced Action


The variant Privacy Zuckering is fabulously named and paints a vivid picture. The

variant takes its name from the inventor of Facebook as Facebook was a massive user of

this variant and based its business practices on this Dark Pattern variant. The trick is to

get a user to reveal private information in depth; more than otherwise intended. They do

so in order to secure an attractive service being offered. For example, to use the Whatsapp
message service in 2017, a user needed to agree to set of terms and conditions – a normal

practice. However, “Privacy Zuckering” Facebook used a two page sign-up agreement.

The UI is designed so that the user can agree on page one to terms and conditions; and

likely never see the additional terms on page 2. And it is on Page two that “you realise that

by agreeing to the Terms and Privacy policy the user also by default allows Facebook to use

Whatsapp account data for advertising” (Mohit 2017). The UI has been designed such that

without opting in, you cannot use Whatsapp. The alternative is to opt-in and share your

private information with Facebook.

4.3 Scarcity/Forced Action


Content in the form such as using countdown timers to the end of sale and low stock

warnings are all attempts at forcing action from a user. They are fear mongering scarcity and

pressure sales tactics, appealing to people’s FOMO (Fear of Missing Out) anxiety.

4.4 Gamification/Forced Action


The last variant is most surprising. What could be dark patterns have to do with

games? Some online games are cleverly designed to be free but with features you are likely

to buy. These features make the game experience better. In these games, the player, without

the paid features, often loses to the paid up user who has greater in-game powers. Users

have two choices: build ability slowly and “grind” away to get stronger (suffering losses to

more powerful opponents as they go) or they can short circuit the process by buying ability.

Gamification typically means to use game design elements in other genres, but as a Dark

Pattern it is extortion.

Summary of the Dark Pattern Types

Obstruction, Sneaking, Interface Interference and Forced Action are 4 genres of Dark

Pattern with a range of variants. All the variants are designed to make the user outcome less

than desirable by design. Designed is an operative word. These Dark Patterns are planned
and designed to intentionally deceive users, resulting in direct harm to users and directly

benefiting online vendors using the Dark Patterns. It is deliberate and planned deceit.

(Mathur p.4 2021)

The impact of using Dark Patterns – so why is dark bad?

Dark Patterns are deceitful and unethical as it relies on tricking users. If you are

religious, Dark Patterns may well be considered immoral. And it is looking more and more

like the practice will become illegal in various major markets where consumer law is being

enforced.

There is a conscious decision to use Dark Patterns because they work short term. In a

quasi-experiment with 1963 users (Luguri et al 2021), users were exposed to an online

survey. Some users encountered a survey with no Dark Patterns, while others completed a

survey that had some Dark Patterns in play. Further, some users experienced a survey loaded

with Dark Patterns to trick, confuse or wear them down. the them to buy a product on

completion of the survey. The survey was on the subject of data privacy and on completion

of the survey, a data privacy software subscription was offered. The results confirmed Dark

Patterns impact.

Users who competed the less aggressive Dark Pattern survey were found to be twice

as likely to be purchase the software subscription than the control group (the group not

exposed to Dark Patterns). Users who took the survey with many Dark Patterns were found

to be four times more likely to be manipulated into buying than the control group.

Further, the research found that there was a negative correlation between education

level of participants and susceptibility to Dark Patterns. Lower educational opportunity

participants were more likely to fall prey Dark Patterns.


Considering the link of low education to lower socio-economic outcomes, people who

can least afford to waste money are more likely to lose out to Dark Patterns.

Dark Patterns can deter people from making decisions for their best interests. In

addition, they can adversely impact economic and mental health outcomes of users caught out

by Dark Patterns.

What then of the brands who seek to benefit from bare faced manipulation?

Backfire – the consequences for designers and brands.

If ethics does not compel. If legal ramifications do not compel. Then self interest

alone should be sufficient for smart designers and brands to move away from Dark Patterns at

100 miles an hour. It’s true that on finding a website using Dark Patterns, some will “swear

angrily under their breaths.” (Brignull 2013). But more than that, the intended consequences

of Dark Patterns may well ‘backfire’ on the brand.

Short term, using Dark Patterns on websites and apps leads to increased cart

abandonment and user outrage that makes for negative online reviews. But in the eyes of the

brand, this may well be offset by Dark Patterns quick fixes; upticks to sales volumes and

retention. This is flawed thinking. Any calculation that fails to account for longer term brand

damage is faulty. And the bigger the brand the bigger the damage potential.

Consider the intention-outcome matrix (Stibe and Cugalman 2016). The matrix

considers that there is four ways to achieve behavioural change. The matrix is shown in
Figure 1
Intention – Outcome matrix.

The Target Behaviour method acts to create positive outcomes such as building “user

trust and sales” (Stibe and Cugalman 2016). There is planning in the delivery for targeted

behavioural change based on human psychology, but there is also benefit to the user. To the

left of Target Behavior is Dark Patterns.

Dark Patterns also involve planning and understanding Human Psychology. The

difference is Dark Patterns have unwitting and/or unwilling participants. To change thinking

with Target Behavior is to use clarity. One of the four methods to achieve behavioural

change, to use Dark Patterns is as far from clarity as can be imagined.

Note the third type of behavioral change called Unexpected Benefits. This in effect is

unplanned, but welcome outcomes for brands. A company may improve the security of data

privacy on their website and, as a consequence, there is improved trust with consumers. That

knock on effect is an unexpected benefit.


Backfires are the fourth type of behaviour change. They are negative and unintended.

And it is here that Dark Patterns are the driver of brand disaster. For every action there is a

reaction. If Target Behaviour works to deliver positive outcomes there is less likelihood that

they will ‘blow-up’ into a Backfire. Dark Patterns are not assistive, they are not working

with users but against them. There is more potential for Dark Patterns to cause a Backfire as

the intention is to trick consumers; leading to frustrated customers who either walk away or

trash the brand. That is your classic Dark Pattern Backfire.

Backfires can be minor or they can be deeply impacting on brands, damaging their

position in the market through changed user perception. They are an unintended consequence

of an intended action, and more likely to be related to user manipulation (Dark Patterns) than

good intentions (Target Behaviors). The types of backfire include defiance arousing through

to a brand self-discrediting (Stibe and Cugalman 2016). The seed of a Backfire can be a

Dark Pattern.

Consider what happened when Facebook, using the private information of its user

base, was exposed. Connected to a philosophy encapsulated by the term Privacy Zuckering,

when consumers and governments discovered their data was being used to manipulate

opinion for profit (what is known as the Cambridge Analytica Data Scandal) the financial

penalties were billions, and the brand was tarnished. While the penalties to Facebook were

very severe, it appears the lesson has not been learnt. As noted earlier, in 2017, Whatsapp

new users had a Force Action Dark Pattern used to ensure they allowed their private data can

be used by Facebook. In 2021, this issue is ongoing and new users of Whatsapp are still

required to share their private data with Facebook (Naresh 2021).

Dark Patterns are not the exclusive domain of Western/English digital applications

and can be found cross-culturally. As human, we all make snap decisions based on fallible
System 1 thinking which can be manipulated by Dark Patterns that, with their discovery, can

backfire on brands long term.

After LinkedIn were found to be using Dark Pattern Social Pyramid scams it was

noted “If they had used the contents of my address book just to suggest contacts on LinkedIn,

I might not have minded” (Schlosser 2015).

Maybe the lesson to be learned is that users respond to openness and there is more

benefit to brands to be had by ditching Dark Patterns than using them.

You might also like