You are on page 1of 8

6/6/2018 Should Privacy Law Regulate Technological Design?

An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

Should Privacy Law Regulate

Technological Design? An Interview with
Woodrow Hartzog
Published on April 11, 2018

Daniel Solove Follow

391 15 70
Professor, GW Law School + CEO, TeachPrivacy + Organizer, P…

Hot off the press is Professor Woodrow Hartzog's new book, Privacy's Blueprint: The
Battle to Control the Design of New Technologies (Harvard Univ. Press 2018). This
is a fascinating and engaging book about a very important and controversial topic:
Should privacy law regulate technological design? 1/8
6/6/2018 Should Privacy Law Regulate Technological Design? An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

Many have argued that the law should avoid regulating technological design, as this
impedes freedom to develop new technology and is a heavy-handed way to regulate.
The law should focus on addressing harms arising after technology is developed and
shouldn't meddle early on in the development stage. In contrast. Hartzog makes a
compelling case why the law must become involved with design. In these days, as
concerns about Facebook and Cambridge Analytica are boiling over the top, a
discussion about the law's fundamental approach to privacy regulation could not be
more timely. Hartzog's book is just the book we need right now! It's a superb book, the
kind of book that changes the way you see and think about things. Privacy's Blueprint
is one of the most important books about privacy and technology, and it's a definite

Hartzog is a professor of law and computer science at Northeastern University School

of Law and the College of Computer and Information Science and an affiliate scholar at
the Center for Internet and Society at Stanford Law School. He has written extensively
on privacy, media, and robotics. 2/8
6/6/2018 Should Privacy Law Regulate Technological Design? An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

Below is a short interview with Professor Hartzog. We just scratch the surface of the
many great issues he tackles in his book. Privacy's Blueprint is essential reading, and I
urge you to put it at the top of your reading list.

SOLOVE: Can you provide some examples of how design affects privacy?

HARTZOG: Design affects nearly every aspect of our privacy these days, but three
areas stand out. The first in the design of the user interfaces of apps and websites,
particularly social media. Every aspect of the user experience is designed to extract data
out of you, to get you to never stop sharing, and to have you feel good about it in the
process. Companies say they respect your privacy by asking for your permission before
collecting certain kinds of data. But what’s not made clear to you is that they are going
to engineer an environment that all but assures most of us say yes. Companies use
relentless pop-ups, nudges to change your current privacy settings, and buttons buried
so deeply and presented in such a confusing way, that your exposure is almost pre-

This sort of adversarial interface is part of what enabled the incident involving
Facebook and Cambridge Analytica. Facebook said that the third party app that
collected the information on over 50 million Facebook users “requested and gained
access to information from users who chose to sign up to his app, and everyone
involved gave their consent.” But for the friends of people that didn’t even interact with
the third party app in question, that consent was buried where few people would be
likely to find it and the nature of the permission was so vague that it was hardly

The next area where design matters for our privacy is in technologies that are built to
make things significantly easier to find—things like biometrics and search functions.
Each of these kinds of seeking technologies erode the obscurity that people rely upon 3/8
6/6/2018 Should Privacy Law Regulate Technological Design? An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

every day. Most people do not assume that when they are walking about in public that
their every move is being sensed, critiqued, and stored to be used against them later.
They are obscure and they make decisions about what to share and where to go based
on that. Design changes that, often dramatically and without people knowing it. Facial
recognition can destroy the obscurity of people’s whereabouts instantly. Making things
searchable can expose to a mass audience what was previously likely to be found only
by people who knew where to look and what to look for. Design decisions like whether
to use biometrics or make things searchable should matter more in law and policy.

Finally, design matters regarding the processors, sensors, and software that are getting
jammed into every device that isn’t nailed down. (And many that are!). Companies have
been remarkably cavalier in their approach taking an object and outfitting it with
surveillance tools and connecting it to the Internet. Consumers can now choose to
connect wine openers, basketballs, toilets, and even their underwear (!) to the Internet.
Sensors in the home increase both the perception of the surveillance and the risk of
unwanted and unknown surveillance. Outdated, unpatched, and uncared for software
embedded in ignored and reckless IoT devices can be the Achilles heel of an entire,
otherwise relatively secure network. It is nearly impossible for companies to give
meaningful notice to users of an entity’s data collection and use practices.

SOLOVE: To what extent is design currently regulated?

HARTZOG: There aren’t many meaningful design interventions or boundaries in

privacy law. Essentially, the privacy rules that govern companies around the world are
built around three major ethics: 1) Do not lie; 2) Do not harm; and 3) Follow the Fair
Information Practices, or “the FIPs.” The rules that are built around these three ethics
fail to adequately address how the design of information technologies affects people and
can frustrate the goals of privacy law. 4/8
6/6/2018 Should Privacy Law Regulate Technological Design? An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

For example, companies too often are allowed to bury the truth in dense and unreadable
boilerplate contracts that nobody should be expected to read or understand. While they
are technically being truthful, users whose expectations are shaped by the design of
their environment often form misconceptions about how sites work and what companies
are collecting and plan to do with their personal information.

Additionally, modern privacy harms that flow from design are often diffuse and
incremental, yet courts and lawmakers usually demand concrete, financial, and visceral
harm before they will recognize a legal violation or let people recover damages.
Companies are allowed to aggregate information, make it searchable, and design their
systems to expose people to all kinds of risk without much fear of ever being held
accountable for it.

Finally, the Fair Information Practices functionally operate at the common language of
privacy around the world. They form the basis of most data protection regimes
worldwide, including Europe’s General Data Protection Regulation and many data
protection statues in the US. But they don’t really address the design of information
technologies. The FIPs were built around the threat of databases, and they do a
relatively decent job addressing the risk of aggregated data. But privacy threats these
days come from more than databases. Automated decision-making systems, biometrics,
artificial intelligence, and manipulative user interfaces are all woven into the design of
information technologies. The FIPs are built the threat from platforms like Facebook or
organizational actors like the government. But in the age of Facebook, Snapchat, and
Twitter, individuals also harass, betray, and expose each other. And design facilitates all
of this. Lawmakers have not created many legal boundaries to guide how the tools are

SOLOVE: How ought design to be regulated without being too heavy handed? 5/8
6/6/2018 Should Privacy Law Regulate Technological Design? An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

HARTZOG: It’s not easy. But when people think about design rules and interventions,
they often think the only option is to heavily regulate technology. But taking design
seriously doesn’t mean pulling the plug on all digital technologies. Nor does it mean
passing micro-managing, ham-fisted rules that risk burdening companies for minimal
privacy gains. It means embracing the full range of legal responses from soft, to
moderate, to robust. It also means a general preference for more flexible standards of
reasonableness unless more specific, technologically-specific rules are necessary. And
above all, it is about matching the right responses to the right problems.

Sometimes, funding, educational efforts, or standards coordination is what’s needed.

Sometimes courts need to simply be more aware of the role that design plays in shaping
people’s expectations about how a technology works and what has been promised to
them. Changes in the common law can better recognize design-based promises and
inducements to breach confidentiality. Of course, certain problems deserve robust
responses. Spyware is a scourge. The Internet of Things is a runaway train headed for
catastrophe. More significant regulation might be needed for these technologies, but a
broad spectrum of regulatory and policy options will help make sure legal responses
across the board are proportional.

SOLOVE: Critics of regulation express grave concerns that regulators lack a

sufficient understanding of technology and might inhibit innovative new
technologies. Critics say that it is dangerous to have regulators tell engineers how
to build things. How would you respond to such critics?

HARTZOG: I totally understand the intuitive appeal of these arguments. We certainly

Back to
have plenty of examples where lawmakers mucked things up because they acted in
short-sighted ways. In the past, lawmakers might not have understood how certain
technologies work or all the different considerations that needed to be taken into
account. We’re still trying to figure out how to fit modern surveillance problems into a
regulatory regimes built around business models for computing in the 1980s. And
sometimes our privacy is best served if lawmakers avoided interfering with certain 6/8
6/6/2018 Should Privacy Law Regulate Technological Design? An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

kinds of technological design. The battle over encryption backdoors is a good example
of this.

But I think peoples’ concerns about lawmakers guiding the design of information
technologies is often unjustified. First, ignoring design is a matter of ethics. To ignore
design is to leave power unchecked. When courts and lawmakers ignore design, they
functionally sanction the way that companies leverage design to affect people’s lives.

Second, lawmakers and courts have imposed boundaries upon design in many different
contexts for quite a long time. Cars and planes must be built safely. This is why we have
seatbelts, airbags, and gas tanks that won’t explode upon contact. Buildings must be
built in structurally sound ways. There are even certain rules about how guns must be
designed. When confronted with technological complexity, there are things courts,
lawmakers, and regulators can do to mitigate uninformed, ineffective, and unjust rules.
They should hire more technologists, ethicists, psychologists, sociologists, economists,
and experts from every discipline to confront the ways in which design decisions
distribute power and affect our relationships, our economy, and personal wellbeing.
They should make sure to articulate all of the different values at stake and use all their
different tools to serve those values.

The Internet is exceptional in many ways, but it is still of this Earth. Digital
technologies are still just tools created by people. And like other tools, companies can
use them to deceive, abuse, manipulate, and harm us. Until lawmakers take that power
seriously, our privacy rules will remain incomplete.

SOLOVE: Thanks, Woodrow, for your terrific insights. The book is Privacy’s
Blueprint, and it is available at Harvard University Press or Amazon.

Related Posts

Daniel J. Solove, Pivacy by Design: 4 Key Points

This post was authored by Professor Daniel J. Solove, who
through TeachPrivacy develops computer­based GDPR training and
privacy awareness training.

Professor Solove's GDPR Training

NEWSLETTER: Subscribe to Professor Solove’s free newsletter

TWITTER: Follow Professor Solove on Twitter. 7/8
6/6/2018 Should Privacy Law Regulate Technological Design? An Interview with Woodrow Hartzog | Daniel Solove | Pulse | LinkedIn

Report this

Daniel Solove
Professor, GW Law School + CEO, TeachPrivacy + Organizer, Privacy+Security Forum Follow
185 articles

15 comments Newest

Leave your thoughts here…

Charlene Meiring 2mo

I am
Yes, definitely. We are one human race. One World.
Like Reply

Marilia W. 2mo
CyberRisk Analyst
Fascinating read. Thanks!
Like Reply

There are 13 other comments. Show more.

Don't miss more articles by Daniel Solove

Artificial Intelligence, Big Data, and Data Security Is Worsening: 2017 Was the Silencing #MeToo: How NDAs and
Humanity's Future: An Interview with Worst Year Yet Litigation Stifle Victims, Innovators, and
Evan Selinger Daniel Solove on LinkedIn Critics -- An Interview with Orly Lobel
Daniel Solove on LinkedIn Daniel Solove on LinkedIn

Looking for more of the latest headlines on LinkedIn?

Discover more stories

Help Center About Careers Advertising Talent Solutions Sales Solutions Small Business Mobile Language ProFinder Upgrade Your Account
LinkedIn Corporation © 2018 User Agreement Privacy Policy Ad Choices Community Guidelines Cookie Policy Copyright Policy Send Feedback 8/8