You are on page 1of 79

1129290

research-article2022
OTT0010.1177/26317877221129290Organization TheoryZuboff

Conversations and Controversies


Organization Theory

Surveillance Capitalism or Volume 3: 1–79


© The Author(s) 2022
Article reuse guidelines:
Democracy? The Death Match sagepub.com/journals-permissions
https://doi.org/10.1177/26317877221129290
DOI: 10.1177/26317877221129290

of Institutional Orders and the journals.sagepub.com/home/ott

Politics of Knowledge in Our


Information Civilization

Shoshana Zuboff

Abstract
Surveillance capitalism is what happened when US democracy stood down. Two decades
later, it fails any reasonable test of responsible global stewardship of digital information and
communications. The abdication of the world’s information spaces to surveillance capitalism has
become the meta-crisis of every republic because it obstructs solutions to all other crises. The
surveillance capitalist giants–Google, Apple, Facebook, Amazon, Microsoft, and their ecosystems–
now constitute a sweeping political-economic institutional order that exerts oligopolistic control
over most digital information and communication spaces, systems, and processes.
The commodification of human behavior operationalized in the secret massive-scale
extraction of human-generated data is the foundation of surveillance capitalism’s two-decade
arc of institutional development. However, when revenue derives from commodification of
the human, the classic economic equation is scrambled. Imperative economic operations entail
accretions of governance functions and impose substantial social harms. Concentration of
economic power produces collateral concentrations of governance and social powers. Oligopoly
in the economic realm shades into oligarchy in the societal realm. Society’s ability to respond
to these developments is thwarted by category errors. Governance incursions and social harms
such as control over AI or rampant disinformation are too frequently seen as distinct crises and
siloed, each with its own specialists and prescriptions, rather than understood as organic effects
of causal economic operations.

Harvard Business School, Boston, MA, USA

Corresponding author:
Shoshana Zuboff, Harvard Business School, Soldiers Field, Boston, MA 02163, USA
Email: info@shoshanazuboff.com

Creative Commons Non Commercial CC BY-NC: This article is distributed under the terms of the Creative
Commons Attribution-NonCommercial 4.0 License (https://creativecommons.org/licenses/by-nc/4.0/) which
permits non-commercial use, reproduction and distribution of the work without further permission provided the original work
is attributed as specified on the SAGE and Open Access pages (https://us.sagepub.com/en-us/nam/open-access-at-sage).
2 Organization Theory 

In contrast, this paper explores surveillance capitalism as a unified field of institutional


development. Its four already visible stages of development are examined through a two-decade
lens on expanding economic operations and their societal effects, including extraction and the
wholesale destruction of privacy, the consequences of blindness-by-design in human-to-human
communications, the rise of AI dominance and epistemic inequality, novel achievements in
remote behavioral actuation such as the Trump 2016 campaign, and Apple-Google’s leverage of
digital infrastructure control to subjugate democratic governments desperate to fight a pandemic.
Structurally, each stage creates the conditions and constructs the scaffolding for the next, and
each builds on what went before. Substantively, each stage is characterized by three vectors
of accomplishment: novel economic operations, governance carve-outs, and fresh social harms.
These three dimensions weave together across time in a unified architecture of institutional
development. Later-stage harms are revealed as effects of the foundational-stage economic
operations required for commodification of the human.
Surveillance capitalism’s development is understood in the context of a larger contest with
the democratic order––the only competing institutional order that poses an existential threat.
The democratic order retains the legitimate authority to contradict, interrupt, and abolish
surveillance capitalism’s foundational operations. Its unique advantages include the ability to
inspire action and the necessary power to make, impose, and enforce the rule of law. While
the liberal democracies have begun to engage with the challenges of regulating today’s privately
owned information spaces, I argue that regulation of institutionalized processes that are
innately catastrophic for democratic societies cannot produce desired outcomes. The unified
field perspective suggests that effective democratic contradiction aimed at eliminating later-
stage harms, such as “disinformation,” depends upon the abolition and reinvention of the early-
stage economic operations that operationalize the commodification of the human, the source
from which such harms originate.
The clash of institutional orders is a death match over the politics of knowledge in the
digital century. Surveillance capitalism’s antidemocratic economic imperatives produce
a zero-sum dynamic in which the deepening order of surveillance capitalism propagates
democratic disorder and deinstitutionalization. Without new public institutions, charters
of rights, and legal frameworks purpose-built for a democratic digital century, citizens
march naked, easy prey for all who steal and hunt with human data. Only one of these
contesting orders will emerge with the authority and power to rule, while the other will drift
into deinstitutionalization, its functions absorbed by the victor. Will these contradictions
ultimately defeat surveillance capitalism, or will democracy suffer the greater injury? It is
possible to have surveillance capitalism, and it is possible to have a democracy. It is not
possible to have both.

Keywords
democracy, digital economy, digitalization, disinformation, information, institutional theory,
internet governance, privacy, social media, surveillance capitalism
Zuboff 3

Our Accidental Dystopia: The consequences of this democratic failure


Surveillance Capitalism as are amplified in the global context. Since at
least 2010, the Chinese state evolved a highly
Institutional Order
intentional theory and practice of digital design
In an information civilization individual and and deployment that advances its domestic sys-
collective existence is rendered as and mediated tems of authoritarian rule and exports them to
by information. But what may be known? Who dozens of countries in nearly every region
knows? Who decides who knows? Who decides (Hoffman, 2022; Menendez, 2020; Mozur et al.,
who decides who knows? These are the four 2019; Murgia & Gross, 2020; Sherman &
questions that describe the politics of knowl- Morgus, 2018). In contrast, the United States
edge in the digital age. What knowledge is pro- and other Western democracies have been com-
duced? How is that knowledge distributed? promised and ambivalent, torn between the
What authority governs that distribution? What digital seductions of surveillance-enabled social
power sustains that authority? The contests over control and the rights-based principles of liberal
the answers to these questions will shape “the democracy.
division of learning in society” as the fundamen- The political failure of the void forfeited the
tal construct of social order in an information critical first decades of the digital century to
civilization. Right now, however, it is the sur- surveillance capitalism. It deprived an increas-
veillance capitalist giants–Google/Alphabet, ingly connected world community of a clear
Facebook/Meta, Apple, Microsoft, Amazon–that alternative to the Chinese vision of the digital
control the answers to each of these questions, century. Without a path to a democratic and
though they were never elected to govern. digital future, the democracies abandoned
This condition reflects a larger pattern. whole societies to new forms of digitally medi-
From the dawn of the public internet and the ated violence from both state and market actors.
world wide web in the mid-1990s, the liberal Most treacherous is the potential fusion of these
democracies failed to construct a coherent spheres in a digital-century incarnation of the
political vision of a digital century that surveillance state defined by unprecedented
advances democratic values, principles, and asymmetries of knowledge about people and
government. This failure left a void where the instrumentarian powers of behavioral con-
democracy should be, a void that was quickly trol that accrue to such knowledge (Zuboff,
filled and tenaciously defended by surveil- 2019). Without new public institutions, charters
lance capitalism. A handful of companies of rights, and legal frameworks purpose-built
evolved from tiny startups into trillion-dollar for a democratic digital century, citizens march
vertically integrated global surveillance naked, easy prey for all who steal and hunt with
empires thriving on an economic construct so human data. In result, both the liberal democra-
novel and improbable, as to have escaped crit- cies and all societies engaged in the struggle to
ical analysis for many years: the commodifi- build, defend and strengthen democratic rights
cation of human behavior. These corporations and institutions now stumble toward a future
and their ecosystems now constitute a sweep- that their citizens did not and would not choose:
ing political-economic institutional order that an accidental dystopia owned and operated by
migrates across sectors and economies. The private surveillance capital but underwritten by
institutional order of surveillance capitalism democratic acquiescence, cynicism, collusion,
is an information oligopoly upon which dem- and dependency.
ocratic and illiberal governments alike depend As an economic power, surveillance capital-
for population-scale extraction of human-gen- ism exerts oligopolistic force over virtually all
erated data, computation and prediction (Cate digital information and communication spaces
& Dempsey, 2017). (Manns, 2020). But for those who would
4 Organization Theory 

approach analysis strictly through the lens of institutionalized when its durability and elabo-
concentrated economic power and its remedies ration do not depend upon “recurrent collec-
in economic regulation and antitrust law, there tive mobilization,” but rather are sustained by
is more to consider. When the economic opera- self-reproducing internal routines (p. 39).
tions that drive revenue are founded on the “Institutions,” Berger and Luckmann (1966)
commodification of the human, the classic eco- write, “control human conduct by setting up
nomic playing field is scrambled. Concentration predefined patterns of conduct ... primary
of economic power produces collateral concen- social control is given in the existence of an
trations of governance and social powers. institution as such.” They note that external
Surveillance capitalism’s institutional develop- forms of human action are only required when
ment braids these three vectors of power into a “the processes of institutionalization are less
hydra-headed force that leads with economic than completely successful” (p. 55).
operations and then competes with democracy These formative processes do not, however,
for governance and social control. Oligopoly in imply a one-way ticket or a solitary journey.
the economic realm shades into oligarchy in Institutional orders form and develop, but they
the societal realm. may also “deinstitutionalize” or even “reinstitu-
Characteristics of the giants’ market power tionalize” in a new form (Jepperson & Meyer,
reflect the distinction between, on the one hand, 2021). Such radical shifts in trajectory are trig-
their varied individual business models and, on gered by contradiction that overtly challenges
the other, their shared participation in, and ben- or implicitly undermines the aura of inevitabil-
efits from, the overarching economic logic of ity. For example, external shocks can unravel
surveillance capitalism and its related strategies inevitability and erode institutionalization.
of institutional reproduction. These institutional Shifts can be initiated by collective action, the
elements spread through the giants’ ecosystems intensification of contradictions with compet-
and a growing majority of market enterprises ing institutional orders, or by an accumulation
across the commercial universe (Power, 2022). of internal contradictions that generate conflict
While this institutional order functions as an oli- among institutional elements. In each case of
gopolistic force, a reality already reflected in the fundamental change, the force of contradiction
term “Big Tech,” individual firms may never- is substantial enough to threaten self-acting
theless exert monopoly or duopoly power within reproductive mechanisms. Under these circum-
the narrower competitive spheres of their spe- stances, the institutional order is forced to resort
cific business models—for example, mass retail, to active measures to protect and defend terri-
mobile services, and online targeted advertising. tory once considered inevitable, inviolable and
In result, surveillance capitalism now intermedi- invincible. Action signals threat, and because
ates nearly all human engagement with digital action is weaker than institutionalization, the
architectures, information flows, products, and destiny of such contests is uncertain. A return to
services, and nearly all roads to economic, polit- the developmental path? Deinstitutionalization
ical, and social participation lead through its and destruction? Eventual reinstitutionaliza-
institutional terrain. tion? Each is possible.
These conditions of practical and psycho- These dynamics of contradiction require situ-
logical “no exit” conjure the aura of inevitabil- ating surveillance capitalism’s institutional
ity that is both a key pillar of surveillance development within a larger contest among insti-
capitalism’s rhetorical structure and critical to tutional orders. Specifically, surveillance capital-
all institutional reproduction (Zuboff, 2019, ism’s two-decade developmental trajectory can
pp. 221–224). Jepperson (2021) observes that only be understood in relation to the institutional
institutionalization is the opposite of action. order that gave it birth and nourished it to adult-
An institutional order is judged as robustly hood: the liberal democratic state.
Zuboff 5

This paper examines the ways in which the capitalism’s intrinsically antidemocratic eco-
institutionalization of surveillance capitalism nomic imperatives produce a zero-sum dynamic
has produced the deinstitutionalization of the in which the deepening order of surveillance
democratic order through the erosion of infor- capitalism propagates democratic disorder and
mational, societal, behavioral and governance deinstitutionalization. Only one of these con-
capabilities essential to democracy’s sustenance testing orders will emerge with the authority
and reproduction. Seen from this vantage point, and power to rule, while the other will drift into
surveillance capitalism’s developmental thrust deinstitutionalization, its functions absorbed by
is revealed as an epistemic counterrevolution, the victor. Will these contradictions ultimately
an antidemocratic coup that aims for knowl- defeat surveillance capitalism, or will democ-
edge dominance and strikes at the essence of racy suffer the greater injury? At stake is the
democratic viability. social order of our information civilization: the
Surveillance capitalism is the young chal- many or the few? Epistemic equality or subju-
lenger, its pockets full of magic. Born at the turn gation? It is possible to have surveillance capi-
of a digital century that it helped to create, its talism, and it is possible to have democracy. It is
rapid growth is an American story that embodies not possible to sustain both.
many novel means of institutional reproduction. The sections that follow aim to reframe the
Chief among these has been its ability to keep requirements for successful democratic contra-
law at bay. The absence of public law to obstruct diction and thus to fortify the efforts of all who
its development is the keystone of its existence seek to avert the drift into accidental dystopia. To
and essential to its continued success. It is, this end, I examine surveillance capitalism as a
therefore, dedicated to nourishing and reward- unified field of institutional development. The
ing the continued failures of democratic leader- developmental stages of this two-decades-old
ship (Zuboff, 2019, pp. 37–82; Chander, 2014). institution reveal cause-and-effect relationships
Despite this history, the liberal democracies between earlier novel economic operations and
do pose an existential threat to the surveillance later dystopian harms to democratic governance
capitalist regime because they alone retain the and society. This unified field perspective sug-
requisite institutional force and capabilities to gests that effective strategies for the elimination
contradict, interrupt, and abolish its founda- of downstream dystopian harms, such as “disin-
tional operations. Indeed, as surveillance capi- formation” or the illicit modification of collective
talism grows, contradictions with its grizzled behavior exemplified in extreme “polarization,”
but still potent antagonist have intensified. depend upon interrupting, abolishing and rein-
Democracy is the old, slow and messy incum- venting the upstream economic operations in
bent, but those very qualities bring advantages which such harms originate.
that are difficult to rival. Foremost among these To preview the organization of this paper:
are the ability to inspire action and the legiti- the following section discusses surveillance
mate authority and necessary power to make, capitalism’s institutional development from
impose, and enforce the rule of law. It now falls the unified field perspective. The succeeding
to the democratic order to reclaim the void for sections each discuss one of the four already
the sake of every society and people desperately visible stages of surveillance capitalism’s
trying to outrun dystopia. development, each one marked by deeper and
In summary, the clash of institutional orders more comprehensive conflict with the demo-
is a death match over the politics of knowledge cratic order. The concluding section antici-
in our information civilization, and the prize is pates the next phase of this work. Please see
the governance of governance. Surveillance the Section Overview, below.
6 Organization Theory 

SURVEILLANCE CAPITALISM or DEMOCRACY?

Section Overview

Our Accidental Dystopia: Surveillance Capitalism as Institutional Order

The Unified Field Perspective


Figure 1: Four Stages of the Surveillance Capitalist Institutional Order

Foundational Stage One: The Commodification of Human Behavior


(Economies of Scale)
The Economic Operations
Supply of Human-Generated Data
Demand for Human-Generated Data
The Governance Vector: The Annexation of Epistemic Rights
The Social Harm Vector (1): The Destruction of Privacy
The Social Harm Vector (2): The Rise of Epistemic Chaos

Stage Two: The Concentration of Computational Knowledge Production and


Consumption (Economies of Learning)
The Economic Operations
The Governance Vector: Epistemic Authority
The Social Harm Vector: Epistemic Inequality

Stage Three: Remote Behavioral Actuation (Economies of Action)


“The Tools”
The Economic Operations
The Governance Vector: The Governance of Collective and Individual Behavior
The Social Harm Vector: The Artificial Construction of Reality

Stage Four: Systemic Dominance (Economies of Domination)


The Economic Operations
Revenge of the Void
Showdown
Apple-Google Contribute to US Failures
The Governance Vector: The Governance of Governance
The Social Harm Vector: The Desocialization of Society

Conclusion: The Golden Sword


Zuboff 7

The Unified Field Perspective The governance vector is constituted by an


accumulation of governance prerogatives ena-
The public and its lawmakers are whipsawed by bled by newly consolidated economic opera-
each day’s headlines bleating surveillance capi-
tions. While it has been understood that Big
talism’s latest atrocities.1 Comprehension of
Tech aims to govern (Balkin, 2017; Goodman &
this hourly parade is thwarted by category
Powles, 2019; Klonick, 2020; Pasquale, 2017b),
errors: social harms are siloed and treated as
the unified field of institutional development
disparate crises. For example, the collapse of
clarifies the governance vector as a core repro-
privacy or the rise of disinformation are
ductive mechanism of continuously expanding
regarded as discrete phenomena, each with its
scope. It demonstrates the expansion and hierar-
own etiology, specialists, and prescriptions.
chical integration of specific governance ele-
The unified field perspective offers a solution
ments over time, their tight coupling with
for this fractured tower of Babel by demonstrat-
economic operations, and the ways in which
ing the organic and temporal interdependen­
earlier achievements coalesce to create the con-
cies across hierarchically integrated stages of
ditions for later-stage governance conquests.
institutional development. Discrete harms are
From the perspective of the confrontation of
revealed as the products of path dependencies,
institutional orders, each governance function
with causes and effects linked across time and
is sucked into the surveillance capitalist orbit,
developmental complexity in an overarching
leading to a simultaneous hollowing-out of the
process of growth and institutionalization.
The four stages of surveillance capitalism’s democratic order. Some governance carve-outs
institutional development are each identified are difficult to decipher because the governance
by their novel economic operations. These are: functions themselves are not yet formally codi-
(1) The Commodification of Human Behavior; fied, as we shall see below in the case of epis-
(2) The Concentration of Computational temic rights. Others are explicit challenges to
Knowledge Production and Consumption; (3) the rule of public law. Most concerning is the
Remote Behavioral Actuation; and (4) Systemic extent to which the democratic order assists in
Dominance. An adequate understanding of each or fails to challenge these attacks.
stage, however, only begins with its economic Apple CEO Tim Cook provides an insight
action. into the essential developmental thrust that uni-
Over a century ago a young Durkheim fies each stage of governance victories when he
(1964) bent to the task of explaining “the divi- describes Apple’s determination to disrupt the
sion of labor in society” as the basis of social healthcare industry. His statement captures the
order in an emerging industrial age. He cau- direction, movement, and purpose of the gov-
tioned his readers, “The division of labor ernance vector more generally. “We are taking
appears to us otherwise than it does to econo- what has been with the institution,” Cook says,
mists” (p. 275). So too, the developmental “and empowering the individual” (Feiner, 2019,
stages of the surveillance capitalist institutional para. 72).
order appear to us otherwise than they do to In a similarly rare display of candor, Uber
economists. In addition to economic accom- founder Travis Kalanick once described to a
plishments, each stage pushes further into the group of MIT students the great “taking” that
void produced by the democracies’ early failure produced Uber’s success. He called it “regula-
to claim dominion over digital information and tory disruption” and quickly added, “We don’t
communication spaces. In this process two col- talk about that a lot in tech” (MIT Sloan School
lateral vectors of dystopian consequences are of Management, 2013, para. 6; see also,
set into motion by and inextricably linked to Fleischer, 2010; Riles, 2014; Terry, 2016, 2017)
each stage’s novel economic operations. I refer Both CEOs celebrate the work of carving out
to these as “the governance vector” and “the governance functions from an institutional zone
social harms vector.” of public law and their transfer to friction-free
8 Organization Theory 

market spaces where they are resurrected shorn Christensen et al. dismiss the democratic pro-
of legal constraints and indentured to a private ject with an Emperor’s casual thumbs-down
institutional logic. When Cook describes after a poorly matched gladiatorial contest.
“empowering” the individual, he declares The Fourth Estate, conceived as an essential
Apple’s prerogative to supplant existing institu- pillar of democracy and the means of holding
tions and laws. Apple Inc. asserts itself as the power to account, is brushed aside, “a function
source of authority that “empowers” and there- of life in the old world.” Incumbents lose
fore its equivalent authority to disempower indi- because they foolishly “stay the course” on the
viduals just as quickly with a flick of its terms of “quality” of content. Winners are the “low
service or operating system. end,” “low-cost,” “personalized” new entrants
The CEOs’ statements are classic render- (Christensen et al., 2012, paras. 14, 15).
ings of corporate strategies known as “disrup- From the vantage point of this ideological
tion,” fragrant with libertarian themes of the fortress—Cook’s fortress—it was impossible to
sovereign individual unjustly subjugated to admit, or perhaps even to perceive, that “low
society and its outdated institutions. Cook’s end” and “low cost” were not the conditions to
script weaves the illusion of Apple as a 21st- produce news but rather to produce fake news.
century Robin Hood, emancipating valuable Or that the “old world” stood for codified prin-
assets held hostage by powerful institutions ciples of information integrity, truth telling, and
and redistributing them to unjustly diminished factualization that a decade later would not be
individuals. Cook’s false flag of liberation regarded as fusty nostalgia, but rather be as
alienates the individual from society to obscure oases of rationality in a corrupted information
the inconvenient-to-Apple truth that only dem- hellscape. Beginning in the United States, and
ocratic society can authorize and protect the in spite of the stakes, the democracies stood
rights and laws that sustainably empower and down, bystanders to the birth of their own
protect individuals. diminished futures.
Democracy has no intrinsic value or invio- Thanks to the disruption reduction, the news
lable status in the disruption equation. The industry was quickly forced to join the surveil-
market mythology of godlike omniscience that lance capitalist order and contribute to its
naturally optimizes for superior economic out- reproductive routines. By 2017, Princeton
comes is exploited to justify its elevation over researchers found that news websites contained
democratic institutions and laws. The point is more embedded tracking codes than those of
illustrated by Clay Christensen, originator of any other industry in the study, as publishers
disruption theory, and his co-authors in a 2012 chased revenues in the new targeted ad markets
essay on the great “taking” from the news established by Google and Facebook. The dis-
industry sadistically entitled “Breaking ciplines of surveillance capitalism’s economic
News.” The piece quickly cites and dismisses imperatives shaped both print and television
journalism’s mission-critical role in the suste- news with pages and newscasts specifically
nance of democracy: “Journalism institutions designed to optimize social media engagement
play a vital role in the democratic process and for the purpose of maximum human data
we are rooting for their survival. But only the extraction (Narayanan & Reisman, 2017;
organizations themselves can make the NewsWhip, 2019; Stroud et al., 2014).
changes required to adapt ...” This laissez- This defeat bites hard in Pew Research’s
faire agnosticism and sang froid intellectual detailed 2020 survey of 979 tech business lead-
remoteness prefigures Tim Cook’s ambitions. ers, policy specialists, developers, innovators,
Zuboff 9

researchers, and activists. About half of those reproduction and treated as externalities. The
surveyed predicted that “humans’ use of tech- two vectors are complementary. Governance
nology will weaken democracy ... due to the carve-outs facilitate institutional reproduction
speed and scope of reality distortion, the decline by bulking up surveillance capitalism, amplify-
of journalism and the impact of surveillance ing its authority and power at the expense of the
capitalism” (J. Anderson & Rainie, 2020). democratic order. Social harms facilitate repro-
The point here is that Christensen and duction through direct attacks that disorient,
Cook’s disruption strategy was never intended distract and fragment the democratic order.
to develop institutions, but rather to eliminate Weaknesses produced by each governance
them. They envisioned new privately mediated carve-out create the conditions and opportuni-
relationships with “individuals” that first ties for attack by successive social harms that
bypass institutions, then eliminate the func- further diminish society’s ability to repel gov-
tions that institutions were designed to protect, ernance carve-outs.
and ultimately render institutions irrelevant. Each stage’s causes and effects create the
Because institutions are the gatekeepers that conditions and construct the scaffolding for
develop, contain, impose, and enforce qualify- the next. Each stage builds on and extends
ing standards of conduct and content in their what went before. Each is carried by the
respective domains, their diminishment or momentum of earlier means of institutional
elimination paves the way for fakes: fake news, self-reproduction, and each produces novel
famously, but also fake healthcare, fake educa- means that sustain, extend, and elaborate the
tion, fake cities, fake contracts, fake public new institutional order. As is typically the case
squares, fake governance, and so on. in stage-based theories of development, the
They “don’t talk about that a lot in tech” stages are ideal-typical abstractions that dis-
because the companies prefer to camouflage close the inner logic of an institutional order in
their governance moves behind the Robin Hood perpetual motion, compelled to survive, grow,
illusion, when in fact their carve-outs prepare and evolve. The stages constitute a unified
the ground for the opposite: the eventual substi- field of hierarchically integrated and path-
tution of private computational governance for dependent causes and effects. Seemingly
democratic governance. This shift emerges in disparate phenomena are thus revealed as
stage four’s expressions of systemic dominance later-stage consequences of earlier-stage oper-
when the democratic order itself becomes the ations and their reproductive mechanisms. All
target for disruption. three dimensions–economic, governance, and
A second vector produced at each stage is social–move together across time in a single
constituted by the production of new social comprehensive architecture of institutional
harms, calculated as the cost of institutional growth and intensification (see Figure 1).
10 Organization Theory 

Figure 1. Four Stages of the Surveillance Capitalist Institutional Order.

The lesson for effective strategies of demo- featured in these cases, but to the larger institu-
cratic contradiction is that later-stage social car- tional order in which they participate. My focus
nage can only be addressed effectively through is on the development of the institution, as it
direct confrontation with earlier stage economic accrues data, knowledge, authority, power and
operations. Durable solutions must be directed ambition. Each of the corporate giants, and for
toward the foundations from which all harms that matter the legions of firms across the com-
originate. mercial landscape already enmeshed in the sur-
Finally, the stage descriptions that follow veillance capitalist order, presents a unique
sometimes rely on case evidence from the lead configuration of each stage’s achievements.
corporations to illustrate the interdependencies Some are more advanced than others. Some
of economics, governance and social harms have more specialized roles and capabilities
within and across stages. Key to a stage-based within the larger spectrum. Each of them con-
analysis is that the dynamics examined here tributes to and is nourished by institutional
accrue not only to the corporate protagonists advance.
Zuboff 11

Foundational Stage One: The experienced will become searchable. Your


Commodification of Human whole life will be searchable” (Edwards, 2011,
p. 291). Google’s business plan had called for
Behavior (Economies of Scale)
selling search engine licenses to corporate cli-
The Economic Operations ents. Instead, the young company found a fast
track to salvation by repurposing its search
The first stage introduces the commodification engine as a sophisticated surveillance medium,
of human behavior operationalized in the secret a loss leader for massive-scale extraction of
massive-scale extraction of human-generated “your whole life.” People thought they were
data. This formative achievement is enshrined searching Google, but Google was searching
in the Google breakthrough that laid the foun- and seizing them.
dation for all that follows. The term of art was “user engagement,” a
In the year 2000, when only 25% of the coded expression for a new subject-object social
world’s information was stored digitally relation in which commodity resources targeted
(Hilbert & López, 2011), a tiny but brilliant for extraction just happen to be sentient human
Silicon Valley internet startup called Google beings. Google’s inventions depended on the
faced an existential threat during the financial secret invasion of once-private human experi-
crisis known as the dotcom bust. Founders ence to implement a hidden taking without ask-
Larry Page and Sergey Brin had not yet discov- ing. Such action is normally characterized as
ered a way to turn their search engine miracle theft, and it was on the strength of this original
into money. Between 2000 and 2001, as the sin of secret theft that users’ private lives were
company’s investors threatened defection, the declared as corporate property. Some of Google’s
Google team stumbled into a series of discover- earliest patents chronicle the company’s frank
ies that offered a rescue plan (Zuboff, 2019, pp. pursuit of behavioral surplus across the internet,
63–97). Data scientists learned to identify including methods that aimed to exploit and con-
behavioral signals embedded in the “data struct user profile information (UPI) with meth-
exhaust” left over from users’ search and ods that knowingly bypassed users’ agency,
browse activities. These discarded behavioral awareness, and intentions. For example, a 2003
traces (Power, 2022) were a surplus—more application explains that UPI “may be inferred,”
than what was required for product improve- “presumed,” and “deduced,” even when users
ment. The signals embedded in this behavioral did not knowingly provide such information or
surplus, they discovered, could be aggregated when they intentionally left information incom-
and analyzed to predict user behavior. The team plete “because of privacy considerations, etc.” It
soon broke through by learning how to predict a notes, “UPI for a user ... can be determined (or
“click-through rate,” the solid gold computa- updated or extended) even when no explicit
tion that saved the little company from financial information is given to the system ... An initial
ruin. It launched the online targeted ad industry, UPI may include some expressly entered UPI
best understood as surveillance advertising— information, though it doesn’t need to” (Bharat
the Trojan horse that conceals the complex et al., 2016, sec. 4.2.3).
machinery of secret massive-scale extraction of The invention and its antidemocratic social
human-generated data. relations were twin born. Page feared the conse-
Google founder Larry Page laid out the quences if users, lawmakers or competitors
essence of Google’s business in 2001 as search were to grasp the true nature of its operations.
and seizure. “If we did have a category,” he Anything that might “stir the privacy pot and
ruminated, “it would be personal information ... endanger our ability to gather data” was scrupu-
Everything you’ve ever heard or seen or lously avoided (Edwards, 2011, pp. 240–245).
12 Organization Theory 

A single law that pulled back the curtain and Human experience is claimed as free raw mate-
redefined Google as a thief would end the pros- rial for market action, beginning with its secret
pect of financial salvation. extraction and translation into behavioral data.
This corporate “hiding strategy,” as it was These data are the gateway to new realms of
called (S. Levy, 2011, p. 69), also served to con- highly predictive inferential constructions: emo-
ceal the astonishing financial implications of tions, personality, political and sexual orienta-
Google’s new capabilities. Between 2001, when tion, and more. Surplus data are immediately
surveillance economics were first applied, and redefined as corporate assets, private property
2004, when Google went public, its revenues available for the proprietary computation of indi-
increased by 3,590% (Google Inc., 2004, p. 19). vidual and collective profiles and predictions.
This surveillance dividend established the secret Surveillance capitalists compete on the
massive-scale extraction of human-generated data power of their predictions to reduce uncertainty.
as the illegitimate, illicit and perfectly legal foun- This most fundamental commercial objective
dation of a new economic order. Every investor dictates the requirement for massive-scale com-
would want it. Every startup would endeavor to modity extraction, production, and refinement
supply it ... and there was no law to stop it. of human-generated data, comparable to tons of
In 2008, after a series of costly blunders that wheat or barrels of oil. Prediction products are
incited user rebellion, Facebook founder Mark sold to business customers in a new kind of
Zuckerberg turned to Google for answers, hir- commodity market that trades in human futures.
ing Sheryl Sandberg, Google’s head of global The point is illustrated in a 2016 Facebook doc-
online advertising, as his second in command ument describing its “AI Backbone”, known as
(Hempel, 2008). With Sandberg in charge of FBLearner Flow. Thanks to the absence of dem-
operations, Facebook quickly learned to extract ocratic contradiction, Facebook’s AI “ingests
behavioral surplus from every behavioral trace, trillions of data points every day” to produce
irrespective of what people voluntarily shared. thousands of models. These computations are
She recognized that Facebook had a front row fed to its “prediction service” that churns out
seat on what Page had called “your whole life,” “more than 6 million predictions per second”
as unsuspecting users poured their lives onto (Dunn, 2016). These are the building blocks of
Facebook pages. The result was a company prediction products sold to companies, adver-
with, as Sandberg observed, “better informa- tisers political campaigns, and other buyers
tion than anyone else” and more “real data”, with an interest in knowing, reinforcing, or
not “the stuff other people infer” (Kirkpatrick, inhibiting the predicted behavior of individuals
2011, p. 266). and groups (Biddle, 2018b).
A year from Sandberg’s arrival, the new The “click-through rate” was only the first
executive duo changed Facebook’s privacy pol- globally successful prediction product, and
icy to pave the way for surveillance economics. online targeted advertising was the first thriving
TechCrunch summarized the corporation’s market in human futures. Surveillance capital-
strategy: “If there is significant backlash against ism was thus “declared” into being and the only
the social network, it can claim that users will- witnesses to its birth were sworn to secrecy
ingly made the choice to share their information (Searle, 2010, pp. 85–86, 13).
with everyone” (Kincaid, 2009, para. 6). Mr The secret accumulation of behavioral sur-
Zuckerberg’s hard-won appreciation of surveil- plus in scale and scope belies the notion of an
lance economics steeled him to the realpolitik “attention economy,” because the defining
of a new economic order: “[W]e decided,” he work here is accomplished outside the atten-
explained, “that these would be the social norms tional field. Indeed, the notion has skewed pub-
now, and we just went for it” (B. Johnson, 2010, lic perception in a dangerous way by promoting
para. 15; see generally Srinivasan, 2019). the false belief that one can control exposure to
The novel economic foundations of surveil- extraction operations by controlling one’s
lance capitalism begin with its original sin. attention. The facts are different. Withholding
Zuboff 13

one’s attention is no barrier to or protection notes that location tracking “enables businesses
from the secret extraction of signals produced to identify customer behavior ... and mitigate
and captured beyond the range of human the uncertainties in the market” (Grand View
awareness or control. One may take refuge in Research, 2022). The notion of “all data” con-
the fiction of choice (Kim, 2013; Radin, 2012) sistently evolves toward the more exquisitely
with respect to a discrete decision to share spe- predictive, such as decoding speech from brain
cific information with a corporation, but that waves or using eye gaze behavior to infer sensi-
information is insignificant compared to the tive information including personality, emo-
volume of behavioral surplus that is secretly tions, and sexual preference (Kröger et al.,
captured, aggregated, and inferred. The princi- 2020; Moses et al., 2019). Indeed, it is already
ple of concealment operationalized in secret understood that augmented reality, or the
surveillance is thus essential to this founda- “metaverse,” for all its futuristic verbiage, is
tional stage of economic operations and intended as an intensification of stage one foun-
becomes a critical mechanism of institutional dational extraction mechanisms (H. Murphy,
reproduction (Binns, 2022, p. 21). 2022; Heller, 2021; Martin, 2021).
In the second decade, the initial successes Research from the Irish Council for Civil
of surveillance capitalist pioneers, such as Liberties (ICCL) illustrates the current state of
Google and Facebook, drew surveillance eco- play (Ryan, 2022). The giants continuously
nomics into the “normal” economy, now sym- aggregate once-private personal information to
bolized in Walmart’s competition with Amazon compute user locations, behavioral surplus,
for human-generated data collection, compu- profiles, and predictions. These are broadcast to
tation, prediction and targeting (Tobin, 2022). human futures markets for real-time bidding
Surveillance capitalism metastasized across (RTB), where advertisers bid on the opportunity
diverse sectors from insurance, retail and to place their ad on your screen. IAB, the non-
finance, to agriculture and transportation, to profit research consortium that supports RTB in
the most intimate and predictive data residing the ad tech industry, lists nearly 400 data
in the two critical sectors of education and categories that refine user profiles, inclu­
healthcare. ding “Coffee/Tea,” “Road-Side Assistance,”
Every product called “smart” and every ser- “Incontinence,” “Panic/Anxiety Disorders,”
vice called “personalized” are now loss leaders “Ethnic Specific,” and “Personal Finance.” The
for the human data that flow through them. Most RTB categories also include “Incest/Abuse
“apps” begin their lives for sale and distribution Support,” “Women’s Health,” “Dating,”
through Apple’s and Google’s app stores. Once “Marriage,” “Travel,” “Pregnancy,” “Babies
downloaded, and no matter how apparently and Toddlers,” and “Adoption,” data all too eas-
benign, they function as data mules shuttling ily trained on stalking potential abortion seek-
behavioral signals from “smart” devices to serv- ers in a polarized America, where many states
ers primarily owned by the tech giants and the have criminalized a woman’s right to choose.
ad tech data aggregators. As one Silicon Valley (IAB Tech Lab, 2016, sec. 5.1).
data scientist described it to me, “The underly- Google is the largest RTB company, chan-
ing norm of virtually all software and apps neling targeting data to 4698 firms in the United
design now is data collection. All software States, or 10% of US broadcasts, and 1058 firms
design assumes that all data should be collected, in Europe, accounting for 14% of European
and most of this occurs without the user’s broadcasts. The ICCL findings suggest that cur-
knowledge” (DS I, see Note on Method). rent legal regimes mitigate but do not abolish
The concept of “all data” is ever-expanding. these operations. The average person in the
Location tracking is now institutionalized: United States, where the federal government has
global, ubiquitous and inescapable (Zekavat yet to pass basic privacy protections, has their
et al., 2021). An industry analysis forthrightly online activity and location data exposed 747
14 Organization Theory 

times each day. In Europe where data protection This pattern reflects a period in which US
laws lead the world, it’s 376 daily exposures. lawmakers coalesced around a radical free mar-
The data flow with no means to control their ket ideology forged in the aftermath of the
destinations or assess their fate. Second World War as a response to the collec-
The entire economic edifice of surveillance tivist nightmares of German and Soviet totali-
capitalism is built on this illegitimate but not tarianism and theorized by the neoliberal
illegal foundation. Nothing here was “techno- thought leader Friedrich Hayek (Burgin, 2012;
logically determined”. Nothing was or is inev- Mirowski, 2013). The politics of knowledge
itable. That the democracies failed to mount defined the core of Hayek’s thesis. It was the
the contradictory force capable of thwarting intrinsic “ineffability” of “the market,” he
surveillance capitalism’s development was the argued, that necessitates maximum freedom of
result of specific ideological and historical action for market actors, an idea that shaped the
contingencies, beginning in the United States. libertarian creed and its insistence on absolute
individual freedom (Hayek, 2007, p. 234).
The epistemic challenge and its politics of
Supply of Human-Generated Data knowledge dominated Hayek’s thinking from
Freedom from law justified by a radical and his 1937 preoccupation with “the division of
antidemocratic economic ideology guaranteed knowledge” as a central concern of economics,2
the limitless supply of human data. to his 1988 description of the market as an
Public access to the world wide web in the unknowable “extended order” that supersedes
mid-1990s was a slow-gathering wave. Legal the political authority of the state. “Modern
scholar Anupam Chander describes the US economics,” he wrote, “explains how such an
milieu of the 1990s, when all three branches of extended order ... constitutes an information-
the government converged on a cobbled- gathering process ... that no central planning
together “industrial policy favoring Internet agency, let alone any individual, could know as
entrepreneurs” and their maximum freedom of a whole, possess, or control” (Hayek, 1988, pp.
operation unobstructed by law. Lawmakers in 14–15). The genius of markets, he argued, had
Europe and Asia typically sought to protect to be protected from any countervailing institu-
established principles of individual privacy, tional force or external authority. That such sys-
intermediary liability, and the sanctity of intel- tems would produce substantial inequality of
lectual property laws as they oversaw the rise of rights and wealth was accepted, and even wel-
the internet. In the United States, the Clinton comed, as a necessary feature of a successful
administration, Congressional lawmakers and market system, a goad to human betterment,
jurists rallied to the rhetoric of “innovation” by and a force for progress (Hayek, 1945, pp. 88–
aggressively conflating freedom of internet 89; Mirowski, 2013, pp. 53–67).
commerce with “freedom of speech.” “U.S. It was University of Chicago economist
authorities (but not those in other technologi- Milton Friedman who later surpassed Hayek in
cally advanced states) acted with deliberation to popular recognition and influence. In addition
encourage new Internet enterprises by both to his academic research, for which he received
reducing the legal risks they faced and largely the Nobel Prize, Friedman simplified free mar-
refraining from regulating the new risks they ket ideology for public consumption. From the
introduced,” writes Chander (2014, p. 645). The 1960s to just a few years before his death in
upshot was the “stunning pattern” of a legal 2006, Friedman tirelessly asserted his bedrock
environment “specifically shaped to accommo- theme in every speech, lecture, essay, and pub-
date” the internet companies (Chander, 2014, lic appearance: radical market freedom is justi-
pp. 644–645, 648–649; see also, Citron & fied as the one source of truth and the origin of
Franks, 2020; Tribe, 2021; Pasquale, 2017a; political freedom. On an infamous 1975 visit to
Chander & Le, 2014; Rozenshtein, 2021). Chile, he counseled the dictator Augusto
Zuboff 15

Pinochet and his generals that only radical free By the late 1970s and for decades to follow,
market economics would return the country to politicians and policy makers were expected to
political freedom (Friedman et al., 2012). He internalize contempt for their own power as
told students at Brigham Young University, they learned how to design, play, and defend a
“The economic market is a more effective game in which the only rule was the absence of
means for achieving political democracy than rules. Democratic submission to market truth
is a political market” (Friedman, 1976, 2017a, required democratic officials, leaders and law-
p. 116). He warned alumni at Pepperdine makers instructed in and converted to a new
University, “If you don’t have economic free- role in the political ordering of knowledge, one
dom, you don’t have political freedom” that yields to the unconscious but superhuman
(Friedman, 2017b, sec. 5). truth of the market over the conscious direction
Historian Angus Burgin observes that of democratic governance. Unimpeded compe-
Friedman’s insistence on the universal applica- tition and supply-side reforms, including com-
tion of radical economic freedom made him an prehensive deregulation, privatization, lower
“unprecedented anomaly” even among the neo- taxes, and corporate self-regulation, were to be
liberal economists of late 20th-century United the new solutions to growth.
States. Friedman described himself as a “radi- Friedman’s political message “had the effect
cal,” “kook” and “extremist” (Burgin, 2012, pp. of a bombshell,” Thomas Piketty (2014)
154–155, 183–184, 212–213, 176–177). observes, and “created the intellectual climate in
As the old collectivist enemies receded, which the conservative revolution of 1979–1980
many conservative economists moved toward became possible” (p. 549). By 1979, it was
greater recognition of the role that democratic Friedman who advised presidential candidate
governments play in sustaining market econ- Ronald Reagan; then Friedman who advised the
omies. Friedman instead doubled down on White House during Reagan’s eight-year tenure;
sharp dualisms, absolutes and unequivocal Friedman who traveled the world declaiming the
policy positions, naming fresh collectivist primacy of radical economic freedom to elites in
dangers to battle. As early as 1951 he worried every region; and Friedman who preached his
that democratic “collectivism” would under- dogma on American Public Television.
mine “political democracy.” In 2002, four years before his death, Friedman
By the late 1970s, he argued, that time had reevaluated his lifelong view of economic free-
come. He identified the new collectivist threats: dom in favor of an even more antidemocratic
state regulation and oversight, social legisla- formulation. In what would become a gross mis-
tion and welfare policies, labor unions and the reading of the future, Friedman explained,
institutions of collective bargaining, public “Hong Kong ... persuaded me that while eco-
education, and even foundational democratic nomic freedom is a necessary condition for civil
principles such as equality, social justice and and political freedom, political freedom, desira-
majority rule. Friedman saw the field of com- ble though it may be, is not a necessary condi-
bat clearly: the market must dominate, demo- tion for economic and civil freedom.”
cratic institutions must recede. Instead of Democracy, he reasoned, is after all too risky,
democracy regulating the market, the market too much of a wild card. Political freedom
must regulate democracy.3 In 1970, the profes- should no longer be understood as an inviolate
sor declared in the New York Times Magazine, good but rather as an unpredictable condition
“There is one and only one social responsibility that “under some circumstances promotes eco-
of business–to use its resources and engage in nomic and civic freedom, and under others,
activities designed to increase its profits so inhibits economic and civic freedom.” Economic
long as it stays within the rules of the game” freedom was to be the one absolute and the only
(Friedman, 1970, para. 33). necessity. Democracy, Friedman concluded, is
16 Organization Theory 

expendable (Ebenstein, 2012, pp. 105–106; lead.” The people, science, capital, software,
Friedman, 2002). chips, wires, cables, servers, and so forth that
Though Friedman’s dogma was adopted constitute the web and its new companies were
most comprehensively in the United States and mythologically fashioned as “cyberspace,” an
the UK, its consequences spread to every region extra-societal zone in which the norms and laws
of the global economy. And so it was that three of real-world democracies do not apply.
days after Friedman’s death in November 2006, The white paper is a Manchurian candidate-
Larry Summers, former Clinton Treasury like testament to the internalized disciplines of
Secretary and soon to be director of Obama’s democratic submission. It denigrates to the point
National Economic Council, authored a memo- of caricature the US government’s role in the
rial essay in the New York Times. “[A]ny honest governance of digital information and communi-
Democrat will admit that we are now all cation spaces, insisting on minimal government
Friedmanites,” he wrote, and not as lament. involvement or intervention. “Business models
“Mr. Friedman ... never held elected office but must evolve rapidly ... government attempts to
he has had more influence on economic policy regulate are likely to be outmoded by the time
as it is practiced around the world today than they are finally enacted ... Existing laws and reg-
any other modern figure.” Summers (2006) ulations that may hinder electronic commerce
observed that Friedman transformed the “previ- should be reviewed and revised or eliminated to
ously unthinkable” into the acceptable and nec- reflect the needs of the new electronic age”
essary. With Friedman, “heresies had become (Clinton & Gore, Jr., 1997, secs. 2, 4). The
the orthodoxy.” Friedman’s greatest skill was to administration’s approach was not only to get
imbue his radical ideas with the aura of inevita- out of the way, but to proactively cede whole
bility (paras. 2, 5, 7). governance functions to the internet companies,
Friedman’s libertarian conception of free- including the most sensitive: “privacy, content
dom converged with the libertarian culture of ratings, and consumer protection.” The absence
Silicon Valley and its second-generation entre- of constraints on privacy-invasive operations
preneurial stirrings in the late 1970s (Flichy, was thus gifted by democracy, and it was this
2007; Isaacson, 2014; Mosco, 2004; Markoff, gift that enabled surveillance capitalism’s
2005; Naughton, 1999; Turner, 2006; see also, explosive growth. Friedman’s dogma would
Chafkin, 2021). With the release of the Mosaic eventually guarantee the secret but legal mas-
browser in 1993, the world wide web was on its sive-scale supply of human-generated data.
way to becoming a popular global medium of Fifty years after Friedman’s declaration in
information production and consumption, the New York Times, presidential candidate Joe
communication, and commerce. These new Biden offered his own declaration: “I think
connected spaces were quickly claimed for there’s going to be a willingness to fix some of
Friedman’s radical freedom. Former Google the institutional inequities that have existed for a
CEO Eric Schmidt celebrated this triumph on long time. Milton Friedman isn’t running the
the first page of his 2014 book on the digital show anymore” (Grunwald, 2020; emphasis
age when he described “an online world that is mine). In 2022, former Obama Administration
not truly bound by terrestrial laws ... the world’s Chair of the Federal Trade Commission, Jon
largest ungoverned space” (Schmidt & Cohen, Leibowitz, aired his frustration with US law-
2014, p. 3). makers in the Wall Street Journal. A decade ear-
This historical nexus finds iconic expression lier, Leibowitz and his FTC colleagues had
in the 1997 Clinton-Gore white paper setting submitted a report to the US Congress sounding
out the administration’s policy vision for the alarm that “industry self-regulation of pri-
“Global Electronic Commerce” (Clinton & vacy was not working for American consumers”
Gore, Jr., 1997). It begins with the document’s and arguing for strict curbs on data collection.
bedrock principle: “The private sector should But in 2022, Leibowitz observed with dismay
Zuboff 17

that in the years since that report, “surveillance In the United States and across the European
capitalism has only gotten worse,” yet “legisla- Union, legislation was quickly enacted to
tion has languished” (Leibowitz, 2022). expand surveillance activities (I. Brown, 2012,
The lesson is that there can be no end to p. 230; Pasquale, 2013; Prodhan & Nienaber,
Friedman’s show unless and until there is an 2015; Rubin, 2015; Schwartz, 2012, pp. 289,
end to surveillance capitalism, the digital-cen- 296; Scott, 2015; Voss, 2016). Instead of federal
tury avatar of his extremist vision and informa- legislation to outlaw the novel surveillance
tion civilization’s preeminent beneficiary of his practices, the new aim was to enrich the condi-
legacy. tions for their expansion and application out-
Leibowitz was not the first to sound the side constitutional, legislative, and regulatory
alarm. As early as 2000, a majority of FTC com- constraints. Yale’s Jack Balkin explained that
missioners, including Chair Robert Pitofsky, while the US Constitution inhibits surveillance
concluded that self-regulatory initiatives “can- by government actors, privacy protections for
not ensure that the online marketplace as a information held in private servers is “limited if
whole will follow the standards adopted by not nonexistent.” If the intelligence community
industry leaders ... notwithstanding several was to indulge its obsession to ascertain the
years of industry and governmental effort” future, then it would have to “rely on private
(Pitofsky et al., 2000, p. 35). They proposed a enterprise to collect and generate information
detailed framework for basic federal privacy for it” (Balkin, 2008, pp. 16–17, 19). The con-
protections that would have set the United States tours of a new interdependency between public
and the world on a different path to the digital and private agents of information dominance
century, but history and chance had other plans. began to emerge, born of a mutual magnetism
originating in complementary interests and rec-
iprocities. The result was an unwritten doctrine
Demand for Human-Generated Data of surveillance exceptionalism that guaranteed
With supply assured, the US response to the robust demand for secretly extracted human-
9/11 terrorist attacks, known as “the war on ter- generated data (Zuboff, 2019, pp. 112–121).
ror,” guaranteed continuous demand for human- The license to steal would become the per-
generated data. sistent elephant in the room, joining free market
According to Peter Swire, chief counselor dogma to shape an environment in which the
for privacy in the Clinton administration and internet companies developed without legal
later a member of President Obama’s Review impediments. But it also sentenced these
Group on Intelligence and Communication companies to a future of political activism, col-
Technologies, “With the attacks of September laboration, appeasement, lobbying, threat, con-
11, 2001, everything changed. The new focus frontation, legal battle, and propaganda. These
was overwhelmingly on security rather than actions have been required to defend and fortify
privacy.” The legal provisions debated just privileges precisely because they were never
months earlier vanished from the conversation inevitable.
more or less overnight, giving way to an obses- Because action is weaker than institutions,
sion with “Total Information Awareness.” A Big Tech was compelled to surpass all prior lob-
“state of exception” was invoked to unleash a bying records, including Big Oil and Big
new data imperative: velocity and volume at Tobacco. Ninety-four percent of all members of
any cost. In this new environment, “Congress Congress with jurisdiction over privacy and
lost interest in regulating information usage in antitrust issues received Big Tech PAC or lob-
the private sector,” Swire recounts. “Without byist contributions, amounting to US$3.2m in
the threat of legislation, the energy went out of 2020 alone (Chung, 2021). “[T]he four biggest
many of the self-regulatory efforts that industry technology companies and their third-party
had created” (Swire, 2013, pp. 845, 846). groups spent $35.3 million during the first half
18 Organization Theory 

of 2022, a 15% increase over ... the first half of began by apologizing for keeping the group
[2021]” (Diaz & Birnbaum, 2022). from its midday meal, and when he ended his
In 2010, former NSA director Mike astonishing remarks 28 minutes and 47 seconds
McConnell acknowledged collaboration later in a lackluster drizzle of applause, the
between the internet companies and the intelli- announcer waived off his offer to take ques-
gence community, noting, “the challenge is to tions. “I think we’re getting ready for lunch, but
shape an effective partnership with the private people can find you, I’m sure, floating around”
sector so information can move quickly back (Hunt, 2013, 28:00–28:05). The clutch of rum-
and forth from public to private” (McConnell, pled suits wandered off in search of sandwiches
2010, para. 15). and drinks, leaving behind the one man in the
By 2013, that partnership was not only fully room whose brain was on fire. His name was
operational but normalized. Speaking at a pub- Edward Snowden (Hunt, 2013, 28:00–28:05;
lic tech conference in March that year, CIA Snowden, 2019, pp. 247–248).
Chief Technology Officer Gus Hunt cheerfully Bart Gellman broke the 2013 Snowden
described the central riddle of the CIA’s urge PRISM story in the Washington Post, chroni-
toward total information. “We have to connect cling the NSA’s data collection from nine com-
the dots ... Since you can’t connect dots you panies, including Microsoft, Yahoo, Google,
don’t have, it drives us into a mode of, we fun- Facebook, PalTalk, AOL, Skype, YouTube, and
damentally try to collect everything and hang Apple. In his 2020 analysis Gellman concludes
on to it forever.” (Hunt, 2013, 19:14–20:11). that the NSA “shared more information obtained
Hunt exalted the tech companies as the means from American internet companies than from
to this end. any other source.” PRISM, he writes, “had
The hallmark of the CIA’s technology mis- become a principal engine of the U.S. surveil-
sion was to “protect national security,” Hunt lance machine ... Never in history had there
explained. It required the agency “to take advan- been richer troves of personal information”
tage of the massive information streams that (Gellman, 2020, pp. 113, 117, 121).
have emerged on the planet.” Hunt acknowl- The United States was not alone in its
edged the CIA’s gratitude toward Google (“a demand for the surveillance capitalists’ extra-
very big provider of things”), Facebook (where constitutional flows of human-generated data.
“35% of all the world’s digital photography” was Cate and Dempsey (2017) describe the expan-
already posted), YouTube (“the only Exabyte sive aggregation of personal data from private
scale or bigger repository ... on the planet”), sector companies by 13 countries, including
Twitter (4500 tweets per second), and the telcos the democracies of France, Germany, Israel,
(global text messaging and mobile phone calls). Italy, Brazil, Canada, the United States,
In recognition of ubiquitous location tracking, Australia, India, Japan, and South Korea. They
Hunt instructed his audience, “You’re already a conclude that every government in their study
walking sensor platform,” and he concluded with practiced “bulk collection”–mass data collec-
a stunning admission: “I think we’re at high tion “without particularized suspicion”–all of
noon in the information age ... It really is very it to collect the dots in the hope of one day
nearly in our grasp to be able to compute on all connecting them. “Every government in the
human-generated information” (Hunt, 2013, world claims the power to compel disclosure
1:22–1:32, 5:55–5:57, 6:25–6:44, 7:05–7:21, of this data by the companies that hold it,”
10:57–11:02, 26:05–26:17). Cate and Dempsey observe. They ask if such
Despite the dramatic content of Hunt’s pres- capabilities can ever be consistent with human
entation, his audience of IT professionals and rights principles of necessity and proportion
journalists barely blinked, apparently more (p. xxvi; see also, I. Brown, 2012; Pasquale,
interested in food than the shocking specter of 2013; Rubin, 2015; Schwartz, 2012; Scott,
tech power and governmental collusion. Hunt 2015; Voss, 2016). More recently, evidence of
Zuboff 19

US government practices of purchasing extra- Data Privacy Regulation (GDPR)—privacy as


constitutional data, such as location tracking, it was understood as recently as the year 2000
has been identified as a form of “data launder- has been extinguished.
ing” designed to evade Fourth Amendment The focus on privacy as the object of attack
protections (Rahbar, 2022). obfuscated to the point of invisibility an epic
Surveillance exceptionalism has meant that chapter in the history of the politics of knowl-
the United States and, to varying degrees, every edge. Elemental rights to private knowledge of
liberal democracy chose surveillance capital- one’s own experience, which I shall refer to as
ism and its utilities for domestic surveillance “self/knowledge,” were expropriated en masse,
over democratic principles and governance. concentrated in the emerging surveillance capi-
This choice forfeited the crucial first decade of talist order, and annexed to its self-asserted
the digital century as an opportunity to advance governance authority. With the term “elemen-
a distinctly democratic digital future, while tal,” I mean to mark a distinction between tacit
secrecy deprived the public of the right to rights and juridical rights. Others have
debate and combat. Gellman mourns the result: addressed this distinction, and Searle’s “prag-
journalists were “in the dark,” “plaintiffs could matic considerations of the formulation of
bring no constitutional challenge,” “Congress rights” are useful here (Searle, 2010, pp.
faced no public pressure,” “internet companies 194–195).
encountered little demand for stronger defense Searle argues that tacit assumptions of pre-
of privacy,” and “voters and consumers could rogatives, which are experienced as something
not ask for change because they did not know like “rights,” adhere to conditions of existence
the truth” (Gellman, 2020, p. 127). This was the in a time and place. They are crystallized as for-
price that the democracies paid for the insatia- mal “human rights” only at that moment in his-
ble drive “to compute on all human-generated tory when they come under systematic threat.
information.” For example, the ability to speak is an elemental
Thanks to these historical conditions, sur- right born of a human condition. The right to
veillance capitalism’s epistemic counterrevolu- “freedom of expression” is a juridical right,
tion proceeded without constraint, preamble to which only emerged when society evolved to a
a growing disjuncture between the emerging degree of political complexity that the freedom
institution of surveillance capitalism and the to express oneself came under threat. Searle
democratic order. The audience was seated, the observes that speech is no more central to
lights were dimmed, the orchestra had begun to human life than breathing or being able to move
play, but the curtain had yet to rise. one’s body. No one has declared a “right to
breathe” or a “right to bodily movement,”
The Governance Vector: The because these elemental rights have not come
under attack and therefore do not require legal
Annexation of Epistemic Rights codification. What counts as a fundamental
Extraction operations invade and extract human right, Searle argues, is both “historically
domains of human experience that in modern contingent” and “pragmatic.”
democratic societies have been considered as Elemental rights to self/knowledge belong to
“private.” It is not surprising that these opera- a larger class of “epistemic rights” that confer
tions have been understood as violations of pri- inalienable entitlements to varieties of knowl-
vacy, and that legal theory and practice have edge and knowing (Radin, 1987). Scholars and
typically aimed to establish or protect a “right jurists have begun to identify the speciation of
to privacy” and to strengthen existing privacy these rights, such as the right to be forgotten
law (Citron, 2022; Citron & Solove, 2022). Yet, (Allen & Rotenberg, 2016; Rosen, 2012), the
despite rising public concerns—and, in the EU right to the future tense, the right to sanctuary
at least, the historic enactment of the General (Zuboff, 2019), the right to exercise human
20 Organization Theory 

intelligence (Risse, 2021), the right to freedom release ... we have improved accuracy for emo-
of thought (Alegre, 2022), the right to truth tion detection ... and added a new emotion:
(Kerner & Risse, 2021), the right to revise one’s ‘Fear’” (Amazon Web Services, 2019, para. 1).
identity (Tutt, 2014). Though one does not grant Amazon knowledge
Epistemic rights are decision rights, and in of one’s fear, the corporation secretly takes it
the realm of self/knowledge, such rights are the anyway, another data point in the trillions fed to
cause of which privacy is the effect. US the machines that day. Extraction methods
Supreme Court Justice William O. Douglas enjoy privacy while robbing their targets of
illuminated this hidden layer of epistemic epistemic rights, including the right to decide
rights in a 1967 dissenting opinion on a Fourth “who knows me”, and the right to contest the
Amendment case entailing questions of illegal nullification of such rights.
search and seizure (US Supreme Court, 1967, As is the case with all elemental rights, most
para. 66): epistemic rights have not been codified in law
because it has not yet been necessary to do so.
Privacy involves the choice of the individual to This fact reflects the many ways in which the
disclose or to reveal what he believes, what he absence of democratic contradiction is essential
thinks, what he possesses ... Those who wrote the to surveillance capitalism’s developmental suc-
Bill of Rights believed that every individual
cess. A new age has dawned in which individu-
needs both to communicate with others and to
keep his affairs to himself. That dual aspect of
als’ inalienable decision rights to self/knowledge
privacy means that the individual should have the must be codified in law if they are to exist at all.
freedom to select for himself the time and Searle’s historical contingency is now. His prag-
circumstances when he will share his secrets with matism reflects the conditions of existence into
others and decide the extent of that sharing. which we are thrown (Flyverbom, 2022).

Douglas’s formulation clarifies the elemen- The Social Harm Vector (1): The
tal epistemic right to self/knowledge as inalien-
able authority over whether to disclose or
Destruction of Privacy
withhold that knowledge, to whom, to what The destruction of individual privacy is the
degree, and for what purpose. Privacy is the queen of social harms, a harsh measure but nec-
effect of Douglas’s “choice” and “freedom to essary to secure the foundations of a novel eco-
select.” The secret seizure of once private expe- nomic logic. Secret massive-scale extraction of
rience and its rendition as behavioral data extin- the human and privacy cannot coexist. Page and
guishes this individual freedom and resurrects it Brin knew it. Zuckerberg and Sandberg knew it.
inside the surveillance capitalist order. The All who worked with them and for them or
epistemic rights embedded in the “freedom to invested in them knew it. For surveillance capi-
select” are thus accumulated as corporate rights. talism to succeed, privacy must fall. And fall it
Since privacy is contingent upon this freedom, did. Most striking is that the destruction of pri-
the result is that privacy is expropriated, seques- vacy and all that follows from it has been perpe-
tered and concentrated in the domain of surveil- trated for the sake of the banality that is
lance capital (see for example, P. Roberts, commercial advertising. This new era of sur-
2012). These concentrations of epistemic rights veillance advertising has brought great wealth
and of the privacy they enable become essential to the few–the surveillance giants, their clients,
mechanisms of institutional reproduction. For investors, and ecosystems–but for the many it is
example, an Amazon press release on its the engine of the commodification of the human
Rekognition system is but one tiny fragment of and all that follows from it. In result, the bat-
evidence suggesting that Searle’s historical tered but not broken privacy standard of the
contingency is upon us. “Face analysis gener- year 2000 now appears as the final hour of a
ates metadata about detected faces ... With this long age of innocence.
Zuboff 21

The very notion of “privacy” has become a Thanks to the democratic void, the question
zombie category, though discussions and con- of information stewardship was never genu-
tests plow ahead. Most of this discourse is trained inely engaged. The early conflation of internet
on damage control after secret massive-scale commerce and freedom of expression enabled
extraction has been institutionalized: data protec- the giants to defend their practices with a
tion, minimization, portability, security, access, twisted notion of “free speech fundamentalism”
deletion, transparency, interoperability ... that equated any discussion of stewardship with
The giants’ rhetoric, honed over two decades censorship and a denial of speech rights
of crises and polished to high art in a global net- (Pasquale 2017a). That the democratic order
work of legal services and public relations war has failed to confront this condition is demon-
rooms, further confuses the field with deliberate strated in countless ways around the world, as
campaigns of disorientation and misdirection corrupt information dominates social commu-
that are themselves core strategies of institu- nication and news engagement, producing may-
tional reproduction (see for example, Wall Street hem across every human domain. In the United
Journal, 2021b). One illustration plucked from States, Allcott and Gentzkow defined “fake
the welter occurred in April 2019, when Mr news” as “distorted signals uncorrelated with
Zuckerberg dramatically unveiled the compa- the truth” that impose “private and social costs
ny’s new strategic direction to his annual devel- by making it more difficult ... to infer the true
oper conference: “The future is private,” he state of the world.” They found that in the lead-
solemnly proclaimed (Statt, 2019, para. 3). Less up to the 2016 US presidential election there
than two months later, Facebook’s attorneys were 760 million instances of a user reading
were in a California courtroom, repelling a class such intentionally distorted signals online, or
action suit concerning privacy violations about three corrupt stories for each adult
brought by its own users. The lawyers argued American (Allcott & Gentzkow, 2017).
that on Facebook, “the social act of broadcasting Research conducted by the German Marshall
your personal information ... negates, as a matter Fund concluded that in the last quarter of 2020,
of law, any reasonable expectations of privacy” before and after another American presidential
(US District Court—San Francisco, 2019, p. 8). election, Facebook posts linked to deceptive
sites received 1.2 billion interactions–nearly
The Social Harm Vector (2): The Rise one-fourth of all posts that included links to the
200,000 US-based sites in the study. During the
of Epistemic Chaos same period, the sharing of corrupt content on
Information civilization earns its name from Twitter reached a new high, with 47 million
new conditions of existence that require per- “shares” by verified accounts, nearly a third of
sons, and increasingly all that is animate and the 155 million “shares” linked to US sites
inanimate, to be rendered as and mediated by (Goldstein, 2021). By 2022, 69% of Americans
digital information. Inclusion in the coordinates regarded disinformation as a more critical social
of the world’s atlas means that one must be as, problem than “Infectious disease outbreaks,”
be in, and move through this computer-medi- “Gun violence,” “Quality of education,” “Illegal
ated realm (Flyverbom, 2022). Under these drug use or abuse” or “International terrorism”
conditions, information stewardship is critical (McCorkindale & Henry, 2022, p. 7).
to the democratic project, beginning with the Disinformation on Facebook is known to
necessity of information integrity: its quality, have distorted elections, produced violence,
fidelity, wholeness, and intactness free from and degraded social discourse in nearly every
inorganic, exogenous, willful, or secret corrup- world region (Akinwotu, 2021; Del Vicario
tion. In an information civilization, systemic et al., 2016; Hao, 2021a, 2021b; Mozur &
threats to information integrity are systemic Scott, 2016; Silverman et al., 2022; Wong,
threats to society and to life itself. 2021). In 2020, 81 countries suffered the
22 Organization Theory 

effects of “cyber troop activity,” defined as There are many sources of disfigured infor-
“government or political party actors” engaged mation, and Facebook does not author this cor-
in the systematic manipulation of public opin- ruption. In the examples reviewed here and
ion online (Bradshaw et al., 2021). In the sec- countless others, Facebook’s machine systems
ond decade, 2010–2020, as Facebook became simply worked as engineered, continuously
a fully operational surveillance capitalist reproducing operations that advance economic
medium of global social communications, the imperatives. How do the imperatives of surveil-
share of the world’s population living in autoc- lance capitalism create the conditions for cor-
racies increased from 48% to 68%. A key rupt information to flourish, yielding epistemic
mechanism in this transformation has been chaos within and across societies? My exami-
disinformation campaigns spread through nation of this question focuses primarily on
social media, principally Facebook (Alizada Facebook, the world’s largest social network,
et al., 2021; Reed, 2019, 2022). but the answer actually begins with a 74-year-
In a May 2022 CNN interview, the Biden- old treatise.
appointed Commissioner of the US Food and Claude Shannon’s 1948 publication “A
Drug Agency, Dr Robert Califf, discussed his Mathematical Theory of Communication”
finding that “misinformation” had become the blazed across established intellectual bounda-
leading cause of death in the United States, with ries toward the new frontier of machine-to-
a “disturbing” effect on Americans’ life expec- machine communications and the information
tancy (Califf, 2022; Doctor Radio NYU, 2022). science it would spawn. “The fundamental
How is a failure of information integrity on this problem of communication,” Shannon wrote,
scale even possible in an affluent, connected, “is that of reproducing at one point either
information-rich society? exactly or approximately a message selected
Dr Califf’s tragic observation rests on at another point.” The exclusive focus was the
Facebook’s shocking, even murderous, record perfect reproduction of the signal. Shannon
as the COVID-19 pandemic collided with the wrote,
world’s most populous social media platform
to produce an endless wave of corrupt health Frequently the messages have meaning; that is,
information (Allington et al., 2021; Frenkel they refer to or are correlated according to some
et al., 2020; ISD, 2020; Kouzy et al., 2020; system with certain physical or conceptual
Simon et al., 2020; World Health Organization, entities. These semantic aspects of communication
are irrelevant to the engineering problem. The
2020). In August 2020 Avaaz documented the
significant aspect is that the actual message is one
failure of Facebook’s weak mitigation efforts selected from a set of possible messages. The
as COVID disinformation websites attracted system must be designed to operate for each
billions of views, far in excess of the leading possible selection, not just the one which will
public health sites (Avaaz, 2020). By October actually be chosen, since this is unknown at the
researchers determined that as many as 60% of time of design (Shannon & Weaver, 1963, pp.
the then 217,000 COVID deaths in the United 31–32; emphasis mine).
States were unnecessary, the primary causes of
these deaths originated in corrupt information The engineering solution to “the engineering
(Redlener et al., 2020). As vaccines became problem” was this blindness by design. Systems
available, the focus of information corruption were optimized for fidelity to the signal while
shifted to anti-vaccine campaigns, obstructing structurally indifferent to questions of the sig-
the most vital public health solutions to the nal’s fidelity to its subject. Engineered blind-
COVID-19 pandemic (Germani & Biller- ness equated to formal disinterest in the content
Andorno, 2020; Loomba et al., 2021; Perri of messages. The machines convey signals
et al. v. Robinhood Markets, Inc. et al., 2021; according to a priori instructions; they do not
Wilson & Wiysonge, 2020). decipher or evaluate meaning.
Zuboff 23

With meaning out of the way, Shannon’s advanced signal processing that enables us to
systems were free to focus on the communica- send high-speed data” (Soni & Goodman, 2018,
tion challenges faced by the Allies in the p. 275). Surveillance capitalism’s information
Second World War and their need for high- project is the unanticipated heir to Shannon’s
volume, high-velocity, precise machine- breakthroughs, and Facebook is the Pandora’s
to-machine communications for signals pro- Box that finally justifies Shannon’s fears.
cessing, codebreaking, and encryption. These Facebook is a signals-processing behemoth
demands born in emergency later became fix- based on the massive-scale extraction and bulk
tures of the Cold War milieu. The application commodity processing of human-generated
domains were primarily mathematical content data. However, the signals that Facebook pro-
for machine-to-machine communications in cesses differ from Shannon’s in that they are
high-level military research and complex human-generated. In these systems, blindness
engineering projects, as well as telephony, by design is no longer a feature restricted to
TV, and radio. In these contexts, it made sense machine-to-machine communications. Now it is
to narrow the engineering focus to the accu- applied to social communications in the inter-
racy, scale, and speed of transmission. A ests of high-speed massive-scale human signals
wholly new automated communications processing. This mash-up produces systems for
domain was thus constructed to serve the effi- social communications in which the “semantic
cacy of transmission, while indifferent to the aspects of communication are irrelevant to the
substance of what is transmitted. engineering problem.” Such systems can neither
Shannon’s biographers described this decipher the meaning of what is transmitted nor
“bomb” of an insight and observed, “If some assess its fidelity to “physical or conceptual
humans can achieve indifference to meaning entities.” The result is automated human-to-
only with great, practically ascetic effort, our human communications systems that are blind
machines are wired for this indifference: they by design to all questions of meaning and truth.
have it effortlessly” (Soni & Goodman, 2018, p. From the ancestral disciplines of oral witness
135). This essential point, overlooked almost to the traumatic shift from the spoken to the
entirely in contemporary discussion, was artic- written word, each turn in the material history
ulated by mathematician and computer science of information and communication further
pioneer Warren Weaver in his preface to the abstracted “reality,” imposing a problematic dis-
1963 edition of Shannon’s text: tance between the living subject and the knowa-
ble world. Every fundamental advance in
The word “information,” in this theory, is used in symbolic media–the alphabet, mathematic nota-
a special sense that must not be confused with its tion, printed text–produced social and psycho-
ordinary usage. In particular, information must
logical upheaval. These conditions eventually
not be confused with meaning. In fact, two
messages, one of which is heavily loaded with
required societies to bridge the gap between sym-
meaning and the other of which is pure nonsense, bol and “reality” with explicit standards of infor-
can be exactly equivalent, from the present mation integrity and new institutions to govern
viewpoint, as regards information. (Shannon & those standards (Clammer, 1976; Clanchy, 1979;
Weaver, 1963, p. 8; emphasis mine) Goody, 1986; Ong, 1982; Stock, 1983).
Now the computer mediation of vast swathes
Shannon feared that existing appropriations of human existence introduces a wholly new
of his work, and many more still to come, would threat paradigm to the reality problem. The
ignore its highly restrictive definitions of “com- extreme intensification of abstraction trans-
munication” (Grafton et al., 2021, p. 246). forms society, social relations and social com-
Despite his misgivings, the engineering solu- munications into symbolic objects without
tion to high volume and velocity of digital com- recourse to sentient channels of lived experi-
munications is considered the source of “all the ence. The “death of distance,” once celebrated,
24 Organization Theory 

imposes a new quality of distance expressed in whom you love, or why you grieve. They sim-
the untraversable fissure between the sentient ply must insist that these and every other facet
subject and “reality,” “truth,” “fact,” “mean- of your existence is lived in ways that allow
ing,” and related measures of information their machines to extract the predictive signals
integrity. Fissure becomes chasm, as blind-by- that reduce others’ uncertainty about what you
design systems saturate society—systems that will do next and thus contribute to others’ profit.
are not only antisocial but aggressively, eerily, The aim is always to widen and accelerate
asocial. the inextricable cycle of engagement> extrac-
Executives have occasionally spoken frankly tion> prediction> revenue (henceforth, I refer
about this institutionalized indifference to the to this cycle as ‘EEPR’). A consequential exam-
meaning of information. In Facebook’s case, a ple was Facebook’s 2018 decision to launder
leaked document from executive Andrew the kinds of sensationalized and defactualized
Bosworth describes the perfect complementa- “news” stories known to boost EEPR through
rity between Shannon’s engineering solution the standardized presentation of its News Feed.
and Facebook’s growth imperatives as they “All news stories looked roughly the same as
converge on blindness by design: each other ... whether they were investigations
in the Washington Post, gossip in the New York
We connect people. That can be good if they Post, or flat-out lies in the ‘Denver Guardian,’
make it positive. Maybe someone finds love ... an ‘entirely bogus newspaper’” (Thompson,
That can be bad if they make it negative ... Maybe 2018, para. 22). Similarly, Facebook and
someone dies in a terrorist attack ... The ugly truth Google “bankroll” clickbait farms (Hao, 2021b)
is that ... anything that allows us to connect more
as just another means of driving EEPR. The
people more often is *de facto* good ... The best
products don’t win. The ones everyone uses win
institution wants you perpetually engaged but
... make no mistake, growth tactics are how we does not, cannot, and must not–for the sake of
got here. (Mac et al., 2018, sec. 2) its own economics–care what engages you. In
this way blindness by design is a crucial sys-
Shannon declared the irrelevance of mean- tems principle and an essential mechanism of
ing to the engineering problem. Surveillance institutional reproduction.
capitalism declares the irrelevance of mean- These observations imply two conclusions:
ing to the economic problem. Economies of one cause and the other consequence. First, eco-
scale in behavioral signals processing require nomics: radical indifference is an economic
that all data are treated as equivalent though imperative because corrupt information is good
they are not equal, just as Weaver described. for business. Recent research demonstrates that
Facts or truth cannot have formal standing Facebook amplifies misinformation because it
because such criteria restrict data flows. drives EEPR. Median engagement with misin-
Human-generated data are and must be treated formation is higher than engagement with relia-
as a bulk commodity and driven through ble information regardless of its political
machine systems that function as supply orientation. But because the bulk of misinforma-
chains, computational factories and prediction tion is produced by publishers on the far right, its
markets. superior economic value means that right-wing
I have called this institutionalized relation- misinformation is consistently privileged with
ship to human-generated data “radical indiffer- every EEPR optimizing algorithmic amenity
ence” (Zuboff, 2019, pp. 376–377). It means (Edelson et al., 2021). Freed from the burdens of
that neither the machines nor their owners care meaning and the responsibilities of witness that
if your messages are fact or fiction, malicious or such burdens entail, corrupt information drives
angelic, fashioned to produce violence or joy. EEPR, which proceeds with the discipline of the
They have no interest in curing your disease or cyclops voraciously consuming all it can see and
what you do, or say, or buy, or eat, or think, or blind by design to meaning or truth.
Zuboff 25

Second, social consequence: in the 20th cen- The engineers describe their “open” sys-
tury, the engineering imperative of blindness by tems in contrast to the “closed” systems that
design yielded an historic breakthrough in the they regard as necessary for legal compliance.
volume and velocity of machine-to-machine sig- The contrast reflects the limitations of blind-
nals transmission. In the 21st century, the engi- ness by design when applied to social informa-
neering imperative of blindness by design yields tion and communications. “For more than a
catastrophic harms as all bits and bytes are wel- decade, openness and empowering individual
comed into the global information bloodstream contributors has been part of our culture.
for immediate human-to-human transmission. We’ve built systems with open borders.” This
The lesson: When social information is transmit- “openness” reflects back to Shannon’s original
ted as a bulk commodity through blind systems privileging of the signal irrespective of its
optimized for volume and velocity, the result is meaning and arcs forward to the economic
epistemic chaos characterized by widespread imperative of massive-scale human-generated
uncontrollable information corruption. data. Indeed, as justification for its open sys-
Confirmation of these propositions is pro- tems, the engineers cite the volume of data
vided by yet another leaked internal document, required to produce a single inferential ad fea-
this time composed in 2021 by Facebook pri- ture: “There are 15K features used in ads mod-
vacy engineers on the Ad and Business Product els.” Approximately six thousand data tables
Team. The group’s functional responsibility is are required to produce a single feature such
described as “the center” of Facebook’s “mon- as, “user_home_city_moved.” The engineers
etization strategy and is the engine that powers compare their data flows to a bottle of ink
Facebook’s growth” (Franceschi-Bicchierai, poured into a lake. “How do you put that ink
2022, para. 4). The document chronicles the back in the bottle? How do you organize it
growing anxiety of key employees who know again, such that it only flows to the allowed
that the engineering of Facebook’s systems is places in the lake?” (Franceschi-Bicchierai,
incompatible with emerging and anticipated 2022). Their answer? You can’t.
democratic contradiction. “We face a tsunami The engineers admit that their systems are
of inbound regulations that all carry massive fundamentally mismatched with lawmakers’
uncertainty,” they write (Facebook Ad and and the public’s expectations. Only “closed”
Business Product Team, 2021). systems would allow Facebook to “enumerate”
The engineers expect regulatory action, the data it has, “where it is; where it goes; how
beginning in the EU, that will restrict the use of it’s used.” Without those capabilities the com-
first-party data and ultimately require user con- pany simply cannot “make commitments about
sent for any deployment of personal informa- it to the outside world ... We fundamentally lack
tion in the advertising process. They anticipate closed-form properties in Facebook systems”
that increased democratic governance will trig- (Facebook Ad and Business Product Team,
ger a doomsday scenario in which Facebook is 2021).
categorically unable to comply at the required No matter how often and loudly the public
scale. They write: and its lawmakers are stirred to outrage insist-
ing that Facebook must take responsibility
(Donovan, 2020; see also, Atlantic Council,
We do not have an adequate level of control and
explainability over how our systems use data, and
2021; Browning & Mac, 2022; Frenkel, 2021;
thus we can’t confidently make controlled policy Harwell & Oremus, 2022), the engineers say
changes or external commitments such as ‘we that the company simply cannot comply. Worse
will not use X data for Y purpose.’ And yet, this is still, the company cannot admit that it cannot.
exactly what regulators expect us to do, increasing Instead, executives and staff are forced to duck
our risk of mistakes and misrepresentation. and weave, misdirect, lie, and buy time. This
(Facebook Ad and Business Product Team, 2021) helps to explain why Facebook continually says
26 Organization Theory 

that it is acting, and its actions continually fall dent in the blind-by-design global automaton
short or fail entirely. For example, Facebook that is Facebook. In the meantime, the privacy
announced it would downrank groups that engineers are bystanders to the endless con of
repeatedly share misinformation in order to information-integrity theatre, its fantasy reme-
reduce their total engagement. A team of French dies and excuses. They appear gripped by dread
researchers investigated the intervention’s as they reckon with the widening gulf between
effects. They concluded that while specific reality and fiction, anticipating the day when the
posts saw a reduction of engagement, behavior breakwater of lies is breached and the conse-
and quality of content were not altered. quences flood their company and their careers.
Ultimately, “the total engagement generated by The harsh truth for law and policy makers is
repeat offender groups” did not decrease over that epistemic chaos cannot be “regulated” out
time (Vincent et al., 2022, sec. 2, para. 3). The of existence. When the engineers consider genu-
pattern is consistent wherever you look: ine solutions, they acknowledge the need for a
Facebook knew that labeling Donald Trump’s discontinuous leap toward redefinition of the
false claims did little to limit engagement “fundamental problem of communication,” as it
(Silverman & Mac, 2020). It failed to curb mur- is now entangled with the fundamental problem
derous activities of a Mexican drug cartel or of surveillance capitalism. “Data Infra[structure]
Middle Eastern human traffickers (Scheck will need to have some semantic awareness of
et al., 2021). It promised to end polarizing rec- its data assets,” they acknowledge. “Building
ommendations by political groups, but the num- this will take multiple years and will require
bers grew instead (Faife & Ng, 2021; Feathers, data infrastructure (DI) investment” (Facebook
2021; Waller & Lecher, 2022). It established a Ad and Business Product Team, 2021, pp. 7–8).
task force to police violent political content, Their statement is an admission that information
then throttled back on enforcement (Silverman civilization has thus far been built on a novel
et al., 2022). Facebook promised, then failed, to genus of social information that blindly blends
curb hate speech in Kenya and election misin- fact and fiction, truth and lies in the service of
formation in Brazil (Global Witness, 2022a, private economic imperatives that repurpose
2022b). Facebook, Google and Twitter each social information as an undifferentiated bulk
pledged to “crack down” on anti-vax disinfor- commodity. Blindness by design is in charge,
mation but failed to do so (CCDH & Anti-Vax though it violates the most basic sociological
Watch, 2021). principles of communication and common
Only Facebook’s privacy engineers, in the sense. The consequences of this sociological
assumed confidentiality of their internal com- rupture are explored more deeply in the discus-
munications, admit that blindness by design is sion of stage four, below.
incompatible with any reasonable construction Mr Zuckerberg, his executives and allies
of responsible stewardship of global social defend blindness by design as the protection of
communications. free speech, another rhetorical sleight of hand
The engineers propose short-term solutions that aims to distract lawmakers and the public
that might at least create the appearance of trying from noticing the facts that alarm Facebook’s
to comply with expected regulations. Possibilities own engineers (Scola, 2019). A comprehensive
include “curated data sources” and “manual” vis- Civil Rights Audit of the company commis-
its to “tens-of-thousands of call sites and code sioned by Zuckerberg and Sandberg condemns
paths” (Facebook Ad and Business Product this misdirection strategy and Zuckerberg’s
Team, 2021, p. 10). But action is weak, and insti- “selective view of free expression as Facebook’s
tutionalization is strong. These engineers and most cherished value ... even where that has
their bosses know that there can never be enough meant allowing harmful and divisive rhetoric
content moderators, fact checkers, labelers, or that amplifies hate speech and threatens civil
any other exogeneous active measures to make a rights” (L. W. Murphy & Cacace, 2020, p. 9).
Zuboff 27

Indeed, blindness by design advances the Second, the social harms of privacy destruc-
aims of every purveyor of corrupt information, tion and epistemic chaos are effects of founda-
a windfall for those who benefit from the tional economic causes set into motion by the
global dystopian drift. Any entity motivated commodification of human behavior and the
by social destruction–repressive governments, resulting imperative of massive-scale human
illiberal politicians and their networks of allies data extraction and processing. The examination
and vassals, fevered groups of radicalized of successive developmental stages will under-
extremists, oligarchs, autocrats, outlaws–can score this point. The further downstream one
exploit the affordances of blindness by design goes, the more critical it becomes to recognize
to inject defactualized content straight into the that the solutions are upstream, where the causes
global information bloodstream without con- of harm originate.
straint. This is a social experiment without his- Third, as Facebook’s own engineers make
torical precedent nihilistically imposed upon clear, there is no hope for post factum solutions
“subjects” who can neither consent nor escape when the foundational mechanisms are mis-
because the experimenters and their manipula- aligned to their task. Mitigation activities aris-
tions are both hidden and ubiquitous (Alizada ing from outside the institution will not unseat
et al., 2021; Allam, 2020; Bradshaw et al., causal operations. The purposes of democratic
2021; Cunningham-Cook, 2021; Frantz et al., contradiction, as the engineers understand, will
2020; D. Gilbert, 2021; Holt, 2021). not be achieved by regulating a system that is
In the weeks leading up to the 2020 US presi- inherently incapable of adapting to regulators’
dential election, Mr Zuckerberg disbanded demands. A theme emerges that will be revis-
Facebook’s Civic Team, which had been tasked ited several times as the unified field analysis
with producing solutions to mitigate epistemic unfolds: No amount of regulation can turn this
chaos. One of the team’s most respected leaders, spider into a swan, just as one does not regulate
Samidh Chakrabarti, spoke out on the compa- the hours a child may work in a factory. Five
ny’s failure to stem the flood of corrupt inflam- hours a day for a five-year-old. Ten hours a day
matory content across its pages. In a series of for a ten-year-old. Regulatory initiatives caught
tweets quoted in the Wall Street Journal’s in this vacuum waste precious time, breeding
“Outrage Algorithm” podcast, Chakrabarti took frustration and cynicism. Genuine solutions
aim at the destructive collision between blind- will depend upon abolition and reinvention, not
ness by design and social communications on post-factum regulation.
Facebook’s pages. According to the Journal,
Chakrabarti observed that “treating all engage- Stage Two: The Concentration
ment equally, irrespective of content, will
‘invariably amplify mis-info, sensationalism,
of Computational Knowledge
hate and other societal harms. I wish this weren’t Production and Consumption
the case, but it is so predictable that it is perhaps (Economies of Learning)
a natural law of social networks ... The chal-
lenge is almost a philosophical one that
The Economic Operations
Facebook can’t solve alone’” (Wall Street With stage-one foundations of massive-scale
Journal, 2021a). Chakrabarti is correct. Only extraction and datafication well established,
the democratic order can solve this one. the accelerating thrust of institutional devel-
Three lessons from the unified field perspec- opment propelled fresh dangers to a new
tive can be drawn thus far. First, nothing here is frontier. Stage two produces extreme concen-
inevitable. Surveillance capitalism as an institu- trations of knowledge production and learn-
tional order was founded on the intentions of ing capabilities that build on earlier
human actors in specific times and places. What concentrations of human-generated data.
is done by people can be undone by people. These institutionalize competitive advantage
28 Organization Theory 

in ways that are self-reproducing and, in the of data: invasion, theft, secrecy, annexation of
absence of system-level contradiction from rights, destruction of privacy, and epistemic
the democratic order, impregnable. chaos. Jacobides et al. (2021) observe that the
The value of the massive-scale human-gen- cumulative advantages of data aggregation
erated data obtained in stage one cannot be (stage one) create substantial barriers to entry
realized without computational systems of for AI production and consumption (stage
knowledge production known as machine two). This dynamic is intensified in the case of
learning and artificial intelligence. These sys- computational knowledge aimed at behavioral
tems, in turn, cannot perform without an ever- prediction, because larger datasets increase
expanding diet of massive-scale human predictive accuracy. Competitive advantage is
generated data. The next step in the struggle for thus linked to the giants’ massive-scale data
dominance over the politics of knowledge that supplies “an extraordinarily rich set of
unfolds in stage two as data are transformed information on their customers” (Jacobides
into information and knowledge with these et al., 2021, pp. 421, 418; see also, Agbehadji
proprietary computational ‘means of produc- et al., 2020; Aiello et al., 2020; Gao et al.,
tion’ along with the authority to determine 2019; Hinds & Joinson, 2019; Iqbal et al.,
what is produced, and who consumes it. 2022; Ma & Sun, 2020).
Information scholar Martin Hilbert observed Massive-scale volume and varieties of data
in 2012 that the world’s volume of digitally also facilitate the lateral spread of predictive
stored information exceeded the global capacity computation as surveillance capitalism reorders
to learn from that information. “The only option industries far from Silicon Valley. Healthcare
we have left to make sense of all the data,” he offers an example. Each of the surveillance
counseled, “is to fight fire with fire,” using giants pursues grand strategies for the extrac-
“artificially intelligent computers” to “sift tion of health data, which are coveted for their
through the vast amounts of information ... lucrative predictive power. The approaches
Facebook, Amazon, and Google have promised vary, but each begins with data culled from their
to ... create value out of vast amounts of data already established systems. Google announced
through intelligent computational analysis” its intentions to “harness the billion health-
(Hilbert, 2012, sec. 2, para. 1). Hilbert’s fore- related questions people ask it every day,” que-
cast would be realized, but with a twist: The ries which amount to 7% of daily searches, as
surveillance capitalists create value, but the the foundation for its efforts “to provide better
value is for them. healthcare” (M. Murphy, 2019, paras. 1–2).
A comprehensive 2021 investigation of the Speaking to a gathering of health industry IT
global market structure of artificial intelli- professionals, Alphabet/Google’s Eric Schmidt
gence by Jacobides, Brusoni and Candelon conveyed the corporation’s vision to render
(2021) emphasizes this extreme market asym- healthcare as the kind of computational predic-
metry, noting “the remarkable and growing tion problem that only a company like Google
concentration in AI” by a handful of Big Tech can solve (Schmidt, 2018, 7:46–7:52, 10:17–
companies and the resulting “economies of 10:59). Schmidt exhorts his audience:
learning” in which “scale begets learning
through the accumulation of data and increases The really powerful stuff right at the edge of what
competitive advantage ... We find that AI is I do is prediction ... Can you imagine when we
have the combination of sensor data plus
adopted by and benefits the small percentage
continuous behavioral data, which you’re gonna
of firms that can both digitize and access high- get from your smartphone and the various
quality data” (p. 418). smartwatches that are coming, plus all the
These extraordinary knowledge rewards molecular data? This data explosion is profound
reflect the path dependency of institutional ... Healthcare is becoming essentially an
development and all that was done in the name information science.
Zuboff 29

Amazon owns the limitless health-related que- on what it describes as the world’s most power-
ries, purchases, and sales of 310 million con- ful artificial intelligence supercomputer
sumers and 5 million sellers. Its One Medical (Bhattacharyya, 2022).
acquisition provides access to the 8000 compa- In the second decade, surveillance capitalists
nies that offer One Medical access as a health further leveraged their capital and institutional-
benefit, enabling the integration of medical ized their knowledge advantage with another
data with apps and ads for health products and self-reproducing mechanism: acquisition. They
services (Amazon, 2022; Kaziukėnas, 2021; acquire and hoard scarce human, technological,
Pifer, 2022; Quaker, 2022). Apple’s Chief and scientific resources required for knowledge
Operating Officer Jeff Williams, who led the production (Peterson, 2020). This reproduction
engineering work on the Apple Watch, notes, strategy ignited an arms race for the 10,000 or so
“We have tens of millions of watches on peo- specialists on the planet who know how to coax
ple’s wrists, and we have hundreds of millions knowledge from surveillance capital’s vast data
of phones in people’s pockets” (Fitzpatrick, continents (C. Metz, 2017b). The giants pur-
2018). Facebook and Microsoft each follow chased the most promising artificial intelligence
distinct paths to a similar objective (Feathers companies (Richter, 2020), decisively deepen-
et al., 2022; Guthrie & Benjamin, 2022; ing and institutionalizing their knowledge domi-
McGuinness, 2022). nance capabilities (Bass & Brustein, 2020). The
In 2010, former NSA director Mike 2022 Stanford Artificial Intelligence Index
McConnell publicly acknowledged that “more reports that state-of-the-art AI results depend
than 90 percent of the physical infrastructure of upon “extra training data,” a trend that “favors
the web” was owned by the private internet private sector actors with access to vast data-
companies (McConnell, 2010). Since then, sur- sets.” In 2021, private investment in AI reached
veillance capital has doubled down on building US$93.5bn, more than double the private invest-
the vast capabilities and infrastructures required ment levels of 2020. Concentration increased as
to transmit, store, and compute massive-scale more money chased fewer companies, with 746
data. The giants dominate research and applica- newly funded AI companies in 2021 compared
tion in frontier machine intelligence and to 1051 in 2019 (Zhang et al., 2022).
advanced microchips (Lohr, 2019; Rosenbush, The giants also exerted control over labor
2022). They build the largest computer net- markets in critical knowledge production exper-
works, data centers, populations of servers, and tise including data science and animal behavior
undersea transmission cables (J. M. Lima, research (McBride & Vance, 2019; Murgia,
2017; C. Metz, 2017a). Infrastructure domi- 2019a), poaching the top scientific talent and
nance began in North America and Europe and elbowing out would-be competitors as well as
now relentlessly proceeds to swallow the global startups, universities, governmental bodies and
south (Ahmad & Salvadori, 2020; Birhane, less wealthy countries. Until 2004, the year of
2020; Stowell & Ramos, 2019). Google, the surveillance dividend, no AI scholar had yet
Amazon and Microsoft became “hyperscalers,” traded a university post for a corporate machine-
creating cloud computing platforms first to run learning lab (Jacobides et al., 2021, p. 420).
their own analyses and later as an important line Between 2004 and 2018, and especially after
of business. The cloud platforms broaden the 2010, more than 211 AI professors accepted
corporations’ AI ecosystems and extend control full-time or part-time industry positions, lured
over AI development with customers for and by high salaries and computing resources
contributors to their cloud solutions (Jacobides (Gofman, 2021). By 2016, 57% of American
et al., 2021, pp. 415, 417, 418). In 2022, computer science PhD graduates took jobs in
Facebook, already a leader in real-time data industry, while only 11% became tenure-track
processing and machine learning (Chen et al., faculty (National Academies of Sciences,
2016; Hazelwood et al., 2018), completed work Engineering, Medicine, 2018).
30 Organization Theory 

The surveillance empires own most of the and production of meaning can only be inferred
scientists and the science. Their data scientists from the products and services on offer.
publish most of the research papers in AI, “an This use of concealment as a reproductive
extraordinary situation compared with any routine also applies more generally (Pasquale,
other field of science” (Jacobides et al., 2021, 2017a). For example, Facebook has come under
p. 420). With so few teaching faculty, colleges broad criticism for withholding access to data
and universities have had to ration computer sets from social media researchers who investi-
science enrollments, which has significantly gate algorithmic operations and their social
disrupted the knowledge transfer between gen- effects, a problem that has intensified since
erations (Gofman & Jin, 2022). In the UK, uni- government inquiries into Cambridge Analytica,
versity administrators contemplate a “missing including its role in the UK Brexit referendum
generation” of data scientists (Sample, 2017). and the 2016 US presidential elections
A Canadian scientist laments, “The power, the (Benesch, 2021; Bruns, 2018, 2018, 2019;
expertise, the data are all concentrated in the Ghaffary, 2021; Gibney, 2019; Puschmann,
hands of a few companies” (Murgia, 2019a, 2019; Tromble, 2021). The company imposes
para. 11). onerous limitations on data use and terms of
The problem extends to even more funda- research publication (Bobrowsky, 2021; J.
mental questions of what kind of knowledge Horwitz, 2020; Murgia et al., 2021). Legal
is produced in the first instance. The giants scholar Michael Karanicolas argues for the cod-
dominate AI funding both within their ecosys- ification of public rights to “platform informa-
tems and within the academy (Jacobides et al., tion” modeled after freedom of information
2021, p. 412). The needs of their commercial laws (Karanicolas, 2021). The need for such
operations, such as search or social media, rights reflects the conditions of institutional
shape the global research agenda. One promi- power shaped by concentrations of epistemic
nent academic researcher observes, “Where rights, privacy, data, knowledge production,
academia might explore how social media and knowledge consumption in combination
changes how people think, a corporate social with the absence of democratic contradiction.
network might ask: If people post sad things,
do they use our product for longer?” (Waddell, The Governance Vector: Epistemic
2018, para. 12).
Authority
The successful assertion of property rights
over illicit data in stage one advances in stage The privatization of data about people and soci-
two with a declaration of property rights to the ety in stage one founds the privatization of the
knowledge produced from those data. Property means of knowledge production and consump-
rights converge with epistemic rights to obscure tion in stage two. Illicit concentrations of data
the illegitimacy of this fresh declaration that in stage one create the conditions for stage
assigns to the giants and their ecosystems the two’s illicit concentrations of knowledge. What
unprecedented knowledge that originates in the economist sees as oligopolistic market dom-
secret extraction from the lives of unsuspecting inance is revealed from the sociological angle
and undefended populations. as a knowledge oligarchy in which private gov-
Finally, once knowledge production is nar- ernance privileges over the division of learning
rowed to the progress of institutional interests, in society are institutionalized.
that knowledge itself is concealed. Observing Oligopolistic ownership and operational
this blackout, Jacobides et al. (2021) note of the control over the means of knowledge produc-
giants, “Secrecy, rather than patenting, remains tion convey the oligarchic governance preroga-
the preferred strategy to protect their research tives over knowledge that define epistemic
findings” (p. 420). Under these conditions, pri- authority. First, the giants govern decisions
vate surveillance capital’s means of production over what may become knowledge in the first
Zuboff 31

instance. As we have seen, knowledge that The NSCAI’s Interim Report, published in
advances their commercial interests eliminates November 2019, is particularly revealing. The
or narrows the pursuit of other forms of knowl- report opens on lofty principles, pledging “AI
edge, imposing excessive opportunity costs on systems ... consistent with, and in service of,
global society. core values Americans hold dear and the rights
Second, the giants leverage the conditions of enshrined in our founding documents” (Schmidt
ownership of scarce AI resources and the ability et al., 2019, p. 14). However, the commission’s
to conceal what they know in order to govern meetings, materials, and the Interim Report
knowledge distribution. Their action to control itself were held in secrecy. Reports were even-
distribution, per Searle, further declares their tually made public, but only after a contested
authority over that distribution. What may be FOIA request followed by a court order.
known? Who knows? Who decides who knows? Equally incongruous was the commission’s
In taking command of the answers, the giants membership. Despite the stated democratic
exercise the epistemic authority to command aspirations, the NSCAI was chaired by
the division of learning in society. Alphabet/Google’s Eric Schmidt. The vice-
Epistemic authority underpins the giants’ chair position went to Robert Work, a former
confidence in their abilities to set terms of Deputy Secretary of Defense under President
engagement with the democratic order. For Obama. Mr Work is known for his “Third Offset
example, surveillance capitalists insist that AI Strategy,” a doctrine that articulates an impera-
solutions are vital to national security and eco- tive for 21st-century US military supremacy in
nomic growth, while it is they who will most AI as necessary to maintain historical “over-
benefit from public funding—a paradox for match” with military rivals, especially China
democratic governments. In some cases, this is (Work, 2015; Korb & Evans, 2017). Of the 13
true both of corporations and of their executives remaining members, 11 were tech executives,
and allies who participate as individual inves- including top leaders of the three hyperscale
tors. For example, Schmidt, Google’s former giants, Google, Amazon and Microsoft. The
CEO and chairman, is a leading champion of AI rest had careers in national security or computer
in national security debates, as well as an inves- science. There was not a single legal scholar,
tor in these technologies (Tech Transparency privacy expert, social scientist, political scien-
Project, 2022). In 2018, the EU pledged US$24bn tist, or historian. There were no leaders repre-
over three years to spur European innovation in senting civil society or labor, no civil rights,
AI (Fioretti, 2018). Without fundamental institu- human rights, privacy, or civil liberties advo-
tional change in surveillance capitalism’s control cates. There was not a single constitutional
of the global AI market structure, however, such expert or democracy activist. There was mini-
outcomes remain difficult to achieve and public mal racial, gender, or age diversity.
funding is most likely to strengthen surveillance The report appears oriented toward renewal
capitalism’s dominance (Waters, 2022). of another doctrine: surveillance exceptional-
This paradox has played out with unique ism. As the persuasive power of the old justifi-
clarity in the United States, where democratic cations shaped by the war on terror wore thin,
contradiction has been weakest. The state of Work’s Third Offset Strategy helped the com-
play between the two institutional orders as they mission articulate a new national security case
converge over AI is illustrated in the work of the to compel doctrinal renewal. “Developments in
US National Security Commission on Artificial AI cannot be separated from the emerging stra-
Intelligence (NSCAI), established by the US tegic competition with China and develop-
Congress in 2018 and tasked to recommend an ments in the broader geopolitical landscape,”
effective path to the militarization of AI as the report declares. “We are concerned that
essential to the modernization of the United America’s role as the world’s leading innovator
States’ 21st-century war-fighting capabilities. is threatened ... that strategic competitors and
32 Organization Theory 

non-state actors will employ AI to threaten AI-enabled systems can only be realized through
Americans, our allies, and our values.” transformation of organization structures and
With “national security” claimed to be at business processes; the inherent rigidity of
stake, the commission demands a reassertion of government in this respect poses a major obstacle.
(Schmidt et al., 2019, p. 22)
the Clinton-Gore style commitment of private-
sector primacy complete with guarantees of
The NSCAI had reason to feel confident in
absolute freedom of action for the giants: no
its strident language and bold imposition of
law, no EU-style regulation, and no interference
terms of engagement. In February 2019, just six
in corporate prerogatives. “American compa-
months after the NSCAI was formed and eight
nies remain world leaders in AI research and
months before the Interim Report, White House
some areas of application,” the report says.
Executive Order 13859, “The American AI
“Our market-based economy and low regula-
Initiative,” signaled the Trump administration’s
tion has created three-quarters of the world’s
eagerness to align with Silicon Valley and
top 100 AI startups”. It argues that while public
“remove barriers to AI innovation” (The White
funding once produced vital R&D that drove
House, 2019). The accompanying “Guidance
private sector innovation, “the reversal of the
for Regulation of Artificial Intelligence
Cold War paradigm” has put government into
Applications,” sounding a lot like Clinton-
“perpetual catch-up mode,” a junior partner in
Gore, pledged “to reduce barriers to the devel-
the quest to militarize artificial intelligence
opment and adoption of AI technologies” by
(Schmidt et al., 2019, pp. 45, 20).
rolling back existing regulations, overriding
The democratic failure at the turn of the digi-
state laws and avoiding new regulations
tal century produced the conditions that now
(Vought, 2020, p. 1).
frame the state of play in this contest for author-
Viewed through the lens of the giants’
ity over the politics of knowledge. The same
accretion of governance powers, the aggres-
companies that once depended on the demo-
sive language of the NCSAI report also pro-
cratic order’s encouragement and protection
vides a glimpse of their Achilles heel.
now, as prefigured by Gus Hunt’s 2013 remarks,
Institutionalization is the opposite of action,
hold a hand full of aces as the critical suppliers
and the report’s threatening and muscular rhet-
of human-generated data and its computation.
oric is a form of action. As such, it reminds us
This helps explain the report’s audacious lan-
that the economic and governance gains of the
guage and blunt assessment of the power
surveillance capitalist order depend upon
dynamics that bend a now supplicant and com-
freezing every democratic impulse toward
promised democracy to the surveillance capital-
contradiction. The long game of dominance
ists’ control of mission-critical resources. The
over the politics of knowledge rests on a bet
commission warns that should a subordinate
that fragile democracies and their civil society
state attempt to rein in the giants, it will simply
communities can be bullied and bribed into
be forsaken by the talent and capital required to
submission—forms of action that require con-
achieve its objectives:
tinuous attention, expenditure and effort.
The government depends on the commercial
sector, while the AI industry, far from depending The Social Harm Vector: Epistemic
on government business, often sees government Inequality
regulations and bureaucracies as hindrances to
their business models and therefore an unworthy Economic and social power blur. Information
pursuit ... the government lacks wide expertise to oligopoly shades into information oligarchy as
envision the promise and implications of AI, it produces a wholly new axis of social ine-
translate vision into action, and develop the quality, expressed in the growing gap between
operating concepts for using AI ... gains from the many and the few now defined by the
Zuboff 33

difference between what I can know and what community. The reciprocities of the division of
can be known about me. Knowledge is scraped labor would breed interdependence and mutual
from human lives but accrues to the improve- respect, imbuing it not only with economic
ment of the few, not the many. The dogma that advantage but with moral force. “The most
Big Tech democratizes knowledge obscures remarkable effect of the division of labor,”
this new source of injustice. Early-stage harms Durkheim explained, “is not that it increases the
create the conditions for later-stage harms as output of functions divided, but that it renders
the destruction of privacy in stage one gives them solidary ... it passes far beyond purely
way to extreme epistemic inequality in stage economic interests for it consists in the estab-
two. lishment of a social and moral order sui gen-
Durkheim again provides an historical paral- eris... If we specialize it is not to produce more,
lel. When the young scholar wrote The Division but it is to enable us to live in new conditions of
of Labor in Society, the title itself was contro- existence that have been made for us”
versial. The division of labor was understood as (Durkheim, 1964, pp. 60–61, 275).
a means of achieving labor productivity through The division of learning follows the same
the specialization of tasks in the new manufac- migratory path from the economic to the
turing concerns of the late 19th century, but that social domain once traveled by the division of
was not what held Durkheim’s fascination. labor. Now it is the division of learning that
Instead, he trained his sights on social transfor- “passes far beyond purely economic inter-
mation far from the factory floor, observing that ests.” How does an information civilization
the division of labor as “specialization” was find a path to social solidarity? The answer is
gaining influence in politics, administration, the a just division of learning in society as the
judiciary, science and the arts. He grasped a critical new source of social solidarity. But
more general phenomenon: the division of labor there are complications.
was becoming the central organizing principle Durkheim recognized that the birth of a new
not just of the economy but of society. social order can take a dark turn, resulting in
“Whatever opinion one has about the division what he called a “pathological division of
of labor,” he wrote, “everyone knows that it labor” in which organic solidarity yields to
exists, and is more and more becoming one of social distance, injustice and conflict. The
the fundamental bases of the social order” source of such pathology, he argued, was the
(Durkheim, 1964, p. 41). destructive effect of social inequalities in socie-
Economic imperatives predictably mandated ties marked by extreme asymmetries of power
the division of labor in production, but what that make “conflict itself impossible” by “refus-
was the purpose of the division of labor in soci- ing to admit the right of combat” (Durkheim,
ety? This new principle of social order, 1964, pp. 353–378).
Durkheim reasoned, was summoned by the The economic interests of the surveillance
breakdown of traditional communities and the capitalist institutional order depend upon
sources of meaning that had “mechanically” extreme asymmetries of knowledge production
bound people to the rules and rituals of culture, and consumption. In result, the emerging divi-
place, religion, clan and kin. How would soci- sion of learning in society is already mired in
ety cohere without those ancient authorities? the pathology of oligopoly. Privatization and
Durkheim’s answer was “the division of labor concentration of the means of knowledge pro-
in society” as it united diverse members of a duction do not summon a new organic solidar-
modern industrial society in a larger prospect of ity. Nor do they remedy the conditions of
“organic solidarity.” Society’s need for coher- inequality, exclusion, and risk that already
ent new sources of meaning and structure was plague many democratic societies, a pattern that
the cause, and the effect was an ordering princi- more or less reflects the extent to which each
ple capable of enabling and sustaining a modern capitulated to the Friedman consensus.
34 Organization Theory 

On the contrary, the division of learning mediation, a new power moves lightly, unan-
now subsumes the division of labor, which nounced, smiling like Alice’s Cheshire cat. This
becomes an expression of the new conditions new power is an odorless invisible poison
of epistemic inequality. This dynamic is whose signature is this: the very moment that
already visible in two key developments. First, we become aware of peril is the moment that is
as economist Daron Acemoglu’s analysis of too late.
AI-driven automation and wage inequality New power in stage three stands on the shoul-
concludes, AI is used to achieve “excessive” ders of earlier accomplishments. In stage one,
cuts in labor costs and task deskilling. These the massive-scale extraction of human data sup-
effects are not linked to the technologies per se ported behavioral prediction, which in turn
but rather to corporate choices that favor sur- necessitated the volume and velocity of blind-
veillance, control and labor substitution over by-design information systems, which then facil-
task enrichment and worker initiative, with itated the dissemination of corrupt information
destructive consequences for social solidarity resulting in epistemic chaos. In stage two, those
and democracy (Acemoglu, 2021, pp. 2–3; unprecedented flows of trillions of data points
Acemoglu & Restrepo, 2021; Zuboff, 1988). each day converge with unprecedented computa-
Second, stage two knowledge asymmetries are tional capabilities to produce unprecedented
reflected in the international division of learn- concentrations of illegitimate though not illegal
ing: A new kind of AI serfdom employs impov- knowledge: epistemic inequality.
erished “gig” workers across the global south In stage three, the capabilities enabled by
who earn starvation wages “training” AI algo- these conditions come to fruition in the trans-
rithms and “moderating” content (Hao, 2022; formation of illegitimate knowledge into ille-
Hao & Hernández, 2022; Perigo, 2022; see gitimate power. Surveillance capitalism’s
also, Birhane, 2020). global architectures are now sufficiently knowl-
In short, the pathological division of learn- edgeable to shift into a higher gear from behav-
ing obstructs the development of global socie- ioral monitoring, datafication, and computation
ty’s capacity to shape and benefit from the to remote behavioral actuation. Expansive and
knowledge production and consumption that intimate knowledge about people is deployed to
defines information civilization. This dimin- exploit, intensify, and weaponize epistemic
ishes the available intelligence to wrestle with chaos for targeted influence operations, includ-
existing threats while it also produces new ing behavioral modification at scale: economies
threats. The individual and social sacrifices of of action. Such capabilities rupture the organic
stage one are not redeemed with service to the integrity of individual and collective behavior,
greater good in stage two, as Hilbert once imag- challenging human autonomy and asserting a
ined. Following the trail from commerce to new form of power over individual lives and
society and economics to sociology, there is lit- societal dynamics. These developments are
tle good news. illustrated in a brief narrative of the Trump
2016 digital campaign’s successful effort to
nullify the political power of Black citizens.
Stage Three: Remote
In September 2020, London’s Channel 4
Behavioral Actuation News reporters revealed their investigation of
(Economies of Action) leaked documents from the Trump 2016 presi-
dential campaign featuring a dataset of more than
“The Tools” 5000 files and nearly 5 terabytes. The data were
Sometimes power arrives careening down the amassed by the Trump campaign, largely through
slopes that ring the valley, stallions thrashing at Facebook and the legal purchase of commercial
a gallop in a dusty whirlwind, men with bloody datasets that could be linked back to Facebook
slaughter in their hearts riding hard to crush us profiles. The files also included Facebook data
at the gates. But in these days of abstraction and illicitly acquired by the political consultancy
Zuboff 35

Cambridge Analytica as well as data compiled by “The tools” included nothing more than the
the Republican Party (Rabkin et al., 2020). standard range of algorithmic targeting mecha-
The campaign cache contained details on nisms used daily to shape the behavior of
almost 200 million individual US voters along Facebook users: subliminal cues, engineered
with the analyses, scoring and algorithmic mod- social comparisons, psychological microtarget-
els used to assess personality traits, political ing, recommendations, real-time rewards and
attitudes, behavioral dispositions, interests, punishments, gamification, and more. Trump’s
concerns, finances, sexual orientations, vulner- data scientists, including some from the consul-
abilities, and more. These were marshalled to tancy Cambridge Analytica who had worked on
microtarget and manipulate voter behavior in the Brexit “Leave” campaign, collaborated with
16 states that the campaign considered essential Facebook staff to master these “tools.”
to a Trump victory, especially the key swing Black citizens were targeted with messaging
states of Michigan, Wisconsin and Ohio. tailored to detailed individual profiles. One
Campaign political advisor Steve Bannon prominent strategy was designed to produce
remarked at the time, “I wouldn’t have come negative views of Hillary Clinton. For example,
aboard ... if I hadn’t known they were building a proportion of Black voters were bombarded
this massive Facebook and data engine” (Green with doctored videos showing Hillary Clinton
& Issenberg, 2016, para. 20). describing Black youths as “super predators.”
Brad Parscale, Trump’s digital director, had According to the Channel 4 investigation,
gambled his modest budget entirely on nearly six million distinct versions of these and
Facebook, where corporate staff members similar ads were injected directly into the feeds
embedded within the campaign helped the of target citizens. Most were delivered as “dark
Trump team dominate surveillance capitalism’s posts”—nonpublic messages whose viewership
key operational mechanisms, referred to within was tightly controlled by the campaign, right
Facebook as “the tools” (see generally, Kreiss down to the individual voter. As Parscale
& McGregor, 2018; Scola, 2017). With described it, “Only the people we want to see it,
Facebook’s help, Parscale pursued an uncon- see it” (Green & Issenberg, 2016, para. 18). All
ventional digital strategy. The campaign identi- the messages advanced the campaign’s domi-
fied three groups least likely to support nant objective: convince Black citizens that the
Trump–idealistic white liberals, young women, most effective expression of Black protest was
and African Americans—and labeled these as simply not to vote.
“audiences” for “deterrence.” “The tools” were Subsequent analyses suggest that unprece-
deployed to persuade these citizens, especially dented knowledge produced unprecedented
Black citizens, not to vote. As one Trump offi- power. According to Pew Research, “The Black
cial boasted, “We have three major voter sup- voter turnout rate declined for the first time in
pression operations underway” (Green & 20 years in a presidential election, falling to
Issenberg, 2016, para. 17). 59.6% in 2016 after reaching a record-high
Despite the campaign’s determination to con- 66.6% in 2012. The 7-percentage-point decline
ceal its activities, the Channel 4 investigation from the previous presidential election is the
amassed evidence detailing the “deterrence” largest on record for Blacks” (Krogstad &
efforts aimed at Black voters. Among the three Lopez, 2017, sec. 1). A Washington Post analy-
selected audiences, 54% were people of color, sis sharpened the picture. Compared to 2012,
including 3.5 million Black citizens. The dispro- the 2016 Black voter turnout rate declined by
portionate selection of Black citizens for “deter- 5.3 percentage points in swing states, where the
rence” held within each state. For example, in Trump campaign targeting was said to be most
Wisconsin Black voters accounted for 5.4% of focused, compared to 4.3 percentage points in
the population but were 17% of the campaign’s non-battleground states (Fraga et al., 2017).
“deterrence” audience (Rabkin et al., 2020). Drilling down further, Channel 4 News
36 Organization Theory 

analyzed voter turnout in the City of Milwaukee, that I have called instrumentarian power
home to the majority of the state’s Black voters. (Zuboff, 2019, pp. 351–352, 379–382). It is
In Precinct 116, 80% (1152) of the 1440 poten- covertly wrung from massive asymmetries of
tial voters were Black. Nearly half of the pre- knowledge, harnessed to the diminishment of
cinct (636) was marked for “deterrence.” Of human agency through the friction-free con-
that targeted group, only 206 individuals cast a quest of human action, and mediated by the Big
vote. Overall turnout in Precinct 116 dropped Other of connected, pervasive, blind-by-design
from 75% in 2012 to 56% in 2016. architectures of digital instrumentation (Zuboff,
What happened here? The episode traces the 2015, 2019, Chapter 13).
arc of institutional progression and the way Instrumentarian power is an affordance of
that each stage summons the next. Data surveillance capitalism, available to own or
demands computation. Computational knowl- rent. It works its will invisibly. No violence. No
edge fulfills its economic or political promise blood. No bodies. No combat. Indeed, the
by shaping communications tailored to bite Trump team understood that win or lose the
hard on the behavioral systems of the data sub- White House, Mr Trump had bought himself a
ject. “Microtargeting” is the euphemism for the limitless power source. “We knew how valua-
range of digital cueing mechanisms engineered ble this would be,” Parscale said. Reflecting on
to tune, herd and condition individual and the consequence of this new power, he crowed,
collective behavior in ways that advance com- “We own the future of the Republican Party”
mercial or political objectives. Illegitimate (Green & Issenberg, 2016, para. 22).
knowledge is thus transformed into illegitimate In January 2020, as the United States braced
power. Radical indifference means that even in for another ugly election year, Bosworth laid
a democracy, microtargeting and manipulation Trump’s success at the feet of Facebook’s
to deter voting is equivalent to microtargeting “tools.” Mr Trump was elected, he said,
and manipulation to sell a new jacket. “because he ran the single best digital ad cam-
In the case of the Trump campaign, citizens paign I’ve ever seen from any advertiser.
of one of the world’s oldest democracies relin- Period.” Bosworth praised Parscale, insisting
quished their most solemn democratic right— that Trump 2016 “remains the high water mark
the right to vote—without anyone ever holding of digital ad campaigns,” and explained that
a gun to their heads or showing up in the dead this “unbelievable work” was the product of a
of night to drag them to the gulag or the camp. simple discipline: “They just used the tools we
Instead, these citizens ceded the right to self- had to show the right creative to each person”
govern in response to pernicious hidden mech- (B. Gilbert, 2020, paras. 5, 10). A senior cam-
anisms of remote behavioral actuation. Stage paign official observed, “There’s really not that
three’s matching and targeting capabilities much of a difference between politics and reg-
exploited the abundance of corrupt informa- ular marketing.” Parscale confirmed this con-
tion established in stage one, combined with clusion with style: “I always wonder why
the detailed individual profiles established at people in politics act like this stuff is so mysti-
stage two until enough people chose to silence cal. It’s the same shit we use in commercial,
their own voices and exclude themselves from just has fancier names” (Green & Issenberg,
political participation in a triumph of remote 2016, paras. 11, 29).
actuation. Surveillance capitalism, then, with its unique
This was not the totalitarian nightmare of affordances and incentives, tipped the scales for
Big Brother ready to break bodies and bend Trump. Its limitless industrialized tons of
souls to its single truth. Nor was there a presid- human data, computational capabilities, and
ing autocrat threatening imprisonment, terror, knowledge production created the instrumen-
torture, and murder. The work here was accom- tarian power that leveraged all of it for remote
plished by a specific form of epistemic power behavioral actuation and turned the tides of
Zuboff 37

collective behavior to a Trump victory. thus the epicenter of the corporation’s financial
Parscale’s dark triumph reset the political bar. success (Merrill & Oremus, 2021; Oremus,
By the next US presidential election cycle in 2021; Zuboff, 2019, pp. 458–461).
2020, an MIT report concluded that “campaign In early 2018, as the new teams were busy
apps” in the United States, India and other reimagining Facebook, Zuckerberg announced a
countries had become “part of a larger system dramatic shift in the News Feed paradigm. The
of surveillance capitalism” (Gursky & Woolley, changes were presented to the public as a happy
2020). face solution to corrupt polarizing information
and a defense against foreign “interference.”
Zuckerberg stressed a return to intimate positive
The Economic Operations
connections that create more “value” for the
In the wake of Brexit, Trump’s election, and individual, improve “well-being,” enrich com-
evidence of Russian participation in remote munities, and “bring people closer together.” The
behavioral actuation, Facebook launched an revamped algorithm was to favor posts from
“internal effort to understand how its platform friends and family, especially those that “spark
shaped user behavior and how the company conversations” and “inspire back-and-forth dis-
might address potential harms” (J. Horwitz & cussion.” This new approach, Zuckerberg told
Seetharaman, 2020, para. 5). Internal research members of the US Congress, demonstrated the
in 2017–2018 conducted by newly formed cad- company’s determination to “police the ecosys-
res of scientists and engineers called “Common tem” (Hagey & Horwitz, 2021; Mosseri, 2018;
Ground” and “Integrity” teams warned that the Wall Street Journal, 2021a).
company’s own algorithms amplified polariz- The “Facebook Files,” exfiltrated from
ing content as a means of driving the cycle of company servers and brought to the public in
EEPR. These algorithmic interventions nour- the fall of 2021 by whistleblower Frances
ished the growth of far-right extremism, both in Haugen, tell a sharply different story, but one
the United States and around the world (J. that is predicted by the theory of surveillance
Horwitz & Seetharaman, 2020, sec. 2, para. 3). capitalism. According to the Wall Street
For example, a 2016 internal study tracked the Journal reporters who studied the documents,
rise of extremist content and extremist groups the varied atrocities of the 2016 US election
in Germany, where 64% of all group joins were produced an alarming decline in Facebook’s
due to Facebook’s targeted recommendations. “engagement metrics” throughout 2017. These
By early 2018, however, Zuckerberg lost inter- results triggered panic among executives, who
est in mitigating these and other social harms. diagnosed the cause as a shift to passive view-
News Feed was in trouble. ing of professionally produced content. The
When first introduced in 2006, News Feed algorithmic redesign, labeled “Meaningful
was a crucial blow to user privacy. Despite Social Interactions,” was engineered to reverse
angry public pushback, Mr Zuckerberg was the decline in comments and other signals of
intransigent. If Facebook is a nervous system, engagement. This solution for the threat to
News Feed was to be its aorta, the largest pipe EEPR “would reward posts that garnered more
carrying behavioral surplus from everyone eve- comments and emotion emojis” (Hagey &
rywhere to the corporation’s computational Horwitz, 2021).
heart. The feed was controlled by a secretive The logic of system design to optimize for
predictive algorithm derived from Facebook’s EEPR had been baked into the cake from the
proprietary god view of what is estimated to be start, as noted in our discussion of epistemic
more than 10,000 data elements continuously chaos. Mechanisms like those that drove
computed to assess the “relevancy” score of German extremism were well established. The
thousands of potential content choices. It difference in 2018 was that now News Feed,
became the most significant driver of EEPR and the driver of the nervous system, was harnessed
38 Organization Theory 

to this single goal. The new algorithm meas- with more toxic content for more EEPR, and
ured multiple dimensions of interaction, on, and on ...
assigning points to detailed signals: likes, A global social network optimizes its central
shares, reactions, emojis, the anger button, operations for EEPR, saturating the information
comments, RSVPs, and so on. The higher the and communication space with corrupt content
score of an interaction, the more “significant” and targeting those who can be goaded into
the engagement, the more widely the algorithm making the largest contribution to its commer-
disseminated that interaction as bait for more cial aims. Facebook operates like a planetary
engagement. Skinner box, and the consequences for collec-
The News Feed algorithm was a social com- tive behavior spread. Poland’s political parties
munications system engineered to follow the joined others across Europe, claiming that the
math not the meaning of interactions, just as arrival of the euphemistically labeled “mean-
Shannon had prescribed for his machines. ingful social interactions” in 2018 “changed the
Unsurprisingly, the News Feed system reengi- nature of politics for the worse.” The political
neered to maximize interaction did not favor parties observed that Facebook’s emphasis on
content from family and friends for amplifica- resharing content “systematically” rewarded
tion. Instead, the content that earned the high- “provocative, low-quality content” and they
est engagement scores, as it reliably “sparked found themselves adapting to Facebook by pub-
conversations” and provoked the most “back- lishing “far more negative content than before”
and-forth discussions,” was the worst content: (R. Metz, 2021). In Spain, Facebook pages
corrupt, defactualized, grotesque, hateful, related to social and political debate saw a 43%
false, polarizing, extremist, and anger-induc- increase in threats and insults (Constella, 2021).
ing. By the summer of 2018 the decline in key Positive headlines and policy posts simply did
engagement metrics had slowed and, in some not contribute to EEPR.
cases, reversed (Hagey & Horwitz, 2021). But As Facebook’s pages turned uglier, the teams
user surveys showed that the degraded quality proposed mechanisms to at least mitigate the
of the News Feed diminished people’s sense of intensification of chaos. For example, one pro-
well-being. A slide from a 2018 Facebook pres- posal outlined new moderation techniques for
entation read: “Our algorithms exploit the private groups and methods to limit the number
human brain’s attraction to divisiveness” (J. of posts on inflammatory subjects. Another
Horwitz & Seetharaman, 2020), a phenomenon team developed a method to reduce the spread
already identified in academic research of content favored by hyperpartisan hyperactive
(Vosoughi et al., 2018). users. The teams understood that, contrary to
Once again, internal studies identified the Friedman, implementation would interfere with
measurable effects of engagement engineering EEPR. Some of their ideas were “antigrowth”
on polarization (Hao, 2021b). The difference and implied “a moral stance” (J. Horwitz &
now was that the all-powerful News Feed was Seetharaman, 2020, sec. 3, para. 3).
specifically reengineered to optimize EEPR. There were political concerns too. As we
Behavioral effects spread widely and quickly as have seen, surveillance exceptionalism imposes
every individual user, publisher, organization, an enduring political burden that ricochets
or troll farm found themselves chasing the between appeasement and conflict. In 2018
proof of life that Facebook’s validating metrics Facebook was already on the political defensive,
provided. Any person, group, or entity that accused by Republican lawmakers of systemic
wanted to participate in the attentional slip- anti-conservative bias (US Senate Committee
stream of likes, shares, comments, audience, on Commerce, Science, & Transportation,
and so on, had to engage in the macabre “back 2016). Because polarizing hyperpartisan misin-
and forth” of social outrage. As they did so, formation was disproportionately produced by
both they and their “lookalikes” were targeted conservative and right-wing sources, beginning
Zuboff 39

with then President Trump, Facebook execu- the actions and attitudes of billions of people at
tives feared political retaliation if they curtailed will, as ordained by the economic imperatives to
such content (Dwoskin et al., 2020; Evanega which he is pledged. He strikes this key or that
et al., 2020). from his celestial perch, and qualities of human
Facebook’s new teams battled with politi- behavior and expression rise or fall. Anger is
cally wary executives, but in the end their pro- rewarded or ignored. News stories are more
posals were killed or fatally weakened (Roose trustworthy or unhinged. Corrupt information is
& Isaac, 2021; Timberg, 2020). By late 2018, showcased or sidelined. Publishers prosper or
Mr Zuckerberg informed his staff that he had wither. Political discourse turns uglier or more
lost interest in reorienting the company toward moderate. People live or people die.
“social good” and asked that they refrain from From the point of view of economic opera-
bringing him more proposals. The Common tions, remote behavioral actuation is a means to
Ground team disbanded, and many senior staff economies of action and their commercial ends
involved in the remediation efforts left the com- (Zuboff, 2019, pp. 293–299; see also, Merrill
pany. Employees were told that Facebook’s pri- & Oremus, 2021; Oremus, 2021). But from a
orities had shifted “away from societal good to governance perspective, it signals the assump-
individual value” (J. Horwitz & Seetharaman, tion of systemic governance powers over the
2020, sec. 4, para. 11). content, pattern, and flow of individual and
As engagement and polarization grew, con- collective behavior. This compromise of
cern over the 2020 US election season deep- behavioral integrity, as seen in the case of the
ened. In April 2019, a data scientist with the Trump 2016 campaign, is a stealth attack that
Civic Team advanced a method for reducing the originates from outside the social, denying its
spread of “deep reshares” associated with viral- victims the right of combat by distorting the
ity and known for their negativity and sensation- social order without triggering awareness.
alism. According to documents provided by Facebook has explicitly engaged in detailed
Haugen, the method was proven effective in tracking of collective behavior since at least
tests on civic and health information. Once 2009, when it went public with its Gross National
again, Zuckerberg stood down, citing his con- Happiness Index continuously compiled from
cern that the change would reduce “positive” users’ posts. The New York Times reported that
engagement. A parade of substantive proposals, the company was also analyzing the ethnic and
from eliminating the “reshare button” to restrict- racial status of its users, how groups interact, and
ing comments, and more, met the same fate how those interactions change over time as an
“because the company determined it would hurt “important indicator of the state of the nation”
user engagement” (Wall Street Journal, 2021a; (Cohen, 2009, para. 20; Siganos et al., 2014).
see also, Hagey & Horwitz, 2021). Active inter- Facebook’s metric for large-scale monitoring of
ventions are typically only favored post-factum, “Violence and Incitement Trends” was first
once extreme damage, such as the 6 January reported in 2020 (Mac & Silverman, 2020, para.
2021 riot on Capitol Hill, risks democratic con- 3). There is little public information regarding
tradiction or widespread user withdrawal (for how these and other proprietary forms of behav-
examples, see Merrill & Oremus, 2021). ioral knowledge are used to remotely actuate and
suppress collective behavior.
The Governance Vector: The Zuckerberg’s decisions between 2018 and
Governance of Collective and 2020 demonstrate Facebook’s reliance on
remote behavioral actuation for continued
Individual Behavior growth. This shift of focus from tracking col-
Instead of solutions, Mr. Zuckerberg wantonly lective behavior to actuating behavior is not
pounds his algorithmic keyboard of humanity’s new. It emerges from a long period of incuba-
collective behavior, reinforcing or extinguishing tion, most clearly identified with Facebook’s
40 Organization Theory 

published contagion experiments (Bond et al., in ways that disrupt human agency, alter behav-
2012; Kramer et al., 2014) before public back- ioral trajectories, and abrogate “the right to the
lash alerted Zuckerberg to the need for secrecy future tense” (Zuboff, 2019, pp. 329–348). The
(Zuboff, 2019, pp. 304–309). Both studies difference now, one decade after the published
designed and tested remote actuation capabili- experiments, is that “the experimenters” employ
ties aimed at measurably influencing collective far more powerful computational capabilities
behavior. In the first, targeting mechanisms along with more vast and varied accumulations
based on engineered social comparisons and of behavioral surplus. These enable exquisitely
subliminal cues were deployed to influence fine-tuned delivery systems with a more diverse
users to vote in midterm elections. In the sec- and effective range of targeting weaponry.
ond, subliminal cues were manipulated to Indeed, the power and capabilities to modify
induce feelings of sadness or happiness among collective behavior at this kind of scale are
users. While many academics and tech observ- closely associated with “information warfare”
ers reacted in horror, the company researchers (Corn & Taylor, 2017; Hollis, 2018; Libicki,
celebrated the successes of their experiments, 2017; Lin, 2019). For example, Creţu et al.
noting that it was possible to manipulate online (2022) demonstrate how massive-scale anony-
triggers to influence real-world behaviors and mous datasets on human interactions reveal
emotions at scale without engaging user aware- individual identities and sensitive information
ness. Indeed, the company publicly touted its that can be used for “matching attacks” or other
ability to influence election outcomes on its attacks based on “behavioral profiling.”
“success stories” page, before removing those Information warfare is widely assumed to be
links in early 2018 (Biddle, 2018a, para. 1). a capability of the State for the purposes of
Contemporary studies by academic research- political, cultural, or military destabilization,
ers have closely examined many of the dynam- just as behavioral modification or surveillance
ics identified in this discussion and their causal were once considered projects of the State.
role in rising societal polarization and dimin- Now the affordances of the surveillance capital-
ished social resilience (Cinelli et al., 2021; R. ist order reconfigure information warfare as a
Levy, 2021; Rathje et al., 2021; Santos et al., market project. Indeed, it is only on the strength
2021; Tokita et al., 2021; Vasconcelos et al., of this construction that State or non-State
2021). While there remains a great deal to learn, actors can succeed as parasites on surveillance
the preponderance of findings illuminates the capitalism’s host body, using its “tools” to wage
relationships between algorithmic design, the information war on civilian populations for
swell of corrupt content and social polarization. commercial or political purpose, as illustrated
This pattern is so prominent that it suggests the by the Trump 2016 campaign.
hypothesis that the corporation is now engaged Equally threatening to democratic freedoms
in a mega-massive-scale contagion experiment, is the fusion scenario, in which the informa-
this time trained on triggering social conta- tion appetites of the state produce a relentless
gions of polarization. source of demand that merges with the relent-
Considered from this perspective, the less supply of human-generated data from the
Haugen documents provide an updated view of surveillance giants. Such operations, once
contagion science after a decade of intensive, legitimated by a “war on terror,” have been
and intensively concealed, development. turned on civilians in a full-on display of citi-
Experimental principles are similar to those in zen-targeted information warfare that plain-
earlier work. Persons are objects of tailored tively illustrates how the democratic void
stimuli and forms of reinforcement that operate leaves us naked and vulnerable. Surveillance
outside of awareness to impel thought, feeling, advertising once again plays the Trojan horse.
and action. These interventions appear capable For example, RTB data have been used to pro-
of evoking, selecting and reinforcing behavior file Black activists, obtain warrant-less phone
Zuboff 41

tracking, and surveil individuals (Ryan, 2022). economy” is the cause. “[T]his economic struc-
Amazon has admitted to giving data from its ture of trading free access for data collection
Ring doorbells to law enforcement officials about individuals’ lives poses a national secu-
without warrants or the knowledge and con- rity threat.” She notes that the unique commer-
sent of owners (Díaz, 2020; Biddle, 2022; cial capabilities of this economic institution are
Selinger & Durant, 2022). Law enforcement “increasingly harnessed for mass population
agencies at every level of government acquire control ... with virtually no oversight or regula-
location tracking data captured by apps and tion” (p. 63).
aggregated in the private market (Burke & Just as systemic threats to information
Dearen, 2022; Shenkman et al., 2021). integrity are threats to the social order of an
The fusion scenario is now vividly on display information civilization, systemic threats to
in a United States where state-level abortion behavioral integrity are threats to the very pos-
bans turn law enforcement and judicial authori- sibility of social order. In recognition of this
ties toward data from the giants as a means of phenomenon, an important paper by an interna-
transforming pregnant women into prey (Collier tional team of scholars argues that the study of
& Burke, 2022; Linebaugh, 2022; Nix & collective behavior “must rise to a ‘crisis disci-
Dwoskin, 2022; Ohlheiser & Kiros, 2022; pline’ ... with a focus on providing actionable
Zakrzewski et al., 2022). Most disturbing is the insight to policy makers and regulators for the
speed with which these operations are normal- stewardship of social systems” (Bak-Coleman
ized. American women are warned that their cell et al., 2021). Without clarity on the source of
phones are now a “reproductive privacy risk” such threats, however, there is little prospect of
(Li, 2022). An article in the Washington Post effective contradiction. For example, Bak-
begins calmly, “Everything you do online is Coleman et al. repeatedly identify “technol-
already tracked,” and it advises women “to ogy” as the causal force. They note that
avoid leaving a digital trail.” This includes “information flows” once shaped by natural
warnings and guidance more typically associ- selection are now at the mercy of “emerging
ated with CIA spymaster training: restrict com- communication technologies,” thus necessitat-
munications to secure encrypted messaging; set ing careful study of “the impact of emerging
messages to disappear; deactivate biometric technology on global behavior.” But in other
authentication and location sharing on devices; instances, their argument implies a generalized
avoid apps, health-tracking wearables, Google, economic causality, citing “engineering deci-
license plate and facial recognition software sions made to maximize profitability” and
readers; beware of cross-site tracking and check- “vested interests” that have “taken advantage
in software at your doctor’s office; scramble of new communication technology to spread
your identity and location on all devices; log out misinformation” (pp. 1–2, 4).
of accounts; maximize privacy settings; use The power to compromise the integrity of
alternate transportation ... (Kelly et al., 2022). If individual and collective behavior cannot be
today communities of color, activists, and now attributed to the internet, social media, commu-
the vast category of pregnant women and their nications technologies, the information econ-
allies can become targets of information war- omy, information capitalism, shadowy vested
fare, then tomorrow it can and will be any and interests, or even the profit motive per se. The
all persons or groups. “First they came for ... ”4 unprecedented power to exercise secret illegiti-
Recent scholarship in information warfare mate governance over human behavior at scale,
has begun to theorize these interdependencies. as described here, is produced by specific path-
Dawson (2021) observes that the US govern- dependent economic operations and capabili-
ment has been sounding the alarm on targeted ties developed in time and guided by the
influence operations on social media without historically unprecedented logic, mechanisms,
understanding that the “digital surveillance and imperatives of surveillance capitalism, in
42 Organization Theory 

which revenues flow from the commodification heighten societal vulnerability, which explains
of human behavior. why the prescriptions that guide these junctures
This distinction is good news, because it are treated with maximum gravity. The legiti-
means that despite the scale and force of sur- macy and continuity of institutions are essential
veillance capitalism as an institutional order, because they buffer society from chaos by for-
nothing about it is inevitable. The governance malizing common sense in norms, rules and
achievements and social harms discussed here laws. So, for example, when Zuckerberg
are not the product of technological or eco- decided that the destruction of privacy should
nomic necessity. While technological evolution be the new norm, and he “just went for it,” that
is a fact of modernity, surveillance capitalism is was a profound violation of the norms that
but one economic logic among many other pos- express commonsense knowledge. It was a
sible logics for bringing digital technologies to form of terrorism from which the democratic
life, constructing an internet or its successors, order failed to protect its peoples.
building an information economy, and structur- The social construction of reality rests on a
ing the social order and moral milieu of an common sense of what is “normal,” what is a
information civilization. It is wholly within the deviation from the norm, and what is a
capabilities of the democratic order to abolish destructive deviation. Formula One driving is
surveillance capitalism and free the digital from thrilling on the track but a norm-defying form
its iron cage in order to reimagine, reinvent and of terror in the neighborhood. Racetracks and
reclaim our information civilization for a demo- neighborhoods can coexist because there is
cratic future nourished by data, information, durable faith that everyone agrees on the
and knowledge. appropriate behavior in each setting. This
faith depends significantly on trustworthy,
The Social Harm Vector: The Artificial transparent, and respectful institutions of
social discourse, especially when there is dis-
Construction of Reality agreement. Even destructive deviations do not
In The Social Construction of Reality, Peter require censorship in an open society because,
Berger and Thomas Luckmann (1966) famously thanks to common sense, they “normally” rise
observe that the miracle of durable social order and fall and fade at the fringe, where they
rests on “commonsense knowledge.” This belong.
knowledge is not the same as “shared values.” It Surveillance economics, as we have seen
operates at a tacit level. The phone rings. You illustrated at Facebook, produce the opposite of
answer with “hello,” and the caller replies, these conditions. The company operates as a
“hello.” It is “the knowledge I share with others chaos machine where “norm violation” drives
in the normal, self-evident routines of everyday EEPR and corrupt information is good for busi-
life” (p. 23). Daily traffic patterns, for example, ness. Zuckerberg, like other owners of social
depend upon far more than obeying laws. media, portrays his network democracy’s “pub-
“All societies are constructions in the face of lic square,” protected—in the liberal democra-
chaos,” write Berger and Luckmann (p. 103). cies, at least—by a guaranteed right to freedom
Because norms are summaries of common of speech. Mr Zuckerberg’s “free speech funda-
sense, norm violation is an important element mentalism” has provoked much academic
of terrorism, stoking fear precisely because it debate and paralyzed the public response
repudiates the most taken-for-granted social (Shapiro, 2021). Considering the US case, con-
certainties upon which an orderly society rests. stitutional scholar Laurence Tribe tirelessly
Without the social construction of reality, eve- explains that First Amendment rights do not
ryday life is unbearably precarious. The death apply to Facebook’s discourse, because its
of the king or, in a democracy, the peaceful social media spaces are not public spaces and
transfer of power are critical moments that do not operate under public law. They exist
Zuboff 43

under the jurisdiction of private capital, not selection dictated by surveillance capitalism’s
governmental authority. Tribe writes, “The First political-economic objectives (see also, Balkin,
Amendment, like the entire Bill of Rights, 2017; Riemer & Peter, 2021; Ward, 2022).
addresses only government action, not the This unnatural selection, optimized for
action of private property owners. That’s not a EEPR, violates the integrity of collective
bug but a feature” (Tribe, 2021, para. 5; see behavior, reshaping thought, speech, and action.
also, Citron & Franks, 2020; Fidler, 2021; Corrupt content is selected for first-class privi-
Horder, 2021; Gingerich, 2021; Shattuck & leges of targeted dissemination and amplifica-
Risse, 2021; Wu, 2017). tion. In the absence of contradictory institutional
This jurisprudential analysis of freedom of forces that impose and defend measures of
expression is complemented by an implicit information integrity already discussed, such as
sociological vision that situates these rights in “truth,” “fact,” “reality,” and “meaning,” blind-
specific conditions of existence. US Supreme by-design systems artificially marginalize
Court Justice Oliver Wendell Holmes drew commonsense speech as a means to others’
upon this vision in his 1919 dissenting opinion commercial or political ends. These dynamics
in Abrams et al. v. United States (1919). “The produce the opposite of the societal qualities
ultimate good desired is better reached by free that juridical speech rights are intended to fos-
trade in ideas,” Holmes wrote. “The best test of ter. Because these operations are hidden, the
truth is the power of the thought to get itself whole process proceeds theatrically, as if its
accepted in the competition of the market ... consequences are organic social constructions
That at any rate is the theory of our Constitution” rather than privately owned and operated artifi-
(para. 58). Holmes’s characterization speaks to cial productions authored by economic impera-
the societal warp that complements the juridi- tives and political accommodations.
cal woof of speech rights. It assumes that the Facebook executive Andrew Bosworth once
social function of free speech protections is the again does his job by denying this bait and
renewal of society. Under conditions of free- switch of an artificially constructed reality for
dom and rule of law, ideas that renew society— a social construction. In the blizzard of criti-
even those that arise at the margin—will cism accompanying the Haugen revelations,
ultimately find wide acceptance through fair Bosworth became the spokesperson for a des-
and free debate. This sociological vision of a perate new line of rhetoric. Now, “individual
public square ordered by the natural selection humans” were to blame for spreading misinfor-
of good ideas through open discourse and com- mation on Facebook, and he questioned the
mon sense is idealized, but it expresses the company’s right even to identify misinforma-
societal aspirations and purpose attached to tion. “I’m very uncomfortable with the idea
speech rights. that we possess enough fundamental rightness
On the path toward accidental dystopia, the ... to exercise that kind of power on a citizen,
public square and its “social construction of another human, and on what they want to say
reality” are vanishing, replaced by a private and who they want to listen to” (Feuer, 2021,
zone of digital information and communication para. 14; C. Lima, 2021).
ruled by the imperatives of surveillance capital. Bosworth reminds us that concealment and
Algorithmic engineering supplants the social Orwellian communications remain essential to
construction of reality with an artificial con- institutional survival and development. These
struction. The corrupt information that increas- mechanisms intensify in an increasingly des-
ingly dominates this private space does not perate bid to buy time, as the veil is drawn on
migrate to the center of social discourse in a free Facebook’s displacement of organic social pro-
and fair competition of ideas. Rather, it is placed cesses in favor of its own self-interested designs
and held at the center by a process of unnatural for humanity.
44 Organization Theory 

Stage Four: Systemic of 130 data scientists from eight European


Dominance (Economies of countries, including teams of German, French,
Italian and Spanish technologists, had been
Domination)
working on an approach they called “Pan Euro-
The Economic Operations pean Privacy Protecting Proximity Tracing”
(PEPP-PT). Their aim was a smartphone-based
Stage four is marked by an increasingly visible
set of “standards, mechanisms, and services”
competition with democracy over the govern-
that could (1) inform individuals of COVID-19
ance of governance. The successes of econo-
exposures and test results; (2) operate seam-
mies of scale, learning and action in combination
lessly across the EU and other borders; (3) sup-
with the continued absence of effective demo-
port public health authorities’ research, policy
cratic contradiction produced historic concen-
development and oversight; and (4) accomplish
trations of data, knowledge, authority, power,
these objectives while complying with the stip-
capital, and infrastructure control. These come
ulations of the GDPR and related EU privacy
to fruition in the giants’ increasing confidence in
laws (PEPP-PT, 2020).
their ability to leverage absolute control of criti-
Ultimately, the revenge of the void would
cal digital infrastructure to bend democratic
sink its claws deep into the data scientists and
governments to their will: systemic dominance.
privacy experts on the PEPP-PT teams, as well
In earlier stages, the surveillance capitalist
as many watching from the sidelines. It was a
institutional order challenged rights and capa-
professional group all too familiar with the
bilities that are mission-critical to a democratic
threat of illegitimate surveillance. Despite its
society, including privacy, information integ-
leadership in privacy law and data protec-
rity, knowledge production, social solidarity,
tion—or more likely because of it—the EU
common sense, and the integrity of individual
and collective behavior. became a key theater of engagement in a
Stage four exploits the conditions produced mounting battle between trust and fear, organic
by these early and ongoing governance carve- solidarity and libertarian values. This discus-
outs and societal attacks. A new offensive sion examines key highlights of these events
emerges aimed at democratic institutions. The as they illustrate the contest for systemic dom-
giants’ control over critical digital infrastruc- inance at the frontier of surveillance capital-
ture is leveraged as a means of weakening and ism’s developmental progression (see Note on
then usurping the governance prerogatives of Method).
the democratic state. Examples of this emerging Some context is useful. Public health profes-
competition for systemic dominance appear sionals routinely employed the language of
with increasing frequency.5 In this discussion I “surveillance” long before the term was bur-
examine one such case played out between an dened with the dystopian themes of the digital
alliance of Apple and Google versus the democ- century. Public health “surveillance systems”
racies of the European Union, as global plague effectively eradicated smallpox in the 1970s,
spread fear across the continent. tuberculosis in the 1990s and SARS in 2003.
These victories typically depended upon indi-
Revenge of the Void. It was on April 10, 2020, as vidual case data in some combination with epi-
the world faced the most dangerous public demiological statistical tracking, as dictated by
health emergency in a century, that Apple and disease dynamics (Bay, 2020; Hellewell et al.,
Google abruptly introduced a COVID-19 expo- 2020; Sontani et al., 2020). Public health schol-
sure-notification protocol for iPhones and ars were already grappling with the implica-
Androids that was incompatible with exposure- tions of disease surveillance in the digital
notification and contract tracing applications century when COVID-19 overwhelmed health
already in development by EU teams (Holmes authorities across the globe (Ramjee et al.,
& Langley, 2020; Wuerthele, 2020). A coalition 2021; Sekalala et al., 2020; Stoto, 2008).
Zuboff 45

On the eve of pandemic Europeans enjoyed the tech companies’ collaboration on a “National
high degrees of trust in their public health Coronavirus Surveillance System” (Cancryn,
authorities (Wellcome, 2019). Eighty-three per- 2020). The White House consulted with the
cent of Germans indicated substantial trust in companies on comprehensive location tracking
their government’s medical and health advice. (Romm et al., 2020). Trump boasted the imme-
In the UK it was 81%, 77% in Spain, 70% in diate implementation of a nationwide system for
France, 63% in Italy, compared to only 59% in screening and testing to be implemented by an
the United States (Archer & Levey, 2020). obscure Alphabet subsidiary, Verily, an organi-
The COVID-19 pandemic evoked what was zation that had neither knowledge of the project
for many a new appreciation of social solidarity nor experience in building and managing such
and the positive role of government, but it also systems (MacMillan et al., 2020).
produced exceptional conditions of vulnerabil- It was against this background of a global
ity and digital dependency that some govern- dystopian surge that Apple and Google encoun-
ments and corporations exploited to extract tered an extraordinary opportunity to display
more data, consolidate more powers or both and enhance systemic dominance. Their suc-
(Lyon, 2022; Rahman-Shepherd et al., 2021). In cess depended largely upon the revenge of the
the EU, the Hungarian government used the void. The democracies’ two-decade failure to
pandemic to justify an indefinite “state of emer- claim digital spaces for democratic values and
gency,” authorizing the government to rule by the rule of public law had abandoned citizens to
decree and weaken parliamentary oversight the incursions of private and public mass sur-
(Zoltán, 2020). The Polish government pounced veillance, igniting its own pandemic of suspi-
on the health crisis as an excuse to schedule cion and mistrust. In the absence of epistemic
elections during the first pandemic wave, in rights, comprehensive new legal frameworks,
defiance of existing laws, the Polish constitu- and the public institutions and enforcement
tion and a Senate resolution (Radjenovic et al., powers purpose-built to protect them, many
2020). The UK’s National Health Service feared that COVID-19 would quickly morph
(NHS) announced a data-sharing agreement into COVID-1984. Without explicit legal pro-
with Google, Microsoft, and the secretive data tections, such as strict purpose limitation, sun-
mining analytics firms Palantir and Faculty. set laws, transparent verification procedures,
Described as “the largest handover of NHS and more, it was easy to imagine that COVID
patient data to private corporations in history,” surveillance undertaken in the name of public
the NHS claimed the companies would provide health would never be unwound (Amnesty
a “single source of truth” with which to track International, 2020; Timberg & Harwell, 2020).
the pandemic (Fitzgerald & Crider, 2020). These fears stirred disagreement among the
Similar moves in the United States were miti- PEPP-PT teams (Abboud et al., 2020; M.
gated only by the Trump administration’s Johnson et al., 2020).
incompetence and general disinclination to gov- Led by French and German scientists, the
ern. The White House unleashed a tech sector PEPP-PT consortium published its mission
bonanza as it turned to the giants for solutions, statement on April 1, 2020. Citing the emerging
supposedly “to leverage the tech industry’s evidence from COVID surveillance in illiberal
powerful tools,” but certainly to lend a patina of societies, it stated a commitment to “strong
competence to chaotic White House operations European privacy and data protection laws and
(Grind et al., 2020; Romm, 2020, para. 2). The principles,” while maximizing the effectiveness
Trump administration hired Palantir to build a of a “national pandemic response.” Data would
comprehensive system for virus tracking (Banco be shared with health authorities to facilitate
& Ackerman, 2020). Other proposals ran the COVID monitoring, analysis and policymaking
gamut from the terrifying to the slapstick. in a way that “enforces GDPR and ensures scal-
Trump’s son-in-law, Jared Kushner, announced ability” (PEPP-PT, 2020, paras. 4, 7).
46 Organization Theory 

A dissenting faction had emerged in March According to one public official I inter-
that year, led in part by Professor Carmela viewed, this rhetorical structure was “a market-
Troncoso, head of the Security & Privacy ing triumph” that took EU leaders by surprise:
Engineering Laboratory at Switzerland’s École
Polytechnique Fédérale de Lausanne (EPFL) At the time, we didn’t know where the label
(VIScon, 2020, 47:50–49:00). A security and “centralized” came from, but that language took
privacy specialist and recent recipient of a over very quickly. Who likes “centralized”?
Suddenly, everyone was scared that authorities
Google Research Award, Professor Troncoso
would surveil them. No one seemed able to
was alarmed at the prospect of COVID-19 imagine or believe that a so-called “central” server
exposure data held in government servers and, can be used constructively and safely by the health
she argued, vulnerable to “data grab,” dean- authorities in the member states according to our
onymization, and other symptoms of “function laws and fundamental rights, without following
creep.” The group’s proposed solution was a you or keeping your data. (PO IV)
“decentralized” model in which data and com-
putation would be strictly confined to one’s Indeed, a lack of trust in government was
smartphone, eliminating the “risks of privacy at the heart of the dissenters’ rationale for
abuse.” Troncoso swiftly recruited colleagues “decentralization.” In the opening lines of the
from EPFL and EHZ6 to pursue this model. April 3 DP-3T publication, decentralized sys-
Experts from across Europe soon joined this tem design was justified as the necessary
team. Initially, the dissenters remained within response to a pandemic that races “across
the larger PEPP-PT undertaking. But as “things borders and jurisdictions with different levels
soured pretty quickly,” they withdrew to focus of fundamental rights guarantees or in times
exclusively on their own approach, while where many governments are functioning
attempting to shut down the PEPP-PT effort under rules of exception” (Troncoso et al.,
(Abboud et al., 2020, para. 15). 2020c, p. 1). But other arguments in the
On April 3, the dissenters, by then 25 strong, DP-3T materials suggest a broad loss of faith
published their protocol for what they called in the rule of law and a generalized distrust of
“Decentralized Privacy-Preserving Proximity all governments, including democratic gov-
Tracing (DP-3T)” (Troncoso et al., 2020a). ernments. Indeed, proponents of the “central-
Their technical strategy was distinguished from ized” model were criticized for trusting the
what they described as PEPP-PT’s “centralized rule of law. “They purely rely on legal norms
approaches with very different privacy proper- ... This model advocates disproportionate col-
ties” (Troncoso et al., 2020b, sec. 6). The lan- lection of personal data, and assumes legal
guage choice of “decentralized” versus protections will be sufficient to protect popu-
“centralized” was a public relations coup that lations which often is not the case” (Troncoso
produced an immediate media explosion in a et al., 2020c, p. 1).
Europe already on edge. While neither term had One DP-3T team member, EHZ computer
appeared in the PEPP-PT mission statement, scientist Professor Kenneth Paterson, described
the image of “decentralization” immediately to an audience the project’s “bunch ... of tight-
conjured associations with pro-social demo- knit academics” united in a mission “to build a
cratic ideals and fundamental rights, while the system that didn’t have any unintended side
word “centralized” triggered fears of Big effects.” He explained that only “a decentralized
Brother and Chinese-style surveillance. architecture ... where all of the work is done on
“Methods are introduced to strictly control data the phone under the control of the user and not at
flows in order to avoid accumulating any con- some central server under the control of a gov-
tact data on a centralized server,” the DP-3T ernment” can protect individuals from mass sur-
team wrote (Troncoso et al., 2020c, p. 2). veillance. “Do you trust your government?”
Zuboff 47

Professor Paterson asked his audience. “I don’t A presentation by Professor Troncoso


... I actually don’t trust my government ... to reviewed this early stage of the project in a
competently handle that kind of data ... you series of PowerPoint slides, some of which fea-
shouldn’t trust your government; you should ture a vividly portrayed Eye of Sauron bearing
trust decentralized designs ... This is about down on the words “security and privacy.”
building trust for the public in the system so they When it comes to the question, “Who decides?”
can be encouraged to use it” (VIScon, 2020, the answer is stated unequivocally: “The sys-
14:43–14:48, 39:00–40:02, 40:14–40:49). tem design. Platform decides Exposure
Professor Paterson’s views may sound Notification.” Another page reads: “Reality.
extreme to some readers, but they place him in Use Existing Infrastructure ... Apple must be
the mainstream of the libertarian ideology that involved ... Google and Apple must be involved
passed through Hayek, Friedman and others to ... Google and Apple implement the protocol
become the taproot of Silicon Valley’s world- and the API” (Troncoso, 2021, pp. 6, 14, 13).
view. Data Scientist II explained: The risks associated with trust were not
eliminated but reassigned. Paterson stressed
Libertarianism is the ideology of Silicon Valley. that without centralized authority and informa-
This community fosters a general mistrust of tion, one must trust anonymous users and their
government and all institutions. There is a norm “sense of morality to do the right thing”
that centralization is always bad. Group identity in (VIScon, 2020, 35:06–36:06). Given the pro-
Silicon Valley is that government is out there trying ject’s absolute dependency on cell phone oper-
to impose things on them. Government is a relic
ating systems, Apple and Google joined
that can’t understand or keep up. There is a myopia,
a focus on the technical side of things rather than
“anonymous users” as necessary objects of
seeing these larger dangers of institutional politics social trust. In their April 3 announcement the
and power. This made sense when they were little team wrote, “Since Apple and Google provide
hacker startups, but now they are the most powerful the operating system running on mobile devices,
corporations in the world, but somehow folks still one has to trust them, since they could poten-
think this way. (DS II) tially learn information related to the proximity
tracing system (who is infected, who infected
Under these perceived conditions of pervasive whom, social graphs, etc.)” (Troncoso et al.,
assault and defense, Paterson regarded manual 2020d, p. 4; emphasis mine). Recognizing that
contact tracing as “highly privacy invasive.” the giants offer no possibility of “exit” or
Epidemiological research was considered “data “voice,” the dissenters settled on “loyalty” as if
overreach.” “We were not then in a position to it was a choice.
collect additional data that could be useful for ... Early on, when the DP-3T first picked up
the health systems” (VIScon, 2020, 6:44–6:56, speed in March, senior EPFL administrator
15:11–16:03). Edouard Bugnion made contact with the office
There were technical hurdles. Without the of Apple COO Jeff Williams. That led to a
cooperation of the two technology giants that series of meetings between members of the
control the critical digital infrastructure of DP-3T team and Apple personnel in which they
Europe’s—and most of the world’s—smart- discussed decentralized design and its technical
phone operating systems, it would be impossi- implications (Barraud, 2020; Owen, 2020;
ble to implement either DP-3T or PEPP-PT VIScon, 2020, 26:03). Recall CEO Tim Cook’s
effectively and at scale. Both required Bluetooth 2019 statement, “We are taking what has been
low-energy beaconing in lieu of GPS-based with the institution and empowering the indi-
location tracking. Without technical adjust- vidual” (Feiner, 2019, para. 72). Williams is the
ments, COVID-19 applications were not able to executive who oversees the translation of this
maintain Bluetooth “in the background,” great “taking” into concrete accomplishments,
quickly draining the cell phone battery. such as the Apple Watch. The DP-3T project
48 Organization Theory 

might have impressed him as the apotheosis of Commission, 2020; Lomas, 2020b). In the
Apple’s grand strategy. The “decentralization” same spirit of defusing the conflict, the
project had already grabbed the main stage of European Data Protection Supervisor con-
Europe’s COVID-19 debates, largely because cluded that “data protection rules currently in
of its rebellious narrative featuring the scien- force in Europe are flexible enough to allow for
tists and their trustworthy system as digital-age various measures taken in the fight against pan-
Robin Hoods dedicated to taking from untrust- demics” (Wiewiórowski, 2020, para. 2). But
worthy public institutions and empowering the conflict was not to be so easily resolved.
individuals. In this narrative, the new Sheriffs One day later, on April 17, the European
of Nottingham were democratic officials, pub- Parliament called for coordinated action to
lic health authorities and their dastardly servers. combat the pandemic. The members channeled
Data Scientist I deepened the picture: Clinton and Gore’s leadership in democratic
self-evisceration when they endorsed the
As we undertook the exposure notification “decentralised” approach and insisted, “The
project, the working theory was to bypass generated data are not to be stored in central-
epidemiology. They wanted to give information ised databases.” The Parliament stated its
directly to individuals and empower people to act “demands that all storage of data be decentral-
on their own behalf. You bypass the public health ised” (European Parliament, 2020, art. 52).
system in favor of individuals. (DS I) Ministers from Italy, Spain, Germany, and
France decided to team up to convince Apple to
Showdown. On April 5, 2020, Apple agreed to adapt its operating system to the member states’
invest in the DP-3T solution, and in a public needs. “Surely they did not have the temerity to
announcement on April 10 it joined forces with reject the requirements of four sovereign
the other major owner of mobile communica- European countries,” Public Official II told me.
tions critical digital infrastructure, Google But in the end, the official recounted, “Apple
(Apple Newsroom, 2020; Wuerthele, 2020). In categorically refused our requirements, insist-
a bizarre twist, this move cast both corpora- ing that only their so-called ‘decentralized’
tions as privacy champions defending Europe structure offered a privacy solution. It was
from the most comprehensive privacy protec- impossible to argue with Apple.”
tions on Earth. Bugnion and Paterson each cel- On April 19, 2020, 300 data scientists, pri-
ebrated their impact on the giants and, through marily from Europe, the United States and the
them, the world (Owen, 2020; VIScon, 2020, UK, published a joint statement demanding
25:00–26:41). that any COVID-related system design solu-
The German, French, and other teams still tions “which allow reconstructing invasive
laboring to deliver the PETT-PT application information about the populations should be
cycled from shock to anxiety over whether their rejected without further discussion” (303
systems would even work with Apple and Scientists and Researchers, 2020, p. 1; empha-
Google’s “decentralized” adaptations. Apple sis mine). This was followed by an open letter
was intransigent, refusing all negotiations with to the German government from six civil liber-
the EU teams. (Abboud et al., 2020; Lomas, ties organizations that criticized the PPET-PT
2020a). approach. Germany’s Chancellor Angela
The European Commission published a Merkel, unsettled by the criticisms of privacy
“toolbox” of “essential requirements” for digi- experts and driven by pragmatic concerns over
tal tracing applications on April 16, still hope- network effects, cross-border interoperability,
ful that “digital technology ... could contribute and a speedy rollout, abruptly abandoned the
to ... containing and reversing” the contagion. four-country effort to win Apple’s support.
The tools were intended to support either “cen- Switzerland and Austria had already commit-
tralized” or “decentralized” designs (European ted to Apple and Google. Once Germany
Zuboff 49

capitulated, Ireland, Italy, Spain, and the other Dehaye & Reardon, 2020; Gvili, 2020; Reardon
EU member states quickly followed. et al., 2020; Vaudenay, 2020).
Only the French moved forward with their In a study of SwissCovid, the first Apple-
own application, forced to sacrifice interopera- Google Exposure Notification Protocol (ENP)
bility and full alignment with smartphone oper- in Europe, two EPFL scientists, Serge Vaudenay
ating systems and never achieving the uptake and Martin Vuagnoux, observed that Apple and
they had hoped for (Abboud & Miller, 2020; De Google had been exempted from Swiss trans-
Vynck et al., 2020; Hern, 2020b; M. Johnson parency requirements, and Swiss law was left
et al., 2020). France’s Minister for Digital “powerless to protect people from using a non-
Affairs, Cédric O, told Politico, “I don’t want to transparent contact tracing system.” The result
be constrained by the internal policy choices of was the giants’ absolute control without
any company on a matter of public health” accountability. In addition to identifying many
(Scott et al., 2020, para. 6). Later, he reflected, privacy weaknesses in the Apple-Google proto-
“The problem with a centralized protocol is that col, they noted the political obfuscation in the
you have to be confident and to trust your state language of centralization and decentralization.
... We’re in a democratic state, we have checks “Arguments against centralized systems have
and balances” (Corbet & Chan, 2020, para. 11). been overly exaggerated, and the ones in favor
The Norwegians chose a similar path, only to of decentralized systems have been oversold to
dismantle their system in June, dogged by tech- a level that we found unethical ... In the former
nical problems and criticism of data overreach case, trust is based on a democratic system. In
and lack of legal safeguards (Singer, 2020). The the latter case, trust is based on a commercial
UK continued to negotiate with Apple to no system.” Without the benefits of epidemiologi-
avail (Hern, 2020a) and by late April NHS offi- cal data for Swiss health authorities, Vaudenay
cials announced they would move forward and Vuagnoux (2022) questioned “whether
without the Apple-Google protocol (Kelion, SwissCovid is useful at all.” They asked if it
2020). By mid-June, technology glitches and should be “shut down,” and concluded,
criticism from privacy advocates forced the “Citizens have no longer anything to say about
NHS to revert to the Apple-Google system it except taking it as a whole or refusing it. This
(Warrell et al., 2020). is a major loss of government’s digital sover-
On the evening of April 22, European eignty” (sec. 2, para. 4; sec. 3, para. 5–6; see
Commissioner for the Internal Market, Thierry also, Creţu et al., 2022; Ng, 2021; Singer, 2020).
Breton, held talks with Apple’s Tim Cook. With
the Commissioner seated at his desk and Mr Apple-Google Contribute to US Failures. The
Cook’s face looming larger than life on a mas- Apple-Google US rollout was even more cha-
sive video monitor, Breton urged Cook “to otic. Unlike Europe, the United States entered
work constructively with the national authori- the pandemic era with record-low trust in
ties so that the tracking apps developed in government institutions. The Trump adminis-
Europe run on the iPhone” (Purcher, 2020, para. tration intensified these conditions, boasting
2). Even this high-level intercession yielded no of plans for high-tech COVID surveillance
accommodation from Apple. while actively demeaning US public health
Many researchers rejected the privacy advo- professionals, health institutions, and state
cates’ unusually unscientific demand to proceed officials. COVID-related disinformation,
“without further discussion.” Instead, they much of it originating with then President
undertook the system testing and analysis that Trump, further battered institutional trust
typically would have preceded a rollout of such (Evanega et al., 2020). In many communities,
significance. Not surprisingly, the new studies these conditions undermined contact tracing
revealed a complex case, debunking many of and responsible epidemiological information
the dissenters’ claims (Bradford et al., 2020; gathering (Raskin, 2020).
50 Organization Theory 

The Apple-Google juggernaut exacerbated The GAO cites this “lack of trust” as the
these problems. As in Europe, US public health major obstacle to Americans’ adoption of the
authorities found themselves pleading unsuc- Apple-Google ENP and the apps built upon it:
cessfully with Apple for a more flexible design. “Mistrust of governmental health authorities
By June 2020, only three states had committed and technology companies can lead people to
to using the Apple-Google technology. The rest forgo using apps ... The public may lack confi-
either remained uncertain or decided to pursue dence that its privacy is being protected, in part,
manual contact tracing and follow-up (Barry, due to ... a lack of federal legal protections” (US
2020; Holmes & Langley, 2020; Yan, 2020). By Accountability Office, 2021, pp. 28, 31).
July, a global analysis would ask, “What ever Apple-Google did not create but rather
happened to digital contract tracing?” (Kissick expertly exploited the rapidly escalating sense
et al., 2020). of vulnerability and disorientation engulfing
One year later, a study by the US General whole populations. These conditions enabled
Accounting Office (GAO) found that only 26 of them to intervene in the relationship between
56 US states and territories had deployed the individuals, their societies, and governments.
Apple-Google app, with download levels rang-
ing from 200,000 to 2 million. Worse still, only The Governance Vector: The
a small fraction of the people who downloaded
the app actually used it. Of six states where the
Governance of Governance
GAO collected systematic data, the number of As pandemic death tolls mounted, the compa-
times that app users received exposure notifica- nies’ steadfast intransigence suggests that sav-
tions ranged from a high of 31,000 and 42,000 ing lives was never their primary goal. Indeed,
respectively in two states, to a low of 900 to if their aim was to fight the pandemic effec-
3800 in the other four. The report concluded: tively, then their efforts were a categorical fail-
“We found limited evidence that exposure noti- ure. The victory Apple-Google scored is better
fication apps are effective at enhancing the understood in terms of the developmental urge
speed or reach of manual contact tracing or at toward systemic dominance, which now aims
reducing the spread of disease” (US the great “taking” of the Robin Hood illusion at
Accountability Office, 2021, p. 33; see also, De the democratic order itself. The governance
Vynck & Zakrzewski, 2021). The ENP’s threat draws on the giants’ absolute control
“decentralization” secured the app as a black over the information and communication
box. State health officials could not measure its spaces gifted to them by the very democracies
effectiveness because there were no data to do to which they laid siege. In this case, their ava-
so, and they could not improve its effectiveness tars on the invisible battlefield are mobile oper-
because they had no measurements (US ating systems. Disruption targets are the
Accountability Office, 2021, pp. 34–35). working elements of democratic institutions,
Here too the revenge of the void played a including elected and appointed officials, their
key role. On one side, US lawmakers had left authority and power, roles, responsibilities,
citizens almost entirely unprotected from gov- purpose, and public mandate. The spoils of war
ernment and corporations. On the other, are measured in opportunities to alienate indi-
Americans harbored a growing sense of outrage viduals from society, its political institutions
toward the tech giants’ relentless extraction and and leaders, turning them instead toward the
targeting. A stream of US survey data shows a system and the propaganda that conceals its
complete rupture of faith in the tech companies facts of private control and genuinely central-
among decisive majorities (Accountable Tech ized unaccountable power.
& GQR, 2021; Future of Tech Commission, The data scientists I interviewed each
2021; Knight Foundation, 2020). emphasized the corporations’ governance
Zuboff 51

ambitions and the economies of domination More predictable is that in the absence of such
upon which they depend. “This was always a contradiction, the surveillance capitalist order
governance play,” Data Scientist I told me. is likely to develop more extensive govern-
“Everyone knew it, and everyone was on board ance powers that further dissolve the distinc-
with that. If they weren’t, it would have been tions between economic, political and social
hard to speak up.” power in much the same way that it eliminates
Data Scientist II explained critical digital boundaries between sectors now reborn as
infrastructure control as key to economies of information science. The words of John Donne
domination. “The companies configure the should echo in the thoughts of every citizen
operating system (OS) any way they want. They and lawmaker, elected and appointed official,
can make this protocol privacy-preserving, just secretary, minister, president, prime minister
as they can update the OS to make data available and civil servant:
to them. This is the larger truth: They can do this
for any data on your phone, email, messages, Therefore, send not to know
photos ... The owner of the OS is the Emperor, For whom the bell tolls,
because there is no regulation that makes it ille- It tolls for thee.
gal. You see this in the nonnegotiable require-
ments they set for public health apps in every The Social Harm Vector: The
country.” Desocialization of Society
Half a world away, the European public
officials with whom I spoke felt the effects of The DP-3T data scientists framed “decentrali-
the “governance play” and its economies of zation” versus “centralization” as proxies for a
domination, just as intended. Public Official III new contest between individuals and social
spontaneously described the situation as “sad solidarity. Apple and Google, sphinxlike,
... and grotesque,” adding, “This means that exploited this ideological opportunity to present
even in the face of so much death and disease themselves with mind-boggling audacity as
these companies currently have the power to guarantors of individual privacy, while they
constrain the choices of democratically elected “take from the institution” of democracy.
sovereign states. They are the gatekeepers of In the development of systemic dominance,
society.” the taking is no longer confined to data, knowl-
Public Official II echoed this analysis: edge, or even the raw power of behavioral mod-
ification. Here the taking extends to the living
Apple can refuse to negotiate with officials of bonds of trust. Society is sacrificed for the sake
democratic governments because they are the of individuals, this story goes, as rhetoric and
sole gatekeeper of the iOS ... While we struggle action aim to transfer trust from society to “the
with the notion of how to structure the platforms, system.”
the platforms are structuring our democracies. As one public official told me:
Under these conditions, what is the relevance of
public power? If we are not strong enough to
The real problem is the decline of trust in the state.
apply our own laws in the real world to the
This is partially self-inflicted, but heightened and
internet, then what is the use of the state?
exploited by the tech companies, their false rhetoric,
and their disinterest or inability to curb disinfor­
From the point of view of institutional mation ... In a democracy you can effect change.
development, the governance of governance is You can fight authoritarianism. But with Apple, you
the next necessary achievement required to can’t change anything. It’s a single massive highly
protect, nourish, reproduce, and extend already centralized corporation with absolute power. So,
conquered terrain. The possibilities of demo- what will remain for the people? Only the “user
cratic contradiction are difficult to predict. experience” remains. (PO I)
52 Organization Theory 

What is the user experience that Apple and authoritarian governments like China.
Google’s systemic dominance imposes? Trust Unencumbered by democracy, he finds them
without hope of verification. Taxation, paid in better suited to meet the crisis of the pandemic
the sacrifice of society, without hope of repre- and all potential crises:
sentation. In this shadowscape, any violation of
privacy is anyway invisible, unknowable, and So if you look at the state of the art in the countries
without hope of redress. All that is sacrificed for that have done this well ... they’ve all been
this empty promise of empowerment is gambled countries with simpler governmental systems.
on an inscrutable operating system owned and Ours is too complicated. We can’t even decide if
the President can shut down the states ... We can’t
operated by vast empires and their silent emper-
decide if it’s the New York Mayor or the New York
ors. The known unknown is a tweak of the OS Governor, who gets to decide what’s going on in
today, and a different tweak tomorrow. the schools. And because you have a problem of
Goodbye, network. In this shadow world, decision making within our country, you get
economies of domination employ radical con- confusion which leads to delay which leads to lack
nection, now paradoxically centralized as hub of action. (Schmidt & Kravis, 2020, pp. 7–8).
and spoke, center, and node. All nodes are
anonymous to all other nodes, each a single Schmidt has no patience for the substance of
nameless atom drifting through an obscurity the democratic process and yearns instead for
where everything is suspect. Connection to a desocialized society. All the implications of
“the system” produces the isolation that nour- “taxation with representation”– voice, partici-
ishes absolute power. This isolation is mis- pation, open discussion, conflict, rights to be
taken for privacy. Society, or what is left of it, claimed, enacted, and protected– mean that
is only tolerable to the extent that it is drained democracy is messy, frustrating and fitful. At
of the social. It is not difficult to imagine the its best, the principles that attend to the rule of
long-game calculation of a new era that begins law in a liberal democracy slow things down
with systemic dominance and eventually, qui- and open things up. There is no supreme
etly, engineers the transformation to computa- decider, no Apple or Google to tweak the OS
tional governance in the name of more this way or that. This slowness and mess are
efficiency, less conflict, and frictionless econ- what protect us and ensure that democracy
omies of domination. endures despite its perennial challenges and
Like Clay Christensen, the senior statesmen failures.
of the surveillance capitalist institutional order Economies of domination obscure the central
and its disciplines of “disruption” do not cry for insight upon which the very idea of democracy
the open societies of liberal democracy. For stands: only society can guarantee individual
example, in April 2020, as the Apple-Google rights. Only a democratic society can guarantee
showdown in Europe unfolded, former Google the rights that enable self-governance to endure
executive and all-around tech alpha Eric and sometimes to flourish, beginning with
Schmidt was interviewed on stage at the Hannah Arendt’s “right to have rights” (Arendt,
Economic Club of New York. Speaking about 2004, pp. 369–384; Ingram, 2008).
the effects of the pandemic on Big Tech, the Arendt observes the Nazi machinery that
moderator posited, “We talked a lot about sur- first deprived Jews and other unwanted humans
veillance capitalism before this crisis, but now of their legal status as the precondition for
it seems that tracking is one of the very impor- depriving them of their humanity and then their
tant elements in dealing with this” (Schmidt & lives. The loss of legal existence meant abso-
Kravis, 2020, p. 7). lute exclusion from “the world of the living,”
Schmidt’s response conveys his admiration forced into ghettos and concentration camps.
for the “simplicity” and effectiveness of She writes:
Zuboff 53

The point is, a condition of complete rightlessness it with trust from the bottom up. Two decades
was created before the right to live was challenged later, surveillance capitalism has failed any rea-
... Not the specific rights, then, but the loss of a sonable test of responsible global stewardship of
community willing and able to guarantee any digital information and communications.
rights whatsoever, has been the calamity which
The abdication of these information and
has befallen ever-increasing numbers of people ...
communication spaces to surveillance capital-
Only the loss of a polity itself expels Man [sic]
from humanity. (Arendt, 2004, pp. 375, 377). ism has become the meta-crisis of every
republic because it obstructs solutions to all
As in the case of epistemic rights discussed other crises. It is astonishing to consider that
earlier, the existential primacy of the right to our emergent information civilization is
have rights is only discovered at the moment in wholly dependent upon these “spaces,” yet
history when it is threatened. “We became they remain for sale or rent by any individual,
aware of the existence of a right to have rights corporation, politician, billionaire, megalo-
... and a right to belong to some kind of organ- maniac, or billionaire megalomaniac, with no
ized community, only when millions of people law to constrain their action, unlike almost
emerged who had lost and could not regain any other form of property. The people are left
these rights” (Arendt, 2004, p. 376). to observe, shout, or cower on the sidelines,
The implications of Arendt’s insight should bystanders to their own pillage and its conse-
rivet our attention on the existential threat of the quences in the uniquely abstract forms of sub-
desocialized society into which we are pro- jugation described in these pages.
pelled by surveillance capitalism. Under such While the liberal democracies have begun to
conditions, rights can neither be granted nor engage with the challenges of regulating today’s
defended. “The system” that was elevated above privately owned information spaces, the sober
democratic society and institutions by the truth is that the regulation of institutionalized
DP-3T data scientists cannot grant the right to processes that are innately catastrophic for
have rights nor the juridical rights that are the democracy cannot produce desired outcomes.
privilege of that most basic condition of the Our societies have faced other institutions
social. There are no rights of any kind in a deso- that imposed catastrophic consequences on
cialized society because all rights issue from people and society, such as human slavery and
society. The rest is gift, as easily rescinded as child labor. It was understood eventually that
given. Returning one last time to Durkheim and there is no bargaining with that which is cate-
the conclusion to his first great opus: “Society gorically catastrophic, and movements arose to
is not, then, as has often been thought, a stran- abolish those institutions. In today’s death
ger to the moral world ... It is, on the contrary, match of institutional orders, the challenge
the necessary condition of its existence” again shifts from regulation to abolition as the
(Durkheim, 1964, p. 399). only realistic path to reinvention.
The democratic order will not survive the
Conclusion: The Golden contest over the politics of knowledge unless
there is a reckoning with fundamental ques-
Sword tions, starting with this: How do we organize
Surveillance capitalism is what happened when and govern the global information and commu-
US democracy stood down. It was always a nication infrastructures of an information civili-
windfall, born of an antidemocratic economic zation in ways that sustain and advance
ideology and gifted by democracy-negating dem- democratic values, principles, aspirations, and
ocratic leaders. It was always the covert quid pro governance? What institutions, rights, and laws
quo of a fearful democratic state more inclined to are required for responsible stewardship and a
control the future from the top down than to build free and flourishing information civilization?
54 Organization Theory 

The unified field analysis reveals the conflict because of what it teaches: with a clear grasp of
that underlies this meta-crisis. The commodifi- an opponent’s source of power and a fit-for-
cation of human behavior is the foundation of purpose weapon, it is possible to succeed
surveillance capitalism’s two-decade develop- against a ferocious enemy that even gods
mental arc expressed in its rapidly evolving believe invincible.
complexity and institutional power over four If the commodification of human behavior is
already visible stages of development. We have both the foundation of surveillance capitalism
seen that each stage is characterized by novel and incompatible with democracy, then surveil-
economic operations, governance takeovers, lance capitalism’s hidden source of power–its
and social harms. Later-stage phenomena are Immortal Head–is the secret massive-scale
effects of the foundational capabilities required extraction of human-generated data. Secret
for commodification of the human. The opera- extraction operationalizes behavior commodifi-
tions and harms at each developmental stage cation and turns it toward prediction, profit, and
are, both individually and in aggregate, categor- concentrations of economic, governance and
ically incompatible with democratic society. It social powers. It follows, then, that the lawful
bears repeating that this conflict produces the abolition of secret massive-scale extraction is
zero-sum dynamic in which the deepening democracy’s Golden Sword that can interrupt
order of surveillance capitalism propagates the power source upon which all surveillance
democratic disorder and deinstitutionalization. capitalism’s destructive economic operations,
The unified field analysis suggests that dem- governance takeovers, and social harms depend.
ocratic stewardship and reinvention require The abolition of the primary human extrac-
contradiction strategies that first freeze surveil- tion that I have called theft is thus the single
lance capitalism’s institutional development, most effective strategy of democratic contradic-
then inhibit its reproduction and shift the global tion. It is the most likely to inaugurate a new
trajectory from dystopia to hope. The results chapter of institutional invention drawn from all
must include surveillance capitalism’s deinsti- the brilliance now clamoring at the gates held
tutionalization, but they should also clear the shut by the surveillance dividend. Abolition of
way for the birth of new institutional forms that these already illegitimate operations means no
draw people, law, ideas, capital, technologies, annexation of epistemic rights, no wholesale
and capabilities into democracy’s house. This destruction of privacy, and no industrialized
means new zones of public governance aligned tons of behavioral signals flowing through the
with the values, principles and aspirations of blind-by-design systems required to accommo-
democratic societies and empowered to hold date their scale and speed. Abolition eliminates
accountable both market and state to the rule of the structural causes of epistemic chaos associ-
public law. ated with the commodification of the human. It
The mission here recalls Hercules’ death means no antidemocratic concentrations of
match with the Hydra of legend and its eight knowledge about people, extreme epistemic
monstrous heads, each able to attack swiftly inequality, or powerful microtargeting algo-
from every direction. As soon as Hercules rithms. The absence of massive-scale extraction
brought down his club on one head, another enhances cybersecurity by reducing the data-
sprouted in its place. Eventually he perceived rich attack surface of individuals, groups, and
the ninth “Immortal Head,” obscured and pro- societies, substantially eliminating vulnerabili-
tected by the others. He realized that slaying the ties to remote actuation, illicit behavioral gov-
beast required hacking his way past the Hydra’s ernance, the fusion of state and market powers,
most visible action to reach its hidden source of and the artificial construction of the public
power. Hercules severed the Immortal Head square. The abolition of extraction resets the
with the Golden Sword, given to him by Athena capabilities of computational behavioral predic-
for that unique purpose. The fable endures tion and its human futures markets, eradicating
Zuboff 55

the financial incentives for surveillance capital- it is content-neutral and does not threaten genu-
ism, summarized as the surveillance dividend, ine freedom of expression. Instead, abolition
that fund its bid for systemic dominance, gov- liberates social discourse and information flows
ernance hegemony, and the desocialized media- from the unnatural selection of profiteering
tion of society by proprietary systems controlled commercial operations that breed digital vio-
by absolute instrumentarian power. lence by artificially interceding to favor lucra-
Today’s discussions of content moderation tive information corruption over integrity. The
and other active measures are hopeless a priori abolition of secret extraction can produce the
because they willfuly ignore the scale and conditions in which genuine freedom of expres-
complexity of blind-by-design information sion, social solidarity, common sense, and the
flows, while naively insisting that the giants integrity of social communications are restored.
should voluntarily handicap themselves in the Deprived of algorithmic oxygen, digital vio-
death match. Doomed from the start because lence slithers back into the shadows at the
action is weaker than institutionalization, these fringe.
contradiction strategies inadvertently create Yes, the abolition of secret extraction of the
the conditions for a cynical theater of self-reg- human promises systemic change, but there is
ulation, deflecting the emergence of genuinely something more here, more subtle and more
effective contradiction strategies. powerful. Responsible democratic stewardship
Active measures directed at what is most is drawn into being as the democratic order
visible, such as content moderation in all its stands up to surveillance capitalism’s antidem-
forms, including labeling, warning, fact check- ocratic counterrevolution–a Hercules for a new
ing, slowdowns, takedowns, or suspensions, are time. This standing up is a dedicated effort of
each post-catastrophe, and therefore no match comprehension and confrontation that speaks to
for the reproductive mechanisms identified in every people’s yearning to escape the gravita-
this discussion. These include automaticity and tional pull of the accidental dystopia toward
the systemic principles of blindness by design; which we hurtle.
democratic self-evisceration and the sustained The work of standing up is already in motion,
absence of effective law; concealment and engi- evidenced by recent expressions of democratic
neering for user ignorance; the surveillance contradiction that were unimaginable only a
dividend and the investment capabilities it both few years ago. As this power builds, the aboli-
attracts and enables; concentrations of epis- tion of secret massive-scale extraction, once a
temic rights and the privacy they afford; distant thought experiment, can and should
Orwellian rhetorics of inevitability, misdirec- become the subject of urgent discussion. In the
tion, and disorientation; unprecedented concen- dialectic of the death match, abolition draws
trations of human data, knowledge, artificial closer to inevitability as the threats to demo-
intelligence capabilities, the epistemic inequal- cratic societies escalate.
ity they produce, and the remote actuation they Highlights of this new wave include the
enable; new forms of political appeasement, European Union’s game-changing legislative
declaration, colonization, and performative developments. In 2022 the European
legitimacy; and, above all, the aura of inevita- Parliament’s historic passage of the Digital
bility emitted by this rogues’ gallery of self- Services Act and the Digital Markets Act broke
reproduction. The abolition of secret extraction the sound barrier of surveillance capitalism’s
would land a decisive blow on each of these aura of inevitability and began the work of
mechanisms and cripple institutional reproduc- asserting democratic governance over the tech
tion as currently construed. giants and their ecosystems. The EU’s proposed
The abolition of secret massive-scale extrac- regulatory regime for artificial intelligence and
tion is also a more effective form of contradic- the European Declaration on Digital Rights and
tion than post-factum active measures because Principles for the Digital Decade are powerful
56 Organization Theory 

new expressions of democratic contradiction This unethical business model must be reined in
that, in concert with the new legislative Acts, globally, including by bringing an end to
create the conditions for the next step function surveillance advertising that people never asked
leap forward in genuinely effective contradic- for and of which they are often unaware. (Ressa
& Muratov, 2022)
tion: the abolition of primary extraction and its
redefinition as theft (European Commission,
In 2022, the US Federal Trade Commission
2022a, 2022b; European Parliament, 2022).
solicited public comment on the prospect of
The global operations of human data extrac-
“Commercial Surveillance and Data Security
tion are already contested among a vanguard of
Rulemaking” (Federal Trade Commission,
lawmakers and policymakers both in the EU
2022). This too may augur a watershed.
and the United States who have rallied to the
These and other current expressions of demo-
prospect of outlawing surveillance advertising
cratic contradiction have the potential to ground
(Bryant, 2021; European Parliament, 2021;
the next critical phase of institutional reinvention.
Kaye, 2021). In Europe, civil society institu-
After two decades of Cook’s and Kalanick’s and
tions delivered their “People’s Declaration” to
Christensen’s and Schmidt’s veneration of insti-
the EU Parliament demanding an end to surveil-
tutional destruction, the unified field analysis
lance advertising (The People’s Declaration,
suggests a different lesson. I call it Zuboff’s
n.d.). In the United States, Accountable Tech, a
Fourth Law: Information is only as useful to soci-
civil society institution focused on the nexus of
ety as the institutions, rights, and laws that gov-
technology and democracy, submitted a petition
ern its production and use.7 This is the reckoning
to the US Federal Trade Commission for rule-
we face, swept up in a new civilization where
making that would prohibit surveillance adver-
digital information and communications systems
tising (“Accountable Tech Petitions FTC to Ban
are owned and controlled by an economic institu-
Surveillance Advertising as an ‘Unfair Method
tion that can neither value nor detect truth.
of Competition’ (Press Release),” 2021).
We struggle in this milieu of desocialized
Significant legislation to ban surveillance
connection without institutional capabilities
advertising and microtargeting in political
developed to failsafe rather than exploit the dis-
advertising has been introduced in the US
tance between sentience and world, a fissure
Congress (Davis, 2021; Eshoo, 2020, 2022),
that in other eras was healed by varied institu-
followed by historic data protection and anti-
tionalizations of “truth,” “trust,” “witness,”
trust bills in the US House and Senate
“accountability,” “responsibility,” “fact,”
(Competition and Antitrust Law Enforcement
“fidelity,” and “meaning.” In some cases, these
Reform Act, 2021; American Data Privacy and
capabilities have been actively damaged or
Protection Act, 2022).
weakened, as in the destruction of the news
Meanwhile, the public conversation is mov-
industry and the democratic role of the Fourth
ing fast. In 2022, the Nobel Peace Prize
Estate. In other cases, such capabilities, and the
Committee, led by its 2021 recipients, journal-
institutions to enact them, have not yet been
ists Maria Ressa and Dmitry Muratov, published
developed, as illustrated in the many varieties
“A 10-Point Plan to Address Our Information
of epistemic rights violations from location-
Crisis” beginning with its demand to “Bring
data trafficking to the international shipping cri-
an end to the surveillance-for-profit business
sis of fake GPS coordinates that facilitates
model.”
criminality at sea (Kurmanaev, 2022). New
The vast machinery of corporate surveillance not rights that are essential but still uncodified are
only abuses our right to privacy, but allows our mirrored by new and still nameless crimes.
data to be used against us, undermining our The abolition of secret human extraction is a
freedoms and enabling discrimination. critical bridge to the reckoning with reinvention.
Zuboff 57

For example, the giants’ domination of artificial make our own future, the language and frame
intelligence and its global market structure wholly with which we might contest the forces otherwise
depends upon their oligopolistic advantages in claiming that future” (W. Brown, 2015, p. 210).
the secret massive-scale extraction of human Hold tight to this promise in the zero-sum
data. This dominance, as we have seen, translates clash of institutional orders and a still indeter-
into global control over knowledge production minate fate.
and consumption. In contrast, the abolition of
secret extraction would reverse the antidemo- Declaration of conflicting interests
cratic capture of the division of learning that The author declared no potential conflicts of interest
elevates the few over the many–surveillance cap- with respect to the research, authorship, and/or pub-
italism’s institutional interests over those of all lication of this article.
people. It would tip the scales in the death match
over the politics of knowledge, clearing the path Funding
for computational knowledge production that
The author received no financial support for the
advances humanity in millions of new ways research, authorship, and/or publication of this
while tethered to the expansion and protection of article.
democracy’s house and its inhabitants. It frees us
to begin again in the spaces sold cheap by Clinton Note on Method
and Gore in 1997 and sold again in 2001. We
begin again and reclaim the void, finally knowing A note on my source materials: In this paper I occa-
sionally draw upon a series of six extended interviews
what we have lost and what is at stake.
conducted in the months following the Apple-Google
This paper concludes on the first page of a intervention in the EU. Among the six, two interview-
new chapter. History is a relay race, not a sprint. ees were seasoned Silicon Valley data scientists each
The baton passes to the democratic order, not as employed by one of the two companies. Each had
an ideal but as lived reality, because each worked on various aspects of the design and launch of
citizen bears responsibility for mobilization, the Apple-Google Exposure Notification Protocol.
transformation, and stewardship. It passes to a Each was already known to me from prior interviews.
new generation of students, artists and scholars I refer to them below as Data Scientist I and Data
charged with the urgent demand for creative Scientist II (DSI, DSII). The other four interviewees
thought and vision that pulls us back from the were high-ranking public officials either in EU mem-
brink of dystopia and illuminates a new direc- ber states or the European Commission. I refer to
these interviewees below as Public Official I, II, III,
tion. It passes to journalists, now under siege,
or IV (PO I-IV). Each of the four was directly
but critical to the reinvention of a Fourth Estate involved in emergency decision-making during the
for our information civilization. It passes to citi- course of the events described here.
zens and lawmakers, bent on clawing back the The six interviewees each agreed to two extensive
future. It passes to all who reject dystopia and interview sessions, providing background and specific
unaccountable power. insights. On the promise of anonymity, each granted
Democracy is not a condition to be taken permission for their comments to be used at my discre-
lightly, discarded from impatience or inconven- tion in future writing. No interviewee had knowledge
ience. Too many have sacrificed for the sake of it. of the other individuals with whom I spoke.
Too many have perished for the lack of it. It is not Most striking to me after so many decades of field
programmable. It is under siege, because at its interviews was the concurrence of perspectives
across these many different roles and vantage points.
best, democracy negates absolute power and
Instead of the usual “blind man and the elephant,”
remains a dangerous obstacle to ambitions of sys- their unique insider experience led all six to similar
temic dominance, including those nursed by a conclusions, especially as regards the larger signifi-
new information oligarchy sozzled on computa- cance of the Apple-Google intervention. I include
tion. Without democracy, as Wendy Brown their reflections here when they offer a useful means
writes, “we lose the language and frame by which of triangulating, contextualizing, or deepening this
we are accountable to the present and entitled to discussion.
58 Organization Theory 

Notes 3. For relevant commentary and insights see


W. Brown, 2015; Mirowski, 2013; Wacquant,
1. You instruct ads not to track you, but they
persist—even on an iPhone (Fowler, 2021). 2012.
There’s a US$16bn market for your location 4. “First they came for the Socialists, and I did not
data (Location Intelligence Market Size & Share speak out—Because I was not a Socialist.
Report, 2022–2030, 2022). Cyber-mercenaries Then they came for the Trade Unionists, and I
are the rage; most of their data is scraped from did not speak out—Because I was not a Trade
the giants (Bradshaw et al., 2021). Surveillance Unionist.
cameras are about to become truly cheap and Then they came for the Jews, and I did not
ubiquitous, and anyone with the Bosch app speak out—Because I was not a Jew.
can install, operate, and analyze the video data Then they came for me—and there was no one
(Campbell & Jones, 2022). A “smart” light left to speak for me.” (Niemöller, 1950)
bulb tracks your heart rate (Tuohy, 2022). The 5. Alphabet/Google endeavored to develop a
new TV watches you (Fowler, 2019c). “Alexa” “Google City” on the Toronto waterfront, includ-
and all the other “assistants” retain all record- ing its own forms of governance (Cardoso &
ings to feed their artificial intelligence (Fowler, O’Kane, 2019; Goodman & Powles, 2019). In
2019a; Suliman, 2022). Facebook is developing 2021 Facebook blacked out its pages in Australia
brain-reading tech (Samuel, 2019). “Student rather than negotiate with Parliament over a new
Surveillance Services” “keep kids safe” by legislative code that would require remunerating
monitoring everything from biometrics to chats publishers for news content (Easton, 2021; Smyth
(Haskins, 2019). Cars are surveillance plat- et al., 2021; Smyth, 2021b). Whistleblower doc-
forms (M. Anderson, 2019). General Motors uments later revealed the takedown as a highly
has launched “behavior-based” driver insur- orchestrated extortion operation planned over
ance, with monitoring systems that track eye seven months, overseen by Zuckerberg and
and head movements, and more (Bellon, 2022). Sandberg, and aimed at the Australian people and
Google’s Chrome browser operates as surveil- their government (K. Horwitz et al., 2022; Reset
lance software (Fowler, 2019b). The “Ring Australia, 2022; Smyth, 2021a; Whistleblower
Doorbell” sees it all (Grauer, 2022). Your face Aid, 2022). Facebook established an “Oversight
travels from that graduation photo, posted in Board” in 2020, a private governance body that
gratitude and hope, to train Chinese facial rec- aimed to protect principles of industry self-regu-
ognition systems that stand watch over Uighur lation (Klonick, 2020; Lapowsky, 2020; Lewin,
families in concentration camps (Murgia, 2021; M. Roberts, 2019).
2019b) ... 6. The Swiss Federal Institute of Technology in
2. “Knowledge in this sense is more than what Zürich.
is usually described as skill, and the division 7. Zuboff’s Fourth Law is a corollary of Zuboff’s
of knowledge of which we here speak more Three Laws. Formulated in the mid-1990s and
than is meant by the division of labor. To put it grounded in 20 years of clinical observations
shortly, ‘skill’ refers only to the knowledge of of “workplace computerization,” the three laws
which a person makes use in his trade, while the summarized and predicted behavior in the eco-
further knowledge about which we must know nomic domain. Zuboff’s Three Laws: Assuming
something in order to be able to say anything the dominant economic paradigm, (1) every-
about the processes in society is the knowledge thing that can be automated will be automated;
of alternative possibilities of action of which (2) everything that can be informated will be
he makes no direct use. It may be added that informated; (3) all digitally produced data that
knowledge, in the sense in which the term is can be used for surveillance and control will be
here used, is identical with foresight only in the used for surveillance and control in the absence
sense in which all knowledge is capacity to pre- of countervailing rights, laws, contracts, rules,
dict” (Hayek, 1980, p. 273). or sanctions (Zuboff, 2013, sec. 3).
Zuboff 59

References Aiello, A. E., Renson, A., & Zivich, P. (2020). Social


303 Scientists and Researchers. (2020). Joint media- and internet-based disease surveillance
Statement on Contact Tracing. https://drive. for public health. Annual Review of Public
google.com/file/d/1OQg2dxPu-x-RZzETlpV- Health, 41, 101–118. https://doi.org/10.1146/
3lFa259Nrpk1J/view annurev-publhealth-040119-094402
Abboud, L., & Miller, J. (2020, June 23). French Akinwotu, E. (2021, October 7). Facebook’s role in
give cool reception to Covid-19 contact-trac- Myanmar and Ethiopia under new scrutiny. The
ing app. Financial Times. https://www.ft.com/ Guardian. https://www.theguardian.com/tech-
content/255567d5-b7ec-4fbe-b8a9-833b3a23f665 nology/2021/oct/07/facebooks-role-in-myan-
Abboud, L., Miller, J., & Espinoza, J. (2020, May mar-and-ethiopia-under-new-scrutiny
10). How Europe splintered over contact trac- Alegre, S. (2022). Freedom to Think: The Long
ing apps. Financial Times. https://www.ft. Struggle to Liberate Our Minds. Limited
com/content/7416269b-0477-4a29-815d- Atlantic Books.
7e4ee8100c10 Alizada, N., Cole, R., Gastaldi, L., Grahn, S.,
Abrams et al. v. United States, 250 U.S. 616 (1919). Hellmeier, S., Kolvani, P., Lachapelle, J.,
https://www.law.cornell.edu/supremecourt/ Lührmann, A., Maerz, S. F., Pillai, S., &
text/250/616 Lindberg, S. I. (2021). Autocratization Turns
Accountable Tech & GQR. (2021). America’s Views Viral: Democracy Report 2021. V-Dem
on Surveillance Advertising. Accountable Tech. Institute. https://www.v-dem.net/static/website/
https://accountabletech.org/research/surveil- files/dr/dr_2021.pdf
lance-advertising/ Allam, H. (2020, December 15). Right-Wing Embrace
Accountable Tech Petitions FTC to Ban Surveil­ Of Conspiracy Is “Mass Radicalization,”
lance Advertising as an ‘Unfair Method of Experts Warn. NPR. https://www.npr.org/2020/
Competition’ (Press Release). (2021, September 12/15/946381523/right-wing-embrace-of-con-
28). Accountable Tech. https://accountabletech. spiracy-is-mass-radicalization-experts-warn
org/media/accountable-tech-petitions-ftc-to- Allcott, H., & Gentzkow, M. (2017). Social Media
ban-surveillance-advertising-as-an-unfair- and Fake News in the 2016 Election. Journal of
method-of-competition/ Economic Perspectives, 31(2), 211–235.
Acemoglu, D. (2021). Harms of AI (Working Paper Allen, A. L., & Rotenberg, M. (2016). Privacy Law
No. 29247; Working Paper Series). National and Society (3rd ed.). West Academic Publishing.
Bureau of Economic Research. https://doi. Allington, D., Duffy, B., Wessely, S., Dhavan, N., &
org/10.3386/w29247 Rubin, J. (2021). Health-protective behaviour,
Acemoglu, D., & Restrepo, P. (2021). Tasks, social media usage and conspiracy belief dur-
Automation, and the Rise in US Wage Inequality ing the COVID-19 public health emergency.
(Working Paper No. 28920; Working Paper Psychological Medicine, 51(10), 1763–1769.
Series). National Bureau of Economic Research. https://doi.org/10.1017/S003329172000224X
https://doi.org/10.3386/w28920 Amazon. (2022, July 21). Amazon and One Medical
Agbehadji, I. E., Awuzie, B. O., Ngowi, A. B., & Sign an Agreement for Amazon to Acquire One
Millham, R. C. (2020). Review of Big Data Medical. Amazon Press Center. https://press.
Analytics, Artificial Intelligence and Nature- aboutamazon.com/news-releases/news-release-
Inspired Computing Models towards Accurate details/amazon-and-one-medical-sign-agree-
Detection of COVID-19 Pandemic Cases ment-amazon-acquire-one-medical/
and Contact Tracing. International Journal Amazon Web Services. (2019, August 12). Amazon
of Environmental Research and Public Rekognition improves Face Analysis. Amazon
Health, 17(15), 5330. https://doi.org/10.3390/ Web Services, Inc. https://aws.amazon.com/
ijerph17155330 about-aws/whats-new/2019/08/amazon-rekog-
Ahmad, N., & Salvadori, K. (2020, May 13). Building nition-improves-face-analysis/
a transformative subsea cable to better connect American Data Privacy and Protection Act.
Africa. Engineering at Meta. https://engineering. (2022). H.R.8152, United States House of
fb.com/2020/05/13/connectivity/2africa/ Representatives, 117. http://www.congress.gov/
60 Organization Theory 

Amnesty International. (2020, April 3). COVID-19, Balkin, J. (2008). The Constitution in the National
digital surveillance and the threat to your rights. Surveillance State. Minnesota Law Review,
Amnesty International. https://www.amnesty. 93(1). https://openyls.law.yale.edu/handle/20
org/en/latest/news/2020/04/covid-19-surveil- .500.13051/1545
lance-threat-to-your-rights/ Balkin, J. M. (2017). Free Speech in the Algorithmic
Anderson, J., & Rainie, L. (2020, February 21). Society: Big Data, Private Governance,
Many Tech Experts Say Digital Disruption Will and New School Speech Regulation (SSRN
Hurt Democracy. Pew Research Center. https:// Scholarly Paper No. 3038939). https://doi.org/
www.pewresearch.org/internet/2020/02/21/ 10.2139/ssrn.3038939
many-tech-experts-say-digital-disruption-will- Banco, E., & Ackerman, S. (2020, April 21). Team
hurt-democracy/ Trump Turns to Peter Thiel’s Palantir to Track
Anderson, M. (2019, June 24). The Self-Driving Virus. The Daily Beast. https://www.thedaily-
Car Is a Surveillance Tool. IEEE Spectrum. beast.com/trump-administration-turns-to-peter-
https://spectrum.ieee.org/surveillance-and-the- thiels-palantir-to-track-coronavirus
selfdriving-car Barraud, E. (2020, May 25). First pilot for the
Apple Newsroom. (2020, April 10). Apple and Google and Apple-based decentralised tracing
Google partner on COVID-19 contact tracing app. EPFL News. https://actu.epfl.ch/news/
technology. Apple Newsroom. https://www. first-pilot-for-the-google-and-apple-based-
apple.com/newsroom/2020/04/apple-and- decentr/
google-partner-on-covid-19-contact-tracing- Barry, E. (2020, April 16). An Army of Virus Tracers
technology/ Takes Shape in Massachusetts. The New York
Archer, K., & Levey, I. R. (2020, March 20). Trust Times. https://www.nytimes.com/2020/04/16/
in Government Lacking on COVID-19’s us/coronavirus-massachusetts-contact-tracing.
Frontlines. Gallup Blog. https://news.gallup. html
com/opinion/gallup/296594/trust-government- Bass, D., & Brustein, J. (2020, March 16). Big
lacking-frontlines-covid.aspx Tech Swallows Most of the Hot AI Startups.
Arendt, H. (2004). The Origins of Totalitarianism. Bloomberg.Com. https://www.bloomberg.com/
Schocken. news/articles/2020-03-16/big-tech-swallows-
Atlantic Council. (2021). Atlantic Council’s most-of-the-hot-ai-startups
DFRLab publishes new report in Just Security: Bay, J. (2020, April 14). Automated contact trac-
#StopTheSteal: A timeline of social media and ing is not a coronavirus panacea. Medium.
extremist activities leading up to 1/6 insurrection. https://web.archive.org/web/20200419012441/
Atlantic Council. https://www.atlanticcouncil. https:/blog.gds-gov.tech/automated-contact-
org/in-depth-research-reports/report/atlantic- tracing-is-not-a-coronavirus-panacea-57fb3ce
councils-dfrlab-publishes-new-report-in-just- 61d98?gi=89919a6fa2e
security-stopthesteal-a-timeline-of-social- Bellon, T. (2022, January 24). GM set to launch
media-and-extremist-activities-leading-up-to- behavior-based U.S. driver insurance in Q1—
1-6-insurrection/ Executive. Reuters. https://www.reuters.com/
Avaaz. (2020). Facebook’s Algorithm: A Major business/autos-transportation/gm-set-launch-
Threat to Public Health. Avaaz. https:// behavior-based-us-driver-insurance-q1-execu-
secure.avaaz.org/campaign/en/facebook_thr tive-2022-01-24/
eat_health/ Benesch, S. (2021, October 30). Nobody Can See
Bak-Coleman, J. B., Alfano, M., Barfuss, W., Into Facebook. The Atlantic. https://www.
Bergstrom, C. T., Centeno, M. A., Couzin, theatlantic.com/ideas/archive/2021/10/face-
I. D., Donges, J. F., Galesic, M., Gersick, book-oversight-data-independent-research/
A. S., Jacquet, J., Kao, A. B., Moran, R. E., 620557/
Romanczuk, P., Rubenstein, D. I., Tombak, Berger, P. L., & Luckmann, T. (1966). The Social
K. J., Bavel, J. J. V., & Weber, E. U. (2021). Construction of Reality: A Treatise in the
Stewardship of global collective behavior. Sociology of Knowledge. Random House.
Proceedings of the National Academy of Bharat, K., Lawrence, S., & Sahami, M. (2016).
Sciences, 118(27). https://doi.org/10.1073/ Generating user information for use in tar-
pnas.2025764118 geted advertising (United States Patent No.
Zuboff 61

US9235849B2). https://patents.google.com/ Brown, I. (2012). Government access to private-sec-


patent/US9235849/en tor data in the United Kingdom. International
Bhattacharyya, S. (2022, January 24). Meta Unveils Data Privacy Law, 2(4), 230–238. https://doi.
New AI Supercomputer. Wall Street Journal. org/10.1093/idpl/ips018
https://www.wsj.com/articles/meta-unveils- Brown, W. (2015). Undoing the Demos: Neoliber­
new-ai-supercomputer-11643043601 alism’s Stealth Revolution. Zone Books.
Biddle, S. (2018a, March 14). Facebook Quietly Browning, K., & Mac, R. (2022, May 16). After
Hid Webpages Bragging of Ability to Influence Buffalo Shooting Video Spreads, Social
Elections. The Intercept. https://theintercept. Platforms Face Questions. The New York Times.
com/2018/03/14/facebook-election-meddling/ https://www.nytimes.com/2022/05/15/busi-
Biddle, S. (2018b, April 13). Facebook Uses ness/buffalo-shooting-social-media.html
Artificial Intelligence to Predict Your Future Bruns, A. (2018). Facebook shuts the gate after
Actions for Advertisers, Says Confidential the horse has bolted, and hurts real research
Document. The Intercept. https://theintercept. in the process. Internet Policy Review. https://
com/2018/04/13/facebook-advertising-data- policyreview.info/articles/news/facebook-
artificial-intelligence-ai/ shuts-gate-after-horse-has-bolted-and-
Biddle, S. (2022, July 13). Amazon Admits Giving hurts-real-research-process/786
Ring Camera Footage to Police Without a Bruns, A. (2019). After the ‘APIcalypse’: Social
Warrant or Consent. The Intercept. https:// media platforms and their fight against critical
theintercept.com/2022/07/13/amazon-ring- scholarly research. Information, Communication
camera-footage-police-ed-markey/ & Society, 22(11), 1544–1566. https://doi.org/1
Binns, R. (2022). Tracking on the Web, Mobile 0.1080/1369118X.2019.1637447
and the Internet of Things. Foundations and Bryant, J. (2021, June 15). A trans-Atlantic dis-
Trends® in Web Science, 8(1–2), 1–113. https:// cussion on “surveillance-based advertising.”
doi.org/10.1561/1800000029 IAPP - The Privacy Advisor. https://iapp.org/
Birhane, A. (2020). Algorithmic Colonization of news/a/a-transatlantic-discussion-on-surveil-
Africa. SCRIPTed, 17(2), 389–409. https://doi. lance-based-advertising/
org/10.2966/scrip.170220.389 Burgin, A. (2012). The Great Persuasion:
Bobrowsky, M. (2021, August 4). Facebook Disables Reinventing Free Markets since the Depression.
Access for NYU Research Into Political-Ad Harvard University Press.
Targeting. Wall Street Journal. https://www. Burke, G., & Dearen, J. (2022, September 2). Tech
wsj.com/articles/facebook-cuts-off-access-for- tool offers police ‘mass surveillance on a
nyu-research-into-political-ad-targeting-11628 budget.’ Associated Press. https://apnews.com/
052204 article/technology-police-government-surveil-
Bond, R. M., Fariss, C. J., Jones, J. J., Kramer, A. lance-d395409ef5a8c6c3f6cdab5b1d0e27ef
D. I., Marlow, C., Settle, J. E., & Fowler, J. Califf, R. (2022, May 7). FDA chief explains
H. (2012). A 61-million-person experiment why misinformation is leading cause of
in social influence and political mobiliza- death in US - CNN (Brown, P., Interviewer)
tion. Nature, 489(7415), 295–298. https://doi. [Interview]. https://www.cnn.com/videos/
org/10.1038/nature11421 health/2022/05/07/fda-robert-califf-intv-misin-
Bradford, L. R., Aboy, M., & Liddell, K. (2020). formation-death-sot-vpx.cnn
COVID-19 Contact Tracing Apps: A Stress Test Campbell, Z., & Jones, C. (2022, February
for Privacy, the GDPR and Data Protection 11). Kitchen Appliance Maker Wants to
Regimes (SSRN Scholarly Paper No. 3617578). Revolutionize Video Surveillance. The
https://papers.ssrn.com/abstract=3617578 Intercept. https://theintercept.com/2022/02/11/
Bradshaw, S., Bailey, H., & Howard, P. N. surveillance-video-ai-bosch-azena/
(2021). Industrialized Disinformation: 2020 Cancryn, A. (2020, April 7). Kushner’s team seeks
Global Inventory of Organized Social Media national coronavirus surveillance system.
Manipulation. Project on Computational https://www.politico.com/news/2020/04/07/
Propaganda. https://demtech.oii.ox.ac.uk/wp- kushner-coronavirus-surveillance-174165
content/uploads/sites/127/2021/01/Cyber Cardoso, T., & O’Kane, J. (2019, October 30).
Troop-Report-2020-v.2.pdf Sidewalk Labs document reveals company’s
62 Organization Theory 

early vision for data collection, tax powers, Citron, D. K., & Solove, D. J. (2022). Privacy Harms.
criminal justice. The Globe and Mail. https:// Boston University Law Review, 102. https://doi.
www.theglobeandmail.com/business/article- org/10.2139/ssrn.3782222
sidewalk-labs-document-reveals-companys- Clammer, J. R. (1976). Literacy and Social Change:
early-plans-for-data/ A Case Study of Fiji. Brill.
Cate, F. H., & Dempsey, J. X. (Eds.). (2017). Bulk Clanchy, M. T. (1979). From Memory to Written
Collection: Systematic Government Access Record, England, 1066-1307. Edward Arnold.
to Private-Sector Data. Oxford University Clinton, P. W. J., & Gore, V. P. A, Jr.. (1997). A
Press. https://doi.org/10.1093/oso/97801906 Framework For Global Electronic Commerce.
85515.001.0001 The White House. https://clintonwhitehouse4.
CCDH & Anti-Vax Watch. (2021). The archives.gov/WH/New/Commerce/read.html
Disinformation Dozen: The Sequel. Centre for Cohen, N. (2009, October 11). Is It a Day to Be
Countering Digital Hate. https://counterhate. Happy? Check the Index. The New York Times.
com/research/disinformation-dozen-the-sequel/ https://www.nytimes.com/2009/10/12/technol-
Chafkin, M. (2021). The Contrarian: Peter Thiel ogy/internet/12link.html
and Silicon Valley’s Pursuit of Power. Penguin Collier, K., & Burke, M. (2022, August 9). Facebook
Press. turned over chat messages between mother and
Chander, A. (2014). How Law Made Silicon Valley. daughter now charged over abortion. NBC
Emory Law Journal, 63(3), 639. News. https://www.nbcnews.com/tech/tech-
Chander, A., & Le, U. P. (2014). Free Speech. Iowa news/facebook-turned-chat-messages-mother-
Law Review, 100(501). https://doi.org/10.2139/ daughter-now-charged-abortion-rcna42185
ssrn.2320124 Competition and Antitrust Law Enforcement Reform
Chen, G. J., Wiener, J. L., Iyer, S., Jaiswal, A., Act. (2021). S.225, United States Senate, 117.
Lei, R., Simha, N., Wang, W., Wilfong, K., http://www.congress.gov/
Williamson, T., & Yilmaz, S. (2016). Realtime Constella. (2021). Polarization as an Emerging
Data Processing at Facebook. Proceedings Source of Digital Risk (Polarization and
of the 2016 International Conference on Threat Intelligence Analysis, p. 11). Constella
Management of Data, 1087–1098. https://doi. Intelligence. https://info.constellaintelligence
org/10.1145/2882903.2904441 .com/hubfs/PDFs/polarization-digital-risk-
Christensen, C. M., Skok, D., & Allworth, J. (2012, analysis-2109.pdf?_hsmi=159025854&_hsenc=
September 15). Breaking News: Mastering p2ANqtz-8LUZ1uoodqx5WDbX2PXgGfgX-
the art of disruptive innovation in journalism. 4tJi0zytPQiHcmCHWF9bqWdhIPXJJV3G-
Nieman Reports. https://niemanreports.org/arti- GM23qAWt2ACkimr1i160t5MmEAnLHTo-
cles/breaking-news/ bzywA
Chung, J. (2021). Big Tech, Big Cash: Washington’s Corbet, S., & Chan, K. (2020, June 2). French
New Power Players (p. 34). Public Citizen. virus tracing app goes live amid debate
Cinelli, M., De Francisci Morales, G., Galeazzi, over privacy. AP NEWS. https://apnews
A., Quattrociocchi, W., & Starnini, M. (2021). .com/article/understanding-the-outbreak-
The echo chamber effect on social media. virus-outbreak-germany-paris-france-d67ef-
Proceedings of the National Academy of 2b17e287677e538fe3876a9df41
Sciences, 118(9), e2023301118. https://doi. Corn, G. P., & Taylor, R. (2017). Sovereignty in the
org/10.1073/pnas.2023301118 Age of Cyber. AJIL Unbound, 111, 207–212.
Citron, D., & Franks, M. (2020). The Internet as a https://doi.org/10.1017/aju.2017.57
Speech Machine and Other Myths Confounding Creţu, A.-M., Monti, F., Marrone, S., Dong, X.,
Section 230 Reform. University of Chicago Bronstein, M., & de Montjoye, Y.-A. (2022).
Legal Forum, 2020(1), 45–76. Interaction data are identifiable even across
Citron, D. K. (2022). The Fight for Privacy: long periods of time. Nature Communications,
Protecting Dignity, Identity, and Love in the 13(1), 313. https://doi.org/10.1038/s41467-021-
Digital Age. W. W. Norton & Company. https:// 27714-6
www.daniellecitron.com/the-fight-for-privacy- Cunningham-Cook, M. (2021, January 14). Arizona
protecting-dignity-identity-and-love-in-our- GOP Chair Urged Violence at the Capitol. The
digital-age/ Mercers Spent $1.5 Million Supporting Her. The
Zuboff 63

Intercept. https://theintercept.com/2021/01/14/ Donovan, J. (2020). Social-media companies must


capitol-riot-mercers-election-unrest/ flatten the curve of misinformation. Nature.
Davis, W. (2021, March 25). Lawmakers Ready Bill https://doi.org/10.1038/d41586-020-01107-z
To Ban “Surveillance Advertising.” Media Post Dunn, J. (2016, May 9). Introducing FBLearner
- Policy Blog. https://www.mediapost.com/pub- Flow: Facebook’s AI backbone. Engineering at
lications/article/361757/lawmakers-ready-bill- Meta. https://engineering.fb.com/2016/05/09/
to-ban-surveillance-advertis.html core-data/introducing-fblearner-flow-facebook-
Dawson, J. (2021). Microtargeting as Information s-ai-backbone/
Warfare. The Cyber Defense Review. https:// Durkheim, É. (1964). The Division of Labor in
doi.org/10.31235/osf.io/5wzuq Society (G. Simpson, Trans.). The Free Press.
De Vynck, G., Drozdiak, N., & Fouquet, H. (2020, Dwoskin, E., Timberg, C., & Romm, T. (2020,
May 13). Apple-Google Virus-Tracking Rules June 28). Zuckerberg once wanted to sanction
Put Apps in a Privacy Bind. Bloomberg. Trump. Then Facebook wrote rules that accom-
https://www.bloomberg.com/news/arti- modated him. Washington Post. https://www.
cles/2020-05-13/apple-google-virus-tracking- washingtonpost.com/technology/2020/06/28/
rules-put-apps-in-a-privacy-bind facebook-zuckerberg-trump-hate/
De Vynck, G., & Zakrzewski, C. (2021, December Easton, W. (2021, February 17). Changes to
29). As omicron washes over America, much Sharing and Viewing News on Facebook
of the country still isn’t using exposure noti- in Australia. Meta. https://about.fb.com/
fication apps. Washington Post. https://www. news/2021/02/changes-to-sharing-and-view-
washingtonpost.com/technology/2021/12/29/ ing-news-on-facebook-in-australia/
omicron-exposure-notification-apple/ Ebenstein, L. (2012). The Indispensable Milton
Dehaye, P.-O., & Reardon, J. (2020). Proximity Friedman: Essays on Politics and Economics.
Tracing in an Ecosystem of Surveillance Regnery Publishing.
Capitalism. In Proceedings of the 19th Workshop Edelson, L., Nguyen, M.-K., Goldstein, I., Goga, O.,
on Privacy in the Electronic Society (pp. 191– Lauinger, T., & McCoy, D. (2021). Understanding
203). Association for Computing Machinery. engagement with U.S. (mis)information news
https://doi.org/10.1145/3411497.3420219 sources on Facebook. Proceedings of the 21st
Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., ACM Internet Measurement Conference, 444–
Scala, A., Caldarelli, G., Stanley, H. E., & 463. https://doi.org/10.1145/3487552.3487859
Quattrociocchi, W. (2016). The spreading of Edwards, D. (2011). I’m Feeling Lucky: The
misinformation online. Proceedings of the Confessions of Google Employee Number 59.
National Academy of Sciences, 113(3), 554– Houghton Mifflin Harcourt.
559. https://doi.org/10.1073/pnas.1517441113 Eshoo, A. G. (2020, May 26). Rep. Eshoo Introduces
Díaz, Á. (2020, December 21). Law Enforcement Bill to Ban Microtargeted Political Ads (Press
Access to Smart Devices. Brennan Center for Release). Congresswomen Anna G. Eshoo.
Justice. https://www.brennancenter.org/our- https://eshoo.house.gov/media/press-releases/
work/research-reports/law-enforcement-access- rep-eshoo-introduces-bill-ban-microtargeted-
smart-devices political-ads
Diaz, A., & Birnbaum, E. (2022, July 21). Amazon Eshoo, A. G. (2022, January 18). Eshoo, Schakowsky,
Breaks Lobbying Record Amid Antitrust Fight. Booker Introduce Bill to Ban Surveillance
Bloomberg.Com. https://www.bloomberg.com/ Advertising (Press Release). Congresswomen
news/articles/2022-07-21/amazon-breaks-lob- Anna G. Eshoo. https://eshoo.house.gov/media/
bying-record-amid-antitrust-fight press-releases/eshoo-schakowsky-booker-intro-
Doctor Radio NYU. (2022, May 6). Commissioner duce-bill-ban-surveillance-advertising
of Food and Drugs at the FDA Dr. Robert M. European Commission. (2020, April 16). Coronavirus:
Califf says misinformation is the leading cause An EU approach for efficient contact tracing
of death [Recorded by R. M. Califf]. https:// [Text]. European Commission. https://ec.europa.
www.siriusxm.com/clips/clip/e7adfb79-ca09- eu/commission/presscorner/detail/en/ip_20_670
4825-b8ca-1aa0c124dea0/d33de290-bbe4- European Commission. (2022a, May 4). European
48a7-8b90-a5f34f3b976b Digital Rights and Principles. European
64 Organization Theory 

Commission. https://digital-strategy.ec.europa. leaked-facebook-documents-reveal-how-com-


eu/en/policies/digital-principles pany-failed-on-election-promise
European Commission. (2022b, August 11). A Feathers, T., Fondrie-Teitler, S., Waller, A.,
European approach to artificial intelligence. & Mattu, S. (2022, June 16). Facebook Is
European Commission. https://digital-strategy. Receiving Sensitive Medical Information from
ec.europa.eu/en/policies/european-approach- Hospital Websites – The Markup. The Markup.
artificial-intelligence https://themarkup.org/pixel-hunt/2022/06/16/
European Parliament. (2020, April 17). Texts facebook-is-receiving-sensitive-medical-infor-
Adopted: EU coordinated action to combat the mation-from-hospital-websites
COVID-19 pandemic and its consequences. Federal Trade Commission. (2022, August 11).
European Parliament. https://www.europarl. Federal Register Notice: Commercial
europa.eu/doceo/document/TA-9-2020-0054_ Surveillance and Data Security Rulemaking.
EN.pdf https://www.ftc.gov/legal-library/browse/
European Parliament. (2021). Opinion of the federal-register-notices/commercial-surveil-
Committee on Civil Liberties, Justice and Home lance-data-security-rulemaking
Affairs for the Committee on the Internal Market Feiner, L. (2019, January 8). Apple CEO Tim
and Consumer Protection on the proposal for Cook speaks with CNBC’s Jim Cramer:
a regulation of the European Parliament and Full transcript. CNBC. https://www.cnbc.
of the Council on a Single Market For Digital com/2019/01/08/apple-ceo-tim-cook-inter-
Services (Digital Services Act) and amending view-cnbc-jim-cramer-transcript.html
Directive 2000/31/EC. Committee on Civil Feuer, W. (2021, December 13). Facebook exec
Liberties, Justice and Home Affairs. https:// blames company’s users for spreading mis-
www.europarl.europa.eu/doceo/document/ information. New York Post. https://nypost.
LIBE-AD-692898_EN.pdf com/2021/12/13/facebook-exec-blames-users-
European Parliament. (2022, May 7). Digital for-spreading-misinformation/
Services: Landmark rules adopted for a safer, Fidler, M. (2021). The New Editors: Refining First
open online environment | News | European Amendment Protections For Internet Platforms.
Parliament. European Parliament News. Journal on Emerging Technologies, 2(2). https://
https://www.europarl.europa.eu/news/en/ ndlsjet.com/fidler-the-new-editors-refining-first-
press-room/20220701IPR34364/digital-ser- amendment-protections-for-internet-platforms/
vices-landmark-rules-adopted-for-a-safer- Fioretti, J. (2018, April 25). EU targets 20 billion
open-online-environment euro investment in AI to catch up with US,
Evanega, S., Lynas, M., Adams, J., & Smolenyak, Asia. Reuters. https://www.reuters.com/article/
K. (2020). Coronavirus misinformation: eu-artificialintelligence-idUSL8N1S26LG
Quantifying sources and themes in the COVID- Fitzgerald, M., & Crider, C. (2020, May 7). We need
19 “infodemic.” JMIR Preprints, 19(10), 13. urgent answers about the massive NHS COVID
Facebook Ad and Business Product Team. (2021). data deal. OpenDemocracy. https://www.
Facebook Data Lineage Internal Document: opendemocracy.net/en/ournhs/we-need-urgent-
ABP Privacy Infra, Long Range Investments answers-about-massive-nhs-covid-data-deal/
[A/C Priv]. Facebook Internal Document. https:// Fitzpatrick, A. (2018, December 6). An Inside Look at
s3.documentcloud.org/documents/21716382/ Apple’s Biggest Step Yet in Health Care. Time.
facebook-data-lineage-internal-document.pdf https://time.com/5472329/apple-watch-ecg/
Faife, C., & Ng, A. (2021, June 24). After Fleischer, V. (2010). Regulatory Arbitrage (No.
Repeatedly Promising Not to, Facebook 1567212). SSRN Scholarly Paper. https://doi.
Keeps Recommending Political Groups to Its org/10.2139/ssrn.1567212
Users. The Markup. https://themarkup.org/ Flichy, P. (2007). The Internet Imaginaire. MIT
citizen-browser/2021/06/24/after-repeatedly- Press.
promising-not-to-facebook-keeps-recommend- Flyverbom, M. (2022). Overlit: Digital Archi­
ing-political-groups-to-its-users tectures of Visibility. Organization Theory,
Feathers, T. (2021, October 29). Leaked Facebook 3(3), 26317877221090310. https://doi.org/10
Documents Reveal How Company Failed .1177/26317877221090314
on Election Promise. The Markup. https:// Fowler, G. A. (2019a, May 6). Alexa has been
themarkup.org/citizen-browser/2021/10/29/ eavesdropping on you this whole time. The
Zuboff 65

Washington Post. https://www.washingtonpost. /archives/a-friedman-doctrine-the-social-


com/technology/2019/05/06/alexa-has-been- responsibility-of-business-is-to.html
eavesdropping-you-this-whole-time/ Friedman, M. (1976). The Fragility of Freedom.
Fowler, G. A. (2019b, June 21). Goodbye, Chrome: BYU Studies Quarterly, 16(4). https://scholar-
Google’s Web browser has become spy soft- sarchive.byu.edu/byusq/vol16/iss4/12
ware. The Washington Post. https://www. Friedman, M. (2002). Capitalism and Freedom
washingtonpost.com/technology/2019/06/21/ (40th anniversary edition, with a new fore-
google-chrome-has-become-surveillance-soft- word). University of Chicago Press.
ware-its-time-switch/ Friedman, M. (2017a). Milton Friedman on Freedom:
Fowler, G. A. (2019c, September 18). You watch Selections from The Collected Works of Milton
TV. Your TV watches back. The Washington Friedman (Kindle). Hoover Institute Press.
Post. https://www.washingtonpost.com/tech- Friedman, M. (2017b). The Future of Capitalism.
nology/2019/09/18/you-watch-tv-your-tv- Hoover Institution. https://www.hoover.org/
watches-back/ research/future-capitalism
Fowler, G. A. (2021, September 23). When you Friedman, M., Piñera, J., de Castro, S., Kaiser, A., &
“Ask app not to track,” some iPhone apps Bellolio, J. (2012). Un legado de libertad: Milton
keep snooping anyway. The Washington Post. Friedman en Chile (A. Soto, Ed.). Fundación
https://www.washingtonpost.com/technol- para el Progreso / Atlas Economic Research
ogy/2021/09/23/iphone-tracking/ Foundation / Fundación Jaime Guzmán /
Fraga, B. L., McElwee, S., Rhodes, J., & Schaffner, Instituto Democracia y Mercado. https://www.
B. F. (2017, May 8). Why did Trump win? More fppchile.cl/wp-content/uploads/2014/09/Libro-
whites—and fewer blacks—actually voted. Friedman-version-completa.pdf
Washington Post. https://www.washingtonpost. Future of Tech Commission. (2021, September 23).
com/news/monkey-cage/wp/2017/05/08/why- Poll: Nine out of Ten Voters Support Strong
did-trump-win-more-whites-and-fewer-blacks- Online Privacy Protections While Eight Out of
than-normal-actually-voted/ Ten Strongly Support Holding Social Media
Franceschi-Bicchierai, L. (2022, April 26). Companies More Accountable for “Illegal and
Facebook Doesn’t Know What It Does Harmful Content” Posted on Their Sites. Future
With Your Data, Or Where It Goes: Leaked of Tech Commission. https://www.futureoftech-
Document. Motherboard. https://web.archive. commission.org/press-release-launch-poll
org/web/20220709113629/https://www.vice. Gao, J., Zhang, Y.-C., & Zhou, T. (2019). Compu­
com/en/article/akvmke/facebook-doesnt-know- tational socioeconomics. Physics Reports,
what-it-does-with-your-data-or-where-it-goes 817, 1–104. https://doi.org/10.1016/j.physrep.
Frantz, E., Kendall-Taylor, A., & Wright, J. (2020). 2019.05.002
Digital Repression in Autocracies (Users Gellman, B. (2020). Dark Mirror: Edward Snowden
Working Paper 2020:27; p. 54). The Varieties and the American Surveillance State. Penguin
of Democracy Institute. Press.
Frenkel, S. (2021, July 19). White House Germani, F., & Biller-Andorno, N. (2020). The
Dispute Exposes Facebook Blind Spot on anti-vaccination infodemic on social media: A
Misinformation. The New York Times. https:// behavioral analysis. PLos One, 16. https://doi.
www.nytimes.com/2021/07/19/technology/ org/10.1101/2020.12.07.20223370
facebook-misinformation-blind-spot.html Ghaffary, S. (2021, August 6). People do not trust
Frenkel, S., Alba, D., & Zhong, R. (2020, March that Facebook is a healthy ecosystem. Vox.
8). Surge of Virus Misinformation Stumps https://www.vox.com/recode/22612151/laura-
Facebook and Twitter. The New York Times. edelson-facebook-nyu-ad-observatory-social-
https://www.nytimes.com/2020/03/08/technol- media-researcher
ogy/coronavirus-misinformation-social-media. Gibney, E. (2019). Privacy hurdles thwart Facebook
html democracy research. Nature, 574(7777), 158–
Friedman, M. (1970, September 13). A Friedman 160.
doctrine—The Social Responsibility Of Gilbert, B. (2020, January 8). Facebook is the reason
Business Is to Increase Its Profits. The New York Trump got elected, says Facebook exec who ran
Times. https://www.nytimes.com/1970/09/13 advertising during the 2016 election, “but not
66 Organization Theory 

for the reasons anyone thinks.” Business Insider. Grand View Research. (2022). Location Intelligence
https://www.businessinsider.com/facebook-ads- Market Size, Share & Trends Analysis Report By
elected-trump-andrew-bosworth-says-2020-1 Application (Sales & Marketing Optimization,
Gilbert, D. (2021, January 15). Steve Bannon Urged Remote Monitoring), By Service, By Vertical,
Facebook Followers to “Take Action” on Eve By Region, And Segment Forecasts, 2022—
of Capitol Riot. Vice. https://www.vice.com/en/ 2030. Grand View Research. https://www.
article/n7vqgb/steve-bannon-urged-facebook- grandviewresearch.com/industry-analysis/loca-
followers-to-take-action-on-eve-of-capitol-riot tion-intelligence-market
Gingerich, J. (2021). Is Spotify Bad for Democracy? Grauer, Y. (2022, April 15). Video Doorbell Cameras
Artificial Intelligence, Cultural Democracy, Record Audio, Too. Consumer Reports. https://
and Law. Yale Journal of Law and Technology, www.consumerreports.org/video-doorbells/
24(227), 90. video-doorbell-cameras-record-audio-too-
Global Witness. (2022a, July 28). Facebook approves a4636115889/
ads calling for ethnic violence in the lead up Green, J., & Issenberg, S. (2016, October 27).
to a tense Kenyan election. Global Witness. Inside the Trump Bunker, With Days to Go.
https:///en/press-releases/facebook-approves- Bloomberg.Com. https://www.bloomberg.com/
ads-calling-ethnic-violence-lead-tense-kenyan- news/articles/2016-10-27/inside-the-trump-
election/ bunker-with-12-days-to-go
Global Witness. (2022b, August 17). Rehashing Grind, K., McMillan, R., & Mathews, A. W. (2020,
existing policies by Facebook not enough March 17). To Track Virus, Governments
to combat election disinformation in Brazil. Weigh Surveillance Tools That Push Privacy
Global Witness. https:///en/press-releases/ Limits. The Wall Street Journal. https://www.
rehashing-existing-policies-facebook-not- wsj.com/articles/to-track-virus-governments-
enough-combat-election-disinformation-brazil/ weigh-surveillance-tools-that-push-privacy-
Gofman, M. (2021, June 23). AI Brain Drain. Simon limits-11584479841
Business School, University of Rochester - Grunwald, M. (2020, April 25). Biden wants a
Simon Blog: Dean’s Corner. https://simon. new stimulus “a hell of a lot bigger” than $2
rochester.edu/blog/deans-corner/brain-drain trillion. POLITICO. https://www.politico.
Gofman, M., & Jin, Z. (2022). Artificial Intelligence, com/news/2020/04/25/joe-biden-green-stimu-
Education, and Entrepreneurship. https://doi. lus-207848
org/10.2139/ssrn.3449440 Gursky, J., & Woolley, S. (2020, June 21). The
Goldstein, A. (2021, January 27). Social Media Trump 2020 app is a voter surveillance tool
Engagement with Deceptive Sites Reached Record of extraordinary power. MIT Technology
Highs in 2020. German Marshall Fund. https:// Review. https://www.technologyreview.com/
www.gmfus.org/news/social-media-engagement- 2020/06/21/1004228/trumps-data-hungry-
deceptive-sites-reached-record-highs-2020 invasive-app-is-a-voter-surveillance-tool-of-
Goodman, E. P., & Powles, J. (2019). Urbanism extraordinary-scope/
under Google: Lessons from Sidewalk Toronto. Guthrie, S., & Benjamin, M. (2022, March 4).
Fordham Law Review, 88, 457. Microsoft + Nuance: Better together to trans-
Goody, J. (1986). The Logic of Writing and the form business and healthcare outcomes with AI.
Organization of Society. Cambridge University Official Microsoft Blog. https://blogs.microsoft.
Press. http://hdl.handle.net/2027/heb.05702 com/blog/2022/03/04/microsoft-nuance-better-
Google Inc. (2004). Form 10-K 2004. https:// together-to-transform-business-and-healthcare-
content.edgar-online.com/ExternalLink/ outcomes-with-ai/
EDGAR/0001193125-05-065298.html?hash= Gvili, Y. (2020). Security Analysis of the COVID-
52891c0ad117912f1ca1079cbd630a820a2f65 19 Contact Tracing Specifications by Apple Inc.
e9db0cfcd50e8d855a989fc2fb&dest=DEX And Google Inc. Cryptology EPrint Archive.
100801_HTM#D10K_HTM_TOC10062_9 https://eprint.iacr.org/2020/428
Grafton, A., Blair, A., Duguid, P., & Goeing, A.-S. Hagey, K., & Horwitz, J. (2021, September
(Eds.). (2021). Communication, Computation 15). Facebook Tried to Make Its Platform
and Information. In Information: A Historical a Healthier Place. It Got Angrier Instead.
Companion. Princeton University Press. Wall Street Journal. https://www.wsj.com
Zuboff 67

/articles/facebook-algorithm-change-zucker- Datacenter Infrastructure Perspective. 2018 IEEE


berg-11631654215 International Symposium on High Performance
Hao, K. (2021a, March 11). How Facebook got Computer Architecture (HPCA), 620–629. https://
addicted to spreading misinformation. MIT doi.org/10.1109/HPCA.2018.00059
Technology Review. https://www.technolo- Heller, B. (2021). Watching Androids Dream of Electric
gyreview.com/2021/03/11/1020600/facebook- Sheep: Immersive Technology, Biometric
responsible-ai-misinformation/ Psychography, and the Law. Vand. J. Ent. &
Hao, K. (2021b, November 20). How Facebook Tech. L., 23(1). https://www.vanderbilt.edu/
and Google fund global misinformation. MIT jetlaw/2021/04/09/watching-androids-dream-of-
Technology Review. https://www.technolo- electric-sheep-immersive-technology-biometric-
gyreview.com/2021/11/20/1039076/facebook- psychography-and-the-law/
google-disinformation-clickbait/ Hellewell, J., Abbott, S., Gimma, A., Bosse, N. I.,
Hao, K. (2022, April 19). Artificial intelligence Jarvis, C. I., Russell, T. W., Munday, J. D.,
is creating a new colonial world order. MIT Kucharski, A. J., Edmunds, W. J., Sun, F.,
Technology Review. https://www.technolo- Flasche, S., Quilty, B. J., Davies, N., Liu, Y.,
gyreview.com/2022/04/19/1049592/artificial- Clifford, S., Klepac, P., Jit, M., Diamond, C.,
intelligence-colonialism/ Gibbs, H., ... Eggo, R. M. (2020). Feasibility of
Hao, K., & Hernández, A. P. (2022, April 20). How controlling COVID-19 outbreaks by isolation of
the AI industry profits from catastrophe. MIT cases and contacts. The Lancet Global Health,
Technology Review. https://www.technolo- 8(4), e488–e496. https://doi.org/10.1016/
gyreview.com/2022/04/20/1050392/ai-indus- S2214-109X(20)30074-7
try-appen-scale-data-labels/ Hempel, J. (2008, April 11). Sheryl Sandberg:
Harwell, D., & Oremus, W. (2022, May 16). Only Facebook’s new number two. CNN Money.
22 saw the Buffalo shooting live. Millions have https://money.cnn.com/2008/04/11/technology/
seen it since. The Washington Post. https://www. facebook_sandberg.fortune/
washingtonpost.com/technology/2022/05/16/ Hern, A. (2020a, April 16). NHS in standoff with
buffalo-shooting-live-stream/ Apple and Google over coronavirus tracing.
Haskins, C. (2019, November 1). Gaggle Knows The Guardian. https://www.theguardian.com/
Everything About Teens And Kids In School. technology/2020/apr/16/nhs-in-standoff-with-
BuzzFeed News. https://www.buzzfeednews. apple-and-google-over-coronavirus-tracing
com/article/carolinehaskins1/gaggle-school- Hern, A. (2020b, April 21). France urges Apple and
surveillance-technology-education Google to ease privacy rules on contact trac-
Hayek, F. A. (1945). The Use of Knowledge in ing. The Guardian. https://www.theguardian.
Society. In F. A. Hayek (Ed.), Individualism and com/world/2020/apr/21/france-apple-google-
Economic Order. University of Chicago Press. privacy-contact-tracing-coronavirus
Hayek, F. A. (1980). Individualism and Economic Hilbert, M. (2012). How much information is there
Order. University of Chicago Press. in the “information society”? Significance,
Hayek, F. A. (1988). The Fatal Conceit: The Errors 9(4), 8–12. https://doi.org/10.1111/j.1740-
of Socialism (W. W. Bartley III, Ed.). The 9713.2012.00584.x
University of Chicago Press. https://press. Hilbert, M., & López, P. (2011). The World’s
uchicago.edu/ucp/books/book/chicago/F/ Technological Capacity to Store, Communicate,
bo3643985.html and Compute Information. Science, 332(6025),
Hayek, F. A. (2007). The Road to Serfdom: Text 60–65. https://doi.org/10.1126/science.1200970
and Documents–The Definitive Edition (B. Hinds, J., & Joinson, A. (2019). Human and
Caldwell, Ed.; Kindle). The University of Computer Personality Prediction From Digital
Chicago Press. https://press.uchicago.edu/ucp/ Footprints. Current Directions in Psychological
books/book/chicago/R/bo4138549.html Science, 28(2), 204–211. https://doi.org/10.
Hazelwood, K., Bird, S., Brooks, D., Chintala, S., Diril, 1177/0963721419827849
U., Dzhulgakov, D., Fawzy, M., Jia, B., Jia, Y., Hoffman, S. (2022). China’s Tech-Enhanced
Kalro, A., Law, J., Lee, K., Lu, J., Noordhuis, P., Authoritarianism. Journal of Democracy, 33
Smelyanskiy, M., Xiong, L., & Wang, X. (2018). (2), 76–89. https://doi.org/10.1353/jod.2022.
Applied Machine Learning at Facebook: A 0019
68 Organization Theory 

Hollis, D. (2018). The Influence of War; The War for Iqbal, U., Bahrami, P. N., Trimananda, R., Cui, H.,
Influence. Temple International & Comparative Gamero-Garrido, A., Dubois, D., Choffnes,
Law Journal, 32(1), 31–46. D., Markopoulou, A., Roesner, F., & Shafiq,
Holmes, A., & Langley, H. (2020, June 10). Z. (2022). Your Echos are Heard: Tracking,
Apple and Google’s ambitious COVID-19 Profiling, and Ad Targeting in the Amazon Smart
contact-tracing tech can help contain the pan- Speaker Ecosystem (arXiv:2204.10920). arXiv.
demic if used widely. But so far only 3 states https://doi.org/10.48550/arXiv.2204.10920
have agreed—And none has started to use it. Isaacson, W. (2014). The Innovators. Simon & Schuster.
Business Insider. https://www.businessinsider. ISD. (2020). Far-right Exploitation of Covid-
com/apple-google-coronavirus-contact-tracing- 19: ISD and BBC Click Investigation (No.
tech-states-dont-plan-using-2020-6 3; Covid Disinformation Briefing, pp. 1–8).
Holt, J. (2021). #StopTheSteal: Timeline of Social Insitute for Strategic Dialogue. https://www.
Media and Extremist Activities Leading to isdglobal.org/wp-content/uploads/2020/06/
1/6 Insurrection. Atlantic Council - DFRLab. COVID-19-Briefing-03-Institute-for-Strategic-
https://www.justsecurity.org/74622/stopthe- Dialogue-12th-May-2020.pdf#page=1&zoom=
steal-timeline-of-social-media-and-extremist- auto,-251,848
activities-leading-to-1-6-insurrection/ Jacobides, M. G., Brusoni, S., & Candelon, F.
Horder, J. (2021). Online Free Speech and the (2021). The Evolutionary Dynamics of the
Suppression of False Political Claims. ILSA Artificial Intelligence Ecosystem. Strategy
Journal of International and Comparative Law. Science, 6(4), 412–435. https://doi.org/10.1287/
https://doi.org/10.2139/ssrn.3827192 stsc.2021.0148
Horwitz, J. (2020, October 23). Facebook Seeks Jepperson, R. L. (2021). Institutions, Institutional
Shutdown of NYU Research Project Into Effects, and Institutionalism (1991). In J. W.
Political Ad Targeting. Wall Street Journal. Meyer & R. L. Jepperson (Eds.), Institutional
https://www.wsj.com/articles/facebook-seeks- Theory: The Cultural Construction of
shutdown-of-nyu-research-project-into-politi- Organizations, States, and Identities (pp. 37–
cal-ad-targeting-11603488533 66). Cambridge University Press. https://doi.
Horwitz, J., & Seetharaman, D. (2020, May 26). org/10.1017/9781139939744.004
Facebook Executives Shut Down Efforts to Jepperson, R. L., & Meyer, J. W. (2021).
Make the Site Less Divisive. The Wall Street Institutional Theory: The Cultural Construction
Journal. https://www.wsj.com/articles/face- of Organizations, States, and Identities
book-knows-it-encourages-division-top-execu- (1st ed.). Cambridge University Press. https://
tives-nixed-solutions-11590507499 doi.org/10.1017/9781139939744
Horwitz, K., Cherney, M., & Horwitz, J. (2022, May Johnson, B. (2010, January 10). Privacy no longer a
5). Facebook Deliberately Caused Havoc in social norm, says Facebook founder. Guardian.
Australia to Influence New Law, Whistleblowers https://www.theguardian.com/technology/
Say. The Wall Street Journal. https://www.wsj. 2010/jan/11/facebook-privacy
com/articles/facebook-deliberately-caused Johnson, M., Abboud, L., Warrell, H., & Bradshaw,
-havoc-in-australia-to-influence-new-law- T. (2020, May 1). Europe split over approach
whistleblowers-say-11651768302 to virus contact tracing apps. Financial Times.
Hunt, G. (Director). (2013, April 18). Gus Hunt https://www.ft.com/content/10f87eb3-87f9-
(CTO, CIA) Mentions Actitracker @ GigaOm’s 46ea-88ab-8706adefe72d
Structure:Data 2013. https://www.youtube. Karanicolas, M. (2021). A FOIA for Facebook:
com/watch?v=edP95iJWVBI Meaningful Transparency for Online Platforms.
IAB Tech Lab. (2016). OpenRTB API Specification St. Louis University Law Journal, 66(49).
Version 2.5. https://www.iab.com/wp-content/ https://doi.org/10.2139/ssrn.3964235
uploads/2016/03/OpenRTB-API-Specification- Kaye, K. (2021, July 20). When the White House
Version-2-5-FINAL.pdf invoked the s-word, it gave new legitimacy to
Ingram, J. D. (2008). What Is a “Right to Have “surveillance” advertising. Digiday. https://
Rights”? Three Images of the Politics of Human digiday.com/marketing/when-the-white-house-
Rights. The American Political Science Review, invoked-the-s-word-it-gave-new-legitimacy-to-
102(4), 401–416. surveillance-advertising/
Zuboff 69

Kaziukėnas, J. (2021, March 24). Amazon Tops Six works. Proceedings of the National Academy
Million Third-Party Sellers. Marketplace Pulse. of Sciences, 111(24), 8788–8790. https://doi.
https://www.marketplacepulse.com/articles/ org/10.1073/pnas.1320040111
amazon-reaches-six-million-third-party-sellers Kreiss, D., & McGregor, S. C. (2018). Technology
Kelion, L. (2020, April 27). NHS rejects Apple- Firms Shape Political Communication: The
Google coronavirus app plan. BBC News. https:// Work of Microsoft, Facebook, Twitter, and
www.bbc.com/news/technology-52441428 Google With Campaigns During the 2016 U.S.
Kelly, H., Hunter, T., & Abril, D. (2022, August Presidential Cycle. Political Communication,
12). Seeking an abortion? Here’s how to 35(2), 155–177. https://doi.org/10.1080/10584
avoid leaving a digital trail. Washington Post. 609.2017.1364814
https://www.washingtonpost.com/technol- Kröger J. L., Lutz O. HM., Müller F. (2020). What
ogy/2022/06/26/abortion-online-privacy/ Does Your Gaze Reveal About You? On the
Kerner, C., & Risse, M. (2021). Beyond Porn and Privacy Implications of Eye Tracking. In:
Discreditation: Epistemic Promises and Perils Friedewald M. et al. (Eds.) Privacy and Identity
of Deepfake Technology in Digital Lifeworlds. Management. Data for Better Living: AI and
Moral Philosophy and Politics, 8(1), 81–108. Privacy (pp. 226–241). Springer, Cham. https://
https://doi.org/10.1515/mopp-2020-0024 doi.org/10.1007/978-3-030-42504-3_15
Kim, N. S. (2013). Wrap Contracts: Foundations Krogstad, J. M., & Lopez, M. H. (2017, May 12).
and Ramifications. Oxford University Press. Black voter turnout fell in 2016, even as a
https://papers.ssrn.com/abstract=2322255 record number of Americans cast ballots. Pew
Kincaid, J. (2009, December 9). The Facebook Research Center. https://www.pewresearch.
Privacy Fiasco Begins. TechCrunch. https:// org/fact-tank/2017/05/12/black-voter-turnout-
techcrunch.com/2009/12/09/facebook-privacy/ fell-in-2016-even-as-a-record-number-of-amer
Kirkpatrick, D. (2011). The Facebook effect: The icans-cast-ballots/
inside story of the company that is connecting Kurmanaev, A. (2022, September 3). How fake GPS
the world. Simon & Schuster Paperbacks. coordinates are leading to lawlessness on the
Kissick, C., Setzer, E., & Schulz, J. (2020, July high seas. The New York Times. https://www.
21). What Ever Happened to Digital Contact nytimes.com/2022/09/03/world/americas/
Tracing? Lawfare. https://www.lawfareblog. ships-gps-international-law.html
com/what-ever-happened-digital-contact-tracing Lapowsky, I. (2020, May 6). How Facebook’s over-
Klonick, K. (2020). The Facebook Oversight sight board could rewrite the rules of the entire
Board: Creating an Independent Institution to internet. Protocol. https://www.protocol.com/
Adjudicate Online Free Expression. Yale Law facebook-oversight-board-rules-of-the-internet
Journal, 129(8), 2418. Leibowitz, J. (2022, February 14). How Congress
Knight Foundation. (2020). American Views 2020: Can Protect Your Data Privacy. Wall Street
Trust, Media and Democracy. Gallup and Journal. https://www.wsj.com/articles/con-
Knight Foundation. https://knightfoundation. gress-can-protect-our-data-consumer-privacy-
org/reports/american-views-2020-trust-media- bipartisan-personal-information-regulations-
and-democracy/ online-security-ftc-cybersecurity-11644871937
Korb, L. J., & Evans, C. (2017). The Third Offset Levy, R. (2021). Social Media, News Consumption,
Strategy: A misleading slogan. Bulletin of the and Polarization: Evidence from a Field
Atomic Scientists, 73(2), 92–95. https://doi.org Experiment. American Economic Review, 111(3),
/10.1080/00963402.2017.1288443 831–870. https://doi.org/10.1257/aer.20191777
Kouzy, R., Abi Jaoude, J., Kraitem, A., El Alam, Levy, S. (2011). In The Plex: How Google Thinks,
M., Karam, B., Adib, E., Zarka, J., Traboulsi, Works, and Shapes Our Lives. Simon & Schuster.
C., Akl, E., & Baddour, K. (2020). Coronavirus Lewin, J. (2021, March 17). Facebook’s long-
Goes Viral: Quantifying the COVID-19 awaited content “supreme court” has arrived.
Misinformation Epidemic on Twitter. Cureus, It’s a clever sham. The Guardian. https://www.
12. https://doi.org/10.7759/cureus.7255 theguardian.com/commentisfree/2021/mar/17/
Kramer, A. D. I., Guillory, J. E., & Hancock, J. T. facebook-content-supreme-court-network
(2014). Experimental evidence of massive- Li, T. C. (2022, June 26). Why you should delete
scale emotional contagion through social net- your period-tracking app right now. MSNBC.
70 Organization Theory 

https://www.msnbc.com/opinion/msnbc Nature Human Behaviour, 5(3), 337–348.


-opinion/states-abortion-bans-can-weaponize- https://doi.org/10.1038/s41562-021-01056-1
your-own-data-against-you-n1296591 Lyon, D. (2022). Pandemic Surveillance. Polity
Libicki, M. C. (2017). The Convergence of Press. https://www.wiley.com/en-us/Pandemic+
Information Warfare. Strategic Studies Surveillance-p-9781509550319
Quarterly, 49–65. Ma, L., & Sun, B. (2020). Machine learning and AI
Lima, C. (2021, December 14). Facebook’s in marketing – Connecting computing power
latest defense: Social media doesn’t hurt to human insights. International Journal of
people. People hurt people. The Washington Research in Marketing, 37(3), 481–504. https://
Post. https://www.washingtonpost.com/poli- doi.org/10.1016/j.ijresmar.2020.04.005
tics/2021/12/14/facebooks-latest-defense-social- Mac, R., & Silverman, C. (2020, November 5).
media-doesnt-hurt-people-people-hurt-people/ Facebook Has A Metric For “Violence And
Lima, J. M. (2017, April 11). Hyperscalers tak- Incitement Trends.” It’s Rising. BuzzFeed
ing over the world at an unprecedented scale. News. https://www.buzzfeednews.com/article/
Data Economy. https://web.archive.org/ ryanmac/facebook-internal-metric-violence-
web/20170615104605/https://data-economy. incitement-rising-vote
com/hyperscalers-taking-world-unprecedented- Mac, R., Warzel, C., & Kantrowitz, A. (2018, March
scale/ 29). Growth At Any Cost: Top Facebook
Lin, H. (2019). The existential threat from cyber- Executive Defended Data Collection In 2016
enabled information warfare. Bulletin of the Memo—And Warned That Facebook Could
Atomic Scientists, 75(4), 187–196. https://doi. Get People Killed. BuzzFeed News. https://
org/10.1080/00963402.2019.1629574 www.buzzfeednews.com/article/ryanmac/
Linebaugh, C. D. (2022). Abortion, Data Privacy, growth-at-any-cost-top-facebook-executive-
and Law Enforcement Access: A Legal defended-data
Overview (No. LSB10786). Congressional MacMillan, D., Kelly, H., Dwoskin, E., & Dawsey,
Research Service. https://crsreports.congress. J. (2020, March 16). Trump announced Google
gov/product/pdf/LSB/LSB10786 was building a virus screening tool. Then
Location Intelligence Market Size Share Report, someone had to build it. Washington Post.
2022–2030 (GVR-2-68038-401-7; p. 153). https://www.washingtonpost.com/technol-
(2022). Grand View Research. https://web. ogy/2020/03/16/google-verily-coronavirus-
archive.org/web/20220525210808/https:// website-trump/
www.grandviewresearch.com/industry-analy- Manns, J. (2020). The Case for Preemptive
sis/location-intelligence-market Oligopoly Regulation (SSRN Scholarly
Lohr, S. (2019, September 26). At Tech’s Leading Paper No. 3665651). https://papers.ssrn.com/
Edge, Worry About a Concentration of Power. abstract=3665651
The New York Times. https://www.nytimes. Markoff, J. (2005). What the Dormouse Said. Viking
com/2019/09/26/technology/ai-computer- Penguin.
expense.html Martin, N. (2021). Noelle Martin—Fighting for
Lomas, N. (2020a, April 6). EU privacy experts push Online Safety, Justice, and Accountability.
a decentralized approach to COVID-19 con- Noelle Martin. https://www.noellemartin.org/
tacts tracing. TechCrunch. https://techcrunch. McBride, S., & Vance, A. (2019, June 18). Apple,
com/2020/04/06/eu-privacy-experts-push-a Google, and Facebook Are Raiding Animal
-decentralized-approach-to-covid-19-contacts- Research Labs. Bloomberg Businessweek.
tracing/ https ://w w w .bloombe rg.c om/ne w s /fe a -
Lomas, N. (2020b, April 16). EU lawmakers set tures/2019-06-18/apple-google-and-facebook-
out guidance for coronavirus contacts trac- are-raiding-animal-research-labs
ing apps. TechCrunch. https://techcrunch. McConnell, M. (2010, February 28). Mike McConnell
com/2020/04/16/eu-lawmakers-set-out-guid- on how to win the cyber-war we’re losing. The
ance-for-coronavirus-contacts-tracing-apps/ Washington Post. http://www.washington-
Loomba, S., de Figueiredo, A., Piatek, S. J., de post.com/wp-dyn/content/article/2010/02/25/
Graaf, K., & Larson, H. J. (2021). Measuring AR2010022502493.html
the impact of COVID-19 vaccine misinforma- McCorkindale, T., & Henry, A. (2022). 2022 Institute
tion on vaccination intent in the UK and USA. for Public Relations Disinformation in Society
Zuboff 71

Report. Institute for Public Relations. https:// Mozur, P., Kessel, J. M., & Chan, M. (2019, April
instituteforpr.org/2022-disinformation-report/ 24). Made in China, Exported to the World:
McGuinness, T. (2022, March 15). Microsoft The Surveillance State. The New York Times.
Cloud for Healthcare: Reshaping the future of https://www.nytimes.com/2019/04/24/technol-
healthcare. Microsoft Industry Blogs. https:// ogy/ecuador-surveillance-cameras-police-gov-
cloudblogs.microsoft.com/industry-blog/ ernment.html
health/2022/03/15/microsoft-cloud-for-health- Mozur, P., & Scott, M. (2016, November 17). Fake
care-reshaping-the-future-of-healthcare/ News in U.S. Election? Elsewhere, That’s
Menendez, R. (2020). The new big brother–China Nothing New. The New York Times. https://
and digital authoritarianism: A minority staff www.nytimes.com/2016/11/18/technology/
report. U.S. Government Publishing Office. fake-news-on-facebook-in-foreign-elections-
Merrill, J. B., & Oremus, W. (2021, October 26). thats-not-new.html
Five points for anger, one for a ‘like’: How Murgia, M. (2019a, March 13). AI academics under
Facebook’s formula fostered rage and mis- pressure to do commercial research. Financial
information. Washington Post. https://www. Times. https://www.ft.com/content/94e86cd0-
washingtonpost.com/technology/2021/10/26/ 44b6-11e9-a965-23d669740bfb
facebook-angry-emoji-algorithm/ Murgia, M. (2019b, June 6). Microsoft quietly
Metz, C. (2017a, April 5). Building an AI Chip deletes largest public face recognition data set.
Saved Google From Building a Dozen New Financial Times.
Data Centers. Wired. https://www.wired. Murgia, M., Criddle, C., & Murphy, H. (2021,
com/2017/04/building-ai-chip-saved-google- December 6). Investigating Facebook: A frac-
building-dozen-new-data-centers/ tious relationship with academia. Financial
Metz, C. (2017b, October 22). Tech Giants Are Times.
Paying Huge Salaries for Scarce A.I. Talent. Murgia, M., & Gross, A. (2020, March 27). Inside
The New York Times. https://www.nytimes. China’s controversial mission to reinvent the
com/2017/10/22/technology/artificial-intelli- internet. Financial Times. https://www.ft.com/
gence-experts-salaries.html content/ba94c2bc-6e27-11ea-9bca-bf503995cd6f
Metz, R. (2021, October 27). Likes, anger emojis Murphy, H. (2022, January 18). Facebook pat-
and RSVPs: The math behind Facebook’s News ents reveal how it intends to cash in on
Feed—and how it backfired | CNN Business. metaverse. Financial Times. https://www.
CNN. https://www.cnn.com/2021/10/27/tech/ ft.com/content/76d40aac-034e-4e0b-95eb-
facebook-papers-meaningful-social-interac- c5d34146f647
tion-news-feed-math/index.html Murphy, L. W., & Cacace, M. (2020). Facebook’s
Mirowski, P. (2013). Never let a serious crisis go to Civil Rights Audit—Final Report. Relman
waste: How neoliberalism survived the finan- Colfax. https://www.relmanlaw.com/cases-377
cial meltdown. Verso. Murphy, M. (2019, March 10). Dr Google will see
MIT Sloan School of Management. (2013, November you now: Search giant wants to cash in on your
6). Uber CEO talks regulatory disruption, main- medical queries. The Telegraph. https://www.
taining startup culture. MIT Management Sloan telegraph.co.uk/technology/2019/03/10/google-
School Newsroom. https://perma.cc/NG3C- sifting-one-billion-health-questions-day/
XWC7 Narayanan, A., & Reisman, D. (2017). The Princeton
Mosco, V. (2004). The Digital Sublime: Myth, Web Transparency and Accountability Project.
Power, and Cyberspace. MIT Press. In T. Cerquitelli, D. Quercia, & F. Pasquale
Moses, D. A., Leonard, M. K., Makin, J. G., & Chang, (Eds.), Transparent Data Mining for Big and
E. F. (2019). Real-time decoding of question- Small Data (Vol. 32, pp. 45–67). Springer,
and-answer speech dialogue using human cortical Cham. https://doi.org/10.1007/978-3-319-
activity. Nature Communications, 10(1), 3096. 54024-5_3
https://doi.org/10.1038/s41467-019-10994-4 National Academies of Sciences, Engineering,
Mosseri, A. (2018, January 11). News Feed FYI: Medicine. (2018). Assessing and Responding
Bringing People Closer Together. Facebook to the Growth of Computer Science
Newsroom. https://newsroom.fb.com/news/ Undergraduate Enrollments (p. 24926).
2018/01/news-feed-fyi-bringing-people-closer- The National Academies Press. https://doi.
together/ org/10.17226/24926
72 Organization Theory 

Naughton, J. (1999). A Brief History of the Future: Mission. https://404a7c52-a26b-421d-a6c6-


The Origins of the Internet. Phoenix. 96c63f2a159a.filesusr.com/ugd/159fc3_87890
NewsWhip. (2019). 2019 Guide to Publishing on 9ad0691448695346b128c6c9302.pdf
Facebook. NewsWhip. https://web.archive.org/ Perigo, B. (2022, February 17). Inside Facebook’s
web/20190319160555if_/http://go.newswhip. African Sweatshop. Time. https://time.
com/rs/647-QQK-704/images/Facebook%20 com/6147458/facebook-africa-content-modera-
Publishing%202019_Final.pdf tion-employee-treatment/
Ng, A. (2021, April 27). Google Promised Its Perri et al. v. Robinhood Markets, Inc. et al., 8:21-cv-
Contact Tracing App Was Completely 00234 (Florida Southern District April 6, 2021).
Private—But It Wasn’t. https://themarkup.org/ https://unicourt.com/case/pc-db5-perri-et-al-v-
privacy/2021/04/27/google-promised-its-con- robinhood-markets-inc-et-al-867511
tact-tracing-app-was-completely-private-but-it- Peterson, B. (2020, July 30). Now we know exactly
wasnt what Jeff Bezos, Mark Zuckerberg, Sundar
Niemöller, M. (1950). First They Came.... Pichai and Tim Cook think about before they
CommonLit. https://www.commonlit.org/en/ make a giant startup acquisitions. Business
texts/first-they-came Insider. https://www.businessinsider.com/ama-
Nix, N., & Dwoskin, E. (2022, August 12). Search zon-facebook-google-ceos-mergers-antitrust-
warrants for abortion data leave tech companies documents-acquisitions-2020-7
few options. Washington Post. https://www. Pifer, R. (2022, July 25). Amazon will see you now:
washingtonpost.com/technology/2022/08/12/ Reading between the lines of the One Medical
nebraska-abortion-case-facebook/ acquisition. Healthcare Dive. https://www.
Ohlheiser, A., & Kiros, H. (2022, June 28). Big healthcaredive.com/news/amazon-why-one-
Tech remains silent on questions about data medical-acquisition-primary-care/627822/
privacy in a post-Roe US. MIT Technology Piketty, T. (2014). Capital in the Twenty-First
Review. https://www.technologyreview.com/ Century. In Capital in the Twenty-First
2022/06/28/1055044/big-tech-data-privacy- Century. Belknap Press of Harvard University
supreme-court-dobbs-abortion/ Press. https://doi.org/10.4159/9780674982918
Ong, W. J. (1982). Orality and Literacy: The Technol­ Pitofsky, R., Anthony, S. F., Thompson, M. W.,
ogizing of the Word. Methuen. http://www.gbv. Swindle, O., & Leary, T. B. (2000). Privacy
de/dms/bowker/toc/9780416713701.pdf Online: Fair Information Practices in the
Oremus, W. (2021, November 13). Why Facebook Electronic Marketplace: A Federal Trade
won’t let you control your own news feed. Commission Report to Congress. Federal
Washington Post. https://www.washingtonpost. Trade Commission, 35.
com/technology/2021/11/13/facebook-news- Power, M. (2022). Theorizing the Economy of
feed-algorithm-how-to-turn-it-off/ Traces: From Audit Society to Surveillance
Owen, M. (2020, June 7). Profile of Apple-Google Capitalism. Organization Theory, 3(3),
contact tracing API reveals how project started. 26317877211052296. https://doi.org/10.1177/
AppleInsider. https://appleinsider.com/arti- 26317877211052296
cles/20/06/07/profile-of-apple-google-contact- Prodhan, G., & Nienaber, M. (2015, June 9). Merkel
tracing-api-reveals-how-project-started urges Germans to put aside fear of big data.
Pasquale, F. (2013). Privacy, Antitrust, and Power. Reuters. https://www.reuters.com/article/
George Mason Law Review, 20(4), 1009–1024. us-germany-technology-merkel-idUSKBN-
Pasquale, F. (2017a). The Automated Public 0OP2EM20150609
Sphere. U of Maryland Legal Studies Research Purcher, J. (2020, April 23). European Commissioner
Paper No. 2017-31. https://papers.ssrn.com/ urges Apple’s CEO on a Video Conference
abstract=3067552 call to support Europe’s “StopCovid” App
Pasquale, F. (2017b, December 6). From Territorial on the iPhone. Patently Apple. https://www
to Functional Sovereignty: The Case of .patentlyapple.com/2020/04/european-com-
Amazon. Law and Political Economy Project. missioner-urges-apples-ceo-on-a-video-confer-
https://lpeproject.org/blog/from-territorial-to- ence-call-to-support-europes-stopcovid-app-
functional-sovereignty-the-case-of-amazon/ on-the-iphone.html
PEPP-PT. (2020). Pan European Privacy Puschmann, C. (2019). An end to the wild west of
Protecting Proximity Tracing: Context and social media research: A response to Axel
Zuboff 73

Bruns. Information, Communication & Society, Reardon, J., Dehaye, P.-O., & Richter, B. (2020,
22(11), 1582–1589. https://doi.org/10.1080/136 December 4). Proximity Tracing in an Ecosystem
9118X.2019.1646300 of Surveillance Capitalism. AppCensus Blog.
Quaker, D. (2022, March 31). Amazon Stats. Amazon https://blog.appcensus.io/2020/12/04/proximity-
Selling Partner Blog. https://sell.amazon.com/ tracing-in-an-ecosystem-of-surveillance-capital-
blog/grow-your-business/amazon-stats-growth- ism/
and-sales Redlener, I., Sachs, J. D., Hansen, S., & Hupert, N.
Rabkin, J., Basnett, G., Howker, E., Eastham, J., (2020). 130,000—210,000 Avoidable COVID-
& Pett, H. (2020, September 28). Revealed: 19 Deaths—And Counting—In the U.S. National
Trump campaign strategy to deter millions Center for Disaster Preparedness - Columbia
of Black Americans from voting in 2016. University Earth Institute. https://ncdp.colum-
Channel 4 News. https://www.channel4.com/ bia.edu/custom-content/uploads/2020/10/
news/revealed-trump-campaign-strategy-to- Avoidable-COVID-19-Deaths-US-NCDP.pdf
deter-millions-of-black-americans-from-vot- Reed, J. (2019, August 9). Maria Ressa: ‘It would be
ing-in-2016 great if we didn’t have to fight our government.’
Radin, M. J. (1987). Market-Inalienability. Harvard Financial Times.
Law Review, 100(8), 1849–1937. https://doi. Reed, J. (2022, May 10). Marcos myths lift dictator’s
org/10.2307/1341192 son to power in Philippines. Financial Times.
Radin, M. J. (2012). Boilerplate: The Fine Print, https://www.ft.com/content/adc60586-9267-
Vanishing Rights, and the Rule of Law. Princeton 43b5-be3b-f44ad4506d2d
University Press. https://doi.org/10.23943/ Reset Australia. (2022, May 11). Briefing: How Meta
princeton/9780691155333.001.0001 Extorted Australia. Reset Australia. https://au.
Radjenovic, A., Mańko, R., & Eckert, G. (2020). reset.tech/news/how-meta-extorted-australia/
Coronavirus and elections in selected Member Ressa, M., & Muratov, D. (2022). A 10-point plan
States. European Parliamentary Research to address our information crisis. People vs.
Service, 11. Big Tech. https://peoplevsbig.tech/10-point-
Rahbar, D. H. (2022). Laundering Data: How plan
the Government’s Purchase of Commercial Richter, F. (2020, July 9). Infographic: Apple Leads the
Location Data Violates Carpenter and Evades Race for AI Domination. Statista Infographics.
the Fourth Amendment. Columbia Law Review, https://www.statista.com/chart/9443/ai-acquisi-
122(3), 713–754. tions/
Rahman-Shepherd, A., Clift, C., Ross, E., Hollman, Riemer, K., & Peter, S. (2021). Algorithmic audi-
L., Van Der Mark, N., Wakefield, B., Patel, C., encing: Why we need to rethink free speech
& Yates, R. (2021). Solidarity in Response to on social media. Journal of Information
the COVID-19 pandemic (pp. 1–10). Chatham Technology, 36(4), 409–426. https://doi.
House. https://www.chathamhouse.org/sites/ org/10.1177/02683962211013358
default/files/2021-07/2021-07-14-solidarity- Riles, A. (2014). Managing Regulatory Arbitrage:
response-covid-19-pandemic-summary-rah- A Conflict of Laws Approach. Cornell
man-shepherd-et-al_0_0.pdf International Law Journal, 47, 63–117.
Ramjee, D., Sanderson, P., & Malek, I. (2021). Risse, M. (2021). The Fourth Generation of Human
COVID-19 and Digital Contact Tracing: Rights: Epistemic Rights in Digital Lifeworlds
Regulating the Future of Public Health (HKS Working Paper No. RWP21-027). https://
Surveillance (SSRN Scholarly Paper No. doi.org/10.2139/ssrn.3973946
3733071). https://doi.org/10.2139/ssrn.3733071 Roberts, M. (2019, January 31). Opinion | Facebook has
Raskin, R. (2020, July 2). Why Americans Hate declared sovereignty. Washington Post. https://
Contact Tracing. Techonomy. https://techon- www.washingtonpost.com/opinions/2019/01/31/
omy.com/why-americans-hate-contact-tracing/ facebook-has-declared-sovereignty/
Rathje, S., Van Bavel, J. J., & van der Linden, S. Roberts, P. (2012, October 4). FTC Releases Google
(2021). Out-group animosity drives engagement Privacy Report—Minus The Juicy Details.
on social media. Proceedings of the National The Security Ledger. https://securityledger.
Academy of Sciences, 118(26), e2024292118. com/2012/10/ftc-releases-google-privacy-
https://doi.org/10.1073/pnas.2024292118 report-minus-the-juicy-details/
74 Organization Theory 

Romm, T. (2020, March 11). White House asks Santos, F. P., Lelkes, Y., & Levin, S. A. (2021).
Silicon Valley for help to combat coronavi- Link recommendation algorithms and dynam-
rus, track its spread and stop misinformation. ics of polarization in online social networks.
Washington Post. https://www.washingtonpost. Proceedings of the National Academy of
com/technology/2020/03/11/white-house-tech- Sciences, 118(50). https://doi.org/10.1073/
meeting-coronavirus/ pnas.2102141118
Romm, T., Dwoskin, E., & Timberg, C. (2020, March Scheck, J., Purnell, N., & Horwitz, J. (2021,
17). U.S. government, tech industry discussing September 16). Facebook Employees Flag Drug
ways to use smartphone location data to com- Cartels and Human Traffickers. The Company’s
bat coronavirus. Washington Post. https://www. Response Is Weak, Documents Show. Wall
washingtonpost.com/technology/2020/03/17/ Street Journal. https://www.wsj.com/articles/
white-house-location-data-coronavirus/ facebook-drug-cartels-human-traffickers-
Roose, K., & Isaac, M. (2021, February 10). response-is-weak-documents-11631812953
Facebook Dials Down the Politics for Users. Schmidt, E. (Director). (2018, March 16). Eric
The New York Times. https://www.nytimes. Schmidt HIMSS18 Opening Keynote.
com/2021/02/10/technology/facebook-reduces- Healthcare Information and Management
politics-feeds.html Systems Society, Inc. https://www.youtube.
Rosen, J. (2012). The Right to Be Forgotten. Stanford com/watch?v=ACQes9erfsw
Law Review Online, 64, 88. Schmidt, E., & Cohen, J. (2014). The New Digital
Rosenbush, S. (2022, March 8). Big Tech Is Spending Age: Transforming Nations, Businesses, and
Billions on AI Research. Investors Should Keep Our Lives. Vintage Books.
an Eye Out. Wall Street Journal. https://www. Schmidt, E., & Kravis, M.-J. (2020). The
wsj.com/articles/big-tech-is-spending-billions- Technological Response to COVID-19—Video
on-ai-research-investors-should-keep-an-eye- Conference Transcript. The Economic Club of
out-11646740800 New York. https://www.econclubny.org/docum
Rozenshtein, A. Z. (2021). Silicon Valley’s Speech: ents/10184/109144/2020SchmidtTranscript.pdf
Technology Giants and the Deregulatory First Schmidt, E., Work, R. O., Catz, S., Chien, S.,
Amendment. Journal of Free Speech Law, 337. Clyburn, M., Darby, C., Ford, K., Griffiths,
https://papers.ssrn.com/abstract=3911460 J.-M., Horvitz, E., Jassy, A., Louie, G., Mark,
Rubin, A. J. (2015, May 5). Lawmakers in France W., Matheny, J., McFarland, K., & Moore, A.
Move to Vastly Expand Surveillance. The (2019). NSCAI Interim Report for Congress.
New York Times. https://www.nytimes. National Security Commission on Artificial
com/2015/05/06/world/europe/french-legisla- Intelligence. https://www.nscai.gov/wp-con-
tors-approve-sweeping-intelligence-bill.html tent/uploads/2021/01/NSCAI-Interim-Report-
Ryan, J. (2022). The Biggest Data Breach: ICCL for-Congress_201911.pdf
report on scale of Real-Time Bidding data Schwartz, P. M. (2012). Systematic government
broadcasts in the U.S. and Europe. Irish access to private-sector data in Germany.
Council for Civil Liberties. https://www.iccl. International Data Privacy Law, 2(4), 289–301.
ie/wp-content/uploads/2022/05/Mass-data- https://doi.org/10.1093/idpl/ips026
breach-of-Europe-and-US-data-1.pdf Scola, N. (2017, October 26). How Facebook,
Sample, I. (2017, November 2). Big tech firms’ Google and Twitter “embeds” helped Trump
AI hiring frenzy leads to brain drain at UK in 2016. Politico. https://www.politico.com
universities. The Guardian. https://www. /story/2017/10/26/facebook-google-twitter-
theguardian.com/science/2017/nov/02/big- trump-244191
tech-firms-google-ai-hiring-frenzy-brain-drain- Scola, N. (2019, October 17). Zuckerberg defends
uk-universities Facebook’s “free expression” in face of
Samuel, S. (2019, August 5). Facebook is building Washington hostility. POLITICO. https://www.
tech to read your mind. The ethical implications politico.com/news/2019/10/17/mark-zucker-
are staggering. Vox. https://www.vox.com/ berg-facebook-georgetown-address-050181
future-perfect/2019/8/5/20750259/facebook-ai- Scott, M. (2015, November 18). Europe, Shaken
mind-reading-brain-computer-interface by Paris Attacks, Weighs Security With
Zuboff 75

Privacy Rights—The New York Times. New Siganos, A., Vagenas-Nanos, E., & Verwijmeren, P.
York Times - Bits Blog. https://web.archive. (2014). Facebook’s daily sentiment and inter-
org/web/20151119015821/https://bits.blogs. national stock markets. Journal of Economic
nytimes.com/2015/11/18/europe-shaken-by- Behavior & Organization, 107, 730–743.
paris-attacks-weighs-security-with-privacy- https://doi.org/10.1016/j.jebo.2014.06.004
rights/ Silverman, C., & Mac, R. (2020, November 3).
Scott, M., Braun, E., Delcker, J., & Manancourt, V. Facebook Reduced Traffic To Leading Liberal
(2020, May 15). How Google and Apple out- Pages Just Before The Election. BuzzFeed
flanked governments in the race to build coro- News. https://www.buzzfeednews.com/article/
navirus apps. Politico. https://www.politico.eu/ craigsilverman/facebook-cut-traffic-liberal-
article/google-apple-coronavirus-app-privacy- pages-before-election
uk-france-germany/ Silverman, C., Timberg, C., Kao, J., & Merrill, J.
Searle, J. R. (2010). Making the Social World: B. (2022, January 4). Facebook groups topped
The Structure of Human Civilization. Oxford 10,000 daily attacks on election before Jan. 6,
University Press. analysis shows. Washington Post. https://www.
Sekalala, S., Dagron, S., Forman, L., & Meier, B. M. washingtonpost.com/technology/2022/01/04/
(2020). Analyzing the Human Rights Impact of facebook-election-misinformation-capitol-
Increased Digital Public Health Surveillance riot/
during the COVID-19 Crisis. Health and Simon, F., Howard, P. N., & Nielsen, R. K. (2020,
Human Rights Journal, 22(2), 7–20. April 7). Types, sources, and claims of COVID-
Selinger, E., & Durant, D. (2022). Amazon’s Ring: 19 misinformation. Reuters Institute for the
Surveillance as a Slippery Slope Service. Study of Journalism. https://reutersinstitute.pol-
Science as Culture, 31(1), 92–106. https://doi. itics.ox.ac.uk/types-sources-and-claims-covid-
org/10.1080/09505431.2021.1983797 19-misinformation
Shannon, C. E., & Weaver, W. (1963). The Singer, N. (2020, July 8). Virus-Tracing Apps Are
Mathematical Theory of Communication (first Rife With Problems. Governments Are Rushing
published in 1949). University of Illinois Press. to Fix Them. The New York Times. https://
Shapiro, J. (2021, January 27). Free Speech www.nytimes.com/2020/07/08/technology/
Fundamentalism. Inside Higher Ed. https:// virus-tracing-apps-privacy.html
www.insidehighered.com/views/2021/01/27/ Smyth, J. (2021a, February 14). Nine Entertainment
academics-should-put-freedom-speech-con- urges Australian MPs to ignore Big Tech threats
text-other-values-opinion over news code. Financial Times. https://www.
Shattuck, J., & Risse, M. (2021). Reimagining ft.com/content/7505a222-30d9-4ba3-b2fb-
Rights & Responsibilities in the United States: 48ff4daafe80
Freedom of Speech and Media (No. RWP21- Smyth, J. (2021b, February 19). Facebook ‘behav-
004). https://doi.org/10.2139/ssrn.3801268 ing like North Korea’ as Australia wakes up
Shenkman, C., Franklin, S. B., Nojeim, G., & Thakur, to news ban. Financial Times. https://www
D. (2021). Legal Loopholes and Data for .ft.com/content/9e519b57-4e48-4221-9622-
Dollars: How Law Enforcement and Intelligence facd83cd0e42
Agencies Are Buying Your Data from Brokers. Smyth, J., Murphy, H., & Barker, A. (2021, February
Center for Democracy & Technology. https:// 18). Facebook ban on news in Australia pro-
cdt.org/wp-content/uploads/2021/12/2021-12- vokes fierce backlash. Financial Times. https://
08-Legal-Loopholes-and-Data-for-Dollars- www.ft.com/content/cac1ff54-b976-4ae4-
Report-final.pdf b810-46c29ab26096
Sherman, J., & Morgus, R. (2018, December 5). Snowden, E. (2019). Permanent Record. Henry Holt
Authoritarians Are Exporting Surveillance and Company.
Tech, And With it Their Vision for the Internet. Soni, J., & Goodman, R. (2018). A Mind at Play:
Council on Foreign Relations - Net Politics. How Claude Shannon Invented the Information
https://www.cfr.org/blog/authoritarians-are- Age. Simon & Schuster. https://www.simo-
exporting-surveillance-tech-and-it-their-vision- nandschuster.com/books/A-Mind-at-Play/
internet Jimmy-Soni/9781476766690
76 Organization Theory 

Sontani, A., Calo, R., & Bergstrom, C. (2020, April Tech Transparency Project. (2022, May 25). Eric
27). Contact-tracing apps are not a solution to Schmidt’s Hidden Influence Over US Defense
the COVID-19 crisis. Brookings. https://www. Spending. Tech Transparency Project. https://
brookings.edu/techstream/inaccurate-and- www.techtransparencyproject.org/articles/eric-
insecure-why-contact-tracing-apps-could-be-a- schmidts-unseen-influence-over-us-defense-
disaster/ spending
Srinivasan, D. (2019). The Antitrust Case against Terry, N. P. (2016). Big Data and Regulatory
Facebook: A Monopolist’s Journey towards Arbitrage in Health Care (No. 2821964).
Pervasive Surveillance in Spite of Consumers’ SSRN Scholarly Paper. https://papers.ssrn.com/
Preference for Privacy. Berkeley Business Law abstract=2821964
Journal, 16(1), 39–101. Terry, N. P. (2017). Regulatory Disruption and
Statt, N. (2019, April 30). Facebook CEO Arbitrage in Health-Care Data Protection. Yale
Mark Zuckerberg says the “future is pri- Journal of Health Policy, Law and Ethics, 17,
vate.” The Verge. https://www.theverge. 143.
com/2019/4/30/18524188/facebook-f8-key- The People’s Declaration. (n.d.). The People’s
note-mark-zuckerberg-privacy-future-2019 Declaration. https://www.peoplesdeclaration.net
Stock, B. (1983). The Implications of Literacy: The White House. (2019, February 11). Artificial
Written Language and Models of Interpretation Intelligence for the American People. The
in the Eleventh and Twelfth Centuries. Princeton White House. https://web.archive.org/web/2019
University Press. http://hdl.handle.net/2027/ 0321000750/https://www.whitehouse.gov/ai/
heb.01528 executive-order-ai/
Stoto, M. A. (2008). Public Health Surveillance in the Thompson, N. (2018, February 12). Inside
Twenty-First Century: Achieving Population Facebook’s Two Years of Hell. Wired. https://
Health Goals While Protecting Individuals’ www.wired.com/story/inside-facebook-mark-
Privacy and Confidentiality. Georgetown Law zuckerberg-2-years-of-hell/
Journal, 96(703). https://heinonline.org/HOL/ Timberg, C. (2020, February 20). How conserva-
Page?handle=hein.journals/glj96&id=706&div tives learned to wield power inside Facebook.
=&collection= Washington Post. https://www.washingtonpost.
Stowell, J., & Ramos, C. (2019, November 15). com/technology/2020/02/20/facebook-republi-
Curie subsea cable set to transmit to Chile, with can-shift/
a pit stop to Panama. Google Cloud [blog]. Timberg, C., & Harwell, D. (2020, March 19).
https://cloud.google.com/blog/products/infra- Government efforts to track virus through phone
structure/curie-subsea-cable-set-to-transmit- location data complicated by privacy concerns.
to-chile-with-a-pit-stop-to-panama Washington Post. https://www.washingtonpost.
Stroud, N. J., Scacco, J., & Curry, A. (2014). Analysis com/technology/2020/03/19/privacy-coronavi-
of News Sites. Center for Media Engagement. rus-phone-data/
https://mediaengagement.org/research/news- Tobin, B. (2022, June 9). A top Walmart exec shares
site-analysis/ why it’s outperforming Amazon in selling cus-
Suliman, A. (2022, January 29). Joni Mitchell pulls tomer data while turning the business into a
music from Spotify in stand with Neil Young key revenue stream for the retailer. Business
against covid misinformation. The Washington Insider. https://www.businessinsider.com/
Post. https://www.washingtonpost.com/arts- walmart-thinks-it’s-beating-amazon-in-mone-
entertainment/2022/01/29/joni-mitchell-spot- tizing-customer-data-2022-6
ify-covid-rogan/ Tokita, C. K., Guess, A. M., & Tarnita, C. E.
Summers, L. H. (2006, November 19). The Great (2021). Polarized information ecosystems
Liberator. The New York Times. https://www. can reorganize social networks via informa-
nytimes.com/2006/11/19/opinion/19summers. tion cascades. Proceedings of the National
html Academy of Sciences, 118(50), e2102147118.
Swire, P. (2013). The Second Wave of Global https://doi.org/10.1073/pnas.2102147118
Privacy Protection: Symposium Introduction. Tribe, L. (2021, May 11). First Amendment fan-
Ohio State Law Journal, 74(6). https://kb.osu. tasies in the social media debate. The Hill.
edu/handle/1811/71601 https://thehill.com/opinion/technology/552735
Zuboff 77

-laurence-tribe-first-amendment-fantasies-in- Data Protection and Security. Github. https://


the-social-media-debate github.com/DP-3T/documents/blob/master/
Tromble, R. (2021). Where Have All the Data Gone? DP3T%20-%20Data%20Protection%20and%20
A Critical Reflection on Academic Digital Security.pdf
Research in the Post-API Age. Social Media + Tuohy, J. P. (2022, January 3). This new light
Society, 7(1), 2056305121988929. https://doi. bulb from Sengled can tell if you are sleep-
org/10.1177/2056305121988929 ing. The Verge. https://www.theverge.com/
Troncoso, C. (2021). Contact Tracing Apps: 2022/1/3/22864783/sengled-smart-health-mon-
Engineering Privacy in Quicksand. EPFL. itoring-smart-bulb-ces2022
https://www.usenix.org/sites/default/files/con- Turner, F. (2006). From Counterculture to
ference/protected-files/enigma2021_slides_ Cyberculture: Stewart Brand, the Whole Earth
troncoso.pdf Network, and the Rise of Digital Utopianism.
Troncoso, C., Payer, M., Hubaux, J.-P., Salathé, M., University of Chicago Press.
Larus, J., Bugnion, E., Lueks, W., Stadler, T., Tutt, A. (2014). The Revisability Principle.
Pyrgelis, A., Antonioli, D., Barman, L., Chatel, Hastings Law Journal, 66(1113). http://dx.doi.
S., Paterson, K., Capkun, S., Basin, D., Beutel, org/10.2139/ssrn.2489718
J., Jackson, D., Preneel, B., Smart, N., ... Boneh, US Accountability Office. (2021, September 9).
D. (2020a). DP3T - Decentralized Privacy- Exposure Notification: Benefits and Challenges
Preserving Proximity Tracing: Overview of of Smartphone Applications to Augment
Data Protection and Security. Github. https:// Contact Tracing - Full Report. US Government
github.com/DP-3T/documents/blob/08e9c145d Accountability Office. https://www.gao.gov/
abfe26907afd66e0973aceb4e4b44f7/DP3T%20 assets/gao-21-104622.pdf
-%20Data%20Protection%20and%20Security. US District Court—San Francisco. (2019). Facebook,
pdf Inc Consumer Privacy Litigation NO. 18-MD-
Troncoso, C., Payer, M., Hubaux, J.-P., Salathé, M., 02843 VC - Transcript of Proceedings. https://
Larus, J., Bugnion, E., Lueks, W., Stadler, T., www.documentcloud.org/documents/6153329-
Pyrgelis, A., Antonioli, D., Barman, L., Chatel, 05-29-2019-Facebook-Inc-Consumer-Privacy.
S., Paterson, K., Capkun, S., Basin, D., Beutel, html
J., Jackson, D., Preneel, B., Smart, N., ... Boneh, US Senate Committee on Commerce, Science,
D. (2020b). DP3T - Decentralized Privacy- & Transportation. (2016, May 10). Thune
Preserving Proximity Tracing: README. Seeks Answers from Facebook on Political
Github. https://github.com/DP-3T/documents/ Manipulation Allegations. US Senate Committee
blob/08e9c145dabfe26907afd66e0973aceb4e4 on Commerce, Science, & Transportation.
b44f7/README.md https://www.commerce.senate.gov/2016/5/
Troncoso, C., Payer, M., Hubaux, J.-P., Salathé, M., thune-seeks-answers-from-facebook-on-politi-
Larus, J., Bugnion, E., Lueks, W., Stadler, T., cal-manipulation-allegations
Pyrgelis, A., Antonioli, D., Barman, L., Chatel, US Supreme Court. (1967, May 29). WARDEN,
S., Paterson, K., Capkun, S., Basin, D., Beutel, MARYLAND PENITENTIARY, Petitioner, v.
J., Jackson, D., Preneel, B., Smart, N., ... Boneh, Bennie Joe HAYDEN. Cornell Law School - Legal
D. (2020c). DP3T - Decentralized Privacy- Information Institute. https://www.law.cornell.
Preserving Proximity Tracing: Simplified edu/supremecourt/text/387/294
Overview. Github. https://raw.githubusercon- Vasconcelos, V. V., Constantino, S. M., Dannenberg,
tent.com/DP-3T/documents/master/DP3T%20 A., Lumkowsky, M., Weber, E., & Levin, S.
-%20Simplified%20Three%20Page%20Brief. (2021). Segregation and clustering of pref-
pdf erences erode socially beneficial coordina-
Troncoso, C., Payer, M., Hubaux, J.-P., Salathé, M., tion. Proceedings of the National Academy of
Larus, J., Bugnion, E., Lueks, W., Stadler, T., Sciences, 118(50), e2102153118. https://doi.
Pyrgelis, A., Antonioli, D., Barman, L., Chatel, org/10.1073/pnas.2102153118
S., Paterson, K., Capkun, S., Basin, D., Beutel, Vaudenay, S. (2020). Centralized or Decentralized?
J., Jackson, D., Preneel, B., Smart, N., ... Boneh, The Contact Tracing Dilemma (No. 2020/531;
D. (2020d). DP3T - Decentralized Privacy- Cryptology EPrint Archive). EPFL. https://info-
Preserving Proximity Tracing—Overview of science.epfl.ch/record/277809
78 Organization Theory 

Vaudenay, S., & Vuagnoux, M. (2022, February 22). Ward, P. A. (2022). When the Soapbox Talks:
The Dark Side of SwissCovid. EPFL - Serge Platforms as Public Utilities. Wisconsin Law
Vaudenay’s Staff Page. https://lasec.epfl.ch/ Review, 2022(1). https://wlr.law.wisc.edu/vol
people/vaudenay/swisscovid.html ume-2022-no-1/
Vincent, E. M., Théro, H., & Shabayek, S. (2022). Warrell, H., Neville, S., & Bradshaw, T. (2020, June
Measuring the effect of Facebook’s downrank- 18). UK to replace contact-tracing app with
ing interventions against groups and websites Apple and Google model. Financial Times.
that repeatedly share misinformation. Harvard https://www.ft.com/content/819ff491-0ae9-
Kennedy School Misinformation Review. 4359-b3db-ed8de48330d7
https://doi.org/10.37016/mr-2020-100 Waters, R. (2022, May 19). Microsoft woos Brussels
VIScon (Director). (2020, October 9). Building a as battle over the cloud intensifies. Financial
Contact Tracing App for Switzerland | Kenny Times.
Paterson | ETH Zürich. VIScon. https://www. Wellcome. (2019). Wellcome Global Monitor—First
youtube.com/watch?v=20kHvbsej-w Wave Findings 2018. Gallup. https://wellcome.
Vosoughi, S., Roy, D., & Aral, S. (2018). The org/sites/default/files/wellcome-global-moni
spread of true and false news online. Science, tor-2018.pdf
359(6380), 1146–1151. https://doi.org/10.1126/ Whistleblower Aid. (2022, May 5). New Disclosure
science.aap9559 Exposes Facebook’s Dangerous Campaign
Voss, W. G. (2016). After Google Spain and Charlie to Exert Leverage over Lawmaking Process.
Hebdo: The Continuing Evolution of European Whistleblower Aid. https://whistlebloweraid.
Union Data Privacy Law in a Time of Change. org/blog/australia-facebook-press-release/
Business Lawyer, 71(1). https://papers.ssrn. Wiewiórowski, W. R. (2020, March 25). Monitoring
com/abstract=2711996 spread of COVID-19 [Letter to Roberto Viola].
Vought, R. T. (2020). Guidance for Regulation of https://edps.europa.eu/sites/edp/files/publica
Artificial Intelligence Applications. The White tion/20-03-25_edps_comments_concerning_
House. https://trumpwhitehouse.archives.gov/ covid-19_monitoring_of_spread_en.pdf
wp-content/uploads/2020/11/M-21-06.pdf Wilson, S. L., & Wiysonge, C. (2020). Social media
Wacquant, L. (2012). Three steps to a historical and vaccine hesitancy. BMJ Global Health,
anthropology of actually existing neoliberalism. 5(10), e004206. https://doi.org/10.1136/bmjgh-
Social Anthropology/Anthropologie Sociale, 2020-004206
20(1), 66–79. https://doi.org/10.1111/j.1469- Wong, J. C. (2021, April 12). How Facebook let fake
8676.2011.00189.x engagement distort global politics: A whistle-
Waddell, K. (2018, September 6). A feud atop AI’s blower’s account. The Guardian. https://www.
commanding heights. Axios. https://www. theguardian.com/technology/2021/apr/12/face-
axios.com/academia-corporate-research- book-fake-engagement-whistleblower-sophie-
ai-9d525070-303d-47fd-b822-0fbffcac6740. zhang
html Work, B. (2015, January 28). The Third U.S. Offset
Wall Street Journal (Director). (2021a, September Strategy and its Implications for Partners
18). The Facebook Files, Part 4: The Outrage and Allies. https://www.defense.gov/News
Algorithm. In The Journal. WSJ Podcasts. /Speeches/Speech/Article/606641/the-third-
https://www.wsj.com/podcasts/the-journal/the- us-offset-strategy-and-its-implications-for-part
facebook-files-part-4-the-outrage-algorithm/ ners-and-allies/
e619fbb7-43b0-485b-877f-18a98ffa773f World Health Organization. (2020). Novel
Wall Street Journal. (2021b, October 1). The Coronavirus (2019-nCoV) (Situation Report-13).
Facebook Files. Wall Street Journal. https:// World Health Organization. https://www.
www.wsj.com/articles/the-facebook-files- who.int/docs/default-source/coronaviruse
11631713039 /situation-reports/20200202-sitrep-13-ncov-v3.
Waller, A., & Lecher, C. (2022, May 12). Facebook pdf?sfvrsn=195f4010_6
Promised to Remove “Sensitive” Ads. Here’s Wu, T. (2017, September 1). Is the First Amendment
What It Left Behind. The Markup. https:// Obsolete? Knight First Amendment Institute
themarkup.org/citizen-browser/2022/05/12/ at Columbia University. http://knightcolum
facebook-promised-to-remove-sensitive-ads- bia.org/content/tim-wu-first-amendment-
heres-what-it-left-behind obsolete
Zuboff 79

Wuerthele, M. (2020, April 10). Apple, Google team Zoltán, K. (2020, March 24). Hungary’s Coronavirus
on “contact tracing” smartphone software to Bill—Orbán’s bid for absolute power? Index.
combat spread of COVID-19. AppleInsider. https://index.hu/english/2020/03/24/hungary_
https://appleinsider.com/articles/20/04/10/ coronavirus_bill_viktor_orban_fidesz_sweep-
apple-google-partner-on-contact-tracing-to- ing_powers_indefinite_term/
combat-covid-19-spread Zuboff, S. (1988). In the age of the smart machine:
Yan, W. (2020, May 17). The U.S. Is Building A The future of work and power. Basic Books.
Contact-Tracer Army. HuffPost. https://www. Zuboff, S. (2013, June 25). The Surveillance
huffpost.com/entry/coronavirus-contact-tracers Paradigm: Be the friction - Our Response to the
_n_5ebd9dc1c5b66e2790db1035 New Lords of the Ring. FAZ.NET. https://www.
Zakrzewski, C., Verma, P., & Parker, C. (2022, July faz.net/aktuell/feuilleton/the-surveillance-para-
3). Texts, web searches about abortion have digm-be-the-friction-our-response-to-the-new-
been used to prosecute women. Washington lords-of-the-ring-12241996.html
Post. https://www.washingtonpost.com/tech- Zuboff, S. (2015). Big other: Surveillance capitalism
nology/2022/07/03/abortion-data-privacy-pros- and the prospects of an information civilization.
ecution/ Journal of Information Technology, 30(1), 75–
Zekavat, S. (Reza), Buehrer, R. M., Durgin, G. 89. https://doi.org/10.1057/jit.2015.5
D., Lovisolo, L., Wang, Z., Goh, S. T., & Zuboff, S. (2019). The Age of Surveillance
Ghasemi, A. (2021). An Overview on Position Capitalism: The Fight for a Human Future at
Location: Past, Present, Future. International the New Frontier of Power. PublicAffairs.
Journal of Wireless Information Networks, 28,
45–76. https://doi.org/10.1007/s10776-021- Author biography
00504-z
Zhang, D., Maslej, N., Brynjolfsson, E., Etchemendy, Shoshana Zuboff’s most recent book is The Age of
J., Lyons, T., Manyika, J., Ngo, H., Niebles, J. Surveillance Capitalism: The Fight for a Human
C., Sellitto, M., Sakhaee, E., Shoham, Y., Clark, Future at the New Frontier of Power. She is the
J., & Perrault, R. (2022). The AI Index 2022 Charles Edward Wilson Professor Emeritus at the
Annual Report. AI Index Steering Committee, Harvard Business School, a Faculty Associate at the
Stanford Institute for Human-Centered AI, Harvard Kennedy School’s Carr Center for Human
Stanford University. https://aiindex.stanford. Rights, and the Co-chair of the Prefiguration
edu/wp-content/uploads/2022/03/2022-AI- Committee of the International Observatory on
Index-Report_Master.pdf Information and Democracy.

You might also like