You are on page 1of 305

Data Governance

Data Governance
Value Orders and Jurisdictional Conflicts

A N K E S O PH I A O B EN D I EK
Great Clarendon Street, Oxford, OX2 6DP,
United Kingdom
Oxford University Press is a department of the University of Oxford.
It furthers the University’s objective of excellence in research, scholarship,
and education by publishing worldwide. Oxford is a registered trade mark of
Oxford University Press in the UK and in certain other countries
© Anke Sophia Obendiek 2023
The moral rights of the author have been asserted
Impression: 1
All rights reserved. No part of this publication may be reproduced, stored in
a retrieval system, or transmitted, in any form or by any means, without the
prior permission in writing of Oxford University Press, or as expressly permitted
by law, by licence or under terms agreed with the appropriate reprographics
rights organization. Enquiries concerning reproduction outside the scope of the
above should be sent to the Rights Department, Oxford University Press, at the
address above
You must not circulate this work in any other form
and you must impose this same condition on any acquirer
Published in the United States of America by Oxford University Press
198 Madison Avenue, New York, NY 10016, United States of America
British Library Cataloguing in Publication Data
Data available
Library of Congress Control Number: 2022943693
ISBN 978–0–19–287019–3
DOI: 10.1093/oso/9780192870193.001.0001
Printed and bound by
CPI Group (UK) Ltd, Croydon, CR0 4YY
Links to third party websites are provided by Oxford in good faith and
for information only. Oxford disclaims any responsibility for the materials
contained in any third party website referenced in this work.
Preface and Acknowledgements

We produce data every day. As learning, working, and socializing have moved
online, their significance has only increased. And yet many of the frameworks
that regulate how exactly data are governed are rarely discussed by the broader
public. While the European overhaul of data protection legislation has left many
annoyed by cookie notifications and regular email updates about long-forgotten
subscriptions, the extent to which our data are used, by whom, and for what pur-
poses is often considered an enigma. This book tracks how data governance has
developed in recent decades. It focuses on key controversies between the main
players of the regulatory landscape—the European Union and the United States—
and the major companies that hold significant amounts of data (and power). It
shows that many of the conflicts we see today are not that different from the ones
we had two decades ago. Instead, we see that they touch upon foundational nor-
mative questions about society: what societal goals matter most? And how do we
evaluate justifications and assess evidence provided by different parties?
This book began as a PhD dissertation at the Hertie School during my time at
the Berlin Graduate School for Global and Transregional Studies. As per usual,
there are many people to thank, so this is by no means an exhaustive list. First and
foremost, I would like to thank my thesis adviser Markus Jachtenfuchs for being
an exceptional supervisor and constant mentor. Thank you, for always giving me
the freedom and encouragement to conduct my own research but providing help-
ful guidance whenever I needed it. I owe you so much. I also wish thank my
second thesis adviser Frédéric Mérand, who always cheers me up with his kind
and thought-provoking comments, and my third thesis adviser Tine Hanrieder,
whose excellent research inspired a significant part of this book. Both have pro-
vided invaluable feedback and support. Further thanks to Adina Maricut-Akbik
for her smart and thoughtful comments as a doctoral committee member. I also
had the great pleasure of working with my project partner Daniëlle Flonk, who
is not only an exceptional colleague and scholar but also a great friend. I am
extremely grateful for the encouraging efforts by Oxford University Press, par-
ticularly by Dominic Byatt, Ryan Morris, the editorial team, and my kind and
constructive anonymous reviewers, to improve this book. My anonymous intervie-
wees have also kindly provided great insights. I gratefully acknowledge funding by
the Deutsche Forschungsgemeinschaft (DFG), FOR 2409, JA 772/8-1 and thank
the OSAIC research group for engaging discussions.
vi PREFACE AND ACKNOWLEDGEMENTS

Participants and discussants at several conferences and workshops have offered


great help. I would like to thank the participants of GigaNet 2018, the ISA
2019, GIG-ARTS 2019, the ECPR 2019, and the EISA 2019, particularly Meryem
Marzouki, Andrea Calderaro, Abel Reiberg, and Felix Berenskötter. I also thank
participants in the HIIG Interdisciplinary Workshop ‘Privacy, Data Protection
and Surveillance’ and the ‘Kolloquium Digitalpolitik’ in Berlin as well as partici-
pants in the joint workshops of the BGTS and the Hebrew University, particularly
Nicole Deitelhoff and Tobias Berger. I further thank Timo Seidl for extensive feed-
back, Jan Aart Scholte and Wolf Schünemann for their words of encouragement,
and Gerda Falkner for giving me the incentive to stick to my deadline.
Many thanks to the Hertie School’s Jacques Delors Centre and the faculty of the
BGTS, particularly Tanja Börzel, Andrea Liese, and Lora Viola for their help. I fur-
ther thank Franziska Loschert and the members of the Success Group who shared
their doubts, happy moments, and (non-)academic insights: Julien Bois, Julia Fuß,
Jörg Haas, Oleksandra Keudel, Cédric Koch, and Ann-Kathrin Rothermel. Of
course, I thank Amelie Harbisch for always being there.
Finally, for being some of my favourite people and for their constant support, I
thank Bella, Kai, Marie, Ronja, and especially my partner, Sebastian. I also thank
my family, particularly my parents, who supported me in so many ways, and my
sister, who—somehow—manages to make everything feel a little less daunting.
Vienna, 27 February 2022
Contents

List of Illustrations viii


List of Tables ix
List of Abbreviations x

1. Introduction: Jurisdictional Overlap in Data Governance 1


2. Theorizing the Resolution of Jurisdictional Conflicts 17
3. Value Orders and the Genesis of the Field of Data
Governance 51
4. Safe Harbour and Its Discontents: The Empowerment of
Individuals in the EU 77
5. Passenger Data in Air Travel: Establishing Data as a
Security ‘Tool’ 104
6. Financial Data Sharing: The Extended Arm of the US
Treasury 129
7. Access Denied? Struggles over Electronic Evidence 152
8. The Right to Be Forgotten: Moral Hierarchies of Fairness 174
9. Conclusion: Normative Visions across Jurisdictional
Conflicts 196

Appendix 1: List of Interviews 221


Appendix 2: MAXQDA Additional Material 222

References 229
Index 278
List of Illustrations

2.1. Jurisdictional conflict resolution processes 31


9.1. Visualization of code co-occurrence across cases 198
A.2.1. Visualization of code co-occurrence (excluding general safeguards) in the
empirical case studies 227
A.2.2. Visualization of code co-occurrence (within 10 paragraphs) in the
empirical case studies 228
List of Tables

2.1. Key texts for the constitution of the field 42


2.2. Selected cases 46
3.1. Substantive value orders in data governance 59
3.2. Procedural value orders in data governance 60
9.1. Visions of data governance (adapted from Obendiek 2022) 200
9.2. Jurisdictional conflict resolution processes across cases 207
List of Abbreviations

29WP Article 29 Data Protection Working Party


AEPD Agencia Española de Protección de Datos
AFSJ Area of Freedom, Security and Justice (European Union)
AG Advocate General (Court of Justice of the European Union)
AML anti-money laundering
APEC Asia Pacific Economic Cooperation
API advanced passenger information
ASEAN Association of Southeast Asian Nations
ATSA Aviation and Transportation Security Act (United States)
AU African Union
BMI Bundesministerium des Innern und für Heimat (Germany)
CAPPS Computer Assisted Passenger Prescreening System (United States)
CBP Bureau of Customs and Border Protection (United States)
CEO chief executive officer
CFT combating the financing of terrorism
CJEU Court of Justice of the European Union
CLOUD Act Clarifying Lawful Overseas Use of Data (United States) Act
CNIL Commission Nationale de l’Informatique et des Libertés (France)
CoE Council of Europe
CSR corporate social responsibility
DHS Department of Homeland Security (United States)
DMCA Digital Millennium Copyright Act (United States)
DOT Department of Transportation (United States)
DPA Data Protection Authority
DPC Data Protection Commission (Ireland)
EC European Commission
ECHR European Court of Human Rights
ECOWAS Economic Community of West African States
ECTC European Counter Terrorism Centre
EDAP European Defence Action Plan
EDPB European Data Protection Board
EDPS European Data Protection Supervisor
EP European Parliament
EU European Union
FATF Financial Action Task Force
FIP Fair Information Principle
FISA Foreign Intelligence Surveillance Act (United States)
FTC Federal Trade Commission (United States)
LIST OF ABBREVIATIONS xi

G7 Group of Seven
G8 Group of Eight
G10 Group of Ten
G20 Group of Twenty
GCHQ Government Communications Headquarters (United Kingdom)
GDP gross domestic product
GDPR General Data Protection Regulation (European Union)
GNI Global Network Initiative
IA Internet Association
IATA International Air Transport Association
ICANN Internet Corporation for Assigned Names and Numbers
ICAO International Civil Aviation Organization
ICDPPC International Conference of Data Protection and Privacy Commissioners
ICT information and communications technology
IGF Internet Governance Forum
ILC International Law Commission
IO international organization
IP international organization
IR Internet Protocol
ITU International Telecommunication Union
JHA Justice and Home Affairs (European Union)
JSB Joint Supervisory Body (Europol)
LIBE Committee on Civil Liberties, Justice, and Home Affairs (European
Union)
MEP Member of the European Parliament
MLAT Mutual Legal Assistance Treaty
NGO non-governmental organization
NSA National Security Agency (United States)
NSCT National Strategy for Counterterrorism (United States)
OAS Organization of American States
OECD Organization for Economic Cooperation and Development
PNR Passenger Name Record
PPD Presidential Policy Directive
SCC standard contractual clause
SME small to medium-sized enterprise
STS Science and Technology Studies
SWIFT Society for Worldwide Interbank Financial Telecommunication
T-CY Cybercrime Convention Committee (Council of Europe)
TFEU Treaty on the Functioning of the European Union
TFTP Terrorist Finance Tracking Program (United States)
TFTS Terrorist Finance Tracking System (European Union)
UK United Kingdom
UN United Nations
UNCTAD United Nations Conference on Trade and Development
UNGA United Nations General Assembly
xii LIST OF ABBREVIATIONS

UNHRC United Nations Human Rights Council


UNSC United Nations Security Council
URL uniform resource locator
US United States
UST United States Treasury
WIPO World Intellectual Property Organization
1
Introduction
Jurisdictional Overlap in Data Governance

In May 2018, the European Union’s (EU) General Data Protection Regulation
(GDPR) came into effect. And clearly, something significant was going on. The
BBC asked, ‘GDPR: Are you ready for the EU’s huge data privacy shake-up?’
(Coleman 2018), while an article in the New York Times suggested that ‘G.D.P.R.,
a New Privacy Law, Makes Europe World’s Leading Tech Watchdog’ (Satariano
2018). For the EU, the GDPR represented a significant step towards a digital sin-
gle market (Reicherts 2014, 3). It also offered an opportunity to assert European
interests and values in light of various privacy scandals. In 2013, the publication of
classified documents by the former United States (US) National Security Agency
(NSA) contractor Edward Snowden revealed widespread surveillance by intelli-
gence services (Greenwald 2013, 2015). In March 2018, just two months before
the GDPR came into effect, the Cambridge Analytica data scandal exposed poten-
tial voter manipulation in the US elections and the United Kingdom (UK) Brexit
referendum in 2016 (Cadwalladr and Graham-Harrison 2018). At the height of
the revelations, great faith was placed in the GDPR. The law arguably represents
the strictest and most advanced data protection legislation in the world. It signif-
icantly shapes the corporate conduct of multinational companies. And it delimits
how public agencies collect data about Europeans. Věra Jourová, then European
Commissioner for Justice, described the GDPR as a ‘loaded gun’ (cited in Khan
2019) in the hand of regulators. Indeed, data protection authorities have since hit
various companies with exceptional penalties for their lack of compliance. Fines
may reach up to 4 per cent of companies’ annual global turnover or €20 million,
whichever is higher. In 2021, the Luxembourg Data Protection Commission issued
a record penalty of €746 million to the online retailer Amazon, while the French
data protection authority claimed €90 million from Google Ireland, both cases
concerning cookie consent. In the same year, the Irish Data Protection Commis-
sion imposed a €225 million fine on the messaging service WhatsApp due to a lack
of transparency. While some charges are still under dispute, the total amount of
GDPR fines exceeded €1.5 billion in early 2022—and it is rising quickly.
The high level of fines is just one of the reasons why the GDPR was not wel-
comed unanimously. The then US Secretary of Commerce Wilbur Ross argued
in the Financial Times that ‘EU data privacy laws are likely to create barriers to

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0001
2 DATA GOVERNANCE

trade—GDPR creates serious, unclear legal obligations for both private and pub-
lic entities’ (Ross 2018). An opinion piece in the New York Times unambiguously
stated that ‘Europe’s Data Protection Law Is a Big, Confusing Mess’ (Cool 2018),
while CNN suggested that companies were ‘getting killed by GDPR’ (Kottasová
2018). Some feared ‘Europe’s data grab’ (Manancourt 2020), while others articu-
lated concerns about collective security and safety, arguing that ‘EU Privacy Laws
May Be Hampering Pursuit of Terrorists’ (Drozdiak 2019).
Whatever the verdict, the GDPR’s data protection standards are a striking
example of how regulatory standards in one jurisdiction may have de jure and de
facto effects beyond their territory. The new legislation has both a deeper reach
and a wider scope than the 1995 Data Protection Directive it replaced. Prin-
ciples such as the right to erasure (Art. 17), the demand for privacy by design
and by default (Art. 25), or the right to data portability (Art. 20) (GDPR 2016),
deepen the regulatory impact of the GDPR. Furthermore, the legislation sets strict
restrictions for cross-border data transfers (Art. 44, 45). Should the standards of
a third country’s data protection not be adequate, that is, ‘essentially equivalent’
(Schrems 2015, para. 73) to protection in the EU, the GDPR severely limits the
transfer of data. On the basis of the marketplace principle, the GDPR determines
the applicability of EU law for all businesses and organizations that target cus-
tomers residing in the EU, irrespective of the location of the service, data, or
processing practices. Due to the physical disconnection between data collection,
storage, and processing, this has implications for organizations across the globe.
Through this widening and deepening of data protection standards, the GDPR
shapes regulatory conduct in unprecedented ways and far beyond European
territory.
In response, some countries such as Brazil and Japan have adopted legislation
very similar to the GDPR (Goddard 2017). Indeed, Bradford has suggested that
through the imposition of high restrictions, the EU may elevate global regulatory
standards through a regulatory race to the top via the so-called ‘Brussels effect’
(2020, 167). In early 2022, 137 or about 71 per cent of countries have enacted
domestic legislation on data protection (UNCTAD 2022), compared with merely
17 countries with legislation in the 1980s, 36 in the 1990s, and 68 in the 2000s,
which demonstrates an accelerating growth (Greenleaf 2011, 2).¹ As the EU is
increasingly pushing for ‘digital sovereignty’ (Obendiek 2021; Pohle and Thiel
2020), its ‘“gold standard” for data protection’ is often considered to be a sign of
increasing assertiveness in the digital space (Schünemann and Windwehr 2020).
Yet while the Brussels effect (Bradford 2020) may sketch a rosy future for data
governance, concerns persist.
Critics have suggested that the exertion of unilateral influence tends to exac-
erbate regulatory conflicts (Kuner 2015, 2017; Newman 2008). In the absence of

¹ For a comprehensive overview, see Bennett and Raab (2006); Greenleaf (2014a).
INTRODUCTION 3

coherent multilateral cooperation on data, the EU and the US in particular, to date


the most significant regulatory powers in the governance of digital data, frequently
engage in policymaking with extraterritorial effects (Bignami and Resta 2015).
Extraterritoriality runs against established principles of public international law,
which specifies restrictions on the ability of states and organizations to conduct
regulatory actions outside their borders. While international law does not con-
cretely define the limitations of jurisdictions, there is general consensus that such
limitations exist (International Law Commission 2006).² In data governance, this
has contributed to the pervasiveness of overlapping jurisdictional claims. When
actors perceive these overlapping claims about the legitimate control over data
as problematic, ‘jurisdictional conflicts’ (Abbott 1986, 192) emerge. Such con-
flicts have been frequent and impactful, turning data governance into a critical
site of inter- and transnational struggles over competing governance objectives.
They constitute the starting point of this book. Conflicts in diverse areas includ-
ing transatlantic commercial data transfers (Chapter 4), counterterrorist initiatives
in air travel (Chapter 5) and finance (Chapter 6) as well as law enforcement
(Chapter 7) and the right to be forgotten (Chapter 8) have shaped data gover-
nance and its future trajectory globally. Not only the prevalence of conflicts but
also their continued re-emergence shows that the regulatory race to the top may
not be as smooth as one might hope for.
Through the analysis of five case studies set at the intersection of jurisdictional
claims involving actors in the EU, the US, and private companies, I analyse clash-
ing perceptions of the right conduct of data governance. I show that the rapid
nature of technological progress, the rise of new global players, and unforeseen
challenges in the digital space all tend to obscure remarkable continuities in data
governance. They play out in different contexts, with varying issues and across
fora, but conflicts actually tend to follow relatively constant patterns. Arguments
and actor constellations are similar across issue areas and over time, as deeply
normative questions about societal goals and values persist at the heart of central
conflicts in data governance. In resolving jurisdictional claims, actors attempt to
impose their specific vision of data governance and data on others to justify their
call to action. In Chapter 9, I develop an empirically grounded typology of four
distinct and competing visions of data governance that suggest different answers
to central questions of data governance today. What principles should we apply
when judging the legitimacy of data use, processing, or protection? Are data ‘the

² The International Law Commission (ILC) has defined extraterritoriality as follows: ‘An attempt to
regulate by means of national legislation, adjudication or enforcement the conduct of persons, property
or acts beyond its borders which affect the interests of the State in the absence of such regulation under
international law’ (International Law Commission 2006 Annex E, para. 2). For a discussion of the
different understandings of extraterritoriality concerning the EU data protection regime, see Kuner
(2015); for the US, see Putnam (2009); and for a general discussion, see Bignami and Resta (2015).
4 DATA GOVERNANCE

new oil’ to be consumed or ‘a force of nature’ to be controlled (Puschmann and


Burgess 2014)? And how do actors respond when these understandings clash? This
is what this book seeks to uncover.

1.1 Conceptualizing Data Governance

Questions of data use, processing, and governance have become a major topic of
concern in the digital age. The recent revelations of widespread surveillance of
journalists, activists, and dissidents by the private NSO Group is Guardian 2021)
have again highlighted how data not only are becoming ever more central in our
daily lives but are also at the core of global power asymmetries and vulnerabilities.
Data governance plays out in various arenas and areas, touching upon important
challenges to the liberal world order (Farrell and Newman 2021). Tech compa-
nies that hold massive amounts of data have emerged as uniquely powerful actors
on the global stage (Culpepper and Thelen 2019), impacting issues from hous-
ing to health to relationships. While they have enabled global connections and
friendships as well as economic growth, data-driven technologies such as facial
recognition or targeted advertising may also contribute to discrimination, pro-
filing, or election manipulation. Furthermore, data create significant regulatory
challenges, mainly because they are at odds with the territorial character of most
legal systems. Data may move from one jurisdiction to another in an instant, mul-
tiply indefinitely, and be in several places at once. They are protected by human
rights law but also constitute a key resource in today’s economy, while simulta-
neously holding significant value for law enforcement authorities and intelligence
agencies.
Efforts to govern digital data are embedded in the broader governance of the
internet, which comprises physical infrastructure (e.g. cables or exchange points),
virtual infrastructure (e.g. protocols or technical standards), content (e.g. postings
on social media or images on news websites), and data (Haggart et al. 2021, 2).
Data governance denotes ordering processes that include but are not limited to
the actions of states, international organizations, local authorities, and private
actors in relation to the processing, transfer, sharing, and general use practices
of digital data.³ Mechanisms may include legislation, common standards, terms
of service agreements, or looser recommendations. Data governance crucially
extends beyond privacy or data protection, including measures that aim to steer
rather than restrict data flows, such as mandatory processing or transfer practices
for security purposes.

³ Terminology and differences in understanding depend on societal and legal traditions. The under-
standing of data governance put forward here differs from an alternative use of the term to describe
data management within businesses. For a comprehensive overview of conceptual debates on privacy
and data protection, see, e.g., Solove (2008); Jörg Pohle (2017).
INTRODUCTION 5

This distinguishes data governance from informational privacy and data pro-
tection, which, historically, have constituted important starting points for debates
about the role of data in society and politics. Conceptual thinking on privacy
dates back to historical figures, such as Aristotle (2012) or Mill, who distinguished
spheres of governmental and private authority (Mill 1859; see also Arendt 1958,
ch. 2; Habermas 1962). In 1890, the US-American legal scholars and practition-
ers Warren and Brandeis more specifically linked threats to individual privacy to
technological progress. Their seminal article, which defined privacy as the ‘right
to be let alone’ (Warren and Brandeis 1890), influenced legal and philosophical
debates for decades.⁴ More recently, the scholarly debate has shifted towards a soci-
etal understanding of privacy (e.g. Nissenbaum 2009; Regan 1995; Solove 2002)
considering its public and collective dimensions (Regan 1995, 213), particularly
concerning the exercise of human dignity and autonomy (Rössler 2005).
Due to its more recent significance based on technological innovations, the
related concept of data protection has received less conceptual attention. Some
have questioned the relevance of data protection in view of surveillance and social
sorting (Lyon 1994); others argue that data, in contrast to information, are only
worthy of protection if they specifically relate to natural persons (Floridi 2005).
This idea is increasingly problematized in the context of big data and deep learn-
ing (Jörg Pohle 2017), as these technologies may make inferences about personal
information possible from seemingly anonymous or non-personal data. Concep-
tual discussions have increasingly outlined the normative value of the right to
data protection as distinct from privacy (Cannataci and Mifsud-Bonnici 2005;
Lynskey 2015a, 2019). For instance, Tzanou (2017) suggests that, in contrast to
privacy, which works as a defensive shield, data protection enables openness and
transparency.
In sum, data governance is related to but different from data protection in that
it comprises processes of ordering and steering rather than only the restriction of
data flows. This often contributes to competing governance objectives, which may
result in jurisdictional conflicts, which I outline in Section 1.2.

1.2 Jurisdictional Conflicts in Data Governance and Beyond

Jurisdictional conflicts are not specific to data governance but represent a general
phenomenon of contemporary global governance. Particularly in light of increas-
ing institutional density (Raustiala 2013), the global realm is frequently the site of
clashing interpretations of norms and rules. International relations (IR) literature

⁴ Scholars have since problematized the depoliticizing character of a clearly demarcated private
sphere, for example in relation to domestic abuse (Allen 1988; MacKinnon 1987; Nissenbaum 1998)
but also by questioning its value from a communitarian perspective (Etzioni 1999).
6 DATA GOVERNANCE

on ‘regime complexes’ (Keohane and Victor 2011) has conceptualized how legal
inconsistencies and possibilities for forum shopping may negatively affect global
order (Alter and Meunier 2009; Drezner 2013). The literature on fragmentation in
international law (Biermann et al. 2009) has identified ways of establishing norma-
tive compatibility in the absence of global government, including through varying
notions of global constitutionalism (Dunoff and Trachtman 2009; Grimm 2016;
N. Walker 2008). However, these scholars rarely examine how actors try to manage
conflicts that emerge at the interfaces of legal, institutional, or normative overlap
(Kreuder-Sonnen and Zürn 2020), particularly in the absence of institutionalized
dispute resolution mechanisms in areas such as internet governance (Flonk et al.
2020). Legal pluralists have questioned the normative desirability and practicality
of a coherent or even constitutionalized international order (Halberstam 2012;
Krisch 2010), also with regard to data governance (Kuner 2017). They suggest
that the continuous re-emergence and resolution of conflicts might contribute to
political equilibrium through mutual openness. Conflicts may thereby prevent the
unilateral imposition of specific norms and rules. Yet the empirical and conceptual
relevance of such conflicts warrants further empirical scrutiny.
In data governance, legal scholars have highlighted the challenges of diverging
data governance regimes (Bignami and Resta 2015; Greenleaf 2012; Kuner 2013;
Mitsilegas 2014; Zhao and Mifsud-Bonnici 2016). They have outlined the pol-
icy challenges of self- and co-regulation in the fragmented regulatory structure of
data governance (Bennett 1992; Bennett and Raab 2006; Flaherty 1989; Greenleaf
2014a), in particular focusing on the EU (Busch and Levi-Faur 2011; Mayer-
Schönberger 2010; Ochs et al. 2016) and contested jurisdictions (Ryngaert 2015).
However, this literature has rarely systematically investigated how conflicts play
out in practice (for an exception, see Bignami and Resta 2015; Brkan 2016; Farrell
2003, 2006; Kuner 2013; Mitsilegas 2014).
The legal perspective has shortcomings, because it fails to understand why juris-
dictional conflicts, despite their seeming resolution through bilateral agreements
or legislation, frequently and unexpectedly remanifest. Some scholars have sug-
gested that both the diversity of existing norms and rules (Bennett and Raab 2020)
and their clashes (Farrell and Newman 2019) are rooted in distinct ‘regulatory
visions’ (Kuner 2017, 918) and ‘values’ (Mitsilegas 2014, 290) and characterized
by a lack of normative ‘fit’ (MacKenzie and Ripoll Servent 2012, 74). Yet how juris-
dictional claims are rendered (il)legitimate according to these visions and values
is not yet sufficiently understood through theoretical concepts like norm diffusion
(Newman 2008) or regulatory regimes (Bennett and Raab 2006; Bessette and Hau-
fler 2001). Thus, while the continuous re-emergence of conflict appears to indicate
prevailing differences, it is not clear how these differences play out.
Yet understanding these conflicts is important, as they may have serious con-
sequences. For instance, the threat of transatlantic data deadlock has prompted
INTRODUCTION 7

US companies, including Meta and Alphabet, to consider their options to mit-


igate the high economic costs, including leaving the European market (Bodoni
2022). But it is not only private companies that stand to lose. According to a
recent study (European Centre for International Political Economy and Kearney
2021), a ban on transatlantic data flows would create losses of up to €420 billion
in the EU. Thus, actors have significant incentives to resolve conflicts in data gov-
ernance, which makes their prevalence a puzzling phenomenon. This speaks to
the deep nature of underlying divisions that persist despite existing institutional
solutions. As Farrell and Newman (2019, 95–101) illustrate, jurisdictional overlap
is not merely a system clash but involves the construction of preferences across
jurisdictions. Such divisions not only concern the transatlantic context but char-
acterize ongoing struggles to define legitimate principles of data governance across
the globe.

1.3 Research Question

This book aims to fill this conceptual and empirical gap in the literature by
investigating the following overarching research question:

How do actors resolve jurisdictional conflicts in the field of data governance?

This question is based on a procedural understanding of conflict resolution and


explicitly understands jurisdictional conflicts not only as specific instances of con-
flict but also as being embedded in broader and shifting normative dynamics. The
question comprises three sub-questions that speak to the different dimensions of
conflict resolution processes. First, I focus on how actors respond to overlapping
jurisdictional claims. The analysis focuses on justificatory practices that include
but also go beyond the legal and institutional aspects of jurisdictional claims. Sec-
ond, I analyse how, having agreed through their justificatory practices on a resolu-
tion strategy, power dynamics stabilize and mediate such agreements. Third, I aim
to understand how specific instances of jurisdictional conflict are interlinked with
the broader trajectory of data governance, investigating how data governance is
embedded in underlying institutional and normative processes of change and con-
tinuity. In short, this book aims to answer the following three ancillary questions:

1. How do actors deal with conflicting demands about legitimate control over
data?
2. How are agreements (temporarily) stabilized?
3. How are conflicts interlinked with the evolution of the field of data gover-
nance?
8 DATA GOVERNANCE

The book offers a conceptual and empirical framework to understand the uneven
resolution processes of jurisdictional conflicts in data governance: in Chapters 1–
3, I develop a theoretical framework based on international political sociology.
I combine a Bourdieusian perspective on fields (Bourdieu 1993, 1996) with the
framework of value orders, drawing on Boltanski and Thévenot (2006) to anal-
yse ‘jurisdictional conflicts’ (Abbott 1986). I set up the contours of the field of
data governance as a social space of interaction. Through inductive reconstruc-
tion, I sketch the normative dynamics and the relations between actors which, I
claim, are rooted in five distinct value orders. In Chapters 4–8, I explore these
dynamics with regard to more specific instances of conflict and illustrate how
they connect the broader trajectory of the field of data governance. The analy-
sis of five case studies set at the intersection of jurisdictional claims by actors in
the EU, the US, and private companies focuses on normative ordering processes
and conflict resolution strategies. The analysis extends beyond the mere applica-
tion of the theoretical and methodological framework and uses the case studies
as a medium for continued conceptual discussion. In Chapter 9, I develop an
empirically informed typology of four distinct visions of data governance based on
cross-case comparison before discussing the implications and outlook for future
research. In short, this book argues that jurisdictional conflicts have a so far under-
estimated normative quality that creates significant opportunities for disruption,
while constituting challenges for durable conflict resolution. In the context of
increasing interdependence, actors are united by the common objective of govern-
ing data but divided by deep normative convictions concerning the right conduct
and goals of data governance. Actors perceive differently not only the societal goals
of data governance but also what data are in the first place. While some perceive
data as a neutral tool for security agencies, others prioritize their character as an
economic resource, while some perceive them to be closely interlinked with indi-
viduals. When claims of legitimate control over data overlap, actors attempt to
impose their normative visions or socially constructed realities of data governance
(see also Obendiek 2022), on others to shape the criteria for evaluation. Mitigated
by actor positions in the field, they may find recognition only if they are able to
create normative fit with the underlying principles in the field. Frequently, actors
resolve conflicts through institutional remedies that fail to address these underly-
ing divisions. Therefore, the field is characterized by the continuous re-emergence
of conflict.
In Sections 1.4–1.6 of this chapter, I first relate the book to relevant strands
of legal and political science research. Second, I outline the main theoretical and
empirical arguments of the book before, third, providing a detailed outline of the
following chapters.
INTRODUCTION 9

1.4 Drawing on Existing Literature: Conflicts in Data Governance

After having outlined the conceptual understanding of data governance, in this


section I outline how the book relates to the existing literature. The book is situated
at the intersection of legal and political science research on data governance as
well as sociological approaches that offer theoretical insights into the negotiation
of jurisdictional claims, emphasizing the interrelatedness of power and normative
claims.
The literature on data governance from a political science perspective can be
loosely divided into two different strands. First, scholars have focused on the
broader evolution of data governance and transatlantic relations, particularly from
an institutional perspective (see, e.g., de Goede 2012b; de Goede and Wessel-
ing 2017; Bellanova and de Goede 2022; Farrell and Newman 2016, 2018, 2019;
Kaunert et al. 2012; MacKenzie and Ripoll Servent 2012; Pawlak 2009; Suda
2013, 2017). Some scholars have analysed the contested and uneven rise of EU
(data-based) counterterrorism measures, focusing on securitization (Kaunert and
Léonard 2019) and responsibility discourses (Kaunert et al. 2012, 2015), par-
ticularly with regard to the European Parliament (Ripoll Servent 2013, 2017;
Ripoll Servent and MacKenzie 2011). Others have discussed the implications
of the Snowden revelations for data governance, highlighting the effect on the
GDPR (Kalyanpur and Newman 2019b; Laurer and Seidl 2021; Schünemann
and Windwehr 2020). A recent volume edited by Fabbrini, Celeste, and Quinn
(2021) analyses instances of tension and cooperation in transatlantic data sharing
in law enforcement, commercial data, and the right to be forgotten, particularly
from a policy perspective. The most notable contribution is a recent book by Far-
rell and Newman (2019, along with other work, e.g. 2016, 2018) which traces
transatlantic data governance disputes in the area of security cooperation. They
highlight the relevance of transnational coalitions of high-ranking public officials
which considerably shape policymaking and contribute to the proliferation and
entrenchment of bilateral agreements and either insulate domestic institutions or
use ‘cross-national layering’ to instigate domestic change (Farrell and Newman
2019, 19–20; see Pawlak 2009 for a similar argument on networks). However,
scholars rarely illustrate why these actors aim to protect or change institutions or
policies in the first place. From a conceptual perspective, the normative character
of these struggles has not been sufficiently addressed.
Second, in contrast, a considerable number of academic articles highlight the
normative nature and implications of current data governance practices. Some
scholars allude to normative divides between actors (Kuner 2017; Mitsilegas
2014; Newman 2008) and highlight the normative significance of data governance
10 DATA GOVERNANCE

(Lynskey 2015a, 2019; Tzanou 2017). This includes criticism of ‘surveillance cap-
italism’ (Zuboff 2018), state surveillance (Bauman et al. 2014; Bigo et al. 2013;
Parkin et al. 2013), and ‘dataveillance’ (Amicelle 2013; Amoore 2020; Amoore and
de Goede 2005; Aradau 2020), particularly concerning the preventive character
of data-based security policies (Mitsilegas 2020), or the demand for an internet
constitution (Fischer-Lescano 2016). An emerging literature particularly in Sci-
ence and Technology Studies (STS) outlines private responsibilities (Chenou and
Radu 2019) and imaginaries (Mager and Katzenbach 2021) but focuses mainly
on the effects rather than the foundation of such visions. While data protection
principles, including the GDPR, explicitly articulate deeper normative aspirations
(see Chapter 3), the sociological relevance of normative arguments in data gover-
nance is insufficiently theorized. An emerging branch of literature that considers
the normative order of the internet (Kettemann 2020), internet constitutionalism
(Redeker et al. 2018), or value judgements in algorithmic regulation (Bellanova
and de Goede 2022; Johns and Compton 2022; Ulbricht 2018) is just begin-
ning to understand the underlying normative principles in the field of internet
governance.
In sum, jurisdictional conflicts, and particularly the processes of resolving
them, remain undertheorized and understudied (for the most significant excep-
tion, see Farrell and Newman 2019; Obendiek 2022). While the literature
illustrates the existence of regulatory differences and the prevalence and nor-
mative implications of transatlantic and intra-EU conflicts in data governance,
the sociological relevance of normative arguments is still underestimated. Of
course, data governance is not only a question of values and the pursuit of
the common good, but ignoring the normative dimension of data governance
to the benefit of legal or power dynamics risks obscuring the deep divisions
that contribute to the prevalent re-emergence of conflicts. To address this lack
of understanding, this book adopts a bifocal theoretical framework inspired
by international political sociology, uniting a focus on power asymmetries
with the investigation of justificatory practices that draw on higher normative
principles.
In Chapter 2, I develop a theoretical framework selectively drawing on Bour-
dieusian field theory (Bourdieu 1996) and Boltanski and Thévenot’s (2006) soci-
ology of critique. Field theoretical approaches are particularly well suited to the
study of complex relational dynamics in contested policy areas (Mérand 2010;
Pouponneau and Mérand 2017; Vauchez and de Witte 2013). The sociology of
critique by Boltanski and Thévenot contributes a moral perspective to the analy-
sis of contested subject areas through the concept of value orders. The developed
framework draws on these approaches as theoretical building blocks to bridge
conceptual divisions. It thereby contributes to approaches that stress the reflexive
capacities of actors and strategic action in a social space that is already hierar-
chized (Fligstein and McAdam 2012; Sending 2015). While some scholars have
INTRODUCTION 11

identified benefits from theoretical cross-fertilization (e.g. Leander 2011), this spe-
cific framework is new. It adds, I argue, reflexivity and agency to the Bourdieusian
perspective while making the value orders framework more receptive to dynamics
of power and hierarchy. To analyse conflict resolution processes in data gover-
nance, I focus on how field dynamics structure the ‘space of possibles’ (espace de
possibles, Bourdieu 1996, 234–9) for justifications and practices. More specifically,
I argue that the field of data governance is structured by distinct value orders that
each define a common good, invoke different reference communities, and out-
line valued objects as well as contrasting dystopian scenarios. Together with calls
for action, they form comprehensive and competing visions of data governance.
This book shows that diverging conceptualizations of societal goals constitute an
important factor in the emergence and resolution of conflicts between the EU, the
US, and private companies.

1.5 Main Argument and Contribution

The contribution of the book is threefold: first, it reconstructs data governance as


a field that is built around the idea that control over data is a key component in the
pursuit of societal goals. At the same time, it shows how actors perceive these goals
differently. It links these normative differences to broader debates on the substan-
tive and procedural goals of global governance. Second, the framework integrates
positional asymmetries with the formulation of normative arguments and demon-
strates how actors are constrained in their resolution of normative disputes. Third,
following from that, it outlines the ways in which actors link abstract normative
ideas with concrete calls for action. In the following, I briefly outline the main
arguments of the book while also highlighting the contributions of the different
chapters.
The book puts forward three general propositions concerning how actors
resolve jurisdictional conflicts in data governance: first, I argue that the field of data
governance is characterized and structured by multiple and shifting understand-
ings of worth. There is a plurality of normative claims regarding the goals of data
use and regulation, for example to prevent terrorism and crime, foster economic
progress, or secure human rights. These understandings are bound to clash. Yet the
normative nature of such claims is not sufficiently theorized even in recent studies
on the topic. The literature stresses predominantly the tensions between security-
driven surveillance practices and fundamental rights protections (e.g. Farrell
and Newman 2019). Justificatory practices highlight the foundational nature of
diverging conceptualizations of territoriality, sovereignty, and globalism in the
context of increased interconnectedness. The book links justificatory practices in
data governance to current debates in IR regarding the contested objectives of
global governance (e.g. de Wilde 2019; Zürn and de Wilde 2016). It shows data
12 DATA GOVERNANCE

governance as a crucial site of conflicts regarding the foundations and goals of


societal, economic, and political life.
Second, I demonstrate how relations of power shape these conflicts over mean-
ing. I argue that actors reflect on their justifications but are constrained by a ‘space
of possibles’ (Bourdieu 1996, 234–9). More specifically, I conceptualize transna-
tional data governance as a distinct field of struggle (Bourdieu and Wacquant
1992, 122). Interaction is structured by field-specific logics, subject to continu-
ous meaning-making processes, and focused on the control over data as a specific
type of capital. The increasing salience of data has contributed to a self-reinforcing
logic that suggests that data governance is central in the pursuit of societal goals.
While actors share an understanding of the high significance of control over
data per se, deep normative divisions give rise to conflicting policy responses.
Actors attempt to gain recognition for their claims and struggle to ‘produce and
to impose the legitimate vision of the world’ (Bourdieu 1989, 20). Through its
relational approach to processes of ordering, the field approach transcends typical
distinctions, such as public and private, global and national. Actors’ success largely
depends on their capacity to reflect on the possibilities within a specific field (Flig-
stein 2001, 114). The normative character of the field creates space for disruption.
In particular, challengers in more peripheral positions may disrupt the status quo
by formulating an alternative vision of the field. As I outline in Chapter 4, a Euro-
pean citizen complaint led to the invalidation of the transatlantic data transfer
agreement against the interest of powerful actors such as the European Commis-
sion, the US government, and major companies. Yet entrenched structures may
foster convergence in the long term.
Third, in Chapter 9, I develop an empirically grounded typology of four dis-
tinct and competing visions of data governance. These ‘vision[s] of the social
world’ (Bourdieu 1985, 731) can be understood as patterns of interpretation in
data governance. They combine specific calls for action and the simultaneous ref-
erence to specific common goods into coherent visions of the current and future
character of internet governance. When actors draw on these visions, they concep-
tualize data as a governable object with specific characteristics. This demonstrates
how data take form as an ‘epistemic object’ (Knorr Cetina 2005, 185) that tem-
porarily stabilizes and materializes through the production of knowledge. They
also outline constituencies, that is, the claimants’ actorness in relation to others.
Justifications indicate (a) a deeply entrenched vision of data governance as local
liberalism. Proponents emphasize the protection of individual and citizen rights,
in spite of potential extraterritorial impact. Others emphasize (b) the significance
of data for economic progress and innovation to envision data governance as the
digital economy. This vision draws heavily on the construction of data governance
as an economic resource but also fosters the idea of limited private responsibility
to reinforce trust in the digital economy. Through the vision of data governance as
(c) a security partnership, underlying processes of securitization link global data
INTRODUCTION 13

governance and cooperation with the search for safety and security. Proponents
attempt to establish data as a crucial tool in the fight against terrorism and crime.
Finally, (d) a vision of global cooperation emphasizes global cooperation in the
pursuit of compatible data protection principles.
In short, I suggest that the normative plurality in data governance is undertheo-
rized but has significant implications for the continuous re-emergence of conflict.
I argue that while challenging actors are particularly capable of formulating an
alternative vision of the field, positional differences and the pervasiveness of the
field’s structuring effects foster convergence in the long term. In resolving jurisdic-
tional claims, actors loosely try to impose one of four competing visions of data
governance that conceptualize the field and data according to normative objec-
tives and link these to specific calls for action. While the analysis is focused on the
transatlantic context, which has been the main site of both regulation and conflict
since the 1990s, the findings of this book are globally relevant. As countries across
the globe, including major economic players such as China and India, have been in
the process of adopting new data protection measures, it is crucial to understand
the normative questions at the heart of these legal and technical decisions.

1.6 Chapter Outline: Analytical Framework


and Empirical Context

The book is structured as follows: in Chapter 2, I outline my theoretical approach


focusing on the field as a social space of interaction and the space of possibles that
constrains actors in their appeal to higher moral principles. The chapter discusses
the contributions and limitations of the selected theoretical building blocks. I
argue that the nature of jurisdictional conflicts is more foundational than currently
established in the literature and is related to fundamentally different conceptions
of the object, goals, and community of governance. Actors draw on a combination
of material and discursive resources to impose their specific criteria for evalua-
tion. I suggest that actors holding an ‘“oppositional” perspective’ (Fligstein and
McAdam 2011, 4), because they are less embedded in the field, are more likely to
successfully articulate an alternative vision of the field. The chapter also outlines
the research design of the book. It discusses case selection, data collection, and the
interpretative text analysis in the empirical case studies as well as the comparison
across cases.
Chapter 3 sets the stage by establishing the context and logics of jurisdictional
conflicts. It establishes data governance as a field, with the EU as a ‘global regu-
latory hegemon’ (Bradford 2020, xvii) in data protection and the US in a central
position relating to security-related data governance (Suda 2017). Drawing on the
analysis of key texts that represent constitutive moments for the field, I inductively
identify five value orders that express moral worth in two dimensions. On the one
14 DATA GOVERNANCE

hand, three substantive value orders are based on the common goods of security,
fairness, and production, while, on the other hand, two institutional orders are
based on sovereignty and globalism. I trace the emergence and institutional mani-
festations of these orders. To embed their normative underpinnings conceptually,
I outline similarities to broader debates in political philosophy. The value orders
structure processes of meaning-making and define morally worthy and deficient
behaviour. They define a common good, a reference community, and dystopian
and valued principles as well as criteria of evaluation.
Chapters 4–8 are empirical. Here, I analyse how actors respond to and deal with
jurisdictional conflicts and how their resolution is interlinked with the broader tra-
jectory of the field of data governance. I draw on a discourse-analytical reading of
policy documents, court proceedings, and press releases as well as semi-structured
expert interviews and observations. Using the value orders as my analytical frame-
work, I analyse five jurisdictional conflicts that involve public and private actors
loosely associated with the EU and the US, the most significant regulatory pow-
ers in data governance. Each case makes a specific contribution to understanding
conflict resolution processes and key dynamics, such as the fragility (Chapter 4) or
stabilization of agreements (Chapters 5, 6, and 7), the disruptive role of non-state
actors (Chapters 4, 7, 8), or the political conduct of courts (Chapters 4 and 8).
Chapter 4 focuses on the disruption of transatlantic commercial data flows
after the Snowden revelations. Despite intense criticism of the lack of protection
through the framework which had governed commercial data transfers since 2000,
the European Commission seemed extremely reluctant to address how it facili-
tated US intelligence data access. A complaint by the Austrian lawyer Maximilian
Schrems resulted in the invalidation of the agreement by the Court of Justice of
the European Union (CJEU) in 2015 and imposed principles of the order of fair-
ness, particularly fundamental rights consideration, on the negotiations. Hence,
this empirical chapter highlights the vulnerability of highly institutionalized and
power-laden transatlantic relations, demonstrating the disruptive effects of indi-
vidual actors. The fragile normative balance of institutional agreements creates
opportunity structures particularly for actors holding more peripheral positions
in the field.
Chapter 5 outlines the emergence of a global regime of Passenger Name Records
(PNRs) sharing. In 2001, the US passed a law that required airlines to share pas-
senger data with US authorities, contributing to significant tensions in the EU–US
relationship. Despite several disruptions in the legalization of the jurisdictional
claim, the discourse significantly shifted from its initial focus on the order of fair-
ness. I explore the increasing imposition of security as a criterion of evaluation
and discuss the difficult and uneven rise of the EU as an international countert-
errorism actor in the field. This chapter also illustrates how shared concepts of
responsibility constrain the space of possibles of challengers such as the European
Parliament.
INTRODUCTION 15

In Chapter 6, I examine a jurisdictional conflict between the US, the EU, and the
Belgian Society for Worldwide Interbank Financial Telecommunication (SWIFT)
in financial data sharing. In 2005, widespread access by US authorities to SWIFT
data on financial data flows strained the transatlantic relationship. While EU actors
initially strongly contested the jurisdictional claim of the US and the European
Parliament even rejected a high-level bilateral agreement after the Lisbon Treaty,
EU institutions increasingly supported financial data sharing for security pur-
poses. As in Chapter 5, the successful imposition of security as the criterion of
evaluation has entrenched financial data sharing as a necessary security measure.
The chapter shows the importance of ‘composite objects’ (Boltanski and Thévenot
2006, 279) that unite characteristics of different orders. Review and oversight
mechanisms contribute to a lasting compromise between principles of the order
of fairness and the order of security, despite limited impacts on data practices.
In Chapter 7, I explore a conflict between the US and the private company
Microsoft in online law enforcement. Access to electronic evidence such as email
account information by law enforcement authorities abroad raises questions of ter-
ritoriality and public–private cooperation. In the conflict, Microsoft refused US
authorities access to data stored in Ireland. The conflict demonstrates how the
significant role of private actors in data governance makes public actors reliant
on the goodwill and cooperation of private actors. The emergence of diverse
regional regimes on e-evidence shows how actors draw on different normative
and institutional resources to sustain their jurisdictional claims.
Chapter 8 discusses the contested right to be forgotten and the more recent
debates that not only stress the balance between privacy, freedom of information,
and business interests but also address the question of extraterritorial applica-
bility. In the latest instance of conflict, Google has become a focal point for the
fight to preserve freedom of information online, while the French data protection
authority (DPA) emphasizes the necessity of protecting European citizens’ privacy
also beyond European borders. The case demonstrates the varying capabilities of
actors to establish compelling jurisdictional claims, depending on field-inherent
restrictions and the normative quality of arguments.
In Chapter 9, I compare justificatory practices across conflicts and outline how
they are embedded in broader struggles between value orders. I conceptualize
these struggles through four distinct and competing visions of data governance
that link the principles articulated in the value orders with concrete calls for
actions. Subject to existing hierarchies, I suggest that actors struggle to improve
their position and invoke both normative principles and other resources to claim
jurisdiction. They point to hierarchical contradictions and inconsistencies or rely
on a set of alternative principles of worth to contest the status quo. Their suc-
cess varies depending on their capacity to establish an alternative vision of the
field. I outline the contributions of the book in the context of global data gover-
nance. Pointing to implications and future trajectories, the chapter discusses the
16 DATA GOVERNANCE

implications of the findings beyond the transatlantic context. The chapter points
to the significance of conflicts over meaning involving data localization practices,
global infrastructure, and sovereigntist assertions, particularly considering the rise
of China as a central actor in data governance.
In sum, the book offers a fresh perspective on the processes of meaning-making
that accompany the constitution, stabilization, and contestation of jurisdictional
claims in the field of data governance. I suggest that these conflicts are symp-
tomatic of larger normative and institutional struggles that play out not only
between the EU and the US but also globally. In the coming years, data gov-
ernance frameworks across the globe will be (re)negotiated. As an increasing
number of actors extend their claim over data, this book shows that a clear idea of
what data mean for us and what is at stake when governing them interlinks with
fundamental questions of collective values and goals. Data governance unites fea-
tures of increasing importance in contemporary politics, including the changing
relations between citizens, states, and private companies. Thus, it provides indi-
cations of how struggles about the global order may evolve under conditions of
ever-increasing inter- and transnational interlinkages.
2
Theorizing the Resolution of Jurisdictional
Conflicts

In Chapter 1, I have outlined data governance as a site of frequent conflicts. I pre-


sented the challenges their resolution poses to a perspective narrowly focused on
institutions, legal questions, or the exercise of power. I have suggested that the pat-
terns of conflict and convergence in data governance can be understood through
greater recognition of the distinct normative visions that the field actors articulate
in conflicts. In this chapter, I disentangle theoretically how actors resolve jurisdic-
tional conflicts. I address the three ancillary questions, that is, how actors respond
to overlapping jurisdictional claims, how agreements are stabilized, and how the
resolution processes of jurisdictional conflicts are interlinked with the evolution
of the field and its normative structure. I outline a theoretical framework based
on a sociological approach to jurisdictional conflicts, defined as both immediate
situations of justifications and power struggle and instances of a more profound
competition between distinct conceptualizations of the common good. The frame-
work establishes a bifocal approach considering both specific events, such as
instances of conflict, and their interlinkages with the ‘longue durée’ (Madsen 2013,
199) of data governance.
In short, I argue that jurisdictional conflicts in this area have a so far underes-
timated normative quality. They are multidimensional in the sense that they not
only comprise institutional or power dynamics but are rooted in deep normative
divisions. Conflict resolution processes cannot be fully understood without the
recognition of these divisions. The absence of a ‘consensual “taken for granted”
reality’ (Fligstein and McAdam 2011, 4–5) gives in particular non-state and more
peripheral actors without formal authority positions to challenge the status quo by
generating an imperative of justification. The necessity of interaction in the face
of problematized overlapping jurisdictional claims opens space for reflection on
diverging governance objectives. This creates possibilities for contestation. In their
responses and challenges, actors are to some extent constrained by the principles
of the field that delineate the ‘space of possibles’ (Bourdieu 1996, 234–9). Over
time and with deeper engagement, actors increasingly tend to replicate the domi-
nant normative structure rather than articulate profound normative challenges. As
they fail to sustain an alternative vision of the field, even contesting actors are fre-
quently complicit in the creation or institutionalization of structures and policies
that undermine their wider normative agenda. In the beginning or at the margins

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0002
18 DATA GOVERNANCE

of a field, actors are less socialized into a specific context and develop, as Harbisch
terms it, ‘change agency’ (2021, 42). Over time, they lose their ‘“oppositional” per-
spective’ (Fligstein and McAdam 2011, 4). This explains why institutionally weak
actors are often able to severely disrupt even long-standing institutionalized agree-
ments against the interests of more powerful actors but then fail to transform these
challenges into change in the long term. Thus, the resolution of conflicts in data
governance and its consequences depend on the character of the normative chal-
lenge, the actors’ capacity to engage in justificatory practices that create shared
normative understandings, and the existing hierarchization in the field.
The chapter will proceed as follows: first, I outline the theoretical approach in
relation to the existing literature and point to the specific contributions of the
relationalist approach taken here. Second, I sketch the concept of jurisdictional
conflicts as a particular type of conflict. Third, I establish the contributions of
Bourdieu’s critical sociology and the sociology of critique based on Boltanski and
Thévenot. Regarding critical sociology, I establish the field as a space of interac-
tion that is structured around the common ‘illusio’ (Bourdieu 1996, 331–6; see
also 1990, 1998) of control over data. I explore the constitution and specific char-
acteristics of this field in more detail in the Chapter 3 but outline the mechanisms
that shape actor behaviour in principle. With regard to the sociology of critique, I
specify the categorization of justifications in situations of normative dispute, draw-
ing on the concept of value orders. Fourth, I outline the theoretical framework
that aims to conceptualize the resolution of conflicts in data governance. In three
steps, I explore the resolution of the conflict through its emergence, responses, and
outcome. Fifth, I outline the research design and method, including case selection.

2.1 Contributions to Existing Literature

The argument presented in this chapter draws on selected concepts from interna-
tional political sociology to highlight the constructed and historical nature of the
space, objects, and categories of governance that significantly shape the evolution
and resolution of these conflicts. It draws on a relationalist approach that fore-
grounds processes of mutual construction and stabilization between actors and
objects (McCourt 2016, 479). First, the theoretical framework establishes the soci-
ological nature of jurisdictional conflicts. It suggests that these conflicts involve
claims of legitimate control over an issue (Abbott 1986, 191). These are themselves
constitutive of the contents of this issue as an object of governance as well as the
form and goals of its governance. Second, the book draws on field theory. The the-
oretical framework establishes the hierarchized space of interaction that organizes
the site of jurisdictional conflicts as a Bourdieusian field (Bourdieu 1996, 1998).
As I outline in Chapter 3, actors engaged in the field are united by the percep-
tion that data governance is meaningful in the pursuit of the common good, the
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 19

illusio of the field. The field as a concept offers a particularly useful perspective on
the co-constitutive relationship between specific instances of conflict and under-
lying structures. Third, to unpack and categorize the deep normative divisions in
jurisdictional claims and conflicts that are formed and expressed in justificatory
practices, I combine this approach with the concept of value orders, drawing on
Boltanski and Thévenot (2006). Value orders stipulate distinct conceptualizations
of the common good, reference communities, dystopian scenarios, and evaluative
mechanisms. I rely on the concept as a powerful heuristic tool for analysing how
actors define jurisdictional claims as morally worthy or deficient, depending on
the perceived contribution to common societal values.
Through the understanding of the multidimensional nature of jurisdictional
conflicts, the approach seeks to make three contributions to existing debates on
data governance and jurisdictional conflicts. First, the bifocal approach illuminates
both short- and long-term dynamics. This helps understand which factors in dif-
ferent issue areas and contexts shape the likelihood of disruption and conflict or
stabilization and convergence. Existing approaches have illustrated the relevance
of (inter)dependence (Argomaniz 2009; Kaunert et al. 2012, 2015) and access to
transnational fora (Farrell and Newman 2019), the difficulty of reconciling differ-
ent legal approaches (Bignami and Resta 2015; Kuner 2015; Reidenberg 2000), or
the role of norm entrepreneurs (Newman 2008). These approaches tend to illus-
trate either short- or long-term dynamics but rarely both. The emphasis on social
learning processes based on the asymmetrical transatlantic security relationship
(Argomaniz 2009) does not explain the continual re-emergence of disruptions in
similar issue areas. In contrast, the literature on legal pluralism has outlined not
only the empirical but also the normative relevance of the coexistence and reg-
ular clashes between legal systems in a field (Krisch 2010). Yet the macro-level
approach rarely considers how the political, legal, and normative claims play out in
practice and are mediated by positional asymmetries. By emphasizing the norma-
tive nature of conflicts as well as their embeddedness in a hierarchically structured
field, this book outlines how short-term disruption strategies often fail to instigate
change in the long term.
Second, the approach allows the problematization of data as an object of gov-
ernance and conceptualizes the plurality of normative underpinnings of such
constructions. This helps understand how more peripheral actors contribute to
disruptions and how even strongly institutionalized compromises fail. While most
scholars perceive data as fixed, this book contributes to existing interdisciplinary
work on data governance that analyses how data are constructed as govern-
able rather than simply how they are framed. This includes, for example, work
in media studies (van Dijck 2014; Gitelman 2013) or critical security studies
(Amoore 2011; Aradau 2020; de Goede 2018b; de Goede and Wesseling 2017).
Drawing on Science and Technology Studies (STS) or ‘algorithmic regulation’
(Yeung 2018), scholars have problematized data-based social ordering through
20 DATA GOVERNANCE

technological and infrastructural choices (Bellanova and de Goede 2022; Ulbricht


2018). Unpacking the various value judgements that construct data as governable
further highlights the normative character of governance and policy decisions that
assign or reject responsibility (Kjaer and Vetterlein 2018). In conceptualizing how
these claims are embedded in comparatively stable value orders, the theoretical
framework provides a more profound account of stability and change in an area
usually represented as fast-paced. Value orders highlight the diversity of princi-
ples and evaluative criteria that actors draw on in their justifications. When even
powerful actors fail to create a normative fit between their jurisdictional claims
and the normative principles in the field, their claim of legitimate control over
data is likely to be rejected. This also contributes to a better understanding of the
underestimated role of moral arguments in contemporary international politics
(Hanrieder 2016; Niemann 2019).
Third, the field approach focuses on the interrelationships between actors
rather than specific actor characteristics. This helps us understand how jurisdic-
tional conflicts are resolved through interactive negotiation processes and across
jurisdictions (Suda 2013, 773) rather than through institutions or the unilateral
assertion of power. Existing scholarship has effectively illustrated the relevance of
access to transnational fora and informal networks (Farrell and Newman 2019;
Pawlak 2009) or the changing role of the European Parliament in view of gain-
ing formal authority (Ripoll Servent 2013; Ripoll Servent and MacKenzie 2011).
The ‘New Interdependence Approach’ outlined by Farrell and Newman (Farrell
and Newman 2016), the literature on ‘regime complexes’ (Alter and Meunier
2009), and emerging research on overlapping spheres of authority in the global
realm (Kreuder-Sonnen and Zürn 2020) have specified how the contestation of
governance objectives takes place within but also across institutional boundaries.
Yet in these accounts, the role of non-state actors as active participants in these
structures, particularly private companies, is still undertheorized. Most existing
analyses tend to focus on one specific type of actor (de Goede 2018b, 26; Hijmans
2016a; Raab and Szekely 2017). The problematization of analytical categories in
sociological approaches illustrates how the objects of and participants in these
negotiation processes are themselves actively constructed. This contributes to a
procedural understanding of conflict resolution processes that does not focus on
specific actor characteristics but on their role in meaning-making processes. In
sum, through the focus on both long- and short-term dynamics, by unpacking con-
ceptualizations and categorizations of principles of worth, and by emphasizing a
relational approach rather than actor categories, the framework is able to illumi-
nate the variation in resolution strategies and short- and long-term outcomes and
consequences.
The sociological approaches share a deeper recognition of the continuous strug-
gle to define and conceptualize the social world through knowledge, power, or
justification. The main aim of this chapter is to present a framework that enables
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 21

fruitful empirical analysis of jurisdictional conflicts (Abbott 1986) in the field of


data governance (Bourdieu 1996, 1998) through the lens of justifications that link
to various value orders (Boltanski and Thévenot 2006). Nevertheless, it should
be noted that this theoretical account bridges conceptual divisions between soci-
ological perspectives that are subject to intense debate. While some scholars
emphasize the differences between Bourdieu and Boltanski (e.g. Emirbayer and
Johnson 2008), I follow those scholars (Bénatouïl 1999; Fowler 2014; Leander
2011; Susen 2015) who have demonstrated that some of the enunciated differences
may actually benefit from increased dialogue (for a conceptual discussion, see the
contributions in Susen and Turner 2014). I argue that productive exchange par-
ticularly concerns the articulated conceptions of agency and normative elements,
which I outline later. Therefore, while aiming for transparency regarding the con-
ceptual baggage these approaches carry, I mainly try to illustrate the benefits of
their application.

2.2 Three Elements of a Jurisdictional Claim

As I established in Chapter 1, due to the prevalence of cross-border data flows,


data governance frequently features some form of institutional or norm-based
overlap. To conceptualize conflicts that emerge in the context of such overlap,
I employ Abbott’s concept of ‘jurisdictional conflicts’ (1986, 192). Rather than
being narrowly understood as legal jurisdictions, as in the conflict of laws, jurisdic-
tional claims comprise elements of normativity and knowledge. Conflicts emerge
when these claims overlap and at least one actor problematizes this overlap (see
also Holzscheiter et al. 2020). I suggest that jurisdictional claims comprise three
elements:

1. the object of governance, i.e., what they attempt to control;


2. their reference community, i.e., the relations between them and the affected
stakeholders;
3. the justification, i.e., the legitimacy of their claims.

First, when actors claim ‘jurisdiction—that is, legitimate control—of a problem’


(Abbott 1986, 191), they also define an issue and legitimate forms of its governance.
The diverging conceptualizations of objects and objectives of governance high-
light the nature of jurisdictional claims as both knowledge and normative claims.
This approach resonates with current IR scholarship on the creation of knowledge
through science, expertise, and professionals (e.g. Allan 2017; Hanrieder 2019;
Sending 2015), the sociology of knowledge, and the role of epistemic practices
in IR (Bueger 2014). For instance, Allan (2017) has highlighted how processes of
co-production contribute to the constitution of the climate as an epistemic object
22 DATA GOVERNANCE

and shape regulation. Boettcher (2020, 7) similarly illustrates the co-constitutive


relationship between how technologies are ‘discursively “formed as objects”’ and
their governance. Drawing on Bourdieu, Sending (2015) highlights how exper-
tise and governance claims were intricately interlinked in the construction of ‘the
international’ as a governable object. Focusing on practices, Bigo (2014) explores
the social universes that constitute the reality of different professional practices
in EU border control and their implications for the logic of control. Hence, juris-
dictional claims are constitutive of specific governance objects, which in turn also
represent a (normative) claim to act on behalf of others (Abbott 2005).
This bridges to the second element or jurisdictional claims—the reference com-
munity. Jurisdiction describes the link between a specific group and its work
(Abbott 1988, 20). Abbott defines such groups as professions. Yet this categoriza-
tion does not easily translate to data governance. In contrast to other fields, groups
in data governance organize around institutional or sectoral membership, such
as ministry affiliations, non-governmental organization (NGO) membership, or
private company status, but even these groups constitute rather loose formations.¹
Therefore, I consider jurisdictional conflicts as emerging between any individuals
or groups that articulate overlapping claims based on some system of knowledge
or control (Abbott 1988, 8).
Third, I investigate these claims according to the elements of justifications
actors employ, that is, their reasoning why their claim rather than others’ is legit-
imate. Lasting conflict resolution processes require a shared understanding of
the governable object, the relevant reference community, and some potential for
compromise between the perceived reasons for engaging in the field.
Abbott highlights the improbability of dominant control both in and through
jurisdictional conflicts. This puts emphasis on ‘how jurisdictions are opened,
contested, and closed’ (Abbott 1986, 192) and foregrounds a procedural under-
standing of conflict resolution. In an area characterized by ‘polycentrism’ (Scholte
2017), the resolution of conflicting overlapping claims entails political processes
of negotiation and normative ordering. Through increasingly hybrid governance
arrangements, private companies assume key roles and responsibilities in such
processes (Chenou and Radu 2019). Therefore, I pay attention to the space of inter-
action in which these conflicts take place, the actors involved, and the evolution
of the conflicts themselves. The outcomes of negotiations are often more tangi-
ble and represent the institutionalization or deinstitutionalization of normative
changes via binding legislation or bilateral or international agreements. Nonethe-
less, compromises, even in these seemingly stable contexts, seem to be relatively

¹ Due to the lack of internal organization and external representation as professions, apart from,
potentially, Data Protection Authorities (DPAs), the professional character of such a group is of less
relevance in the field of data governance (see also Raab and Szekely, 2017). The diverse expertise of a
legal, economic, and technical character required to enter the field of data governance has prevented
the domination of a particular profession.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 23

fragile, which highlights the intrinsically conflictive nature of politics (Bueger


and Gadinger 2014). Therefore, I am interested in both short- and long-term
normative and institutional consequences of conflict resolution processes.

2.2.1 Three Phases of Conflict

In the framework, I broadly distinguish between three phases of the conflict: emer-
gence, responses, and outcome. The emergence of the conflict defines the point
in time when at least one actor perceives overlapping claims to be conflictual
and voices dissent. Since I focus on conflicts that have already emerged, I do not
analyse extensively under what conditions actors perceive overlapping jurisdic-
tional claims to be problematic and activate the conflict (see, e.g., Holzscheiter
et al. 2020). Yet the justifications actors use in their initial jurisdictional claims
form the starting point and shape the further evolution of the conflict, including
its resolution. The responses to the conflict include, for example, explicit strate-
gies actors pursue as well as less strategic reactions they display throughout the
conflict. Here, I particularly focus on the actors’ justifications and their reliance
on varying conceptions of worth, as I outline later. The outcome may involve the
institutional or normative resolution of the conflict, for example through an agree-
ment based on compromise or through one side prevailing. This may encompass
different levels of formality, for example a bilateral or international agreement, a
court judgment, the publication of guidelines, or less institutionalized forms of
intersubjective agreement. However, the conflict may also be ongoing.
In conclusion, the concept of jurisdictional conflicts highlights how overlapping
jurisdictional claims that aim to assert control over a specific issue put forward a
specific definition of this issue and how these clashing definitions may result in
conflict. By claiming jurisdictions, actors put forward normative goals. Therefore,
I specifically aim to investigate the production of data governance in its different
dimensions through jurisdictional claims. The conditions, constraints, and conse-
quences of such processes will be examined in more detail later. In Section 2.3, I
outline the analytical tools that illustrate how jurisdictional conflicts play out in
practice.

2.3 A Relational Approach to Conflicts

The analytical approach theorizes the resolution of overlapping jurisdictional


claims and potential consequences of this process. Data governance is character-
ized by strong interlinkages between actors, levels, and issue areas. This makes
a relational approach particularly suitable for analysing how conflicts unfold.
It facilitates investigating how actors construct and interpret legitimate and
24 DATA GOVERNANCE

illegitimate jurisdictional claims. This ‘turns the focus away from the idea that
objects or structures have assumed a fixed, stable identity and that closure is
achieved at some point’ (Bueger and Gadinger 2014, 456). In the context of con-
flicts, I show how jurisdictional claims construct fragile structures of meaning in
specific situations and at the same time appeal to and constitute relatively sta-
ble normative visions. The analytical recognition of normative plurality does not
imply the absence of power. On the one hand, actors simply do not have equal
access to the discursive sites of conflicts, such as courtrooms, bilateral meetings,
or media outlets. On the other hand, and due to asymmetry in the distribution of
material or symbolic resources, they also have unequal abilities to engage in jus-
tificatory practices. For example, major private tech companies provide backing
to their claims by drawing on significant financial resources, fostering allegedly
independent expertise, for example by funding academic research or conferences
(Bodó et al. 2020) or setting up expert boards (Chenou and Radu 2019). Juris-
dictional claims, therefore, take place in a space that is already hierarchically
structured.
The recognition of the potential for justificatory practices and normative plu-
rality within an already structured space requires a conceptualization of agency
and normativity that bridges insights from Bourdieusian critical sociology and
the Boltanskian sociology of critique with insights from related IR and sociolog-
ical literature (e.g. Fligstein and McAdam 2011, 2012; Sending 2015; Zürn and
de Wilde 2016).² I specifically emphasize both critical actor reflexivity (Boltanski
and Thévenot 2006, 146) and the situatedness of actors in hierarchical struc-
tures (Bourdieu 1996; Bourdieu and Wacquant 1992, 123) in the jurisdictional
claims. The investigation of how actors draw on and enact normative principles
while recognizing the possibility of domination enables a richer account of the
multidimensional character of jurisdictional conflicts.
In summary, critical sociology highlights the hierarchies and rules of inter-
action, while the sociology of critique illustrates the specific justifications and
broader shifting orders of value. The sociology of critique approach adds a more
nuanced perspective on the categorizations actors employ to justify inequality.

² Drawing on the concept of ‘habitus’ (Bourdieu and Wacquant 1992, 123), Bourdieu puts emphasis
on how different dispositions shape present and future practices. Space for change is mostly restricted
to external shocks that provoke a crisis in the field (Mérand, 2010, 352). This approach, according to
Boltanski, unfairly limits the reflexivity of actors and their ability to address and change their social
surroundings (2011, 19). Boltanski (2011, 21) specifically problematizes that Bourdieu problematically
assumes this reflexivity for scholars. Thus, the assumption that scientists are able to reflect on social
structures, while ordinary actors rarely escape them, he argues, undermines the postulated reflexive
character of the approach. Boltanski and Thévenot instead highlight the ‘ability to detach oneself from
the immediate environment’ (2006, 146), stressing actors’ capacity to reflect on their position. The
recognition of actor reflexivity also has implications for normativity. In field theory, the invocation
of normative questions is often conceptually limited to the legitimization of hierarchy (see Pellandini-
Simányi 2014), while Boltanski’s sociology of critique emphasizes the normative and moral dimensions
of action.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 25

The field approach emphasizes the dynamics of power that constrain them. Both
show the co-constitutive relationship of the conflict resolution processes with the
specific logics of the field. In Sections 2.3.1 and 2.3.2, I describe each in more depth.

2.3.1 The Field as a Space of Social Interaction

Drawing on Bourdieu, I use the concept of a ‘field’ (1996, 1998; Bourdieu and Wac-
quant 1992) to conceptualize the relationship between specific conflicts and the
framework of endogenous logics that constitute the structure of the field. My main
argument is that the common stake at the centre of the field of data governance,
that is, that data governance is meaningful in the pursuit of common societal goals,
leads to self-reinforcing dynamics that make control over data seem valuable. This
contributes to the pervasiveness of competing jurisdictional claims on data. These
dynamics seem to prevail even in cases in which data governance is not effective
with regard to the achievement of the particular societal goal that motivated actors
to join the field in the first place.³ The concept of fields has been used in a diver-
sity of areas, including international diplomacy (Adler-Nissen and Pouliot 2014;
Pouliot 2016), security (Mérand 2010), or sustainability (Dingwerth and Pattberg
2009), but recently also in relation to internet policy (Julia Pohle et al. 2016; Radu
2019; Reiberg 2018). It is particularly suitable for recognizing long-term institu-
tional and social processes in evolving issue areas (see also Hamm et al. 2021).
In the following, I demonstrate how it can also contribute to a better understand-
ing of the specific ‘logic of practice’ (Bourdieu 1990) at play in data governance.
I briefly present the concept and illustrate the implications for interaction in the
field and actor responses in conflicts.
While Wacquant and Akçaoğlu (2017, 62–3) criticize the preponderance of
identified fields in the social sciences, I use this term to describe a space of inter-
action and social relations that is united by actors’ belief in the significance of
common stakes that constitute the centre of the field. In contrast to a community
based on values or common goals, actors share a much thinner form of mate-
rial and ideational interest (Kauppi and Madsen 2014). Bourdieu describes this
unity through the concept of a common illusio. The illusio depicts actors’ notions
that their engagement in the field and the central issue at the field’s centre is
meaningful (Neveu 2018, 364). In the field, actors follow specific logics (Bour-
dieu 1996, 227) that emerge from the illusio and engage in common struggles over
meaning-making (Bourdieu and Wacquant 1992, 102).

³ Scholars have described this belief in the power of data or the perceived pressure to obtain data.
For example, van Dijck captures this idea through the concept of ‘dataism’ (2014), while Fourcade and
Healy (2017) and Schildt (2020) use the term ‘data imperative’.
26 DATA GOVERNANCE

While the common stakes may appeal to outsiders, fields tend to have some
barriers to entry. For example, even if actors subscribe to the arguably more
widespread illusio that data governance is important, outside actors are likely to
experience difficulties in successfully navigating the field. Actors that have been
engaged in the field for a longer time are likely to have gained material or symbolic
resources, such as accumulated data or specific legal expertise. These resources
can help navigate or structure existing hierarchies. This is well illustrated by the
monopolist position of large tech companies that prevent smaller companies from
becoming more meaningful participants in the field by buying them off (Culpep-
per and Thelen 2019). Nevertheless, fields are always surrounded by, nested in, and
interconnected with other fields (Fligstein and McAdam 2011, 3), shaped by both
external and internal principles of legitimation (Bourdieu 1996, 217; A. Cohen
2018, 203).
While the field establishes the social space, the framework for political action
is formed by the distribution of resources and the perceived social reality of the
field. On the one hand, resources delineate how actors are positioned in a spe-
cific field. Bourdieu describes these resources by the concept of ‘capital’ (Bourdieu
1991), which may manifest in diverse forms, for example through social, eco-
nomic, political, or symbolic resources. The positioning of actors only becomes
meaningful in the context of the specific principles or truths of the field. Thus,
on the other hand, the framework accounts for the subjectivist perceptions and
categories actors use to understand their surroundings (Bourdieu 1985, 727–8).
These truths set the limitations to what actors perceive as thinkable and sayable
under specific circumstances. The truths and struggles of the field are contained
in what Bourdieu describes as the ‘space of possibles’ (1996, 234–9). Actors tend
to relate to these truths (Bigo 2011, 232), which typically results in a reproduction
of existing power dynamics (Susen 2014, 326). For the resolution of jurisdictional
conflicts, this means that actors tend to articulate visions of the field that con-
form to the existing principles of the field. To some extent beyond Bourdieusian
assumptions, I consider actors as making ‘articulate, and more or less strategic,
attempts at gaining recognition in the field’ (Sending 2015, 29). Rather than just
navigating the field in a practical sense, this actor reflexivity also creates the space
for strategic action (see also Susen 2014, 335). Therefore, I argue against an overly
strong emphasis on actor attributes as primary defining characteristics, as these
sources do not just pre-exist but must themselves be constructed and made oper-
ational. In the effort to define the criteria they and others use in their judgement
(Sending 2015, 19; 2017, 318–19), actors’ success largely depends on their capacity
to reflect on the possibilities within a specific field (Fligstein 2001, 114).
In sum, actors engage in constant struggles to ‘produce and to impose the legit-
imate vision of the world’ (Bourdieu 1989, 20). These processes create meaning
and develop into endogenous logics that structure how actors perceive their posi-
tion and the specific truths of the field. In line with the Bourdieusian approach,
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 27

I highlight how actors’ reflexive horizon and their practice are shaped by a per-
ceived space of possibles. However, against the backdrop of the Boltanskian
framework, I emphasize the situatedness of action rather than specific predispo-
sitions (Boltanski 2011, 63). The field of data governance is formed around the
common illusio that specifies control over data as a common stake as well as
meaning-making processes that involve the construction of legitimate governance
responses in the pursuit of the common good. To analytically grasp the concep-
tualizations of this common good, I draw on the concept of value orders, which I
outline in Section 2.3.2.

2.3.2 Justificatory Practices and Value Orders

Section 2.3.1 established the field as a space of interaction that shapes (the per-
ception of ) the positions from which actors articulate jurisdictional claims. In
this section, I outline the contributions of the sociology of critique to the theo-
retical framework, which enables the analysis of justificatory practices. As they
establish jurisdictional claims of control over data as morally worthy or deficient,
justifications lie at the very core of jurisdictional conflicts.
Drawing on Boltanski, I assume that justifications aim ‘to reconcile a require-
ment of common humanity, which presupposes the equality of actors, with their
ordering in a hierarchy’ (Boltanski 2011, 112). They unite these two aspects
through references to higher common principles. Kornprobst describes this pro-
cess as ‘subsuming particulars under universals’ (2014, 193), pointing to the link
between the general and the specific. Justifications are, therefore, not only rele-
vant as ex post clarifications for action, but also provide orientation to political
decision-making ex ante.⁴ In other words, in invoking justifications, actors, while
implicitly or explicitly referring to specific imagined communities, draw on higher
principles to legitimize inequality between different actors. In making a jurisdic-
tional claim, actors, in reference to moral worth, justify why their claim rather than
others’ is legitimate. Therefore, in situations of conflict, ‘each of the protagonists
presents not only a different interpretation of what has “really” occurred, but also
different facts in support of her truth claim’ (Boltanski 2011, 61).
Value orders constitute an analytical tool for capturing the grammar actors
employ under these circumstances. Boltanski and Thévenot stipulate that under
conditions of normative uncertainty or dispute, individuals follow an ‘imperative
of justification’ (2006, 346). In feeling compelled to justify their claims, they draw

⁴ The significance of public justifications for state and non-state actors to take or guide political
actions is firmly established (e.g. Deitelhoff 2009; Bially Mattern 2001; Hansen 2000a). For a more
comprehensive overview of justification, see, e.g., Kornprobst (2014) or Forst (2015).
28 DATA GOVERNANCE

on distinctive orders of worth, polities (cités), or, as I refer to them, value orders.⁵
Hanrieder defines them as ‘repertoires of evaluation consisting of moral narra-
tives and objects that enable tests of worth’ (2016, 391). The concept has been
used in empirical analyses of, for example, global health (Hanrieder 2016), the
UN Security Council (Niemann 2019), or organizational structures (Jagd 2011).
It should be noted that value orders do not presuppose strong intersubjective
agreement or shared values but also present in heterogeneous settings. This is par-
ticularly compatible with the conceptualization of data governance as a field which
is characterized by unity only in the sense that actors share a belief in common
stakes.⁶
Boltanski and Thévenot reconstruct six orders from seminal works in Western
political philosophy⁷ but acknowledge the need to account for different periods or
contexts—including those in which justification is not as central (Boltanski and
Thévenot 2006, 347).⁸ Later modifications also include new orders (e.g. Boltanski
and Chiapello 2005; Thévenot et al. 2000). Likewise, I use value orders as a pre-
dominantly heuristic concept adjusted to the field of data governance. As I outline
in Section 2.5, I reconstruct five distinct orders, which I outline in Chapter 3. While
the orders are irreducible to each other, I do not perceive them to be fundamentally
incompatible, as a common good may be understood as the orientation towards a
specific goal or the orientation towards a specific community. More specifically, I
suggest that there is not only a plurality of substantive common goods but also a
plurality of procedural common goods.⁹ For example, while the pursuit of safety
and security constitutes a substantive normative goal, the pursuit of sovereignty
to guard specific communities from excessive influence in the global realm con-
stitutes a procedural normative goal. This distinction between procedural and
substantive does not speak to the normative or political quality of the higher

⁵ The terms ‘polity’, ‘cité’, and ‘order of worth’ are used interchangeably in the original work. This
book uses the term ‘value orders’. The concept of ‘world’ describes the justificatory language, object,
and practices that are available to actors in a specific situation rather than the grammar of principles
of the polities.
⁶ This aspect also distinguishes value orders from similar concepts that have been prominent in
social theory. For example, Walzer’s ‘spheres of justice’ (2008) assume that certain principles govern a
particular domain, while value orders explicitly account for the simultaneous plurality of principles of
worth. In contrast, Forst problematizes the limitations of narratives of justifications or normative orders
both concerning their level of specificity and their numerical occurrence (Forst 2015, 31). However, I
suggest that focusing on a limited number of more or less dominant value orders provides analytical
specificity. Clustering the plurality of conceptions of worth and comparing them on a limited number
of dimensions helps understand the main normative reference points that inform political action.
⁷ Boltanski and Thévenot (2006) define six different orders: market, industrial, civic, domestic,
inspired, and fame. Hanrieder (2016) defines four orders: fairness, production, security, and spirit.
⁸ Naturally, the importance of justification presupposes a society in which justification is possible
and to some extent, socially and politically meaningful; see Blokker and Brighenti (2011) for a more
detailed discussion.
⁹ Kornprobst (2014, 8) similarly specifies a distinction between substantive and procedural based
on the specification of the right action or the specification of the applicable community, respectively.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 29

common principle but merely points to different dimensions of worth. In con-


trast to variation on the same dimension, these may arguably be combined more
intuitively. While one might object to the notion of such procedural principles as
common goods, I employ a sociological approach that focuses on justifications in
policy discourse rather than their normative desirability. For enhanced concep-
tual clarity, I further draw on the condensed version of the concept as proposed
by Hanrieder (2016) but differ in my reconstruction, as illustrated in Section 2.5.
Accordingly, each order defines a higher common good, a reference community,
objects of worth, and virtues standing in contrast to potential threats or dystopian
scenarios.
First, involved actors define legitimate and illegitimate justifications according
to their contributions a specific conceptualization of the common good. As Han-
rieder puts it, ‘each conception of a global good such as development, security, or
democracy also allocates blame and praise in specific ways’ (Hanrieder 2016, 415).
Therefore, in one polity, private goods satisfy private pleasures, while the pur-
suit of collective values through legitimate means represents a common good. This
justifies societal inequalities, as the invocation of higher common principles ‘nor-
matively determines the place that members of society, who are in principle equal,
should occupy in the hierarchy of social status’ (Honneth 2010, 379). The dis-
tinction between public and private goals is obviously not always clear-cut. While
many would quibble, for example, with the assertion that private companies are
guided by the common good, they frequently justify their actions in reference to
collective values or goals, emphasizing the compatibility of private profit and pub-
lic gains (Nachtwey and Seidl 2020). Thus, companies are not necessarily morally
deficient, just as public officials are not necessarily morally worthy. A priori limit-
ing of the notion of the common good to specific actors or arguments curtails our
understanding of the recognition of justifications in practice.
Second, if an actor appeals to a higher normative principle, she not only speaks
to the principle of social order that should be applied but also creates an implicit
claim to speak on behalf of others. This may represent an existing or future
(imagined) community, a collective that is united by a specific conviction of the
common good, often based on standards of exclusion and inclusion. What bodies,
for example, are considered to be part of an existentially threatened community is
often defined by racist, classist, or sexist standards (Boltanski and Thévenot 2006,
80; Hansen 2000b). The attempt to speak on behalf of others may fail if the belief
in the common good is not (yet) shared by a group that is considered sufficient to
be taken seriously in a particular context (Boltanski 2011, 100).
Third, a dystopian scenario invokes the potential dangers and threats that may
result from abandoning the principles and virtues associated with a value order.
As also highlighted by the literature on securitization, by referring to an existential
threat to the referent community, whether this threat is constituted by, for example,
30 DATA GOVERNANCE

bodily harm or by an abandonment of core norms, the relevance of the common


good is re-emphasized (Hanrieder 2016, 400).
Fourth, besides abstract principles, the critical repertoire of actors also com-
prises both objects and other specific strategies for proving worth, as ‘the quality
of things must have been determined in a way that is consistent with the prin-
ciples of worth invoked’ (Boltanski and Thévenot 2006, 130). Proving worth is
not restricted to narratives but may include material objects or symbols, such as
flags or statues but also images, personal anecdotes, or technical expertise (e.g.
Mac Ginty 2017). In data governance, actors may refer to, for example, the exper-
tise of data protection authorities, the technical capacities of companies, impact
assessments, or binding domestic or international law.
The construction of legitimate or illegitimate ways of proving worth or using
evidence is intricately linked to the specific interpretation of a problem. In con-
trast to, for example, framing (Goffman 1974), the concept of value orders accepts
the inherent malleability and fragility of the reference concept. The conceptual-
ization of data as a governable object not only constitutes a different perspective
on a fixed object but points to its constructed nature. Digital data are in some
ways less materially grounded than, for example, flags or tanks. They can be mul-
tiplied, shared, and moved without significant efforts, but they similarly relate to
material structures for storage and transport, such as servers and cables. As I have
outlined, they may be compared with ‘epistemic objects’ (Knorr Cetina 2005, 183)
that temporarily stabilize and materialize through the production of knowledge.
By tracing these construction processes on different levels, that is, the underly-
ing common good, the reference community, and valuable objects, the concept
of value orders provides a more systematic conceptualization of these reference
concepts. In their resolution of overlapping jurisdictional claims, actors need to
negotiate the compatibility of these principles or attempt to impose their preferred
vision of the field.
After outlining the field and value orders as frameworks of interaction and jus-
tification, Section 2.4 will focus on how the combination of these analytical tools
contributes to a unique perspective on jurisdictional conflicts in this area.

2.4 The Resolution of Jurisdictional Conflicts in the Field


of Data Governance

Against the theoretical backdrop, this section formulates the concrete theoret-
ical framework for understanding how actors resolve conflicts emerging from
overlapping jurisdictional claims. To understand the resolution of jurisdictional
conflicts in data governance, the book emphasizes two related aspects. On the
one hand, the theoretical framework establishes the space of interaction in which
conflicts take place, that is, the field. The field works as a structuring mechanism
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 31

1. Emergence (Actor P1)

Jurisdictional claim by P1: contestation of Jurisdictional claim by P1: attempt to


hierarchization within prevailing order(s) impose principles of alternative order(s)

2. Responses (Actor Px)

Px does not follow Px follows imperative


imperative to justification of justification

Contestation of P1’s Px attempt to impose


Px draws exclusively on alternative
hierarchization within principles from
resources, like position in the field
prevailing order(s) alternative order(s)

3. Outcome

Minor reform or revision Major reform or revision Emergence of new


institutional Status quo
of institutional structure of institutional structure institutional structure

Revised hierarchization Compromise Shift from order(s) to


normative Conflict continues within prevailing order(s) between orders alternative order(s)

Fig. 2.1 Jurisdictional conflict resolution processes

but also establishes a structured space, because the principles of the field and
jurisdictional conflicts are assumed to be mutually constitutive. On the other hand,
to understand the evolution of conflicts in more detail, I conceptualize the justi-
fications actors bring forward in their jurisdictional claims. This analysis draws
on the concept of value orders. Therefore, the framework outlines how jurisdic-
tional conflicts emerge in an already structured space and clarifies how actors draw
on evaluative principles and tools in their resolution. In Section 2.4.1, I outline
the three phases of the conflict, that is, emergence, responses, and outcome, as
illustrated in Figure 2.1. I also illustrate how these relate to the questions of actor
responses, the stabilization of agreements, and the interlinkages with the broader
principles of the field.

2.4.1 The Emergence of Jurisdictional Conflicts

As outlined above, the specific character of the conflicts I analyse in this book
presupposes that actors have articulated ‘jurisdictional claims’ (Abbott 1986)
that aim to establish control over data. Jurisdictional claims demarcate a space
of control and, at the same time, are constitutive of this space. In claiming
legitimate control of a problem (Abbott 1986, 191), actors also make norma-
tive claims about the character of data and implicitly or explicitly outline how
to govern them. Jurisdictional conflicts arise when at least one actor (P1) per-
ceives overlapping jurisdictional claims to be problematic (see also Holzscheiter
et al. 2020). The act of questioning the current moral or normative hierarchy
may result from perceived injustice or inconsistency. In this case, actors may
32 DATA GOVERNANCE

decide to ‘test’ the existing hierarchies to confirm or challenge the existing order
(Boltanski and Thévenot 2006, 40; Boltanski 2011, 103). In tests, actors must
invoke their critical capacities in coordinating the plurality of orders and their
intersubjective interpretations. While all tests ‘exploit contradictions’ (Boltan-
ski 2011, 110), they can be ranked according to their potential for normative
change. I use these tests to conceptualize the normative quality of the jurisdictional
claim.
Boltanski identifies three types of tests: truth tests, reality tests, and existen-
tial tests. Truth tests mainly rhetorically confirm (and only rarely disconfirm) the
hierarchization within the existing order. They tend to prevent rather than enable
critique (Boltanski 2011, 62). Due to this largely non-conflictual character, I focus
on the other forms of tests.¹⁰ Reality tests more strongly question the status quo
by explicitly pointing to inconsistencies or injustices (Boltanski 2011, 104). Yet
the underlying order still finds acceptance. For example, a reality test in reference
to the common good of economic progress and innovation might point to prac-
tices that undermine consumer trust, such as privacy violations. Yet the reality test
would not question whether data practices should be subjected to a commodifying
logic. Existential tests are most radical in their challenge, bringing forward funda-
mentally antagonistic positions. While reality tests problematize that a particular
practice contradicts the overall pursuit of a specific common good, existential tests
draw on completely different standards, that is, they question the common good
as such. In our example, data access might be considered no longer in relation to
its contribution to consumer trust and economic progress but as necessary to the
prevention of terrorism. While data protection rules initially were mainly evalu-
ated with regard to their conduciveness to economic progress and the enjoyment
of human rights, the consideration of security as an evaluative criterion experi-
enced a strong rise in reaction to the 9/11 terrorist attacks (Etzioni 2018, 112–13;
Kaunert et al. 2012). Thus, a test that formerly might have involved economic
impact assessments and human rights law may now stress the ease of data avail-
ability or sharing mechanisms in the fight against terrorism to prove worth and/or
contest the status quo. In sum, the theoretical framework assumes jurisdictional
conflicts as arising from tests that question the existing moral hierarchization in
the context of competing jurisdictional claims. Conflicts may emerge regarding
the appropriate interpretation of hierarchy within the existing order (within-order
conflict) or regarding the conceptualization of an issue based on principles from
alternative value orders (across-order conflict). Both instigate an ‘imperative of jus-
tification’ (Boltanski and Thévenot 2006, 346). Most of the conflicts I observe fall

¹⁰ While I do not investigate these cases, the successful prevention of conflicts may be a sign of insti-
tutionalized coordination mechanisms, such as regular truth tests, or the particularly strong position
in the field. While they are, therefore, of great relevance, they are epistemologically challenging, as their
identification is based purely on (counterfactual) assumptions by the researcher.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 33

into the latter category, that is, the rejection of the applicability of a specific regime
of justification (Chapters 4–7) rather than the hierarchy within one order (but see
Chapter 8).

2.4.2 Responses to Jurisdictional Conflicts

In Section 2.4.1, I illustrated how actors invoke specific normative principles after
reflecting on their present and future situation in a given field. In this section, I
outline potential actor responses. While jurisdictional overlap needs to be per-
ceived as problematic by at least one actor (P1) for a conflict to emerge, there are
fundamentally distinct ways in which other actors (Px) may respond to the emer-
gence of conflict or a challenge to the status quo. I discuss three distinct responses
as components of the conflict resolution process. First, actors may not respond to
the imperative of justification at all; second, they may contest the hierarchization
within the prevailing orders; and third, they may attempt to impose principles
from an alternative order. While actors might formulate an immediate compro-
mise, I discuss this in Section 2.4.3, as it constitutes an outcome rather than a
response.
First, actors may decide not to follow the imperative of justification or simply
lack the opportunity to do so. Reasons for this may vary; actors might be aware
of (material or ideational) dependency, for example on the basis of their domi-
nant position in the field. This could prevent a further escalation of the conflict,
as actors either assert their dominance or submit to their marginal position in the
field. Actors may also draw on ‘morally unjustified social norms’ (Möllers 2020,
22) such as the status quo of the existing social order. Some actors may also be
unable to engage meaningfully in justificatory practices because they lack access
to relevant fora or capacities for justification. In short, while the most powerful
actors may decide to resort to their position in the field to evade the imperative
of justification, weak actors might not have the option or capacity to participate
in justificatory practices in the first place. Therefore, actors may simply (be forced
to) remain silent, intentionally ignore the conflict or its normative character, or
fall back on coercive power, such as through infrastructural or technical means.
Second, if actors respond to the imperative of justification, I expect that they
draw on value orders that have already manifested in some form in the broader
field. Particularly as a response to a reality test, actors may emphasize the appli-
cability of and hierarchy within the existing order or the balance of orders that is
dominant in the situation. As illustrated above, the framework of justification is
not necessarily tied to a specific call for action (Boltanski 2011, 111). Therefore,
it may be easier to contest the specific conclusions derived from the predominant
principles and evaluative criteria rather than drawing on a completely distinct set
of principles.
34 DATA GOVERNANCE

Third, actors may also rely on principles or objects associated with an alterna-
tive value order. The reference to an alternative common good also enlarges the
space of possibles and opens the space for legitimate political action. Therefore,
actors might draw on an alternative justification order if their preferred outcome
is so far removed from their interests or normative convictions that they feel the
need to break with the status quo. I suggest that in contrast to actors holding a posi-
tion of power in the field, what Fligstein and McAdam call ‘incumbents’ (2011, 2),
this type of challenge may be particularly likely to come from actors that hold a
more peripheral position in the field. Due to their outside perspective, those ‘chal-
lengers’ (Fligstein and McAdam 2011, 2) are able to formulate a fundamentally
distinct vision of the field. This bears significant disruptive potential. While value
orders are considered to remain relatively stable, any agreements or compromises
remain open to challenge (Boltanski and Thévenot 2006, 177).
There is not necessarily a clear link between the articulated challenge or test
(by P1) and the response of other involved actors (Px). For example, if actor P1
articulates a reality test, Px may decide to deny the addressed inconsistencies or
justify them in reference to alternative principles. In turn, if actor P1 articulates
an existential test that draws on principles of an alternative order, actor Px may
decide to deny the applicability of this order or similarly draw on order(s) that
are neglected in the status quo. Compared with the contestation or affirmation of
moral hierarchization within the existing order(s), the attempt to apply different
evaluative criteria is usually more difficult and costlier. An established justification
order ‘constitutes an entire lifeworld and hence leads to stable habits of action
and perception’ (Honneth 2010, 386). It also presupposes a specific community
that recognizes the moral claim. If the appeal to an alternative order is consid-
ered a sufficient strategy and succeeds, this readiness to accept suggests a deeper
entrenchment of the order in the field.
A substantial alteration of the universe of possible and legitimate policy options
is particularly well established through the concept of securitization. It outlines
how actors conceptualize a referent object as existentially threatened through a
‘securitizing move’ (Buzan et al. 1998, 25) and justify extreme measures. In and
beyond data governance, IR scholars have employed Bourdieusian frameworks to
outline securitization processes (e.g. Bigo 2014; Huysmans 2006). Securitization
may be based on strategic action or incremental processes of structural or nor-
mative change (Williams 2003, 521). However, securitization does not capture the
plurality of common goods in data governance. The exclusive focus on governance
efforts to respond to existential threats risks ignoring other underlying reference
goods and evaluative mechanisms. For example, in data governance, the norma-
tive discourse is to a significant extent shaped by the importance of the free flow
of information, innovation, and economic progress through the data economy
or the importance of protecting the privacy norms of a specific reference com-
munity. I suggest that securitization is one specific mechanism through which a
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 35

situation or issue is tied to particular evaluative principles. This, however, does


not preclude the possibility of similar dynamics with regard to alternative gover-
nance objectives. Therefore, I pose the plurality of conceptions of the common
good as an empirical question. The choice of responses may be based on strate-
gic or normative reasons. The effect, in contrast, depends on the capabilities of
the actors involved to create normative fit. Fligstein and McAdam have outlined
this capacity as ‘social skill’, that is, the ‘ability to induce cooperation by appealing
to and helping to create shared meanings and collective identities’ (2012, 46; see
also Adler-Nissen and Pouliot 2014). Thus, actors need to draw on higher com-
mon principles that most likely are already recognized in the field and attempt to
formulate their jurisdictional claim on the basis of these principles and in rela-
tion to the specific audience.¹¹ Even if actors do not exclusively rely on capital or
their position in the field, the available resources mediate the likelihood of actors’
success.
In sum, actors may depend completely on their position in the field or engage
in justificatory practices. The differences between these response types will not
always be clear-cut, but they offer an analytical distinction according to the poten-
tial for normative change, which increases with the increasing radical quality of
normative challenge. The absence of justifications has a very low potential to pro-
duce normative change, while justifications based on alternative principles have
the highest potential. However, these scenarios may result in a diversity of out-
comes, which I outline in Section 2.4.3. Since actors attempt to appeal to multiple
audiences and/or diversify their normative resonance, it is also unlikely that they
invoke justifications that are limited to only one value order (Hanrieder 2016,
412). Instead, actors draw on constellations of different orders. In Section 2.4.3,
I examine the details of this process in more detail.

2.4.3 Outcome: Cooperating for the Common Good?—Or beside It

In Section 2.4.2, I outlined the different ways in which actors may respond to the
emergence of a jurisdictional conflict. This section more explicitly focuses on the
outcome of the conflict through the attempt to find a temporary stabilization or
agreement distinguishing between three types of outcomes. These outcomes may
be institutionalized to varying degrees. An outcome may consist of a minor or
major reform of the existing institutional order, for example through the revi-
sion or abolishment of a bilateral agreement, or it may lead to the emergence of
new structures such as laws or treaties. However, if the underlying fundamental

¹¹ In contrast to Habermasian accounts, justification in my understanding does not foreground the


‘force of the better argument’ (Habermas 1984) but stresses the sociological relevance, i.e. the various
justificatory strategies actors employ when considering the audience’s characteristics.
36 DATA GOVERNANCE

normative principles remain unstable in their combination, a challenge through


a renewed test is likely to emerge at some point. Below, I outline three types of
normative outcomes.
First, actors may strive to satisfy their private interests rather than the com-
mon good. Thus, under specific conditions, there is the possibility of a ‘private
arrangement’ (Boltanski and Thévenot 2006, 336). This describes ‘a contingent
agreement between two parties that refers to their mutual satisfaction rather than
to a general good’ (Boltanski and Thévenot 2006, 336). Such an arrangement
allows, also potentially in the context of secrecy, the discounting universal goods
and agreement on an arrangement that benefits both parties without reference to
the common good. In that sense, actors recognize an interest that is not gener-
alizable or at least not legitimately sustainable. In consequence, the potential for
normative change is low. The same may happen if one side does not follow the
imperative of justification, which may be based on a relativistic agreement that
the dispute or the pursuit of the common good does not matter or on the exercise
of arbitrary power (Boltanski and Thévenot 2006, 339).¹²
Second, one side may normatively prevail. This may take two forms. Either
actors fully recognize the attempt to establish a different moral hierarchy within
the prevalent orders or they accept the applicability of a new justification order,
for example through securitization. This implies that either the challenge by actor
P1 or the responses by another actor Px successfully appealed to shared under-
standings and normative principles and were able to either impose their preferred
hierarchization within the existing order or alternative principles on the situation.
Third, actors may compromise ‘for the common good’ (Boltanski and Thévenot
2006, 277). In the resolution of the conflict, a compromise may yield particularly
balanced and sustainable results, as multiple concerns are satisfied. This process
requires ‘that actors mobilize narratives and devices in a way that creates a fit
between evaluative repertoires and contested situations’ (Hanrieder 2016, 414).
Therefore, rather than a natural outcome of a conflict based on predispositions,
compromises constitute processes of active construction. While this interactive
dimension is most explicit in the creation of new compromises, even if one side
prevails, this may involve the continual re-enactment of existing compromises.
The construction of compromise, as Boltanski and Thévenot put it, is premised
on an implicit ‘equivalence’ (2006, 277) between arguments from different orders.
It presupposes a ‘principle that can take judgments based on objects stemming
from different worlds and make them compatible’ (2006, 278). They suggest that
tensions between different orders may be stabilized by drawing on ‘composite
objects’ (2006, 279), that is, objects, beings, or principles that contain features

¹² Boltanski (2012) has also identified other grammars of dispute resolution, such as love and vio-
lence. Nevertheless, as the book’s focus is on conflicts between globally prominent entities where such
an arrangement is less likely to be considered legitimate in the long run, I do not consider them here.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 37

of these compromises. For example, a ‘user’ may unite traits of a ‘customer’ and
an ‘individual’ and thus represents a worthy reference community in the pur-
suit of economic progress and individual rights. The more ambiguous and open
these principles and objects are, the more likely it is that they sustain compro-
mises, as they may enable a variety of interpretations that can satisfy multiple
concerns at once. As soon as compromises are translated into tangible political
action, disagreement may re-emerge. Thus, any unexpected tilt towards a spe-
cific notion of the common good may lead to the dissolution of the compromise,
because ‘an exploration of the grounds for agreement shows the compromise up as
a simple assemblage without any foundation at all’ (Boltanski and Thévenot 2006,
336). By recognizing the implicit malleability of the reference points, the analyt-
ical framework takes into account that the process of compromising opens new
indeterminacies but may also entrench certain principles in the field.
In summary, the potential for normative change is low when actors compro-
mise to satisfy private rather than public goods. If one side normatively prevails,
the normative quality of the challenge or response is decisive. A revision of the
existing hierarchy will have a lesser potential for normative change than the suc-
cessful imposition of an alternative value order. Compromises may have varying
implications for normative change, depending on the normative proximity of the
justificatory principles. The outcomes of conflicts may also have consequences
beyond the specific instance of conflict. For instance, if the outcome of the con-
flict consists in a renewed emphasis on the order of fairness, this may inspire other
actors to challenge the dominant order in a related area. The outcomes of con-
flict resolution processes may be of a more permanent or temporary character,
depending on whether they are able to create a normative fit with the underlying
institutional and normative logics in the field, that is, the constellations of value
orders and the distribution of resources and positions in the field.

2.4.4 Normative Visions in Data Governance

The central claim of this book is that conflicts in data governance must be under-
stood in their multidimensional quality as normative, institutional, power-related,
and legal conflicts. I have argued that in particular the normative principles actors
draw on in their resolution processes to justify their jurisdictional claims are
central but understudied. When actors fail to address normative differences in
the resolution of the conflict, agreements are particularly vulnerable to disrup-
tion. Sections 2.4.1–2.4.3 outlined how these value orders are organized around
the field’s common illusio and link higher common principles, specific reference
communities, and evaluative mechanisms. In this section, I establish the part of
the framework that zooms out of the specific conflict situations to reconstruct
how justificatory practices are used across conflicts. As I have illustrated above,
38 DATA GOVERNANCE

the process of conflict resolution may involve the attempt to refine or impose
a specific value order or may be based on newly created compromises between
orders. The reference to dissimilar principles requires trade-offs, which may foster
the instability of compromises. However, if these trade-offs become normalized,
the justificatory space is likely to change significantly. This potential normaliza-
tion of emerging, restructured, or combined orders also again demonstrates that,
far from being fixed and clearly demarcated, value orders are in flux. They are con-
sistently constructed, combined, and revised throughout the entire conflict. While
conceptualized as sticky and fluid, different value orders may emerge over time,
either because of compromises that become normalized or because of new prefer-
ences or grievances. Thus, the evolution and combination of different orders may
contribute to the constitution of new ordering principles (see also, e.g., Boltan-
ski and Chiapello 2005). When actors engage in justificatory practices, they not
only provide reasons for their jurisdictional claims in the sense of higher common
principles or constituency communities but also propose specific actions. There-
fore, while the value orders framework outlines general guiding principles, this
part of the theoretical framework tries to illustrate how they are applied in juris-
dictional conflicts. More specifically, I aim to outline how they are combined and
linked to specific calls for action, for example whether an emphasis on the order
of sovereignty is frequently linked to a call for stricter data protection rules. These
links produce more comprehensive visions of data governance that consist of a
justificatory basis and reference community based on the value orders and a cor-
responding conceptualization of the object of governance, as well as specific policy
preferences. These visions are relevant for the conflict resolution process because
they have the potential to redesign the normative structure of the field through
the normalization of trade-offs. The normative structure demarcates the space of
possibles for further contestation or dissent, which may delimit and restrict the
opportunities for disruption.
These visions offer the opportunity to sketch how the field of data governance
relates to its surroundings. As I outlined above, fields are always interlinked with
other fields. While even fully settled fields are rarely completely autonomous (Flig-
stein and McAdam 2011, 8), the field of data governance, due to the pervasiveness
of data in several sectors, has even stronger interlinkages. These interlinkages are
likely to create constraints and opportunities for actor justifications. Established
normative foundations in neighbouring fields or in global governance negotia-
tions are likely to provide reference points. Thus, for the final part of the analysis, I
compare how distinct visions of data governance relate to more general normative
underpinnings and objectives of governance. For contextualization, I draw selec-
tively on the literature on globalization ideologies (de Wilde 2019; Steger 2013;
Zürn and de Wilde 2016). The analysis of political ideologies in IR has so far largely
disregarded data or internet governance in favour of a focus on, for example, bor-
ders and migration (de Wilde et al. 2019), climate change (de Wilde 2019), or the
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 39

general allocation of authority (Steger 2013). I compare the identified visions of


data governance with the globalization ideologies identified by de Wilde’s (2019)
analysis across issue areas. I provide limited contextualization on how this com-
pares with the bones of contention and community conceptions in global political
ideologies in Chapter 9.
In other words, I compare how actors use distinct value orders in their justifi-
catory practices of conflict resolution processes across cases. I focus particularly
on how higher common principles and value orders are employed across juris-
dictional conflicts. In comparing to what extent actors relate to and differ from
globalization ideologies, it is possible not only to engage with the debate on
the uniqueness of the internet as a space of social scientific inquiry (Choucri
2012) but also to point to potential ways of enriching these categories in further
research.
In conclusion, I investigate the resolution of jurisdictional conflicts to address
the puzzling re-emergence and stabilization patterns that contribute to the uneven
trajectory of the field of data governance. The bifocal approach of the theoret-
ical framework links specific justificatory practices in jurisdictional conflicts to
the field as a space of interaction. In their attempt to shape evaluative criteria,
actors are expected to define morally acceptable and deficient jurisdictional claims
according to the constrained perception of value orders. However, the field con-
strains not only the exercise of justifications per se but also shapes what actors
perceive as possible responses. The ‘space of possibles’ (Bourdieu 1996, 234–9)
determines how and to what extent actors are constrained in the formulation of
their justifications, that is, what value orders are thinkable and sayable. When there
is a shift in the constellations of orders, this space of possibles may change accord-
ingly. As conceptualizations of worth change, the goals of political action and
the reference community shift as well. When normative challenges are profound,
even subordinate actors are often able to severely disrupt agreements against the
interests of more powerful actors. However, actors often fail to transform these
challenges into manifest normative or institutional change in the long term. There-
fore, the success of actors’ claims depends on a diversity of factors, including
their position in the field, competences, and social skills as well as resonance
with the imagined reference community. Distinct normative visions represent last-
ing compromises between value orders and express calls for actions. Together,
these accounts show how data governance as a field is shaped by conflict and
continuity.
I have so far illustrated the what (jurisdictional conflicts), where (the field of
data governance), and how (justifications drawing on value orders) of the theo-
retical framework I use to analyse the conflicts in the empirical Chapters 4–8. In
the remainder of the chapter, I illustrate how this approach is translated into an
empirical examination of conflicts in data governance and present the research
design of this project.
40 DATA GOVERNANCE

2.5 Research Design and Method

As outlined above, this book aims to understand how actors resolve jurisdictional
conflicts in data governance. I focus on the responses of specific actors, the stabi-
lization of agreement in conflict resolution processes, and existing interlinkages
with the broader evolution of normative and institutional logics of the field. The
understudied character of jurisdictional conflicts in data governance demands a
thorough understanding of the phenomenon and in-depth analysis. Hence, this
book aims to answer ‘how possible’ questions that focus on the production of
meanings that create and restrict the space of possibilities (Doty 1993, 279–99)
rather than assuming specific causal or unidirectional mechanisms. The under-
lying methodological foundation of the project is interpretivist. Interpretivism
highlights the significance of meaning-making processes. Language is perceived
not only as representing the social world but as actively shaping it (Wittgen-
stein 2001). I additionally draw on methodological insights from pragmatism.
Approaches that are close to or draw on pragmatist approaches offer a reflex-
ive understanding of research that recognizes its character as a social practice
(Friedrichs and Kratochwil 2009, 711). By addressing inherent bias and unpack-
ing fixed concepts, I try to incorporate this reflexivity into my own research. The
reconstructive impetus of the project is similarly guided by pragmatist and prac-
tice theoretical research that highlights the historical situatedness of meaning. The
project starts reasoning at an intermediate level (abduction), which aims ‘to enable
orientation in a relevant field. It consists of mapping a class of phenomena to
increase cognitive understanding and/or practical manipulability’ (Friedrichs and
Kratochwil 2009, 716). A significant part of the empirical research is inductive, but
I consistently draw on existing literature for orientation and aim to provide such
orientation for further studies.
The book analyses the broader trajectory of data governance through the key
controversies and their resolution. The research links long-term institutional and
normative processes with specific instances of conflicts and justificatory practices.
This bifocal approach requires a research design that can capture both dynamics.
While neither field theory nor the value orders framework constitutes a distinct
method, they provide guidance for the three-step research design of the book:
First, I reconstruct the emergence of the field of data governance and the value
orders that are prominent in the field. Second, drawing on this analytical frame-
work, I analyse conflict resolution processes in five cases which represent examples
of jurisdictional conflicts as specific empirical phenomena (Gerring 2004, 342).
Third, I identify how actors draw on value orders across conflicts. I construct a
typology of distinct visions of data governance that can provide orientation for
further (deductive) research.
The analytical focus adopted in this book transcends common binaries, such
as local/global or public/private, to highlight interlinkages between them. When I
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 41

distinguish actors on the basis of those lines, I mainly refer to the sociological rele-
vance of such distinctions rather than their conceptual distinctiveness. In addition,
when I speak of the preferences of specific institutions or actors, this is not meant to
reduce the entire political system of the respective states or institutions to a unitary
position.¹³ This is also relevant for companies. While I highlight intra-institutional
dynamics, for example, in the EU, I restrict multinational companies to a unitary
position, despite complex intra-organizational dynamics. I refer to Google, rather
than Google Inc. or the more recent Google LLC, which is part of the holding com-
pany Alphabet, and distinguish only in cases where the distinction is relevant for
the understanding of the conflict, for example regarding Google Spain and Google
Inc. in Chapter 8.

2.5.1 The Field of Global Data Governance and Value Orders

The first step in the project is the analysis of the genesis of the field of data gov-
ernance focused particularly on the construction of a common ‘illusio’ (Bourdieu
1996, 376), and the ‘orders of worth’ (Boltanski and Thévenot 2006) in the field.
The reconstruction of the long-term development of the field illuminates how
principles of vision and division become entrenched and institutionalized over
time (Mérand 2010). I start with a brief historical account of the emergence of the
field of data governance and then reconstruct the value orders. While Boltanski
and Thévenot use canonical texts of Western philosophy in the reconstruction,
they acknowledge that the orders are likely to vary in different contexts (2006,
347). Thus, I rely on a largely inductive approach and embed the identified orders
in political philosophy debates ex post. This also addresses a potential lack of atten-
tion given to the contextual nature of these orders. Using the software MAXQDA
(maxqda.com), I conducted a discourse analytical reading of 25 key documents
(see Table 2.1) with 1,136 individual codings. The approach underlines the con-
stitutive functions of discourse as well as the importance of context (Milliken
1999), outlining how actors ‘are engaged in the politics of knowledge and know-
ing, that is, in meaning/world making’ (Keller 2018, 17). By tracing how actors
make data knowable, it also outlines adequate and inadequate approaches to their
governance.
I triangulated the results of the analysis with information from qualitative
expert interviews.¹⁴ The selected key texts represent crucial moments for the

¹³ In this book, I have recourse to simplifications, like EU and US, when actors are authorized to
speak on behalf of these entities. While, particularly for the EU, I demonstrate that this does not exclude
the possibility of inter-institutional differences, this level of nuance for other parties extends beyond
the scope of this analysis.
¹⁴ The MAXQDA file, including all coded documents and segments, is available digitally for ref-
erence purposes. I chose MAXQDA because it is well established among qualitative researchers and
offers a free reader for the data and coding structure, which enhances transparency.
42 DATA GOVERNANCE

Table 2.1 Key texts for the constitution of the field

Stakeholder Organization Title Year Region


Group

International OECD OECD Guidelines 1980 Global


Organizations
Updated OECD Guidelines 2013
Council of CoE Convention 108 1981 Europe,
Europe CoE Convention 108+ 2013 but open
for global
accession
UN UN Guidelines for the Regulation 1990 Global
of Personal Data Files
UNGA Resolution 68/167 The 2014
right to privacy in the digital age
UNSC Resolution 2396 2017
EU 1995 Directive 1995 Europe

GDPR 2018 Europe


Safe Harbour Privacy Principles 2001 Europe,
EU, US Privacy Shield 2016 North
EU–US agreement on per- 2016 America
sonal data protection (Umbrella
Agreement)
OAS Comprehensive Inter-American 2004 North
Cybersecurity Strategy and South
America
APEC APEC Privacy Framework 2005 Asia,
Pacific,
Oceania
ECOWAS Supplementary Act A/SA.1/01/10 2010 Africa
on Personal Data Protection
within ECOWAS
AU African Union Convention on 2014 Africa
Cybersecurity and Personal Data
Protection
ASEAN Framework on Personal Data 2016 Asia
Protection
G7 Joint Declaration by G7 ICT 2016 Global
Ministers
G20 Digital Economy Development 2016 Global
and Cooperation Initiative

Private sector Internet Internet Association Principles 2017 Global


Association
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 43

Private Cybersecurity Tech Accord 2018


sector
coalitiona

Civil Society ICDPPC Montreux Declaration 2005 Global


The Pub- Madrid Declaration 2009
lic Voice
Coalition
Article 19 The Global Principles on Protec- 2017
tion of Freedom of Expression
and Privacy

Multistakeholder Global GNI Principles on Freedom of 2017 Global


Network Expression and Privacy
Initiativeb

a
The Cybersecurity Tech Accord (2018) is a declaration of principles by more than sixty global
companies.
b
The GNI is one of the most important multistakeholder initiatives in internet governance. For more
information, see https://globalnetworkinitiative.org, accessed 26 June, 2022.

constitution and further formation of the field of data governance and cover dif-
ferent regions, sectors, and time periods.¹⁵ I expect these foundational documents
to refer to their governance objectives in a general fashion which facilitates the
reconstruction of normative dimensions. I assume that organizations draw on
understandings of worth eclectically. The document selection was mainly based
on secondary literature (Greenleaf 2011, 2014b; Kuner 2013, 189) and policy doc-
uments (UNCTAD 2016) that list important principles of data governance. Where
no specific guidelines or agreements existed, I included principles on related sub-
jects such as cybersecurity which often have data governance elements. Europe
and the industrialized West are over-represented. This reflects the position of
policy entrepreneurs and most influential actors (e.g. Suda 2013, 2017) and thus
existing power asymmetries in the field. To avoid over-representation of specific
subject areas, the selected documents address data governance as a broader issue.
Therefore, for example international Passenger Name Record sharing agreements
were excluded but more general agreements on cooperation in law enforcement
and counterterrorism, such as the UN Security Council (UNSC) Resolution on
Counterterrorism (UNSC 2017), were included.
I coded largely inductively and developed and adapted the coding scheme (see
Appendix 2) through multiple iterations, clustering codes and coded segments
according to underlying conceptions of worth. On the basis of the codings, I
distinguished each order according to the different characteristics and outlined
their historical emergence and institutionalization. I also investigated parallels

¹⁵ The documents do not represent an equal distribution over time periods but were selected
according to their significance for the formation of the field.
44 DATA GOVERNANCE

with principles articulated in political philosophy. This approach allows for a


sociological understanding of orders that may evolve over time or change accord-
ing to different contexts rather than assuming pre-existing fixed structures. The
analysis identified five value orders (security, fairness, production, sovereignty,
and globalism), as I demonstrate in Chapter 3 (see Tables 3.1 and 3.2). For a
statement like ‘RECOGNISING the importance in strengthening personal data
protection with a view to contributing to the promotion and growth of trade
and flow of information within and among ASEAN [Association of Southeast
Asian Nations] Member States in the digital economy’ (ASEAN 2016, 1; empha-
sis in original), I coded the highlighted principles (promotion of trade, free flow
of data/information, harmonized standards) and the underlying common goods
(production and globalism). Subsequently, I assigned or added the codes to the
relevant value order. There was no limit to the number of codes given to one state-
ment. To validate my findings, I consulted secondary literature, such as Epstein
et al. (2014), who analyse framings of privacy at the Internet Governance Forum
(IGF) between 2006 and 2011, or Smith et al. (2011), who review secondary liter-
ature on privacy. I also included more conceptual debates, including the work by
Solove (2002) or the edited volume by Floridi (2016), that discuss different per-
spectives on data protection and privacy. For validation, I compared the principles
with existing reconstructions of orders by Boltanski and Thévenot (2006; see also
Boltanski and Chiapello 2005; Thévenot et al. 2000) and Hanrieder (2016). As the
first three orders showed significant overlap with three of the four orders Han-
rieder identified in her reconstruction of global health, I adapted her terms for
easier cross-comparison.¹⁶
After the reconstruction, I linked the identified orders to debates in politi-
cal philosophy. Value orders are assumed to illustrate how actors relate to the
broader moral objectives of their action. Exploring these normative underpin-
nings through more general debates on conceptions of justice, humanity, and
the purposes of governance highlights the various ways in which these debates
shape categories and underlying assumptions (Herzog 2013, 6). This allows a
more reflexive and precise articulation of principles and moral hierarchies. There
are limitations to this approach, because I focus on predominantly Western male
political philosophers. This, however, reflects their empirical relevance as a ref-
erence point in the still Western-dominated field of data governance. In addition,
I illustrate extremes on dimensions that many non-Western contributions try to
problematize, such as the global–local distinction. While the description of stable

¹⁶ Boltanski and Thévenot (2006) define six orders: market, industrial, civic, domestic, inspired, and
fame, while Hanrieder (2016) identifies the order of survival, the order of fairness, the order of produc-
tion, and the order of spirit. I changed the term survival to security because data governance offers less
immediate links to death and survival but has strong security dimensions. The term fairness reaches
beyond requests to, for example, address bias in algorithmic governance and speaks to questions of
social justice (see Hoffmann 2019 for a more comprehensive discussion).
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 45

principles of worth might contrast with the focus on the continuous negotiation
of such orders, I argue that there is significant analytical value in the systematic
clustering of justifications, which I will outline in Section 2.5.2.

2.5.2 Zooming in on Conflicts

In the second step of the research, I apply this coding scheme in the analysis
of five case studies. The analysis aims to identify processes of meaning-making
in conflictual situations which involve different actors that draw on background
knowledge, evaluative schemes, and their moral sense of justice to resolve these
situations. I analyse these conflicts through qualitative interviews, interpretive
document analysis, and a contextual embedding.

2.5.3 Case Selection

Both formally and normatively, the interaction of European institutions, the US,
and private companies has significantly shaped the field of data governance (Far-
rell and Newman 2019). The EU in particular has been described as a norm
entrepreneur (Newman 2008) and as a ‘global regulatory hegemon’ (Bradford
2020, xvii) in data protection, while the US is home to many of the most significant
tech companies. Together, they have entrenched the use of data in security and
counterterrorism practices (de Goede 2008). Both entities can arguably be con-
sidered global authorities that significantly shape the evolution of the field. While
other actors, most significantly China (Erie and Streinz 2022), are important,
access to reliable information is notably more difficult because of language barriers
and data accessibility. The highly intergovernmental and often informal charac-
ter of data governance, outlined in more detail in Section 2.5.4, restricts access
to reliable data. This makes academic research even more difficult for less lib-
eral countries like China, which presents added challenges due restrictions on free
speech and therefore access to public justifications. Other countries have a lively
debate about data protection, such as India (Burman 2020) or Brazil (Doneda and
Mendes 2014) but were excluded due to their current lack of formal and public
inter- or transnational activities and conflicts (but see Chapter 9).
This book analyses five transatlantic jurisdictional conflicts that have a formal
legal dimension (i.e., a court case or negotiations about a potential conflict of laws)
and involve private companies to varying degrees (Table 2.2). The universe of
jurisdictional conflicts in data governance is small, which also limits the rationale
for case selection. There is another conflict concerning the publication of personal
data on the WHOIS database by the Internet Corporation for Assigned Names
and Numbers (ICANN 2017), which is comparatively well researched due to its
46 DATA GOVERNANCE

Table 2.2 Selected cases

Case Conflict Issue area Outcome


parties

Safe EU citizen, Cross-border Revision of institutional


Harbour US commercial data structure, ongoing
transfers
Passenger EU, US Counterterrorism data Formalization and
Name sharing of air travel expansion of exist-
Record passenger data ing practices, limited
revisions
Terrorist EU, US Counterterrorism data Formalization of exist-
Finance sharing of financial data ing practices, limited
Tracking revisions
Program
Electronic Microsoft, Law enforcement access Formalization of
evidence US, EC to private company data existing practices,
ongoing
Right to be EU, Google Delisting/removal of New institutional struc-
forgotten data from search engine ture, limited changes in
results second phase

embedding in the broader internet governance context (Kulesza 2018; Mueller and
Chango 2008). Together with this dispute, these cases, to the best of my knowledge,
represent the only instances of jurisdictional conflict that have received signifi-
cant public attention (see Chapter 9 for potential conflicts). I selected these cases
because they bring to the fore underlying substantive or procedural differences
but at the same time show the temporary stabilization of agreement. In addition,
the chosen examples represent important typical features, such as the multifari-
ous character of public–private relationships. These conflicts touch upon the most
important areas of data governance, including commercial, counterterrorist, and
law enforcement as well as mixed forms of data processing and regulation. They
cover different time spans and periods, various substantial topics, and several
types of challenges. Showing how these cases relate to each other and are subject to
similar normative patterns offers significant insight into the evolution of the field
at large. Therefore, the insights generated from this book provide a framework for
helping to understand conflicts in other jurisdictions or related areas, or conflicts
that are yet to come.

2.5.4 Data and Method

For the analysis of justification strategies and interrelations in the field, I used
a combination of textual analysis and qualitative interviews. There is limited
direct access to documents produced by primary actors, such as autobiographies,
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 47

handbooks, or other artefacts,¹⁷ and limited opportunities for extensive field


research prevented participant observation. I was able to attend several policy
conferences, conducted qualitative interviews, and performed an interpretative
document analysis of 157 documents. Secondary literature provided the entry
points for the in-depth analysis of cases. Such a combination of document analysis
and interviews combines official justifications and more informal insights (Pouliot
2007).
The policy conferences I attended included local, regional, and international
conferences on different issues. Some had a narrower focus, such as the Octo-
pus Conference organized by the Council of Europe on electronic evidence; some
were related to data governance, such as the Computers, Privacy and Data Protec-
tion Conference; and some covered internet governance more generally, such as
the Internet Governance Forum or the World Summit on the Information Society.
These conferences are usually multistakeholder conferences, that is, in addition to
government and international organization (IO) representatives, they also allow
participation by academics, private actors, or NGOs. They provide an opportu-
nity to observe how actors engage with each other and how they attempt to prove
worth, and to witness who engages with whom in formal and informal settings. In
contrast, the intergovernmental meetings of officials from the EU and the US that
are most significant for many of these conflicts are highly exclusive. Thus, I rely
on interviews to gain insights on these meetings.
I conducted twenty-one semi-structured, problem-centred qualitative inter-
views with representatives from the public and private sectors between December
2017 and November 2019. Qualitative interviews provide only ex post rational-
izations of actor behaviour (Bueger 2014, 400), but they make available accounts
of meaning-making processes and stratification that are not visible from textual
sources. I conducted interviews with experts who either directly participated in
negotiations, such as DPAs, governmental, or EU representatives, or who have sig-
nificant insights or knowledge about these conflicts, such as private sector or NGO
representatives. I mainly tried to gain insights regarding potential power and insti-
tutional dynamics behind justificatory practices but also specifically asked about
motivations to engage in the field. Interviews focused on interviewees’ percep-
tion of conflicts and interrelations between actors, particularly in negotiations,
and their observations concerning the (normative) trajectory of the field of data
governance. It is important to take note of strict non-disclosure agreements that
concern, for example, private company employees but are also prevalent in pub-
lic review procedures. This exposes violators to criminal and/or civil sanctions
(EC 2012b, 3–4). Therefore, most interviewees preferred to stay anonymous. A list
of interviews can be found in Appendix 1). I recorded interviews based on the

¹⁷ Some examples include Kirby (2011) who describes the negotiation of the 1980 OECD Guidelines
as well as a recent autobiography by Edward Snowden (2019).
48 DATA GOVERNANCE

preferences of individual interviewees. Most interviews lasted between forty-five


and sixty minutes; the shortest interview lasted thirty minutes and the longest
two hours and forty-five minutes. Since the focus of the book is on justificatory
practices that capture how actors make representative claims, these interviews
mainly provided background information to triangulate the results of the textual
analysis.
Publicly available documents constituted the main data source for this book. To
identify and categorize justificatory practices in conflicts, I conducted a discourse-
analytical reading of different types of documents, including legal proceedings
and submissions, press releases, policy documents, and company blog entries.
I triangulated the information with documents leaked to the press, particularly
emails by the European Commission and internal ministry reports, for example
via Wikileaks (wikileaks.org). In addition, I focused on media reports where data
was scarce. Overall, the sample included over 400 documents. I did not code all
documents but relied on a sample of important documents based on intertextual
references until I reached saturation in each of the case studies. For the case stud-
ies, I coded 157 documents.¹⁸ Again, I used the software MAXQDA for coding. The
coding focused on the inductively derived notions of (a) the reference community,
(b) the conceptualization of data, (c) virtues and principles associated with the
common good in question, (d) the perception of threat and dystopian scenarios,
and (e) objects and principles that prove worth or deficiency. These distinctions
are not always clear-cut but served as helpful analytical categories for clustering.
Only statements that corresponded to an expression of worth were coded, while
restatements of facts or laws were not coded unless they were employed to prove
worth or deficiency. For example, in connection to a specific call for action or
claim, a reference to domestic case law was coded as proving worth within the
order of sovereignty, while a restatement of the legal situation was not coded. I
repeated the analysis with the final coding scheme where necessary. This process
also enhanced intra-coder reliability. Differences in the codings were marginal
and contributed to further streamlining of the coding scheme, but the categoriza-
tions according to the specific value orders remained constant. As jurisdictional
struggles are not reducible to language conflicts but involve the relations between
objects and actors (Boltanski and Thévenot 2006, 131), I specifically situated these
justifications in the broader context of the conflict and in relation to the actor
positions in the field.
It is important to note shortcomings in this approach regarding the cho-
sen method, the sample size, and inherent bias. Regarding the chosen method,
the analysis of justificatory practices is naturally limited to the representa-
tions by involved actors themselves. Justificatory practices are carefully curated,

¹⁸ As mentioned above, the MAXQDA file, including all coded documents and segments, is available
digitally for reference purposes.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 49

particularly considering their potential contribution to the common good. While


I try to address these limitations by triangulating the information derived from
the interpretative analysis through interviews and leaked documents, it is unlikely
to disclose the full extent of power dynamics. However, since I mainly focus
on the interactive dimension of negotiating the common good in conflict reso-
lution processes, references to such higher principles presuppose some kind of
intersubjective understanding of their existence.
The material does not capture justificatory practices beyond a narrow commu-
nity of experts, which would highlight the need for further research that draws
more extensively on, for example, news sources and opinion polls. Regarding
sample size and bias, privacy advocates or data protection officials rather than
private sector representatives or public sector officials more often granted inter-
view requests. In addition, the European perspective is more strongly represented,
mostly due to data availability but also due to my personal background and
location. The EU publishes more extensively on data governance-related issues.
Interview partners from Europe were, despite short field trips to Washington DC,
and California, more readily available. This creates selection bias. In addition,
the analysis is naturally impacted by my perspective on the topic. My personal
and cultural background has shaped my view on data protection and privacy,
which I consider fundamental rights. To address potential bias, I continually re-
evaluated available data and included diverse perspectives. In interviews, I tried
to avoid leading questions or any wording that could introduce bias, including,
for example, sticking to the term ‘data governance’ rather than ‘data protection’ or
‘privacy’, while excluding references to normatively loaded terms like ‘surveillance’.

2.5.5 Analysing Normative Visions of Data Governance

In the third step of the research, I make limited comparisons between the cases and
analyse how actors use justificatory practices across cases. In Chapter 9, I create a
typology of ‘visions’ of data governance that express specific social realities actors
draw on in their justifications.
For the analysis of broader patterns of justification, I examine how concepts
from different orders potentially form groupings that illustrate along what lines
worth is articulated in situations of jurisdictional conflict. Some codes already
encompass principles from different value orders as ‘composite objects’ (Boltan-
ski and Thévenot 2006, 279). For example, ‘cross-border security cooperation’ or
‘protection of liberal values through security measures’ appeals to both the order
of security and the order of globalism. These codes were added to both orders
to indicate their complex normative nature. To describe and analyse normative
ordering processes more systematically, I also investigated code co-occurrences
in the empirical case studies. Code co-occurrence in MAXQDA describes the
50 DATA GOVERNANCE

simultaneous occurrence of two or more codes for a particular segment and thus
demonstrates to what extent actors simultaneously appeal to distinct orders. The
analysis is based on 6,381 codings from the analysis of the case studies. Coded
expressions vary in length and may comprise sentences but also entire paragraphs
depending on their interpreted statement as one justificatory expression. As actors
are expected to have general references to all or most orders within a document,
I chose to focus solely on intersections between codes, that is, coded segments
needed to directly overlap, to illustrate what trade-offs are considered legitimate
within one justificatory expression. The varying length of statements and the
qualitative nature of the inquiry point to the limitations of this approach.¹⁹
The visualization of co-occurrence offers only a superficial insight into potential
linkages between different value orders. It indicates how actors link specific value
orders in their justificatory practices. I combined this analysis with insights from
the qualitative case studies to create a typology of distinct visions of data gover-
nance. Due to the small number of cases, the visions of data governance represent
preliminary ideal types in need of further scrutiny.

2.6 Conclusion

In this chapter, I have highlighted the mutually constitutive relationship between


the field of data governance and jurisdictional conflicts. These conflicts are consti-
tuted by clashing jurisdictional claims, that is, claims of legitimate control over a
certain issue. In the articulation of such claims, actors attempt to justify their par-
ticular jurisdictional claims through references to higher common principles or
value orders. When faced with specific conflict situations, they draw on and create
distinct visions of data and data governance. I have suggested that jurisdictional
conflicts must be understood as comprising multiple dimensions to account for
the significant variation in their occurrence and resolution. In essence, actors try
to reach a shared understanding of the governable object, the relevant goals, and
the scope of data governance. I argue that the bifocal approach of the theoretical
framework offers a fresh understanding of jurisdictional conflict resolution pro-
cesses. It focuses on interlinkages between short- and long-term dynamics, allows
a problematization of data as an object of governance, and conceptualizes the plu-
rality of normative underpinnings. By considering the interrelationships between
actors rather than formal actor characteristics, the framework enables the analy-
sis of non-state and individual actors. In Chapter 3, I reconstruct the genesis of
the field through an analysis of the manifestation of common stakes and the value
orders to contextualize these theoretical articulations.

¹⁹ In Appendix 2.2, I provide two additional visualizations for cross-checking. One visualization
excludes superficial references to human rights, and the other is based on a co-occurrence analysis
that measures proximity. Neither changed the results significantly.
3
Value Orders and the Genesis of the Field
of Data Governance

From today’s perspective, the relevance of data governance seems rather intu-
itive. In 2018, the Facebook CEO Mark Zuckerberg was famously called to the US
Congress to testify on how a data analytics firm called Cambridge Analytica pur-
chased tens of millions of user data to build software designed to influence voters
(Cadwalladr and Graham-Harrison 2018). The potential role of election interfer-
ence and voter manipulation in the success of the Brexit and the Trump campaigns
in 2016 produced arguably the biggest international outcry on data processing
practices since the Snowden revelations. A vast number of people began to perceive
the potential consequences of data (mis)use. Considering the implications for the
enactment of free democratic elections, requests for regulation, even from compa-
nies, grew louder (Kang 2018; Tynan 2018). Yet data governance dates back much
further. Indeed, nearly forty years before the Cambridge Analytica scandal, the
Organization for Economic Cooperation and Development (OECD) adopted the
Guidelines on the Protection of Privacy and Transborder Flows of Personal Data
(OECD 1980). This also predated the dominance of today’s ubiquitous tech com-
panies. Nonetheless, a group of actors was united by the perception of common
inter- and transnational stakes in data governance. This chapter aims to investigate
how data governance emerged as a matter of concern in the global context long
before it became salient widely and how it has transformed into an increasingly
central element in global governance.
As outlined in Chapter 2, I reconstruct the emergence of data governance as
a Bourdieusian field (Bourdieu 1996, 1998) in which actors organize around the
common idea that their engagement is meaningful (Neveu 2018, 364). I argue that
the field emerged around the idea that data governance is meaningful in the pur-
suit of the common good. While actors in the field recognize this common ‘illusio’
(Bourdieu 1996, 331–6), there are critically distinct notions of the common good.
Others have investigated competing interpretations as different framings (Epstein
et al. 2016). These differences represent deeper normative divisions that produce
and conceptualize data as a governable object in relation to broader societal goals.
This chapter aims to disentangle and categorize these notions through the con-
cept of value orders, which I introduced in Chapter 2 as justificatory grammars
based on higher common principles (Boltanski and Thévenot 2006). Through an
inductive analysis of key data governance agreements and principles that represent

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0003
52 DATA GOVERNANCE

constitutive moments in the field (see Table 2.1), the chapter identifies five orders,
links them to debates in political philosophy, and situates them in the context of
the emerging field of data governance. These orders include, on the one hand,
the order of fairness, the order of production, and the order of security, which
define more substantive goals, and, on the other hand, the order of sovereignty
and the order of globalism, which define more procedural goals. By embedding
this reconstruction of evaluative repertoires in the analysis of the genesis of the
field, I illustrate the orders’ contextual nature, their institutionalization, and the
field’s settling logics.
In Section 3.2, I first provide a brief overview of the historical genesis of the
field. Second, in Sections 3.3–3.7, I reconstruct the five orders. I start with a gen-
eral description of their virtues and characteristics and embed them in debates in
political philosophy. After that, I explore a dystopian scenario for each order point-
ing to justifications that evoke fear or danger, illustrate different ways to prove
worth through evaluative repertoires or valued objects, and finally demonstrate
the specific conceptualization of data and data governance in this order. Third, I
briefly summarize the main dynamics in the field and give a short overview of the
application of the framework.

3.1 The Genesis of an Emerging Field: Data Governance

In this section, I briefly outline the historical emergence of the field of data gover-
nance. As the case studies provide a deeper reconstruction of key conflicts and
events, I only sketch the most important steps in the genesis of an incomplete
global field. This section particularly focuses on the formation of transnational
stakes that originated in the EU’s Data Protection Directive (1995).
As outlined in Chapter 1, the origins of data governance reach back to the
end of the nineteenth century. In 1890, the legal scholars Warren and Brandeis
explicitly warned of the dangers of intrusion through technological innovations
and articulated a legal right to privacy. Their definition of privacy as the ‘right
to be let alone’ (Warren and Brandeis 1890) long shaped the legal and philo-
sophical debate, as did Brandeis’s dissenting opinion in a case concerning the
wiretapping of telephone conversations (Olmstead v. United States 1928). The
experiences of twentieth-century totalitarianism contributed to the institutional-
ization of the right to privacy, which was first enshrined in the 1948 Universal
Declaration of Human Rights and later in the 1976 International Covenant on
Civil and Political Rights. Yet data governance was perceived predominantly as a
matter of domestic regulation rather than a challenge in the international context
(Bessette and Haufler 2001, 74). In view of increasing automated data collec-
tion and information technology after the Second World War, both academics
(such as Flaherty 1989; Sieghart 1976; Westin 1967) and governments began to
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 53

address potential implications for privacy more directly.¹ The German state of
Hesse introduced the first data protection law in 1970, and Germany adopted a
data protection law in 1977, reinforced by the seminal judgment of the Constitu-
tional Court that established the right to informational self-determination in 1983.
Other industrialized states followed shortly after, such as Switzerland in 1973 or
the US in 1974 (Bennett 1992; Newman 2008, 65, 102). Domestic legislative ini-
tiatives aimed to increase the protection of privacy against both economic and
public security invasions, but there are also indications of economic incentives,
for example in the United Kingdom (UK) (Dowd 2019) or the US (Solove and
Hartzog 2014, 587).

3.1.1 Limited Global Cooperation

The developments in the domestic context also fostered a political debate about
privacy and data protection on a global level. In 1968, at the twentieth anniver-
sary of the adoption of the Universal Declaration of Human Rights, the UN was
the first international organization to reflect explicitly on data privacy (Heisen-
berg 2005, 52). In 1979, the first International Conference of Data Protection and
Privacy Commissioners (ICDPPC) marked the emergence of a transnational net-
work of professionals. Finally, in the late 1970s, the OECD and the Council of
Europe (CoE) both established expert groups to draft regulations (Kirby 2011, 8).
Both instruments constituted major steps towards a distinct field beyond domes-
tic boundaries, as data governance came to be recognized as a relevant inter-
and transnational concern. When the CoE Convention 108 entered into force
in 1985, it became the first legally binding international data protection instru-
ment (Cannataci and Mifsud-Bonnici 2005, 6). Yet, due to its European character
and lack of enforcement, it was not significant enough to make a truly global
impact. OECD member states were ‘[d]etermined to advance the free flow of infor-
mation’ (OECD 1980). Therefore, the influential OECD guidelines were mainly
designed to overcome barriers to free data flows and prevent further fragmentation
(Kirby 2011).² Apart from some efforts under the auspices of the UN (e.g. UNGA
1990), global cooperation was limited to these initiatives for the decades that
followed.
Besides international institutions, non-state actors were also increasingly rel-
evant. Microsoft and Apple were founded in the mid-1970s but the rise of
information and communications technologies (ICT) and particularly the com-
mercialization of the internet in the early 1990s catalysed the perception of data

¹ For an overview, see Bennett (1992), Bennett and Raab (2006).


² The guidelines comprised eight basic principles: collection limitation, data quality, purpose spec-
ification, use limitation, security safeguards, openness, individual rights, and accountability—the Fair
Information Principles (FIPs). Self-regulation was preferred (OECD 1980, Art. 19b).
54 DATA GOVERNANCE

governance as an economic concern. The idea of the ‘digital economy’ (Tapscott


1996) highlighted the beginning of a shift towards informational capitalism (Saco
1999, 260). Considering their increasing economic relevance, data increasingly
transformed into a form of widely convertible capital. Data-based companies,
such as Google and Facebook were founded around the turn of the millennium.
NGOs, such as Privacy International and Statewatch in Europe as well as the Elec-
tronic Frontier Foundation and the Electronic Privacy Information Center in the
US, emerged in the 1990s. In the 2000s, notable additions to this group included
the pan-European network of privacy organizations European Digital Rights and
Access Now (for a comprehensive overview, see Bennett 2008).

3.1.2 The Emergence of Common Stakes

The key impetus for the formation of a multilevel field with common stakes man-
ifested at the intergovernmental level with the implementation of the EU’s 1995
Data Protection Directive. Through domestic law, the EU unilaterally asserted
strict data protection rules not only for the EU market but also for cross-border
data transfers, notably against powerful US interests (Heisenberg 2005). Swire
and Litan consider the 1995 Directive as a constitutive moment that ‘repre-
sents a dramatic increase of the reach and importance of data protection laws’
(1998a, 24; see also UNCTAD 2016, 32). As the EU implicitly claimed control
over data beyond its territorial borders, a multiplicity of rules from different
jurisdictions overlapped, which, in turn, required meaning-making processes
about common concepts, goals, and institutional solutions. Most significantly, this
included negotiations to bridge regulatory differences between the EU and the US
(Farrell 2003).
The 1995 Directive not only marked an increase in scope but also entrenched
the normative character of data governance. While the OECD Privacy Princi-
ples had pragmatically stated that ‘Restrictions on these [data] flows could cause
serious disruption in important sectors of the economy, such as banking and
insurance’ (OECD 1980 Preface), the 1995 Directive established the premise that
‘data-processing systems are designed to serve man’ (Data Protection Directive
1995, para. 2). This significantly raised the stakes in international cooperation
and thus contributed to the formation of a common inter- and transnational field.
By linking data governance to the common good of mankind, the directive speci-
fied the ‘illusio’ of the field. Based on the premise that data processing can ‘serve
man’, the directive suggested that engagement in data governance contributes
to the pursuit of this common good. Due to limited EU competences in other
areas, the 1995 Directive was officially based on internal market considerations
(Lynskey 2015a, 3). Yet privacy and data protection rights became increasingly
entrenched and constitutionalized in the EU, particularly with the inclusion of
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 55

data protection as a fundamental right in the EU Charter (Charter of Fundamen-


tal Rights of the European Union 2000, Art. 8) in 2000.³ The EU assumed the role
of a global standard-setter and privacy policy entrepreneur (Newman 2008), while
the US seemed passive or resistant to both national and transnational debates
(Regan 2003, 280).

3.1.3 Global Institutionalizations

After the 1995 Directive had fostered meaning-making processes about com-
mon stakes, institutionalization increased globally. The Asia Pacific Economic
Cooperation (APEC) Privacy framework entered into force in 2005 (APEC 2005;
Greenleaf 2006), the Economic Community of West African States (ECOWAS)
(2010) adopted a Supplementary Act on Personal Data Protection in 2010, and
the African Union (AU) adopted a Convention on Cyber Security and Personal
Data Protection in 2014. In the early 2010s, the OECD and the Council of Europe
updated their original instruments, as did the EU with the adoption of the GDPR
(2016), which came into effect in 2018. In these instruments, data is conceptu-
alized as ‘a valuable asset’ (OECD 2013, ch. 1), and data governance is directly
linked to the pursuit of the common good. For example, the ASEAN framework
recognizes ‘the importance in strengthening personal data protection with a view
to contributing to the promotion and growth of trade and flow of information
within and among ASEAN Member States in the digital economy’ (ASEAN 2016,
1; see also APEC 2005, para. 20).
For a long time, China and the US were significant outliers to this global trend
(Newman 2008). In contrast to efforts to promote content control norms glob-
ally (Flonk 2021), China does not yet strongly engage in debates on international
data protection standards. However, China has recently developed a comprehen-
sive regulatory framework on cybersecurity and data protection (Erie and Streinz
2022). While there is potential for conflict, particularly considering the GDPR
(Zhao and Chen 2019), no major public clashes have emerged. The US still does
not have a comprehensive data protection law but a fragmented regime of sectoral
laws. Data protection is restricted to cases in which the individual has ‘a legit-
imate expectation of privacy’ (Katz v. United States 1967, 360). This is not the
case, for example, if an individual voluntarily turns over private data to a finan-
cial institution (United States v. Miller 1976) or dials a phone number, thereby
transmitting data to a phone company (Smith v. Maryland 1979). The US Pri-
vacy Act of 1974 restricts only government activities. In the US, the use of data
for economic purposes by private companies is often perceived more favourably

³ The full legal effect of the EU Charter of Fundamental Rights only came about with the Lisbon
Treaty in 2009.
56 DATA GOVERNANCE

than public surveillance practices (Schwartz and Reidenberg 1996). The debate
on a federal data protection law has accelerated due to the adoption of the Cali-
fornia Privacy Act in 2018, which offers levels of protection similar to the GDPR
(Harding et al. 2019).
Despite these limited efforts in privacy and data protection, the US is a signifi-
cant player in the field of data governance. On the one hand, US-based companies
such as Facebook, Google, or Microsoft hold significant market power and have
shaped the practices of data governance considerably. On the other hand, the US
has contributed to the emergence and institutionalization of data governance for
security purposes. In particular, US unilateral jurisdictional claims in countert-
errorism, often with extraterritorial effects, have shaped the view of data and the
objectives of their governance (Kaunert et al. 2012). Tensions between security
and privacy have been discussed before the internet age (e.g. Klass and Others
v. Federal Republic of Germany 1978; Westin 1967), but digital data gained their
security dimension mainly in the last twenty years (see Saco 1999 for exceptions).
This dimension was increasingly ‘internalized’ (Farrell and Newman 2019, 34)
in view of the increasing use of the internet for criminal and terrorist purposes.
External shocks, particularly 9/11, worked as a catalyst for the institutionaliza-
tion of security measures (Etzioni 2018, 112–13; Kaunert et al. 2012) such as the
systematic gathering of airline passenger and financial data (Council of the EU
2016). Yet the salience of data protection was limited until the revelations of mass
surveillance in 2013 (Kalyanpur and Newman 2019b).

3.1.4 The Snowden Effects

The revelations of extensive public surveillance by former intelligence contractor


Edward Snowden triggered extensive public debates and institutional responses
(see Chapter 4). In line with the widespread international criticism, the UN Gen-
eral Assembly (UNGA 2013) adopted Resolution 68/167 ‘The Right to Privacy in
the Digital Age’, which calls on states to respect the right to privacy in digital com-
munications and review their practices. The first UN Special Rapporteur on the
right to privacy was appointed in 2015. However, a global framework or treaty
still does not exist. Calls for the global promotion of the recently updated Con-
vention 108+ (e.g. UNHRC 2018, para. 117e), which aims to increase signatories
by non-member states, are impeded by the fact that the US refuses to sign the
convention.
The Snowden revelations also increased the scrutiny of private tech compa-
nies. An increasing number of data-related headlines, such as those about the
Cambridge Analytica scandal (Cadwalladr and Graham-Harrison 2018), the data
breach by the credit reporting agency Equifax that involved the data of 143 million
US customers (Gressin 2017), or the attempt to create a Facebook cryptocurrency
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 57

through the Diem project,⁴ made the calls for regulation more urgent. On the one
hand, there is a growing reliance on data in the security sector, as law enforcement
and intelligence agencies use data in the fight against terrorism and crime, as well
as in the context of the ‘data economy’ (EC 2020a). On the other hand, govern-
ments face increasing incentives to exercise (sovereign) control over the internet
(Drezner 2007; Goldsmith and Wu 2006) when dealing with the growing power
of platforms and other private actors. As already outlined, this has contributed
to an increase particularly in domestic data protection legislation. Yet tech compa-
nies exercise unique forms of power. Culpepper and Thelen (2019) argue that tech
companies’ capacity to combine consumer dependence and consumer attachment
constitutes a key source of their power. Nachtwey and Seidl characterize Silicon
Valley elite attitudes as a ‘solutionist ethic’ (2020) which unites the concern for the
common good with a flexible, risk-taking embrace of the capitalist ideal of profit
and thus promises technological solutions to humanity’s most pressing problems.
Therefore, tech companies promote the normative character of their engagement
in the field, for example through Google’s articulated goal to ‘organize the world’s
information and make it universally accessible and useful’ (Google 2020) or Face-
book’s mission ‘to give people the power to build community and bring the world
closer together’ (Facebook 2020).

3.1.5 Interlinkages with Internet Governance

The infrastructure that governs transnational data flows is complex and inter-
governmental, which creates significant barriers to entry, particularly for NGOs
(NGO representatives, personal communication 2018, 2019) but also for actors
that lack legal expertise. The current intergovernmental character of data gov-
ernance might seem surprising in view of the internet’s inherently transnational
nature. While data governance is part of internet governance, they are empirically
disconnected (Julia Pohle et al. 2016; Radu 2019; Reiberg 2018). While a subject
of discussion, for example at the Internet Governance Forum (Epstein et al. 2014),
data governance is negotiated in intergovernmental rather than multistakeholder
fora that have emerged before or in parallel to central internet governance bodies
(Murray 2007). Indeed, other areas of internet governance have been character-
ized by a significant scepticism about intergovernmentalism and instead promote
multistakeholderism (Hofmann et al. 2016). While, for example, the governance of
the domain name system is characterized by a deliberate emphasis on private and
non-state actors (Mueller 2010), the origins of data governance are firmly tied to
national and subnational actors. This also shows how transnational fields tend to
retain strong linkages to the domestic level (Sapiro 2018). In internet governance,

⁴ For more information, see https://diem.com/, accessed 26 June 2022.


58 DATA GOVERNANCE

scholars have identified a plurality of internet governance visions—market-based,


self-regulatory measures (Raymond and DeNardis 2015; Spar 1999) and informal,
bottom-up ordering to ensure a free internet (Johnson and Post 1996; Zittrain
2008), against the emphasis on sovereignty or the development of international
principles (Deibert and Crete-Nishihata 2012). This has contributed to the loose
formation of ‘liberal’ and ‘sovereigntist’ spheres in internet governance (Flonk
et al. 2020). However, conflicts over data privacy are currently mostly restricted
to the liberal sphere (but see Chapter 9).
In sum, data governance first loosely formed as a field through the constitu-
tive moment created by the 1995 Data Protection Directive. The law prompted
common processes of meaning-making and institutionalization and reinforced the
notion that data could serve the common good. Therefore, actors in the field orga-
nized around the common assumption that data governance was meaningful. Yet
data governance is not a fully consolidated field, despite the rising significance
of data in a diversity of sectors. Also, due to this rising significance as well as its
global character, the field is not autonomous but interlinked with other fields such
as internet governance or internet policy. It penetrates local and regional bound-
aries and includes diverse actor types. In Section 3.2, I conceptualize the distinct
notions of what actors perceive as the goal of the field through the concept of
value orders, zooming in on different principles, institutionalizations, values, and
dystopian scenarios.

3.2 Value Orders in the Field of Data Governance

This section outlines five value orders in data governance; an overview is pro-
vided in Tables 3.1 and 3.2. I identify two dimensions that specify the common
good of data governance: a substantive and a procedural dimension, which, in
contrast to variation on a single dimension, might be compatible beyond the for-
mation of a (temporary) compromise. The substantive orders are the order of
fairness, the order of production, and the order of security. The procedural orders
are the order of sovereignty and the order of globalism. Naturally, this is not an
exhaustive list of existing conceptions of common societal goals but outlines the
most important or dominant orders. As outlined in Chapter 2, I reconstructed
the orders through the inductive analysis of key data governance agreements and
principles that form constitutive moments of the field (see Table 2.1). I then went
back to the literature and linked the orders to philosophical underpinnings. They
provide a theoretical basis for the diversity of claims in justificatory practices. In
Sections 3.3–3.7, I discuss the general principles of each order, their philosophical
underpinnings, and outline how this links to data governance, including institu-
tional frameworks, and outline dystopian scenarios of the order as well as objects
of value.
Table 3.1 Substantive value orders in data governance

Value order Community Historical context Relevant Dystopian Data governance Valued objects
virtues political scenario
philosophers

Fairness Freedom, indi- Early debates, stronger Appiah, Beitz, Power asymme- Data as a human Human rights con-
vidual and emergence in 19th century, Rawls try, repression, rights and ventions, e.g. UNGA
collective rights more explicit with auto- surveillance autonomy concern Resolution Privacy in
mated data collection in capitalism the Digital Age
1960s
Production Innovation, Early 1990s, stronger with Hegel, Economic Data as an Consumer trust, free
economic commercialization of the Rousseau, Smith downturn, economic resource trade agreements,
rationality internet in 1996 inefficiency free flow of data,
e.g. OECD Privacy
Guidelines
Security Vigilance, safety Emergence in 1980s and Locke, Hobbes Terrorism, Data as crucial Law enforcement
1990s, strong salience after crime, death information cooperation, informa-
9/11 terrorist attacks tion sharing, police,
e.g. UNSC Resolution
2396
Table 3.2 Procedural value orders in data governance

Value order Community Historical context Relevant political Dystopian Data governance Valued objects
virtues philosophers scenario

Globalism Global Emergence with Held, Ohmae, Sassen Fragmentation, Data as inter- Inter-
cooperation, increasingly global international /transnational /transnational
multilateralism character of data discord frameworks,
flows since 1980s IOs, e.g. CoE
Convention 108+
Sovereignty Territory, Early emergence in Bell, Nagel, Sandel Intervention, inter- Data as territorial Domestic law,
collective 1960s and 1970s ference by IOs or or community- hierarchies,
self-determination with national data states based e.g. EU 1995
protection legislation Data Protection
Directive
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 61

3.3 The Order of Fairness

The first order I identify is the order of fairness, which is perhaps most deeply
entrenched in the institutional and scholarly debate on data governance (Clif-
ford and Ausloos 2018). It is characterized by a strong emphasis on the human
rights implications of data governance, particularly regarding power and infor-
mation asymmetries (Rössler 2005). Due to the effects on personal autonomy
(CoE 2018a, 16), this particularly includes privacy (UNGA 2013, 1), but also the
right of data protection is considered to protect informational self-determination
(Rodotà 2009, 80) and human dignity (CoE 2018a, 16). Yet there is no unidirec-
tional assessment of data governance. Privacy is rarely considered as a value in
itself but evaluated in light of the implications for individual liberty and autonomy
(Rössler 2005). In consequence, the order of fairness demands a balance between
individual rights, such as freedom of expression (G7 2016, paras. ii, a, b; G20 2016,
3) and freedom of speech (Volokh 1999), the right to access (US White House
1997), and other goods that potentially benefit from data sharing, such as health
(Pasquale and Ragone 2013). The NGO Article 19’s Principles most explicitly
state that:

data protection legislation can be misused or abused to prevent, end or restrict


the legitimate public dissemination of accurate personal information in order
to enable individuals to control their reputation at the expense of freedom of
information, the right to truth and the wider public interest.
(Article 19 2017, 7)

This also resonates with feminist critiques concerning the separation of public and
domestic spheres that enable the perpetuation of power asymmetries, for example
through abuse (MacKinnon 1987).⁵

3.3.1 Philosophical Foundations

The principles characterizing the order of fairness are to a significant degree


embedded in a classic liberal tradition that emphasizes the promotion of free-
dom from (governmental) domination (e.g. Pettit 1997) and the promotion of
basic liberties (Rawls 1971). These basic liberties⁶ cannot be traded off but
protect individual rights (Beitz 2009, 109). While some emphasize procedural
fairness as a means to attain just outcomes, such as the ‘original position’ in
which participants are ignorant of their identity or social circumstances (Rawls

⁵ For a more detailed discussion, see Rössler (2005, 29–46).


⁶ The scope of these liberties is contested. Rawls (2001) argues for minimal rights to accommodate
cultural differences, while Beitz (2001) explicitly warns of these limitations.
62 DATA GOVERNANCE

1971), others demand the inclusion of ‘the equity and efficiency of the sub-
stantive opportunities that people can enjoy’ (Sen 2005). Risse defines fairness
as ‘the proportionate satisfaction’ (2012, 276, emphasis in original) of certain
demands. The proponents of the order of fairness are united by the substantive
goal of protecting fairness and the basic rights of the most vulnerable, but they
diverge regarding the conception of the desirable level of governance. Some fol-
low a conception of (Kantian) cosmopolitanism, proposing global principles or
global public reasoning (Beitz 2001; Pogge 1989; Sen 2005), while others (Nagel
2005; Rawls 2001) argue that principles of justice only hold in the context of
the state.

3.3.2 The Order of Fairness in Data Governance

In this order, data are considered in light of the implications for individual rights.
This conceptualization may be based on concerns for privacy, which has been
subject to various debates in political philosophy (e.g. Nagel 2005) and in which
it is often coupled with other values, such as human dignity and personhood
(Rotenberg 2001). Data may be owned in a sense that is different from that of
property assumptions but expresses ‘a sense of constitutive belonging, not of exter-
nal ownership’ (Floridi 2005, 195). Data are considered more broadly in their
relationship to the right to access, freedom of expression, and power asymme-
tries. In more recent academic debate, scholars discuss specific questions of data
justice. They tie inequalities, which often asymmetrically hit vulnerable or socio-
economically disadvantaged groups (Masiero and Das 2019), to the information
society (J. J. Britz 2008) or data specifically (Dencik et al. 2016). Similarly, con-
cerns for redistributive justice are increasingly gaining traction in the debate on
competition (Lynskey 2019). This is consistent with the emphasis on privacy and
data protection as collective rights (Solove 2008) moving beyond the liberal focus
on individual rights.
The conception of community in this order is linked to vulnerable people
(Epstein et al. 2014, 161). In contrast to the order of security, vulnerability is not
perceived as bodily vulnerability but in terms of threats to a community of individ-
uals and collectives in their pursuit of justice, the enactment of human rights, and
the avoidance of repression and power asymmetries. The emphasis on autonomy
assumes an inherent worth of people as individuals with dignity (Appiah 2010, 61)
that forms the basis of this community construction.
There is significant variation in the level of institutionalization of the order
of fairness. Even though there is, for example, a significant global diffusion of
data protection principles from Europe (Bradford 2020; Greenleaf 2012), other
instruments are less precise. For instance, the ASEAN Framework on Personal
Data Protection states: ‘An organisation may collect, use or disclose personal data
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 63

about an individual only for purposes that a reasonable person would consider
appropriate in the circumstances’ (ASEAN 2016, para. 6b, emphasis added), which
allows considerable room for interpretation.

3.3.3 Dystopian Scenario

The dystopian scenario of the order of fairness sees individual rights bargained
away to satisfy other interests which may diminish individual liberty and auton-
omy. A society based on commercial or public surveillance with inadequate
privacy protection is, therefore, considered ‘oppressive’ (Solove 2008, ch. 5), par-
ticularly in the absence of democratic oversight and control. It creates significant
power asymmetries and injustices (Zuboff 2018), such as voter manipulation in
the Cambridge Analytica scandal. This is, inter alia, problematic due to ‘chilling
effect[s] on the exercise of the right to freedom of expression and the right to
hold and form an opinion by searching and accessing and disseminating informa-
tion online’ (Article 19 2017, sec. 4.1). While there is a significant variation in the
perception of threat (Epstein et al. 2014), the intricate relationship between pub-
lic and private surveillance in particular poses new challenges (Fischer-Lescano
2016; Lyon 1994, 180).

3.3.4 Valued Objects and Evaluative Principles

This order emphasizes the value of objects that institutionalize and protect fair-
ness legally and politically, such as regulations or other legally binding human
rights frameworks. For example, the AU explicitly refers to the African Charter
of Human and Peoples’ Rights (AU 2014, 25), while the UN links its goals to the
International Bill of Human Rights (UNGA 1990, Art. 6). All frameworks contain
some reference to safeguards. Greenleaf identifies ten elements, including stan-
dards of collection, data quality, purpose specification, or notice, that form the
basis of most data agreements and domestic legislation (Greenleaf 2012, 73–5). In
addition, value might be assigned to frameworks that highlight the responsibility
of businesses, such as the UN Guiding Principles on Business and Human Rights
(UN 2011). Strong enforcement capacities for independent oversight or monitor-
ing bodies (GDPR 2016) or the establishment of a UN Special Rapporteur (Article
19 2017) indicate the manifestation of the order. The order of fairness is sometimes
associated with civil liberties organizations and moral authority (Avant et al. 2010),
particularly in international institutions or fora (Epstein et al. 2014, 160). Actors
use different resources, including statements from whistle-blowers or leaked docu-
ments (Gros et al. 2017), to prove conformity to this order. Proponents tend to rely
on reports and expertise by data protection authorities, such as the European Data
64 DATA GOVERNANCE

Protection Supervisor (EDPS) or the Article 29 Data Protection Working Party


(29WP), now the European Data Protection Board (EDPB) (NGO representative,
personal communication 2018).

3.4 The Order of Production

The second order I identify, the order of production, is tied to the promotion of
economic progress and a strong logic of efficiency. Legislation, as well as general
conduct, should be ‘practical’ (APEC 2005, 3), ‘flexible’ (IA 2018, 2), and ‘reason-
able’ (IA 2018, 3). In the academic debate, this approach finds resonance in the
privacy economics literature, which focuses on the economic potentials of data
sharing. Proponents weigh the trade-offs between the protection of privacy and
the (economic) efficiency benefits resulting from the collection and processing of
data (Acquisti et al. 2016). A prominent proponent is Posner (1977, 1981), with
his vision of a ‘legal right to privacy based on economic efficiency’ (1977, 404).
While some scholars reject public regulatory measures due to their ineffectiveness
(Stigler 1980), regulation finds support for addressing market failures (Veljanovski
2010, 18), such as monopolies.

3.4.1 Philosophical Foundations

Basic property rights are widely argued to be part of the pursuit of justice and
the common good (Rawls 1971; Rousseau 1998).⁷ However, the normative and
philosophical underpinnings of the order of production are most clearly artic-
ulated in the seminal works of Adam Smith (see also Boltanski and Thévenot
2006). Smith famously argued that behaviour focused on self-love would result
in aggregate social benefits due to the ‘invisible hand’ of market competition (A.
Smith 1999). While proponents of libertarian philosophy have echoed the empha-
sis on the minimalist state (e.g. Nozick 1974), many scholars acknowledge that a
free market creates problems and demand limited restrictions or state involvement
(Hegel 2015; see also Satz 2012). For example, Friedman proposed that ‘govern-
ment is essential both as a forum for determining the “rules of the game” and as
an umpire to interpret and enforce the rules decided on’ (M. Friedman 2009, 15).
Even Smith allowed for protective measures for defence purposes or through tariffs
(A. Smith 1999, bk. 4, chs 4–5).
While most proponents have an explicitly international outlook, economic ben-
efits are often limited to a restricted group, particularly Europe and the Americas
(A. Smith 1999, bk. 5, ch. 3). While principles of the order of production are

⁷ For a more extensive debate on this issue, see Herzog (2013).


VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 65

generally portrayed as going hand in hand with globalism, scholars have demon-
strated how sovereigntist or even nationalist tendencies have found expression in
neoliberal policies (Harmes 2012).

3.4.2 The Order of Production in Data Governance

In the order of production, data embodies the ‘new oil’ (Economist 2017), ‘the
fuel that drives much commercial activity online’ (UNCTAD 2016, iv), ‘a valu-
able asset’ (OECD 2013, ch. 1), or even ‘the lifeblood of the global economy’
(Kirkhope 2016). To foster innovation and economic growth, the free flow of
data is crucial (e.g. OECD 2013, 12). This conceptualization makes data trade-
able: users exchange their data for (free) services. Privacy protection is often a
means to secure a trust relationship with customers (see Fischer-Lescano 2016)
or to avoid legally binding regulation. Data governance aims to harmonize and
improve efficiency (Reicherts 2014). For instance, in the G7 ICT Ministers’ Decla-
ration, privacy protection is an action item to support the economically beneficial
free flow of information through trust and confidence building (G7 2016, para.
ii, a, b; see also G20 2016, 5; ASEAN 2016, 1). However, the commodification of
data also offers opportunities to strengthen individual control. A property-based
understanding of data may increase agency for individuals regarding the current
and future use of their data (Schwartz 2004, 2029–116), for example through fines
or sanctions (Janger 2002, 914).
Efficiency-maximizing individual and collective actors form the order’s refer-
ence community. Consumers and entrepreneurs interact to strengthen economic
progress. Actors may include private companies but also public actors that aim
at improved economic performance, ‘permissionless innovation’ (Thierer 2016),
and harmonization.
The order of production has found expression in numerous institutional frame-
works, particularly in treaties designed to reduce barriers to data flows (Kirby 2011,
8). For instance, the OECD defines privacy and the free flow of information as ‘fun-
damental but competing values’ (OECD 1980; CoE 1981, Preamble; UNGA 1990,
Art. 9), and even the GDPR refers to ‘the free movement of data’ in its title. There
is also a strong emphasis on business interests and public–private initiatives. The
APEC framework specifically proposes a policy that ‘balances information pri-
vacy with business needs and commercial interests’ (AU 2014, 13; APEC 2005, 3).
The Internet Association’s Privacy Principles point out that while their company
members welcome efforts to protect privacy, ‘laws and regulations should avoid
a prescriptive approach to doing so, as such an approach may not be appropri-
ate for all companies’ (IA 2018, 4). The order of production puts emphasis on
soft solutions rather than strict enforceable legislative approaches. Governance
efforts aim to harmonize standards, foster trust in the digital economy, and reduce
66 DATA GOVERNANCE

barriers to the free flow of information (APEC 2005, foreword). Trust is ‘funda-
mental to their [IA companies] relationship with individuals’ (IA 2018, 2), as is
consent (ECOWAS 2010, 10).

3.4.3 Dystopian Scenario

The dystopian scenario in the order of production expresses concerns about the
inefficiency of barriers to trade and data sharing, often associated with strong gov-
ernmental interference (GNI 2017, 4), which may even be considered a pretext
for economic protectionism (Obama cited in Farrell 2015; Lancieri 2018). Restric-
tive effects on innovation are perceived as particularly problematic (e.g. Kottasová
2018). In view of legal uncertainty, which poses ‘major obstacles to the develop-
ment of electronic commerce’ (AU 2014, 1), and inequality of competition, global
players have more recently become proponents of common standards (Hern
2019). As lack of trust constitutes a potential interference with the digital economy
(EC 2013e), private companies in particular aim to distance themselves from, for
example, ‘[m]alicious actors, with motives ranging from criminal to geopolitical,
[that] have inflicted economic harm, put human lives at risk, and undermined the
trust that is essential to an open, free, and secure internet’ (Cybersecurity Tech
Accord 2018).

3.4.4 Valued Objects and Evaluative Principles

In the order of production, most actors value soft standards to avoid overregula-
tion and the undermining of trust. Under the governmental ‘shadow of hierarchy’
(Héritier and Lehmkuhl 2008, 5), self-regulatory measures, especially in the US
(Newman and Bach 2004), aim to ensure the free flow of data. Therefore, refer-
ences to agreements, codes of conduct, and terms of service agreements provide
conformity with this order (OECD 2013, Art. 19; US Department of Com-
merce 2000a, 2016). The ‘legitimate need’ (IA 2018, 3) of companies is strongly
emphasized but rarely defined. In addition, membership of private or multistake-
holder initiatives such as the GNI Principles, the IA Privacy Principles, or the
Internet and Jurisdiction Policy Network (GNI 2017, 5; IA 2018) provides evi-
dence of compliant behaviour. Transparency reports issued by tech companies
(see Parsons 2017) have a similar function. Economic impact assessments (e.g.
US Chamber of Commerce 2013) or statistics highlight the immense value of
the digital economy. They emphasize the ‘unprecedented personal, social, pro-
fessional, educational, and financial benefits’ (IA 2018, 2; see also APEC 2005,
2) stemming from the sector’s contribution to GDP and the creation of jobs
(IA 2018, 2).
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 67

3.5 The Order of Security

Third, the order of security is the most recent substantive addition to concep-
tions of worth in data governance. While tensions between privacy and security
have been discussed extensively in the literature (e.g. Westin 1967), the specifi-
cation of data governance as a security concern has emerged only in the last two
decades. The order underlines safety and the prevention of physical harm, particu-
larly for vulnerable actors, such as minors or law-abiding citizens. It unites a variety
of implications of harm, such as the threat of terrorism, serious crime, or cyber-
attacks. Proponents juxtapose an inconvenient but harmless invasion of privacy
with an existential threat to the security of humans or contrast security as a com-
mon good with privacy as a liberal individual right (Etzioni 2018, 124; see Solove
2007). While not all proponents use this zero-sum logic, a general sense of urgency
is attached to justifications that rely on the principles of the order of security.
This dynamic is illustrated by the literature on ‘securitization’ (Buzan et al. 1998),
which refers to the active transformation of an issue through security-embedded
speech acts.

3.5.1 Philosophical Foundations

Security has always constituted a key concern in political philosophy. Arendt


describes how political thinkers of the seventeenth and eighteenth century set up
security as ‘the highest purpose of politics, “the end of government”’ (Arendt 2017,
61). This, they suggested, enabled rather than restricted freedom, for example
through the notion of strengthened property rights as suggested by Locke (Himma
2016). In liberal philosophy, restrictions of individual rights are considered accept-
able if they increase community security (Himma 2016, 157–8), for example to
‘prevent harm to others’ (Mill 1859, ch. 1). The philosopher most clearly asso-
ciated with the articulation of security as the main goal of political action and
community is Hobbes, who argues for ‘absolute submission’ (Sorell 2016, 42) to
a higher authority to increase physical security through a social contract (Hobbes
and Curley 1994, ch. xviii). Hobbes contrasts the civilized rule of the sovereign
with a fictitious ‘state of nature’ in which life is famously ‘nasty, brutish and
short’ (Hobbes and Curley 1994, ch. 1.xiii: 9). On the basis of this contrast,
security explicitly legitimizes inequality in diverse sectors (Aradau 2008, 302).
While proponents of the order of security are united by the recognition of secu-
rity as a relevant higher common principle, the focal point of the Hobbesian
realist tradition is the process of ‘securing’ (Dillon 2002) by a sovereign. In con-
trast, in the liberal tradition associated with Locke, security aims to mitigate
the vulnerability of the individual through social institutions on different levels
(Locke 1967).
68 DATA GOVERNANCE

3.5.2 The Order of Security in Data Governance

In the order of security, data are conceptualized as a significant resource to ensure


security pre-emptively and provide evidence of existing or potential insecurity.
The insistence on privacy may become stigmatizing and dogmatic, as liberties
such as privacy may be conceptualized as ‘a fetish’ (Aradau 2008, 294) or ‘single-
minded’ (Baker 1994). This highlights tensions between national and personal
security that emerged because of the internet (Saco 1999, 286). While all stake-
holders acknowledge the significance of security, there are only very few scholars
who focus on data exclusively as a security concern. Himma juxtaposes the instru-
mental character of privacy with the intrinsic value of security to argue that ‘it
is reasonable to conclude that privacy is less important, from the standpoint of
morality, than security’ (2016, 156). Most scholars argue for some form of balance
(e.g. Etzioni 2018).
The idea of community is strongly interlinked with the construction of threats.
Proponents constitute their common identity, interests, and rationales for action
in reference to a threatening other (Saco 1999, 264; see also Campbell 1992;
Weldes 1999), particularly criminals and terrorists. While the focus on secu-
rity puts enhanced emphasis on public authority, justifications in this order
often call for international cooperation, including that with private actors. For
instance, UNSC Resolution 2396 ‘[e]ncourages enhancing Member State coop-
eration with the private sector … in gathering digital data and evidence in
cases related to terrorism and foreign terrorist fighters’ (UNSC 2017, 4). In
contrast to realist assumptions about global order (e.g. Mearsheimer 2003;
Waltz 2010), the order of security oscillates between the protection of domestic
sovereignty and safeguarding international security via international cooperation
(e.g. Adler 1997).
The order of security is articulated in most policy frameworks, albeit through
exemptions rather than specific positive prescriptions. For instance, despite its
comprehensive character, the GDPR (2016) excludes data processing for law
enforcement or counterterrorist purposes. While the AU (2014) focuses pri-
marily on cybercrime and cybersecurity, the APEC framework ‘is not intended
to impede governmental activities authorized by law when taken to protect
national security, public safety, national sovereignty or other public policy’
(APEC 2005, 8). The ECOWAS framework specifies separate rules on mat-
ters of ‘national security, defence or public security’ (2010, 5; see also ASEAN
2016, Art. 4). While these exemptions are common in most frameworks, the
order of security emerged as particularly salient after the 9/11 terrorist attacks
and in the context of the war on terror, which is also highlighted by the
spread of specific security conventions such as the UNSC Resolution 2396
(UNSC 2017).
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 69

3.5.3 Dystopian Scenario

The appeal to the order of security already invokes an implicit assumption of inse-
curity (Dillon 2002, 120–2). Any reference to the order of security is, therefore,
closely tied to the dystopian scenario of terrorism, crime, and death, evoking an
existential threat to a community of humans. Particularly vulnerable groups such
as children and other innocent actors are frequently mentioned to emphasize the
importance of the claim. Extreme measures are justified by the perception of an
existential threat (Buzan et al. 1998, 25). These measures may include the tempo-
rary suspension of rules, such as sovereignty in the case of intervention (Buzan
et al. 1998, 158), but also the creation of new rules, such as the resort to war
(de Goede 2008). Thus, references to a security threat can enlarge the space of
possibles for action.

3.5.4 Valued Objects and Evaluative Principles

In this order, proponents ascribe value to vigilance (AU 2014, 25) in the fight
against impunity. There are also a significant number of references to informal
or formal cooperation agreements, for example in law enforcement or in the fight
against terrorism (OAS 2004, Appendix A), joint databases, and watch lists of
potential terrorists (Ryngaert and van Eijk 2019; UNSC 2017, para. 13), as well
as more active measures such as the promotion of ‘counter-terrorist narratives’
(UNSC 2017, 4). While there is a general recognition that interference with pri-
vate life needs to be constrained (CoE 2002, v), this interference is conceptualized
as a necessity in the protection of liberal values. For example, in 2014, President
Obama stated that ‘Throughout American history, intelligence has helped secure
our country and our freedoms’ (Obama 2014). Proponents may use statistics
and evidence for successfully prevented terrorist attacks or major crime, includ-
ing review reports (EU and US 2016, Art. 23), to prove worth in this order.
The demonstration of an existential threat may rely on speech acts or shocking
images (Williams 2003). By referring to pre-emptive security, proponents empha-
size secrecy (de Goede 2018b; de Goede and Wesseling 2017) and the restriction
of transparency as a way of proving worth rather than a deficiency.

3.6 The Order of Sovereignty

In addition to the three substantive orders, I also identify two orders that relate to a
more procedural dimension of the common good, fourth, the order of sovereignty,
and fifth, the order of globalism. Sovereignty is one of the foundational principles,
70 DATA GOVERNANCE

even a ‘Grundnorm’ (Reus-Smit 2001), of the international order but at the same
time heavily contested. The order of sovereignty is centred on the interrelations
between community members and between communities. In consequence, the
order underlines the importance of community decision-making and legitima-
tion processes.⁸ Any obligations or restrictions stemming from institutions or
entities outside the community seem problematic, as they interfere with domes-
tic authority (Hutter et al. 2016). This may also include considerable scepticism
about the exercise of power by private or multinational companies, which states
often aim to resist (Kalyanpur and Newman 2019a). Differences between norms
and rules in the international context are perceived as resulting from a necessary
and normatively desirable differentiation of political objectives (Reidenberg 2000;
FTC employee, pers. comm. 2019; former FTC employee, pers. comm. 2019). For
example, scholars and policymakers emphasize the potential effect of totalitarian-
ism that may have entrenched a perception of worth of data in European societies
(Kirby 2011, 8).

3.6.1 Philosophical Foundations

In political philosophy, sovereignty is rarely justified as a normative goal in itself.


Nevertheless, there is a principled conflict regarding the role of sovereign entities,
particularly nation states, and their status as the units of institutionalized political
order in the context of increasing globalization and interdependence (Zürn and
de Wilde 2016, 289). While the scholarship on sovereignty is theoretically diverse,
the promotion of the rights of sovereign entities is often tied to the conception
of a ‘constitutive community’ (Bell 1993) that shapes understandings of worth
and identity. For instance, Nagel argues that ‘We are required to accord equal
status to anyone with whom we are joined in a strong and coercively imposed
political community’ (Nagel 2005, 133). This points to the order’s roots in com-
munitarian political philosophy, which defines the moral focal point as a fixed
group (Sandel 1998). This approach problematizes the assumption of universalist
values, for example in relation to the cultural determination of the prioritization,
justification, or moral foundation of liberal rights (Bell 2005; Sandel 1998).
Criteria of community membership may vary, including, for example, state,
citizenship, or nationality (e.g. Miller 2017) but also local communities, families,
or schools (Etzioni 2014; MacIntyre 1999, 142). While proponents of the order
of sovereignty are united by the focus on communities as central elements in the
constitution of order, a commitment to constitutive communities or sovereignty

⁸ The book uses the term sovereigntism rather than statism, despite conceptual similarities (Forst
2001; Zürn and de Wilde 2016). In the context of international negotiations in data governance, the
EU acts as a sovereign community in the sense that the authority of the EU institutions, particularly
the EC, is well established.
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 71

is compatible with the order of fairness (Rawls 1971), production (Walzer 2008),
and security (Hobbes 2016).

3.6.2 The Order of Sovereignty in Data Governance

By defining data governance as an issue best dealt with at the level of sovereign
communities, proponents of the order of sovereignty conceptualize data either as
a primarily territorial or domestic issue or as a potential threat to the exercise of
territorial sovereignty (Ruggie 1993). Proponents are united by the prioritization
of the interests and values of the reference community. While outside interfer-
ence and extraterritoriality challenge the principle of external sovereignty, the full
enactment of community interests may have ‘spillover’ effects (Kobrin 2004, 111)
into other jurisdictions, which results in an imposition of domestic norms on other
communities (see Chapter 7).
Sovereign states and their citizens constitute the order’s reference community.
Building on the idea of ‘insiders’ and ‘outsiders’ (Rawls 2001), most frameworks
offer significantly less or no protection for outsiders, mostly non-citizens (AU
2014; e.g. APEC 2005; OECD 1980).⁹ The absence of international fora shows the
entrenchment of this order in data governance, particularly compared with other
areas such as intellectual property or internet protocols (DeNardis 2013).
Beside the existence of numerous domestic frameworks, many international
frameworks explicitly highlight community values or norms, such as the AU,
which argues for ‘an appropriate normative framework consistent with the African
legal, cultural, economic and social environment’ (AU 2014, 2). APEC similarly
‘accords due recognition to cultural and other diversities that exist within mem-
ber economies’ (APEC 2005, paras. 6, preamble). While the diffusion of common
norms to other communities might be considered desirable (EC 2010a), the pro-
tection of sovereign norms constitutes the highest priority. The responsibility for
implementing decisions resides with governments and the national legislature
(APEC 2005, 32–3; CoE 1981, Art. 4; ECOWAS 2010, 7), but international agree-
ments may be valued if they emphasize the competences and responsibilities of
states, for example, when public instruments contain specific exceptions for the
provision of public goods, such as national security or monetary interests (CoE
1981, Art. 9), the ‘ordre public’ as well as the ‘public interest’ (APEC 2005, 8;
ASEAN 2016, Art. 4; ECOWAS 2010, Art. 4), or ‘public health or morality’ (UNGA
1990, Art. 6).

⁹ The Netherlands is one of the few countries where restrictions on intelligence gathering apply to
non-citizens. However, a recent judgment of the German Federal Constitutional Court also highlighted
that intelligence service data collection is bound to German constitutional law, even when this occurs
outside German territory and exclusively on non-German citizens (1 BvR 2835/17 -, Rn. 1-332 2020).
72 DATA GOVERNANCE

3.6.3 Dystopian Scenario

The dominant dystopian scenario in this order is interference by outsiders, in par-


ticular violations of territoriality and sovereignty (OECD 2013). The prospect of
conflict is perceived as being less problematic than the idea of intrusion (see New-
man 2008). Fear of interference may be based on the delegation of authority to the
supranational or international level but may also include the absence of regulatory
authority. For example, ECOWAS perceives a ‘legal vacuum’ (ECOWAS 2010, 2)
but explicitly establishes the framework’s applicability ‘without prejudice to the
general interest of the State’ (ECOWAS 2010, Art. 2). As the order of sovereignty
is deeply interwoven with public authority, this fear also relates to the global nature
of tech companies (Kalyanpur and Newman 2019a, 449). Concepts such as ‘data
imperialism’ (Svantesson 2015b) or ‘data colonialism’ (Couldry and Mejias 2019)
express this fear most clearly by criticizing the impact on the conduct of sovereign
states.

3.6.4 Valued Objects and Evaluative Principles

In the order of sovereignty, the idea of territory constitutes a significant sym-


bolic resource, often entrenched through clauses such as in Convention 108’s
Territorial Clause (CoE 1981, Art. 24 2018a, Art. 28). Proponents emphasize the
self-determination of communities and therefore value opt-outs (in all frame-
works) or opt-ins (e.g. EU and US 2016, Art. 27) rather than specific prescription.
They also put a high value on the importance of and the respect for domestic law.
This comprises the explicit inclusion of statements that highlight the evolution of
diverging cultures or traditions (US Department of Commerce 2016, 1). Com-
munity values as distinguishing characteristics are similarly perceived as being
desirable, and this is also expressed by the tendency of policymakers in Brussels to
highlight the significant scope of the GDPR, particularly through the marketplace
principle (GDPR 2016, Art. 3). Data localization practices that restrict data storage
to the host jurisdiction are sometimes referred to as a solution to enhance control
for local authorities. In addition, they emphasize the authority of the law as well
as the role of domestic legislative, judicial, and enforcement authorities.

3.7 The Order of Globalism

The fifth order I identify is the order of globalism. In view of the increasingly
transnational character of contemporary problems, the call for globalism through
the delegation of authority to international institutions has found considerable
resonance in recent decades, which is in line with the proliferation of regional
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 73

and international organizations. The order of globalism emphasizes the bene-


fits of global or regional cooperation and common problem-solving, highlighting
‘harmonization’ (AU 2014, Art. 28), ‘greater unity’ (CoE 1981 Preamble), and
‘interoperability’ (OECD 2013, Art. 21). It also highlights the importance of
sharing expertise (APEC 2005, 40–3) and the need for cooperation between super-
visory authorities (e.g. AU 2014, Art. 32). As Fischer-Lescano puts it, ‘The problem
of protecting fundamental rights and democracy in the transnational constellation
requires answers that transcend statist and legal-subjectivist reductionism’ (2016,
159–60).

3.7.1 Philosophical Foundations

Like the order of sovereignty, the order of globalism comprises a theoretically


diverse set of perspectives united by their common perception of the necessity for
action beyond the nation state to regulate transnational interaction (Sassen 1998).
Globalists have criticized the narrow conceptualization of community (Benhabib
2004), which is perceived as being discursively dominated by the state (e.g. Beitz
1979). However, there is considerable variation on the level of institutional orga-
nization that is considered appropriate, ranging from global government to looser
conceptions of international law as a strong structuring mechanism (Zürn and
de Wilde 2016, 290) to the emphasis on ‘stateless’ corporations (Ohmae 1995)
to transnational networks (Castells 2011). The order of globalism does not nec-
essarily express thick cosmopolitan principles of global fairness or justice (e.g.
Nussbaum 1994; O’Neill 2000). Instead, global cooperation and transfer of author-
ity to the global level may also be tied to the pursuit of security (Held 2002, 34)
or economic progress (e.g. T. L. Friedman 2000; Held and McGrew 1998; Strange
1996). The order unites those arguing for universal principles, such as Beitz (1979),
those promoting a specific (institutional) vision of cosmopolitan democracy (e.g.
Archibugi et al. 1998; Linklater 1998), and those promoting efficiency and effec-
tivity gains from cooperation (T. L. Friedman 2000). All suggest that the character
of contemporary problems demands global cooperation.

3.7.2 The Order of Globalism in Data Governance

In the order of globalism, data governance is set up as a distinctly inter- and


transnational challenge that can only be addressed by overcoming the narrow
focus on domestic regulation. This may simply criticize self-centred domestic
action (ECOWAS 2010) or even provide support for independent oversight bod-
ies and the transfer of authority to independent organizations (Hijmans 2016a;
ICDPPC 2005, 3). While technological advances and the increasing opportunities
74 DATA GOVERNANCE

for businesses to collect and use information early demonstrated the inter- and
transnational dimensions of data transfers (Bessette and Haufler 2001, 74), the
specific conception of the global character of data governance emerged in view of
the potential dangers of fragmentation. The unilateral and extraterritorial impetus
of the EU’s 1995 Data Protection Directive provided salience to this order, because
it heightened the stakes of control over data. The possibility of fragmentation (US
Department of Commerce 2000a, 1) and diverging standards emerged as serious
threats in view of multiple and overlapping jurisdictional claims.
Proponents of the order of globalism consider the relevant community as being
located at the global level. However, the extent to which this community is based
on a global citizenry is contested, as is the degree to which this community struc-
ture implies a duty for the provision of, for example, justice (Buchanan 2007, 83;
Pogge 2008; Zürn et al. 2007). Nevertheless, a connection between institutions,
individuals, and states is based on their common membership of the global order
(Risse 2012, 8), which also evokes ideas of the existence of an international society
in which actors ‘conceive themselves to be bound by a common set of rules in their
relations with one another’ (Bull 2002, 13).
There are institutional expressions of the order of globalism, including calls for a
global framework. The Montreux Declaration by the International Conference for
Data Protection Commissioners emphasizes the ‘universal character’ (ICDPPC
2005) of data protection that requires a global framework. In 2009, the NGO-based
Madrid Resolution aimed to ‘define a set of principles and rights guaranteeing
the effective and internationally uniform protection of privacy with regard to the
processing of personal data’ (Madrid Privacy Declaration 2009). Yet while there
are common data governance elements (Greenleaf 2012, 73–5), there is no global
framework for data governance, and its likelihood beyond a superficial agree-
ment is limited. Beyond these more cosmopolitan expressions for the protection
of human rights, increasing cooperation also constitutes a stated goal in economic
cooperation and the area of counterterrorism and law enforcement. For instance,
the EU and US (2016) Umbrella Agreement was specifically designed to set up the
terms of transatlantic cooperation and law enforcement. The OAS emphasizes that
‘threats to our citizens, economies, and essential services, such as electricity net-
works, airports, or water supplies, cannot be addressed by a single government or
combated using a solitary discipline or practice’ (OAS 2004, Appendix A), which
suggests a more efficiency-based, instrumental perspective.

3.7.3 Dystopian Scenario

In the order of globalism, the ideal of global harmonization and convergence is


contrasted with a dystopian scenario of fragmentation and international discord.
Most frameworks highlight problematic ‘disparities in legislation’ (ICDPPC 2005,
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 75

Art. 6; OECD 1980) and the ‘creation of unjustified obstacles to the development
of economic and social relations’ (OECD 2013, 12; GDPR 2016, para. 6). Conflicts
of laws are highlighted as a significant problem by both policymakers and pri-
vate companies (Apple et al. 2018), because they require adherence to conflicting
standards. The prioritization of unilateral decision-making processes over coordi-
nation and interoperability is perceived as morally deficient (e.g. Newman 2008).
The OECD framework similarly emphasizes the dangers arising from unilateral
legislation and states that the problem ‘cannot be solved exclusively at the national
level’ (OECD 2013, 7).

3.7.4 Valued Objects and Evaluative Principles

Proponents of the order of globalism put significant value on global stan-


dards and international cooperation mechanisms. APEC specifically commits to
‘[a]dvancing international mechanisms to promote and enforce information pri-
vacy’ (APEC 2005, 4; see also CoE 2018a, para. 6 preamble). This emphasis is
also reflected by the value of multistakeholder cooperation. Tech companies have
similarly argued for ‘global solutions to protect our customers and Internet users
around the world’ (Apple et al. 2018; see also GNI 2017, 2). In 2005, the ICDPPC
specifically appealed to the UN to develop a legally binding instrument to provide
for ‘effective global data protection’ and ‘universal rights’ (ICDPPC 2005, 2–6),
also for non-citizens (e.g. CoE 2018a). Yet in view of the limited likelihood of a
global framework, public actors and private companies often emphasize the value
of interoperability (OECD 2013, 11). Some frameworks explicitly refer to alterna-
tive standards to emphasize their responsiveness (ICDPPC 2005, Art. 15, 16), such
as the OECD Guidelines (APEC 2005, Art. 5), the Council of Europe Convention
(GDPR 2016, para. 105), or the APEC Principles (ASEAN 2016, 1). In addition,
many actors value transnational activism, for example through a ‘Global Privacy
Movement’ (Privacy International 2018; Bennett 2008) or common principles
(Article 19 2017).

3.8 Conclusion

The historical struggles over the right conduct of data governance have con-
tributed to the formation of a field that is shaped by institutional legacies and
normative divisions. This chapter has illustrated the emerging field of data gov-
ernance through the evolution of endogenous logics that are centred on the
assumption that data governance is meaningful in the pursuit of the common
good. The intergovernmental roots of the field, most notably its genesis around
the extraterritorial impact of the EU’s 1995 Data Protection Directive, have shaped
76 DATA GOVERNANCE

institutional responses to jurisdictional conflicts. I argue that the fragmented


nature of the field of data governance has not only prevented the rise of a com-
prehensive global framework but entrenched reliance on extraterritorial practices
to shape policy, which increases the potential for conflict and, at least in its cur-
rent form, tends to prevent an open and transparent debate. Through institutional
solutions to normative differences, data governance has thus far failed to recon-
cile the plurality of distinct normative visions to form a common purpose for the
field. While one might question the normative desirability of such convergence
(Krisch 2010), this helps understand the continued re-emergence of jurisdictional
conflicts.
The chapter has disentangled the plurality of purposes through a reconstruc-
tion of distinct value orders that define the common good of data governance (see
Table 3.1). These value orders have emerged as salient at different points in time
and have found expression in discourse and institutions to a varying extent. As
should be visible from the discussion, these orders do not generate clear-cut calls
for action but provide broader justificatory principles or grammars that may be
employed in the justification of dissimilar practices. Value orders are contested, on
the one hand, externally, by competing orders that offer opposing evaluative reper-
toires, for example, whether data governance should pursue fairness or production
as a common good, and on the other hand, internally, by competing hierarchiza-
tions within the order, for example, whether the common good of fairness is
realized through the prioritization of privacy or freedom of speech, respectively.
Hence, a call for opposing actions may be perceived as morally worthy in refer-
ence to different orders or the same order. In addition, and as demonstrated in
Chapter 2, it is unlikely that actors limit their justificatory practices to one order.
Instead, to appeal to a wider audience, they may draw on multiple orders. In the
empirical Chapters 4–8 and building on the analytical framework of field logics
and value orders I developed here, I explore how the plurality of notions of soci-
etal goals has contributed to the continued re-emergence of jurisdictional conflicts
and the superficial quality of institutional resolution processes in the field.
4
Safe Harbour and Its Discontents
The Empowerment of Individuals in the EU

In July 2020, the Court of Justice of the European Union (CJEU) made interna-
tional headlines when it invalidated the so-called Privacy Shield, an agreement
for commercial transatlantic data transfers. The CJEU argued that data transfers
under the agreement did not satisfy the requirements of European’s fundamental
rights protection (Data Protection Commissioner 2020, para. 185). Observers were
concerned about the negative consequences for the $7.1 trillion transatlantic eco-
nomic relationship but not entirely surprised by the Court’s reasoning: in 2015,
the CJEU had similarly caused diplomatic uproar by invalidating Safe Harbour,
the predecessor of Privacy Shield. Both instances followed complaints regarding
the lack of data protection by Maximilian Schrems, an EU private citizen, in light
of the revelations of mass surveillance by intelligence agencies. Shortly after the
EU’s General Data Protection Regulation (GDPR) had sparked controversy, the
high-stakes transatlantic relationship was again in disarray due to data politics.
What happened?
The roots of this conflict reach back much further and started with the EU’s
first adoption of comprehensive data protection legislation in 1995. Through
an analysis of global commercial data flows as the earliest site of transatlantic
data governance conflicts, this chapter investigates a mismatch between the
strongly normative jurisdictional claims and their largely superficial resolution.
The chapter adds to literature that has discussed the conflicts between the EU and
the US in the context of the adoption of the 1995 Directive (Newman 2008) and
the emergence of Safe Harbour (Farrell 2003; Farrell and Newman 2019, ch. 5;
Kobrin 2004; Long and Quek 2002) as instances of regulatory divide and con-
vergence. In addition, a broader strand of literature has focused on the societal
and political implications of the NSA scandal (Bauman et al. 2014; Gros et al.
2017; Lyon 2015) and described the multifaceted reactions of the EU regard-
ing commercial data protection and surveillance (Schneider 2017; Schünemann
and Windwehr 2020). In contrast, the invalidation of the Safe Harbour agree-
ment has found only limited attention outside the legal literature (Kuner 2017;
Ni Loideain 2016; Weiss and Archick 2016; for an exception, see Farrell and
Newman 2018, 2019, ch. 5). Due to its recent nature, even fewer scholars have
investigated the invalidation of Privacy Shield yet (for exceptions, see Chuches
and Zalnieriute 2020; Fahey and Terpan 2021). My analysis will, therefore, focus

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0004
78 DATA GOVERNANCE

primarily on the post-Snowden developments of Safe Harbour as a still neglected


side of the transatlantic regulatory dispute. In the analysis of the conflict resolution
process, I outline three arguments: first, I argue that conflicts about data gover-
nance cannot be narrowly construed as conflicts about privacy versus security,
even though the Snowden revelations gave salience to this binary (Kalyanpur and
Newman 2019b). The case particularly shows the moral relevance of economic
justifications as well as the centrality of sovereignty. Second, this chapter illus-
trates how the persistence of normative divisions provides windows of opportunity
for disruption. Although the Safe Harbour agreement constituted one of the most
foundational institutions in the field of data governance, the action of an indi-
vidual citizen led to its invalidation. I argue that Schrems’s successful disruption
was possible mainly due to the lack of public justifications. The unwillingness of
the US administration and the European Commission to engage in targeted jus-
tificatory practices enabled privacy-leaning actors to impose criteria of the order
of fairness on the case. This lastingly established the dystopian scenario of mass
surveillance rather than the collection of security-relevant crucial information.
Third, this point has implications on a conceptual level. It shows how a focus
on actor attributes fundamentally underestimates the potential of disruption by
individual and more peripheral participants in the field. Challenging actors may
successfully create an alternative vision of the field based on their ‘“oppositional”
perspective’ (Fligstein and McAdam 2011, 4). Some (e.g. Mérand 2010) have high-
lighted the important roles of individuals and small groups of experts for policy
developments. Yet the evolution of the governance of commercial data transfers
highlights that significant disruptive potential may be situated with such challeng-
ing actors, even if they do not possess significant amounts of political or diplomatic
capital.
The chapter will proceed as follows: I first outline the background of Safe Har-
bour and the origin of the jurisdictional conflict. Second, I investigate the Snowden
revelations as a disruptive event. Third, I conceptualize the aftermath of the first
Schrems case as a re-emergence of the institutional conflict resolution strategy
from the 1990s and point to its renewed failures by illustrating the emergence and
contestations of Privacy Shield. Fourth, I discuss these specific conflicts in the con-
text of the broader development of the field, also in relation to the introduction of
the EU General Data Protection Regulation (GDPR) in 2016.

4.1 The Emergence and Continuation of Safe Harbour

Before we can understand the recent invalidations of transatlantic data sharing


agreements, it is essential to take a step back and outline the main features of the
data transfer regime. As illustrated in Chapter 3, the EU is often portrayed as a
vanguard on the subject of privacy and data protection, with many states using
SAFE HARBOUR AND ITS DISCONTENTS 79

European legislation as a blueprint for domestic or regional regulation (Green-


leaf 2012; Newman 2008). It should be noted that the EU’s strong position does
not originate purely from its normative appeal but also stems from its significant
market power (Drezner 2007), which is essential for raising legislative standards
via ‘the Brussels effect’ (Bradford 2020), as outlined in Chapter 1. Yet the Brus-
sels effect is further conditional on stringent rules issued by a regulator with high
capacities. The EU’s Data Protection Directive (1995) represents one of the most
significant historical disruptions in the institutional and normative structure of
the field of data governance. When the EU in 1995 adopted the first enforceable
regional data protection legislation, this provoked debates not only among data
protection professionals but also in transatlantic relations. Due to take effect in
1998, the directive’s restrictions on data transfers to third countries constituted
a potential threat for US companies engaged in the growing e-commerce sector,
which prompted a jurisdictional conflict about the right conduct and scope of
data governance. The US market-based approach was suddenly in conflict with
the rights-based EU approach (Reidenberg 2000). The Data Protection Directive
(1995) had an extraterritorial effect to the extent that it imposed strict condi-
tions on data transfers to third countries, which required the existence of legal
safeguards. More specifically, these transfers presupposed an ‘adequate’ (Data Pro-
tection Directive 1995, Art. 25, 26) standard of data protection in the destination
country, to be determined by the European Commission. Adequacy decisions
require a substantial and extensive assessment, and so far only fourteen juris-
dictions have been recognized, including Argentina, Korea, Japan, New Zealand,
Switzerland, the UK, and—with limitations—Canada (EC 2022). Until their inval-
idation, EU–US agreements were also recognized under the adequacy framework.
Through these legislative demands upon foreign jurisdiction, the EU articulated a
jurisdictional claim over data.
The US government seemed largely unwilling to abandon the existing sector-
specific regulations, and US businesses were initially openly hostile towards the
directive (Farrell 2005, 124). In a policy brief, Swire and Litan accordingly asked,
‘Is there any way to avoid the coming showdown over privacy policy?’ (1998b).
In 2000, when the adoption of the Safe Harbour Privacy Principles allowed the
continuation of transatlantic commercial data transfers by US companies and
organizations, it seemed as if the EU and the US had established a unique arrange-
ment to successfully bridge regulatory differences (Farrell and Newman 2019,
134). The debate was largely conceived as a challenge to trade. Jurisdictional over-
lap was evaluated through standards rooted in the order of production, that is,
economic impact, rather than the order of fairness, that is, human rights. The Safe
Harbour agreement appeared to be a potential model agreement for transatlantic
regulatory disputes (Farrell 2005). The principles implemented a self-regulatory
scheme through which US firms could voluntarily subscribe to adequate data pro-
tection standards, which would authorize them to carry out transatlantic data
80 DATA GOVERNANCE

transfers. The US Federal Trade Commission (FTC) and the Department of Trans-
portation (DOT) were tasked with monitoring compliance (US Department of
Commerce 2000b). In sum, the agreement constituted an institutional solution to
bridge normative differences but had the potential to foster convergence in the
long term.
Soon after its implementation, however, significant doubts concerning the effec-
tiveness of the framework emerged (Dhont and Asinari 2004; EC 2002; Kobrin
2004, 122). A report conducted for the European Commission openly questioned
the FTC’s interest in enforcing and implementing Safe Harbour (Dhont and Asi-
nari 2004), and a 2008 study found that of 1,597 organizations in the Safe Harbour
list, 206 were false entries (Connolly 2008), which rose to 427 in 2013 (Connolly
2013). In response, FTC staff pointed out that ‘While the Framework contemplated
that EU data protection and other authorities would provide us with such referrals,
we received none for the first ten years of the program, and only a few over the past
three years’ (FTC 2013, 3). In view of ten total enforcement cases between 2000
and 2013, the promoted ‘vigilant Safe Harbor enforcement’ (FTC 2013, 3) seems
somewhat overstated. Yet the fact that the European Commission never articulated
any profound challenges to the framework highlights that, as one FTC employee
put it, Safe Harbour ‘was just not a priority for most people’ (FTC official, pers.
comm. 2019).
In summary, the Safe Harbour framework represented the solution to a juris-
dictional conflict that emerged due to regulatory and normative divisions between
market-focused US and rights-based EU regulation. While the framework facili-
tated data exchanges and contributed to the entrenchment of economic principles
in line with the order of production, the rights-based approach in line with the
order of fairness was undermined by a lack of enforcement.

4.2 After Snowden: A Changing Field—An Emerging Conflict

In the early 2000s, the expansion of US data access for security purposes con-
tributed to major jurisdictional conflicts in transatlantic data governance (see
Chapters 5–6). In contrast, the Safe Harbour framework was largely unaffected
by these conflicts. In 2012, the European Commissioner for Justice, Fundamen-
tal Rights, and Citizenship, Viviane Reding, publicly and unambiguously stated
that ‘Safe Harbour will stay’ (Reding, cited in Kerry 2012). Not even one year
later, the Snowden revelations challenged this position. The Snowden revela-
tions can arguably be seen as a key disruptive event, inter alia, concerning
the evolution of data protection standards in the EU (Laurer and Seidl 2021;
Schünemann and Windwehr 2020). Existing institutional structures between the
US and the EU were re-evaluated, such as the agreement on the sharing of finan-
cial data (see Chapter 6). Yet the overturn of the Safe Harbour agreement by
SAFE HARBOUR AND ITS DISCONTENTS 81

the CJEU in 2015 constitutes the most direct and prominent effect. To illustrate
the re-emergence of jurisdictional conflict between the EU and the US in light
of these revelations, I briefly outline the responses towards the Snowden rev-
elations in the field before zooming in on the challenges to the Safe Harbour
framework.

4.2.1 The Impact of the Snowden Revelations

In 2013, the revelations of widespread surveillance by intelligence services of the


US and its allies resulted in significant global tensions. Through access to classified
information, the former intelligence contractor Edward Snowden had obtained
an estimated 1.7 million pages of material in order to expose widespread global
surveillance practices. The surveillance practices can arguably be conceived as an
implicit jurisdictional claim by the US and the Five Eyes network (which consists
of Australia, Canada, New Zealand, the UK, and the US) of legitimate control over
global data flows. The extent of these secret practices contributed to a ‘salience
shock’ (Kalyanpur and Newman 2019b, 7). It provoked a global public discussion
on privacy protection on the internet, not least because the NSA had apparently
spied on high-ranking politicians of friendly states, such as Angela Merkel or
Dilma Roussef (Smale 2013). Even though they were in line with the expectations
of surveillance experts (Lyon 2015), the broad nature of surveillance programmes
prompted widespread public discussions. The debates that followed the disclo-
sure included questions on the right balance between privacy and security but
also expressed concerns about the relationship between states, citizens, and pri-
vate companies (Bauman et al. 2014). The UN General Assembly (2013) adopted
Resolution 68/167 ‘The Right to Privacy in the Digital Age’, calling on states to
respect the right to privacy in digital communications. In 2015, the UNGA further
appointed the first Special Rapporteur on the right to privacy, Professor Joseph
Cannataci (UNGA 2013, para. 5). In particular, the so-called PRISM programme
caused concerns about mass surveillance. The programme monitored the internet
traffic of major private tech companies and facilitated the upstream collection of
data through interceptions at the internet backbone, that is, the nodes of major
data routes such as fibre-optic cables (Greenwald 2015).
The extraterritorial nature of the surveillance measures developed into a signif-
icant source of conflict in EU–US relations. A report by the EU–US working group
controversially discussed the legal basis of the relevant surveillance programmes
in the US. It highlighted the problematic consequences of surveillance for a com-
munity of EU citizens, such as the minimal protection of non-US citizens under
the framework and the broad derogations for counterterrorism measures (EU–
US Working Group 2013, 6–7). The report also problematized the ambiguity of
evaluative practices in the order of security. While the EU requested additional
82 DATA GOVERNANCE

information and transparency on surveillance measures and their justification, the


US refused on the basis of potential implications for national security:

The EU side asked for further specification of what is covered under ‘foreign intel-
ligence information’, within the meaning of FISA 50, U.S.C. §1801(e), such as
references to legal authorities or internal guidelines substantiating the scope of
foreign intelligence information and any limitations on its interpretation, but the
US explained that they could not provide this as to do so would reveal specific
operational aspects of intelligence collection programmes.¹
(EU–US Working Group 2013, 3)

The report juxtaposes the overt exercise of power with the more consistent, rule-
following approach of the EU in its claim to jurisdiction based on sovereignty.
The EU stressed the need for legal references and internal guidelines as eval-
uative objects in line with the order of sovereignty. The European Parliament
similarly tried to assert the applicability of EU law to ensure that ‘the legal force of
the EU Treaties is not undermined by a dismissive acceptance of extraterritorial
effects of third countries’ standards or actions’ (EP 2014a, para. K). Actors in the
EU attempted to prove their morally superior status by referencing existing legal
standards and problematizing the absence of such standards in the US. The EU
used the absence of such objects as evidence that the US failed to prove worth
within this order. In his statement before the European Parliament, Snowden
problematized the same ambiguity in wording associated with the order of secu-
rity, highlighting the lack of bindingness of US commitments (Snowden 2014).
The European Parliament initiated an extensive investigation of the Snowden rev-
elations, specifically through a series of hearings by the European Parliament’s
Committee on Civil Liberties, Justice, and Home Affairs (LIBE Committee) and
reports. According to Gros et al. (2017), these hearings transformed the files dis-
closed by Snowden and reports into compelling evidence. Some experts tried to
question the validity of the allegations and shift the debate away from more general
societal or political questions (Gros et al. 2017, 79). Yet, more importantly, a sig-
nificant number of invited representatives from the US and intelligence services
of member states declined the invitation to participate in these public hearings
(LIBE 2014a, 125–6), deciding not to follow the imperative of justification. Due
to the lack of explicit justifications attempting to outline legitimate control over
data, the LIBE committee and the participants who highlighted the dystopian sce-
narios of mass surveillance in reference to the order of fairness shaped the criteria
of evaluation (Gros et al. 2017). The report ‘strongly rejects the notion that these

¹ According to the report, ‘[f[oreign intelligence information’ includes not only specific categories of
information such as international terrorism but also ‘information relating to the conduct of the foreign
affairs of the US’, which may also include intelligence on government agencies (EU–US Working Group
2013, 3–4).
SAFE HARBOUR AND ITS DISCONTENTS 83

issues are purely a matter of national security and therefore the sole competence
of Member States’ (LIBE 2014b, para. 16). This not only denied the dominance of
the order of security but also shifted the reference community away from sovereign
security actors.
The European Parliament also explicitly highlighted the dystopian scenario
of increasing interlinkages between public and private surveillance and the eco-
nomic incentives of the process. For example, the European Parliament ‘[d]eplores
the fact that many mass and large-scale intelligence programmes seem to be
also driven by the economic interests of the companies that develop and run
those programmes’ (EP 2015, 39). Juxtaposed with a reference community of
right-bearing individuals, this problematizes the perpetuation of potentially illegal
practices removed from parliamentary scrutiny. Nonetheless, the Snowden revela-
tions created significant tensions not only between public actors but also between
public and private actors. As the revelations increasingly disclosed the interlink-
ages between public and private surveillance, many companies tried to distance
themselves from public surveillance. For instance, in 2015, major companies such
as Apple, Google, Facebook, and Microsoft requested reforms in government
surveillance (Reform Government Surveillance 2015).
In turn, the continued denial demonstrated that US actors neglected the appli-
cability of evaluative criteria of the order of sovereignty and fairness and relied on
ambiguous references to the higher common principle of security. The US publicly
justified the revelations only in a very general way, emphasizing, for example, the
need for balance. For instance, President Obama argued, ‘But I think it’s impor-
tant to recognize that you can’t have 100% security and also then have 100%
privacy and zero inconvenience. We’re going to have to make some choices as a
society’ (Obama 2013). This statement makes a connection between security and
the need for sacrifices, while establishing privacy as a convenience. Therefore, sac-
rifices to protect the greater good of society through security rather than binding
or transparent standards express worthiness in the order of security. While the
US government had been slow to address the allegations, in January 2014, Pres-
ident Obama offered justifications in the form of a Presidential Policy Directive
(PPD-28) (US White House 2014) and a more extensive speech that announced
reform measures and provided a general sense of direction for US intelligence in
the future. The policy directive lays out the framework and principles of signals
intelligence by the US and specifies limited protections for non-US citizens, which
represents a significant and ‘unprecedented’ (Farrell and Newman 2019, 144) step.
Despite a general willingness to engage with European data protection princi-
ples (Suda 2017), both the speech and the directive demonstrated the intention
to maintain surveillance practices:

The United States must preserve and continue to develop a robust and techno-
logically advanced signals intelligence capability to protect our security and that
84 DATA GOVERNANCE

of our partners and allies. Our signals intelligence capabilities must also be agile
enough to enable us to focus on fleeting opportunities or emerging crises and to
address not only the issues of today, but also the issues of tomorrow, which we
may not be able to foresee.
(US White House 2014)

This statement demonstrates a strong commitment to flexible solutions that


address security pre-emptively and establishes signals intelligence as a necessity.
In the speech, Obama further emphasized that ‘Throughout American history,
intelligence has helped secure our country and our freedoms’ (2014), thereby
attempting to alleviate potential tension between freedom and surveillance and
instead stressing positive interlinkages. By describing the hard-working intelli-
gence officers as ‘patriots’ (2014), Obama also established a reference community
of US Americans who believe in the realization of freedom through security, thus
creating a compromise between the orders of security and fairness.
In sum, with regard to the broader revelations of mass surveillance, the EU
institutions tried to justify their claim over data by emphasizing sovereign rights
and legal principles. Through various hearings, actors transformed the corpus of
files published by a whistle-blower into valid evidence that disclosed the implicit
nature of jurisdictional claims of the US and attempted to compel justifications
for these claims. Yet they failed to create a unitary position on these violations.
The approach of the US administration and a wide range of intelligence agencies
speaks to a tendency in post 9/11 security practices to emphasize their situated-
ness beyond the law in the face of exceptionalism.² Therefore, jurisdictional claims
that invoke justifications of the order of security do not seem to require justifica-
tion based on concrete legal principles. Instead, proving worth seems to be based
on a self-referential system, as the invocation of security already implies the proof
of its worth.

4.2.2 The Emergence of Jurisdictional Conflict over Safe Harbour

In comparison with the broader reactions to the Snowden revelations in the EU,
the discourse on the Safe Harbour agreement seemed initially modest in its cri-
tique, particularly from the European Commission. Even Commissioner Reding,
who articulated strong criticism of bulk surveillance practices, declined to partic-
ipate in the LIBE hearing that focused on Safe Harbour. This prompted the pre-
siding LIBE Chair, Juan Fernando López Aguilar, to emphasize ‘disappointment
that the Commission could not attend this hearing’ (LIBE 2013, pts. 19:12–19:13).

² For a more detailed discussion of the justification of exceptionalism or emergency, see Huysmans
(2008); for a more conceptual discussion, see Amoore (2013) or Kreuder-Sonnen (2019) for its effects
on global politics.
SAFE HARBOUR AND ITS DISCONTENTS 85

Right after the disclosures, the European Commission decided to conduct a Safe
Harbour framework review. Notably, this was only the third review since the
framework’s adoption in 2000.
The review focused primarily on the economic implications for Safe Harbour,
and only the last of four points referred to ‘the information recently released on
US surveillance programmes’ (EC 2013d, 3). It highlighted the framework’s eco-
nomic relevance, thus establishing the order of production as the main evaluative
principle. The European Commission specified Safe Harbour as ‘one of the con-
duits through which access is given to US intelligence authorities’ (EC 2013d,
16). While avoiding explicit references to mass surveillance, it argued for limit-
ing ‘the national security exception foreseen by the Safe Harbour Decision’ (EC
2013d, 19) to necessary and proportionate circumstances. While the Commis-
sion acknowledged that Safe Harbour facilitated access to commercial data, it did
not explicitly problematize the actions of intelligence agencies. The main point
of contention seemed to be broken trust. Throughout the report, the Commis-
sion underlined broken trust as an impediment to future cooperation and barely
referred to potential human rights violations (Rettman 2013). The report even
reconceptualized human rights violations according to criteria consistent with the
order of production:

If citizens are concerned about the large-scale processing of their personal data
by private companies or by the surveillance of their data by intelligence agencies
when using Internet services, this may affect their trust in the digital economy,
with potential negative consequences on growth.
(EC 2013a, 3)

This element was also emphasized by other former officials and think tanks (e.g.
Kerry 2014; Singer and Wallace 2014). The idea of broken trust further estab-
lished a dystopian scenario in the order of globalism, particularly concerning the
‘credibility’ (e.g. EC 2013d, 7–11) of cooperation. The Commission rejected the
idea of a new framework. Instead, it envisioned ‘strengthening’ the Safe Harbour
framework, emphasizing how a potential institutional change could be ‘adversely
affecting the interests of member companies’ (EC 2013a, 7). Reports highlighted
the risk of fragmentation (EC 2013d, 5) in light of diverging enforcement practices
by data protection authorities. While Věra Jourová, the then European Commis-
sioner for Justice, emphasized the protection of data as the first priority, she called
the digital economy ‘the backbone of our economy’ (EC 2015a), establishing its
necessity. The European Commission seemed confident that a reform of Safe
Harbour was the right step and emphasized that ‘it is the competence of the Com-
mission’ (EC 2013a, 4) to adapt any decision on the adequacy of data protection,
thereby establishing its jurisdictional claim.
In contrast, the European Parliament, drawing on principles of the order of
fairness, called for an immediate suspension of the Commission’s 2000 adequacy
86 DATA GOVERNANCE

decision (EP 2014a, 40). Members of the European Parliament (MEPs) questioned
the legitimacy of the jurisdictional claims by the Commission and the US and
pointed to the persistent and ongoing lack of oversight (EP 2014a, AN). In its writ-
ten observations to the following court case, the Parliament, asked ‘who will guard
the guardians?’ (EP 2014b, para. 80) to emphasize the need of independent super-
visory authorities (EP 2014b, paras. 74–80) as proof of worth within the order of
fairness.
The Commission’s reluctance to address the significant shortcomings of the
framework may have different reasons, including a status quo bias that favours
legal consistency (Hartlapp et al. 2014, 302) but also the potential for a significant
economic fallout, given the dependency of European businesses on American-run
digital services. The strong focus on principles of the order of production is also
in line with assumptions in the literature that EU policymaking favours deregula-
tory and market-friendly policies (e.g. Schmidt and Thatcher 2014). Even though
recent approaches have argued that the Commission’s regulatory ambition pits it
against business interests (Dür et al. 2019), the statements by the European Com-
mission seem to correspond to the prioritization of economic progress. It is likely
that there was additional pressure from member states, such as Germany, that
supported the reform of Safe Harbour. This pressure notably extended beyond
policy decisions and more generally concerned the position of the Commission
in the field: in a leaked document, the Bundesministerium des Innern und für
Heimat (BMI, the German Ministry of the Interior) criticized the Commission’s
decision to comment on any action in relation to national intelligence agencies.
The BMI instead emphasized that any reports should be published in the name
of the member states (BMI 2013, 14 [71]). While this illustrates to what extent
states aim to shield their ‘core state powers’ (Genschel and Jachtenfuchs 2018),
the emphasis on the sovereign rights of member states seemed to be largely instru-
mental. Emphasizing member state competences not only limited the evaluation
of US intelligence but also insulated the surveillance activities of national intel-
ligence agencies from scrutiny by EU institutions (Bigo et al. 2013). Intelligence
activities are to a significant extent based on collaboration and security-based data
sharing between domestic intelligence services. This cooperation includes the Five
Eyes network but also involves the French, German, or Swedish intelligence ser-
vices (Bauman et al. 2014; Fischer-Lescano 2016). This indicates that justifications
referring to the order of sovereignty were mainly strategically deployed to retain
sufficient room for manoeuvre.

4.2.3 A Private Complaint Changing the Game

The general reluctance to overthrow the Safe Harbour framework shaped initial
talks about reform. The European Parliament tried to challenge the existing system
SAFE HARBOUR AND ITS DISCONTENTS 87

but did not find sufficient support. This makes the invalidation of the framework
in 2015 even more surprising. An Austrian citizen was able to catalyse a process
that translated a justificatory challenge into institutional disruption. He articulated
a challenge to the implicit jurisdictional claim of the US that attempted not only to
undermine its legitimacy but also to change practices in the field, thereby express-
ing an alternative vision of the field. How exactly did this unfold? Shortly after the
Snowden revelations, in June 2013, Maximilian Schrems, at the time a law student
in Austria, filed a complaint with the Irish Data Protection Commission (DPC).
He specified that the PRISM programme granted ‘mass access’ (Schrems 2013,
7) to commercial data, including data held by the US private company Facebook,
where he was registered as a user. Through its membership of the Safe Harbour
self-certification scheme, Facebook had guaranteed to maintain an adequate level
of data protection. Schrems requested an investigation regarding the existing level
of protection, asking the data protection authority to stop data transfers if the Safe
Harbour standards had been undermined.
Here, it is important to note the central position of Ireland in the European
digital space. Ireland hosts the European headquarters of several major tech
companies, including Google, Facebook, and Microsoft. The corporate-friendly
approach to regulation in Ireland has been criticized as relatively ‘cosy’ (Cadwal-
ladr and Campbell 2019) due to low taxes and lax enforcement. In contrast to the
vast material and ideational resources of tech companies, the Irish DPC occupies
an at best marginal position in the field. Until parts of its operations moved to
Dublin in 2015, the DPC had gained prominence mainly due to its lack of fund-
ing and the fact that it shared space with a local supermarket outside Portarlington,
a small town in the Irish countryside (Scally and Bittner 2013).
Whether for a lack of resources or political will, the Irish DPC dismissed the
complaint and argued that it had an ‘obligation to accept “adequacy” decisions’
(Irish DPC 2013, 1) by the Commission. The justification referred to Article
25(6) of the 1995 Data Protection Directive (Data Protection Directive 1995),
which authorizes only the European Commission to issue such adequacy deci-
sions (Schrems 2015). The DPC decided not to investigate the complaint, which
started a heated exchange of letters and eventually led to a court case. The DPC
spoke out against a formal obligation for investigation, while Schrems argued the
refusal to investigate could legally only be justified if the complaint was consid-
ered ‘frivolous and vexatious’ (Irish DPC 2013).³ In correspondence with the High
Court, the DPC pointed out that this included any claims unsubstantiated by law
(Schrems 2014, para. viii).
In contrast to the DPC, the Irish High Court strongly argued against the
legitimacy and legality of US surveillance practices. The Court took the factual cor-
rectness of the surveillance measures largely as given, arguing that ‘denials from

³ According to 10(1)(b)(i) of the 1988 Irish Data Protection Act.


88 DATA GOVERNANCE

official sources, such as they have been, were feeble and largely formulaic, often
couched in carefully crafted and suitably ambiguous language designed to avoid
giving diplomatic offence’ (Schrems 2014, para. iii). The Court also emphasized the
necessity of considering changes since the original adequacy decision, including
a new security environment but also ‘disclosures regarding mass and undiffer-
entiated surveillance of personal data by the US security authorities’ (Schrems
2014, para. xii) and the elevated status of data protection in the EU as a fun-
damental right since the 2009 Lisbon Treaty. Due to the potential implications
for the EU, the Irish High Court decided to refer the case to the CJEU. This
case was exceptional to the extent that, on the basis of the Treaty on the Func-
tioning of the European Union (TFEU), the CJEU does not have jurisdiction
over cases that involve the internal security of member states or surveillance con-
ducted by national authorities for these purposes (2016, Art. 4, para. 2). However,
due to significant reliance on cooperation with private service providers, data
collection and transfer practices could be considered as commercial. This made
them subject to EU law and thus the jurisdiction of the CJEU. This ‘issue link-
age’ (Farrell and Newman 2018) provided a possibility for Schrems to do what
the institutions had failed to achieve: contest the Safe Harbour framework as the
basis for the collection of data for both surveillance and commercial purposes
(Schrems 2013).

4.2.4 The Court Case Unfolds

The referral to the CJEU manifested a shift away from the political to the legal
arena. The extent to which actors recognized this jurisdictional challenge as a
serious threat, however, varied significantly. In contrast to the more recent pro-
ceedings against Privacy Shield, the US authorities did not engage in public
justifications in the court cases. While Safe Harbour permitted derogations for rea-
sons of national security, even statements outside the courtroom failed to justify
surveillance in reference to such derogations. The US Presidential Policy Directive
generally emphasized the relevance of commercial data for surveillance purposes,
stating: ‘The evolution of technology has created a world where communications
important to our national security and the communications all of us make as
part of our daily lives are transmitted through the same channels’ (US White
House 2014). US actors simply did not address the Safe Harbour framework, thus
providing no justifications for its continuation.
EU member states and institutions submitted opposing statements. While the
European Commission and the UK argued for the maintenance and reform of
Safe Harbour, other member states such as Belgium, Austria, or Poland argued
against the acceptance of the adequacy decision. While the US might have antic-
ipated a more comprehensive consideration of the exception for ‘necessary’ (US
SAFE HARBOUR AND ITS DISCONTENTS 89

Department of Commerce 2000b, 10) national security derogations enshrined in


the Safe Harbour agreement, none of the actors was willing to explicitly justify US
bulk surveillance practices. According to an interviewee from the European Com-
mission (EC official, pers. comm. 2019), this was not considered as feasible from a
political perspective. Particularly in light of public justifications that attempted to
strengthen privacy, for example in the context of the UNGA resolution, and due
to high issue salience, the attempt to justify surveillance measures in reference to
counterterrorist and law enforcement efforts was perceived as being problematic.
This is further evident, for example, from leaked discussions concerning the draft
of a joint EU–US statement. A German ministry official from the BMI stated that
the department ‘considers the reference to law enforcement activities in relation
to the strengthening of individual privacy as inappropriate’ (BMI 2014, 396–7).⁴
The European Commission’s perceived ‘space of possibles’ (Bourdieu 1996,
234–9) was immensely reduced at the outset of the court case. One interviewee
from the European Commission said that it ‘felt like we saw the train coming
towards us and could do nothing to avoid the crash’ (EC official, pers. comm.
2019). Changing its position in light of a legal challenge and officially revoking
its adequacy decision would have seemed inconsistent. Instead, the Commission
tried to dismiss Schrems’s complaint on the basis of the lack of imminent risk to
Schrems personally:

due to their generality and abstractness, Mr Schrems’ concerns about the surveil-
lance programmes of the US national security agencies are exactly the same as
those which have led the Commission to start the review of the Safe Harbour
Decision.
(EC 2014, 19)

The Commission emphasized its position in the field as the competent authority
and decided against significantly engaging with substantive issues. It made claims
on behalf of the EU as a constituent community, emphasizing its position vis-à-vis
the member states and the US. The submission suggested that even if the case con-
cerned national security exemptions, these exemptions benefited the US rather
than the member states, thereby re-emphasizing its competence (EC 2014, 15).
Even though the Commission recognized the violation of Safe Harbour principles,
both in their written observation (EC 2014) and in the oral hearing (Gibbs 2015a),
it concluded that the Irish DPC was bound by the existing adequacy decision. In an
interview with The Guardian newspaper, Schrems described his surprise at what
he perceived as a weakness of the Commission’s justifications:

⁴ My translation; the original version reads: ‘Darüber hinaus halten wir den Verweis auf die Tätigkeit
der Strafverfolgungsbehörden im Zusammenhang mit der Stärkung der Privatsphäre des Einzelnen
verfehlt’ (BMI 2014, 396–7).
90 DATA GOVERNANCE

I even had the feeling that the European Commission was not too interested in
winning the case. Their representation was really bad. […] I think they wanted
to get rid of it anyway. And now they can say, ‘The court is taking a decision, we
can’t be blamed for it anymore.’
(cited in Powles 2015c)

While the European Commission’s justificatory strategy seemed to be limited


to general claims of competence rather than normative claims, the company
in question, Facebook, tried to avoid the need for public justifications alto-
gether. The company emphasized that ‘this case is not about Facebook’ (cited
in Price 2015) and deferred responsibility to the public actors, arguing that ‘it
is imperative that EU and US governments ensure that they continue to pro-
vide reliable methods for lawful data transfers and resolve any issues relating
to national security’ (Lee 2015). This stands in marked contrast to the more
proactive role that companies have embraced more recently, such as the dis-
putes about law enforcement access to electronic evidence (Elligsen 2016; see also
Chapter 7).
The absence of normative arguments created significant space for those actors
such as the European Parliament and Schrems who actively attempted to shape
evaluative criteria with regard to their moral worthiness. For example, by estab-
lishing the ‘indiscriminate practices of mass surveillance’ (EP 2014a, K), they
pointed to a dystopian scenario in the order of fairness. In its written observations
to the case, the European Parliament stated that:

‘Mass’ and ‘indiscriminate’ interference with fundamental rights, encompassing


all innocent persons without limitation, is rather the antithesis of the fundamen-
tal principles enshrined in Articles 7, 8 and 52 of the Charter, not to mention
Article 8 of the Convention for the Protection of Human Rights and Fundamental
Freedoms.
(EP 2014b, para. 32)

This statement draws on binding human rights conventions to prove worth with
regard to the EU as a liberal community of individuals. It also emphasizes the
responsibility of strong independent data protection authorities as ‘“guardians”’
(EP 2014b, para. 80) of this community.
I argue that the decision of the European Commission and the US to follow
the ‘imperative of justification’ (Boltanski and Thévenot 2006, 346) only to a very
limited extent contributed to the evaluation of the issue through criteria embed-
ded by actors actively engaged in meaning-making processes. As I outline below
and as also highlighted by the Irish High Court, both courts accepted the fac-
tual record of surveillance as presented by the only parties willing to comment
(see also Farrell and Newman 2019, 147). The meaning-making processes were
significantly shaped by criteria and reference communities relating to the order of
SAFE HARBOUR AND ITS DISCONTENTS 91

fairness rather than principles embedded in the order of security. There was no
account that specifically linked the Safe Harbour framework to the necessity for
surveillance measures.

4.2.5 The 2015 CJEU Judgement

In September 2015, the opinion of the CJEU’s Advocate General (AG) Yves Bot
received widespread attention. He emphasized that in light of the fact that ‘the law
and practice of the United States allow the large-scale collection of the personal
data of citizens of the EU which is transferred, without those citizens benefiting
from effective judicial protection’ (Bot 2015, para. 158), the European Commis-
sion should have invalidated the adequacy decision. In the landmark ruling that
followed shortly after and was widely discussed both academically (Kuner 2017;
Ni Loideain 2016) and publicly (Gibbs 2015b; Scott 2015), the Court similarly
emphasized the inadequacy of protection. More specifically, it proposed that the
broad derogations in the adequacy decision ‘enabled interference founded on
national security and public interest requirements or on domestic legislation of
the United States’ (Schrems 2015, paras. 87–9) with the fundamental rights of
persons. On the basis of the cooperation requirements for companies specified
by law, the CJEU considered that the Safe Harbour principles were systemati-
cally undermined. In the judgment, the Court frequently referred to the Digital
Rights Ireland v. Commission (2014) case, which affirmed the importance of data
protection in the digital context as well as to the TFEU and the EU Charter of
Fundamental Rights. This highlights again the importance of case law and human
rights conventions as significant objects of value. The CJEU also problematized
that the European Commission in its initial adequacy decision had never stated
‘that the United States in fact “ensures” an adequate level of protection by reason
of its domestic law or its international commitments’ (Schrems 2015, para. 98).
Therefore, the Court considered the adequacy decision to be invalid, which also
invalidated the framework as a whole.
The judges further specified the understanding of adequate as ‘essentially equiv-
alent’ (Schrems 2015, para. 73) to EU protection, which reduced the Commission’s
discretion in the adequacy findings and has implications for other data shar-
ing frameworks, such as the Passenger Name Records agreement (Kuner 2015,
2017). Schrems considered the judgment a ‘puzzle piece in the fight against mass
surveillance, and a huge blow to tech companies who think they can act in total
ignorance of the law’ (cited in Powles 2015c). It is also evidence of increasing
judicialization in data governance (Fahey and Terpan 2021).
It is difficult to say how much stronger engagements in the meaning-making
process would have changed that particular situation. The lack of effectiveness
of the Safe Harbour framework in light of minimal enforcement and review
procedures as well as continued sovereignty infringing bulk surveillance practices
92 DATA GOVERNANCE

created significant pressure. Yet references to principles embedded in the order of


production seemed sufficient to evaluate Safe Harbour in terms of its economic
relevance not only to the European Commission but also to a significant number
of member states. This demonstrates the entrenched character of this order. Sig-
nificant public voices that emphasized the legitimacy of the framework from the
order of security could have potentially created an alternative vision of the field
that was compatible with the existing institutional situation, including the existing
exceptions for national security in the framework. Instead, the European Parlia-
ment (2014b, 11, para. 41) and the CJEU based their assessment of US Surveillance
practices and law on the Irish High Court opinion (2014) and the existing media
revelations (Guardian, 2020). Therefore, Schrems and the European Parliament
were able to evaluate the surveillance activities of the US in reference to data gov-
ernance as a fundamental rights issue. On the one hand, the CJEU organized and
resolved the EU-internal jurisdictional conflicts and emphasized the primacy of
human rights according to core values of the EU. On the other hand, the tempo-
rary enforcement of a unitary EU position also significantly shaped the space of
possibles for the EU on an international level. Both the Irish High Court and the
CJEU were outspoken in their criticism of the perceived fundamental rights vio-
lations by the US. While the CJEU judgment found the 2000 adequacy decision
to be moot on the basis of the European Commission’s missing assessment of the
broader legal framework in the US in 2000, both courts highlighted that neces-
sity and proportionality are preconditions for any infringements of fundamental
rights. Thus, they remained firmly committed to principles of the order of fairness.
In a newspaper article, Powles described the CJEU as ‘an emboldened court in full
flight—confident in its role, willing to challenge and test executive and legislative
authority, and determined to respect fundamental rights both to privacy and data
protection’ (2015b). I elaborate on the role of the CJEU with regard to the recent
invalidation of Privacy Shield in Section 4.2.8.

4.2.6 Outcome: Transatlantic Discord

Through the invalidation of the European Commission’s adequacy decision,


the CJEU had eliminated most options for incremental change and, while to
some extent limiting and undermining the role of the European Commission,
strengthened the position of the EU in the global field. The European Commis-
sion, through the sense of urgency that followed the CJEU decision, was able
to both enlarge and constrain the space of possibles vis-à-vis the US and other
EU actors.
Particularly for the US, the conflict in many ways had just started. After the pub-
lication of the opinion of AG Bot, actors in the US had attempted to undermine the
jurisdictional claim of the Court, arguing that it was based on a false account of US
SAFE HARBOUR AND ITS DISCONTENTS 93

surveillance practices and ignored reform measures (IA 2015, 2). For example, in a
congressional hearing on the issue, the US Department of Commerce representa-
tive argued that Safe Harbour had become ‘a target for continued criticism largely
based on misunderstanding and false assumptions about its purpose and opera-
tion and the important privacy benefits it provided’ (E. M. Dean 2015, 2). The US
mission stated that ‘The United States does not and has not engaged in indiscrimi-
nate surveillance of anyone, including ordinary European citizens’ (US Mission to
the EU 2015). The US Secretary of Commerce Penny Pritzker stressed that the US
was ‘deeply disappointed’ (Pritzker 2015; see also, e.g., Beckerman 2015, 2) and
suggested that the invalidation ‘puts at risk the thriving transatlantic digital econ-
omy’ (Pritzker 2015). The congressional hearing dedicated to the invalidation of
the Safe Harbour agreement revolved around the detrimental consequences for
the digital economy, such as the free flow of information, the transatlantic com-
mercial relationship, and legal uncertainty, as well as the increased burden and
damage for job-generating companies. While it mainly evaluated the situation in
reference to the common good of production and economic progress, there is also
a recognition that the CJEU decision entails a strong emphasis on future cooper-
ation and the urgency of agreement, thus emphasizing the order of globalism (US
Congress 2015).
Within the EU, both the European Parliament and the European Commission
attempted to gain recognition for their central positions in the field. The European
Parliament emphasized its role as a strong protector of human rights, arguing ‘this
ruling has confirmed the long-standing position of Parliament regarding the lack
of an adequate level of protection under this instrument’ (EP 2015, 2014a, K). Even
the European Commission framed the judgment as a confirmation of its mandate:

we have made important progress that we can now build on in light of the judg-
ment. Our aim is to step up discussions with the US towards a renewed and safe
framework for the transfer of personal data across the Atlantic.
(EC 2015a)

After the judgment, there was a general sense of urgency for a new agreement but
also a heightened sensibility towards the fundamental right status of privacy and
data protection. The European data protection authority 29WP (2015) threatened
to stop data transfers if no agreement were found soon.

4.2.7 On the Way to Privacy Shield

In light of intense pressure also from the private sector, which acted in a legal limbo
and emphasized the adverse effect on businesses, including, in particular, small
and medium enterprises (IA 2015, 2), negotiations for a new framework began.
94 DATA GOVERNANCE

Underlying differences remained, but there was general agreement on the fact
that data flows were essential. As a representative of the FTC put it in an inter-
view, ‘nobody wanted to say: “no data should flow”, well almost nobody’ (FTC
official, pers. comm. 2019). Concerns regarding the extent of data access by secu-
rity agencies had remained a significant part of the negotiations. However, the
US had agreed to a clarification of the term ‘bulk collection’ and maintained that
collection needs to be ‘as tailored as feasible’ and ‘reasonable’ (EC 2016d, 2). Nev-
ertheless, concerns persisted, particularly with regard to the requirements set out
by the CJEU (EC 2016e, 1). The European Parliament also criticized the fact that
the restrictions on bulk collection did not meet ‘the stricter criteria of necessity and
proportionality as laid down in the Charter’ (EP 2016b, 4). The EP largely stayed
committed to principles of the order of fairness, arguing on behalf of a reference
community of individuals that ‘“protecting data” means protecting the people to
whom the information being processed relates’ (EP 2016b, B).
However, some perceive that the strong juxtaposition of EU and US surveillance
practices did not consider the significant extent to which EU member states are
engaged in similar practices (Farrell and Newman 2019, 1–3; EU FRA 2018). The
strong emphasis on dystopian scenarios of mass surveillance created the impres-
sion that the EU attempts to impose standards externally that it is not willing
to uphold internally (see also Lamla 2016). While this points to an emphasis on
sovereign community norms rather than the prioritization of the order of fairness
and human rights principles, some actors also problematized abusive powers in
the EU. In 2015, a European Parliament resolution argued that ‘the lack of a clear
definition [of national security] allows for arbitrariness and abuses of fundamen-
tal rights and the rule of law by executives and intelligence communities in the
EU’ (EP 2015, para. 24). The criticism of significant ambiguity and uncertainty
concerning both the actual wording and the level of bindingness delegitimizes
jurisdictional claims based on the order of security.
In February 2016, renewed negotiations replaced the Safe Harbour framework
with the EU–US Privacy Shield. Highly similar in structure and content, Pri-
vacy Shield offered voluntary self-certification for companies in the US under
the scrutiny of the Federal Trade Commission and the Department of Trans-
port. The framework represented a compromise and was as much an institutional
solution to bridge the existing normative divisions as Safe Harbour had been.
However, it included some stronger provisions compared with its predecessor. On
the one hand, Privacy Shield introduced stricter commitments as well as trans-
parency, overview, and enforcement measures. The FTC seems to have become
more assertive in the enforcement practices under Privacy Shield (FTC official,
pers. comm. 2019). The European Commission also highlighted constraints and
obligations in relation to its own work and referred to the Schrems decision as a
way to ‘breathe life into the framework’ (2016c, 10). Similarly, the contrast with
a ‘formalistic exercise without consequences’ (10) seems to ‘mark a significant
SAFE HARBOUR AND ITS DISCONTENTS 95

departure from the previous static situation’ (11). On the other hand, Privacy
Shield created new opportunities for (judicial) redress as well as an ombudsman
position. The agreement was part of a package deal with the US Judicial Redress
Act and the so-called Umbrella Agreement. The US Judicial Redress Act of 2015
extends some data and privacy protection measures to non-US citizens or resi-
dents. It offers EU citizens the possibility of contesting both US companies’ data
disclosures and government practices, a right that had been granted to US citizens
in the EU with the Data Protection Directive (1995). While there had been con-
siderable disagreements about the scope of judicial redress for EU citizens in the
area of law enforcement since it was first addressed in the 2007 High Level Contact
Group (Farrell and Newman 2019, 138), the necessity of a transatlantic agreement
forced negotiators to find a common position. The Umbrella Agreement (EU and
US 2016) came into force in February 2017. Under the agreement, data transfers
for law enforcement purposes between the EU and the US, also when carried out
by private entities, are subject to a common standard of data protection, with the
exception of Ireland and Denmark.
Thus, Privacy Shield offered a stronger focus on rights protection, and it more
thoroughly and explicitly addressed fundamental rights concerns. The Euro-
pean Commission was confident about the improved safeguards, for example
with the then Commission Vice President Andrus Ansip stating that Privacy
Shield ‘will protect the personal data of our people’ (Ansip, cited in EC 2016f),
which again establishes a reference community of Europeans. Nevertheless, many
changes compared with Safe Harbour remained limited, particularly regarding
bulk surveillance (Weiss and Archick 2016, 8–11), which contributed to con-
siderable uncertainty (Voss 2016). As in other data-sharing agreements, the US
administration chose to provide written assurances to confirm restrictions for
public surveillance measures. The EP criticized this due to uncertainty about a
lack of legal bindingness (EP 2016b, 7). In the 2019 review of Privacy Shield,
the European Commission noted that only one complaint, which was considered
inadmissible, had been filed with the Ombudsperson since the framework had
become operational (EC 2019d, 27). This is a common feature of compromises in
data-sharing frameworks, as I also illustrate with regard to the PNR (see Chapter 5)
and the TFTP (see Chapter 6) agreements. While review and transparency mech-
anisms prove worth in the order of fairness, they work as implicit legitimation of
the order of security. While, formally, US authority was curtailed, practices did
not seem to change substantially (Suda 2017).
In sum, Privacy Shield constituted a temporary compromise between differ-
ent orders. Bulk surveillance practices did not stop, but they should have become
more transparent and challengeable. While the vague specification of principles
in compromises frequently is a major factor in maintaining its stability (Boltanski
and Thévenot 2006, 279–80), this proved insufficient to withstand judicial scrutiny
by the CJEU, as I demonstrate in Section 4.2.8.
96 DATA GOVERNANCE

4.2.8 The Re-Emergence of Conflict: The Invalidation


of Privacy Shield

Significant challenges to the framework illustrate the continuous character of the


conflicts as well as the fact that the underlying normative divisions have not been
reconciled. On the one hand, the European Parliament called for the invalidation
of Privacy Shield in July 2018. It cited a lack of compliance, the largely vacant and
consequently gridlocked Privacy and Civil Liberties Oversight Board in the US,
and legislative changes prompted by the Trump administration, particularly with
regard to the protection of non-US citizens and intelligence procedures (EP 2018;
see also Butler 2017). For example, the prolongations of the Foreign Intelligence
Surveillance Act (FISA) did not include PPD-28. The European Commission
largely ignored these calls. In the yearly review procedures of Privacy Shield, it
pointed to weaknesses in the implementation but concluded that ‘the United States
continues to ensure an adequate level of protection for personal data transferred
under the Privacy Shield’ (EC 2018d, 5). The European Commission also ‘encour-
age[d] the U.S. to adopt a comprehensive system of privacy and data protection
and to become a Party to the Council of Europe’s Convention 108’ (EC 2018d, 6)
to highlight its commitment to principles of globalism such as cooperation and
interoperability.
On the other hand, and more urgently, this concerns the continuation of
the Schrems case. The Irish DPC, which was officially declared responsible for
addressing the complaint through the CJEU, did not produce a decision on the
lawfulness of Facebook’s data transfers. In the proceedings of the case, it became
apparent that data were often transferred via standard contractual clauses (SCCs)
rather than via the Safe Harbour framework. Data controllers can incorporate
these clauses—issued by the European Commission—in their contracts to specify
conditions for data transfers. They are considered as granting an adequate level of
data protection, despite significant criticism (Kuner 2017, 908). The DPC argued
that Facebook’s reliance on SCCs was again sufficient to assume the adequacy of
protection, which has been established by the Commission, and again decided not
to investigate. Schrems reformulated the complaint to the Irish DPC to propose
that Facebook’s reliance on SCCs did not guarantee an adequate level of protec-
tion either (Schrems 2017), prompting the DPC to refer the case to a court. The
DPC controversially chose to bring the case in a commercial court, which could
have exposed Schrems to high costs and has been argued as having potential chill-
ing effects for future complaints to the Irish DPC (Lillington 2019). The case was
brought before the Irish High Court and later referred to the CJEU. In contrast
to the first case, the US government joined the procedure as an amicus curiae,
while NGOs were denied this status (Moody 2016). The case was referred to the
CJEU, and the parties presented their arguments in July 2019 (Data Protection
Commissioner 2020), also referred to as Schrems II). The AG found that SCCs
SAFE HARBOUR AND ITS DISCONTENTS 97

offered adequate protection, and the CJEU confirmed this opinion in its judgment.
The Court emphasized that the use of SCCs may only be considered adequate if
strong safeguards are in place and emphasized the responsibility of the transfer-
ring actors, particularly controllers, for ensuring these safeguards (Data Protection
Commissioner 2020, paras. 131–46). While the case constituted only an indirect
challenge to Privacy Shield (EP 2017a), the CJEU again invalidated the agree-
ment. A direct challenge by the French NGO La Quadrature du Net was scheduled
to be heard after the Schrems II judgment (Joined Cases C-511/18, C-512/18 and
C-520/18 2016), but is now moot. Schrems argued:

The judgment makes it clear that companies cannot just sign the SCCs, but also
have to check if they can be complied with in practice. In cases such as Facebook,
where they don’t take action, the DPC had the solution to this case in her own
hands all along. She could have ordered Facebook to stop transfers years ago …
Instead, she turned to the CJEU to invalidate the SCCs, which are valid. It’s like
screaming for the European fire brigade, because you don’t feel like blowing out
a candle yourself.
(noyb 2020)

Schrems jurisdictional claim confirmed the applicability of the order of fairness


and pointed to failures of the DPC with regard to its responsibilities according to
the GDPR. The DPC interpreted its actions quite differently. The DPC Commis-
sioner Helen Dixon welcomed the judgment, arguing that ‘the DPC brought these
proceedings—and resisted objections from both Facebook and Mr Schrems—
specifically in order to secure a decisive statement of position from the CJEU’
(Irish DPC 2020). This indicates that the DPC seems to have understood the
role of data protection authorities more broadly, pushing the Court to assess the
broader legal situation rather than restricting its role as a supervisory authority.
The Court unequivocally argued that data protection authorities must investi-
gate data protection complaints. It specified the responsibilities of data protection
authorities, arguing that ‘the supervisory authority is nevertheless required to
execute its responsibility for ensuring that the GDPR is fully enforced with all
due diligence’ (Data Protection Commissioner 2020, para. 112). While it agreed
with the DPC’s argument that data protection authorities cannot stop data flows
under an existing adequacy decision, it highlighted the supervisory authorities’
obligation to investigate and bring complaints to Court (Data Protection Com-
missioner 2020, para. 126). Furthermore, it argued that the adequacy decision of
the Commission was invalid.
The CJEU based its invalidation on the insufficient safeguards under Privacy
Shield. The Court argued that ‘the Ombudsperson mechanism referred to in
that decision does not provide data subjects with any cause of action before a
body which offers guarantees substantially equivalent to those required by EU law’
98 DATA GOVERNANCE

(CJEU 2020, 3, emphasis in original). The Court highlighted the limited powers
of the Ombudsperson particularly concerning US intelligence. It emphasized the
broad nature of US surveillance, highlighting a dystopian scenario in the order of
fairness:

In the view of the Court, the limitations on the protection of personal data arising
from the domestic law of the United States on the access and use by US pub-
lic authorities of such data transferred from the European Union to that third
country, which the Commission assessed in Decision 2016/1250, are not cir-
cumscribed in a way that satisfies requirements that are essentially equivalent to
those required under EU law, by the principle of proportionality, in so far as the
surveillance programmes based on those provisions are not limited to what is strictly
necessary.
(CJEU 2020, 3, original emphasis)

It is notable that the Court made an explicit assessment of the necessity and
proportionality of surveillance practices by US intelligence services, which are
clearly outside the Court’s jurisdiction. Thus, while the US joined the conflict as
a party and attempted to influence meaning-making processes, the evaluative cri-
teria already entrenched in the 2015 judgment prevailed. The Court’s conclusion,
under the same presiding judge, similarly remained unchanged. Privacy Shield, as
already mentioned, constituted essentially an updated version of Safe Harbour. In
2015, the Court had invalidated Safe Harbour on the basis of the Commission’s
lack of an assessment of the legal situation in 2000. The judgment already con-
tained strong indicators that the Court did not deem US intelligence compatible
with fundamental rights in the EU. Therefore, this judgment should, like the case
of data retention with the invalidation of the Data Retention Directive in 2014
(Digital Rights Ireland 2014) and the invalidation of member states’ general obli-
gation to data retention just two years later (Joined Cases C-203/15 and C-698/15
2016), be understood as a way by which the Court stressed the seriousness of its
initial assessment.
The case thus highlights the role of judicial actors as part of political conflicts.
Some (Brkan and Psychogiopoulou 2017) have explored the role of courts in
political controversies from a legal perspective, including the CJEU (Brkan 2017).
However, in the conflict resolution process, the courts play an important political
role, particularly in cases of less stringent institutionalization (see also Fahey and
Terpan 2021). It shifted from a more member states-oriented approach that left
significant leeway for national authorities and courts (Bagger Tranberg 2011, 242)
to a stricter interpretation of necessity and proportionality, for instance in Schecke
(Joined Cases C-92-93/09 2010). This emphasis was upheld even compared with
other fundamental rights such as freedom of expression in Satamedia (2008). This
seems to have increased in recent years. In 2014, the CJEU had surprisingly, and
SAFE HARBOUR AND ITS DISCONTENTS 99

against the recommendations of the AG Yves Bot, backed the ‘right to be forgot-
ten’ and thereby increased obligations particularly for search engines. In the same
year, the Court also invalidated the European Data Retention Directive, which had
been in place since 2006 (Data Retention Directive 2006). In member states, sev-
eral constitutional courts had formerly declared laws based on the directive to be
unconstitutional. However, this, as Schneider highlights, was a first opportunity
for the Court to ‘establish itself as a fundamental rights court, even in the field
of security law’ (2017, 549). Thus, compared with the restrictive interpretation of
security-relevant data in 2006 (see also Chapter 5), the requirement of ‘essentially
equivalent’ standards for data transfers generally indicates a shift to and firmer
entrenchment of the order of fairness. As in the prominent Kadi case of 2009, the
Court used human rights law to refer to extraterritorial rights (Schrems 2015).
It remains to be seen whether the Court’s recent decisions will contribute to a
backlash (Burchardt 2020) or appreciation (Powles 2015b).

4.3. Changes in the Field: The General Data Protection Regulation

As I showed in Section 4.2, the Snowden revelations had a significant catalysing


effect on the jurisdictional conflict over transatlantic commercial data sharing. In
this section, I outline broader changes that unfolded in the field during and after
the renegotiation of Safe Harbour and the Snowden revelations more generally.
The conflict is embedded in a significant overhaul of commercial data protection
in the EU, more specifically the introduction of the 2016 EU GDPR, the most com-
prehensive data protection legislation globally to date. The adoption of the GDPR
in its current form not only represents to some extent a result of the jurisdictional
conflict that resurfaced after Snowden (Christou and Rashid 2021; Kalyanpur and
Newman 2019b; Laurer and Seidl 2021; Schünemann and Windwehr 2020) but
also has implications for potential future conflicts in commercial data governance.
For instance, the European Commission (2016c) explicitly emphasized that the
planned reform of the Data Protection Directive (1995) would also constitute a
key element in rebuilding trust in transatlantic data flows and the digital economy.
Moreover, the GDPR had enormous impact across the globe and demonstrates the
global reach of the EU in data governance (Bradford 2020, ch. 5). The GDPR was
also adopted alongside legislation on data transfers in the law enforcement sec-
tor, which shows the increasing recognition of interlinkages between public and
private data processing practices.
To address the outdated features of the Data Protection Directive (1995), the
European Commission had presented a legislative proposal for an updated data
and privacy protection framework in 2012. The proposal was designed to instigate
further harmonization of data protection definitions and rules in different sec-
tors, especially in the economic and security sectors, as well as in different member
100 DATA GOVERNANCE

states (Ripoll Servent 2017, 119). Following intense—even unprecedented (Laurer


and Seidl 2021, 259)—lobbying efforts, some were concerned that the regulation
would weaken rather than strengthen the existing data protection regime (Kalyan-
pur and Newman 2019b). The negotiation process proved difficult, as member
states had reservations about enhanced control over implementation and monitor-
ing processes by the European Commission and the newly established European
Data Protection Board (EDPB) (Ripoll Servent 2017, 123–4). The 2016 GDPR can
clearly be described as ambitious. The legislation represents an attempt by the EU
to institutionalize its growing jurisdictional claims, also in view of the case law of
recent years. The marketplace principle unequivocally establishes the applicability
of EU law for all companies that target customers residing in the EU, irrespec-
tive of the location of the service, data, or processing practices, which reinforces
the extraterritorial effect compared with the Data Protection Directive (1995). In
addition, other instruments, such as the right to be forgotten, which became insti-
tutionalized as a right to erasure (Art. 17), the privacy by design and by default
approach (Art. 25), or the right to data portability (Art. 20) (GDPR 2016), broaden
and deepen the scope of the GDPR. The strengthened competences of DPAs, in
particular the issuance of administrative fines that reach up to 4 per cent of com-
panies’ annual global turnover, backed up by a specific and relatively tight legal
framework, arguably make the GDPR the most progressive data protection legis-
lation globally (Burri and Schär 2016). While the European Commission was eager
to emphasize that the GDPR efforts ‘started long before Edward Snowden and a
flood of spying revelations made data protection fashionable’ (Reicherts 2014, 2),
the literature has identified the Snowden revelations as a key element in the devel-
opment of the assertiveness of the GDPR (Kalyanpur and Newman 2019b; Laurer
and Seidl 2021).
These developments are also interlinked with the Schrems case. On the one
hand, the GDPR reflects the judgment’s high standards of data protection, includ-
ing those for third countries, that draw on the EU Charter (Kuner 2017, 895–6).
It also provides backing for the responsibility of data protection authorities to be
strong and independent. On the other hand, the Schrems case is also a significant
assertion of the power to enforce the EU data governance regime and the willing-
ness of the CJEU to address the onward transfer of data for intelligence purposes.
In its 2020 judgment, the CJEU confirmed that data transferred for commercial
purposes are protected by the GDPR, even if they are subsequently used for secu-
rity purposes (Data Protection Commissioner 2020, paras. 86–9). Both public and
private data controllers will have to evaluate their practices against this back-
drop. The Schrems case and the GDPR are evidence of the increased emphasis
on the sovereign status of the EU as a reference community, particularly regard-
ing shared community norms and the export of specific principles. An interviewee
mentioned the use of the phrase ‘GDPR first’ (EP official, pers. comm. 2019), a ref-
erence to the ‘America First’ policy articulated by the US President Trump, which
SAFE HARBOUR AND ITS DISCONTENTS 101

emphasizes the European vision of data governance as an assertion of sovereign


rights and community norms. The GDPR increasingly seems to be considered an
object of value in reference not only to the order of fairness but also to the order
of sovereignty. Interviews confirmed that there generally is a broad acceptance
by public and private policymakers of the EU’s capacity to enforce its laws. Inter-
viewees emphasized the need to ‘build bridges’ and ‘educate’ (FTC official, pers.
comm. 2019; former FTC official, pers. comm. 2019). However, there is some scep-
ticism by US actors regarding the current approach of treating privacy and data
protection as fundamental rights. For instance, one official argued it would be
‘unfortunate if people were to look at it as a sacred text’ (FTC official, pers. comm.
2019. This shows that the jurisdictional claims based on a combination of fairness
and sovereignty (or the liberal vision of data governance outlined in Chapter 1)
are perceived as a potential impediment to globalism and cooperation.
In addition, the Schrems case highlighted the significant interlinkages between
public and private surveillance practices, which provided additional impetus to
two additional frameworks that were adopted and implemented alongside the
GDPR: the 2016 Law Enforcement Directive that governs data transfers in police
and law enforcement (Data Protection Directive for Police and Criminal Justice
Authorities 2016) and the EU PNR Directive (2016) (discussed in Chapter 5)
were adopted together as a ‘package deal’ (EC official, pers. comm. 2019), in a
similar way to the adoption of the Umbrella Agreement, the US Judicial Redress
Act, and Privacy Shield. This approach is—beyond a strategic attempt to find
a compromise—also evidence of the increasing recognition of the interlinkages
between public/private and commercial/security transfers. The European Com-
mission specifically argued that the GDPR and the 2016 Law Enforcement Direc-
tive will ‘cover all forms of international transfers, be they for commercial or law
enforcement purposes, between private parties or public authorities, or between
private entities and public authorities’ (2016c, 6). The 2016 Law Enforcement
Directive received relatively little attention compared with the GDPR. However, in
contrast to the EU–US Umbrella agreement, which represents ‘a safety net below
which the level of protection cannot fall’ (EC 2016c, 11), the directive is a more
comprehensive instrument that specifies rules for increasingly extensive national
and cross-border processing activities by law enforcement authorities (Marque-
nie 2017; see also Chapter 7), for example in criminal investigations. While this
is a significant step towards a more comprehensive data governance regime in the
EU, the directive has been subject to significant criticism (EDPS 2015a; Leiser and
Custers 2019).
In sum, in light of the ever-increasing public and private interlinkages of data
collection and processing practices, the field has experienced a significant increase
in salience and to some extent polarization concerning commercial data pro-
tection. The most recent expressions of this salience are the adoption of the
GDPR—the most comprehensive data protection framework globally—and the
102 DATA GOVERNANCE

regulation of EU-internal and transatlantic law enforcement data transfers. In the


EU, the vision of data protection seems to be strongly tied to a more local refer-
ence community, including an explicit hope of other jurisdictions adopting rather
than engaging with these community standards, in keeping with the vision of a
local liberal vision of data governance. The Schrems case has illustrated both in the
EU and internationally that the legal order of the EU facilitates, even for individ-
ual actors, the possibility of disrupting and imposing politically controversial or
challenging decisions in light of fundamental rights violations. While the recent
legislative changes have provided additional support for challenging such viola-
tions, there are persistent exceptions for security purposes as well as continuously
limited consequences of such protections. While the CJEU seems to be willing to
address these consequences assertively, even against political interests, it is bound
to reactive rather than proactive assessments.

4.4. Conclusion

This chapter has illustrated the evolution of the transatlantic commercial data
transfer regime as a space that has been characterized by remarkable continuity
compared with that of other areas of data governance for a significant time but
that has experienced significant changes and conflicts in the last years.
The chapter demonstrates how distinct normative visions across and within
institutions play out in and shape jurisdictional conflicts. Jurisdictional claims by
intelligence services formed data as an object of governance through linking its
public/private and commercial/security character. This contributed to unprece-
dented bulk access to data transferred from the EU to the US for commercial
reasons. While the jurisdictional claim by the US intelligence services was subject
to significant contestations in other areas of data governance, such as airline travel
or financial data, it remained largely implicit due to its secret character. When the
mass data access was disclosed by the Snowden revelations, this brought up not
only the underlying jurisdictional conflict between the regulatory regimes but also
the inconsistencies and contradictions in the EU’s approach to data governance.
The broad concessions to the order of production that had characterized the
transatlantic relationship under Safe Harbour notably and to the frustration of the
European Parliament persisted in the European Commission’s approach after the
Snowden revelations. The entrenched character of the order of production is also
highlighted by the fact that the US administration at no point felt the need to jus-
tify the explicit facilitation of surveillance through a framework that was designed
to provide safeguards.
The further evolution of the conflict also demonstrates the importance of rec-
ognizing and addressing the profound character of normative challenges. The fact
that the European Commission and the US decided to follow the imperative of
SAFE HARBOUR AND ITS DISCONTENTS 103

justification only to such limited extents left a normative vacuum that created
the opportunity for a profound normative challenge. Rather than by the Euro-
pean Parliament, the successful challenge was articulated by an outsider. Schrems,
backed by the European Parliament, NGOs in both the EU and the US, and the
EU’s legal order was able to gain recognition for the criteria of evaluation pre-
sented, demonstrating a dystopian scenario in the order of fairness highlighting
indiscriminate mass surveillance.
Importantly, this case is also a demonstration of the importance of individual
actors, particularly as proponents of the order of fairness. Even if they do not
have immense political or other forms of capital from the outset, this case demon-
strates how individuals can have particularly disruptive qualities. The revelations
by Edward Snowden catalysed significant discursive and institutional develop-
ments and created a ‘salience shock’ (Kalyanpur and Newman 2019b, 7) that
reached a broader audience. Building on the Snowden revelations, Maximilian
Schrems, due to persistent complaints regarding the violation of data protection
principles and privacy rights in the EU, disrupted the commercial data sharing
regime.
The multilevel character of the jurisdictional conflict in Schrems reshuffled the
institutional and discursive dynamics within the EU and contributed to a strength-
ened EU position in the global field. After the Snowden revelations, the public
outcry seemed to lack a transformation in practices, as the US denied breaking
any international agreements or laws. The CJEU judgment strengthened the posi-
tion of NGOs and privacy activists vis-à-vis the European Commission (Farrell
and Newman 2019, 158). The actions of Schrems established better access for
non-state actors and oversight bodies in the long term. This is also in line with
stronger protection for whistle-blowers in the EU, following calls by the Council
of Europe to implement stronger protections (CoE 2014). In addition, consider-
ing the Court’s binding interpretation that any adequate protection would need
to be ‘essentially equivalent’ (Schrems 2015, para. 73), any transatlantic agree-
ment required more comprehensive privacy protection principles, which shifted
the space of possibles towards a stronger recognition of the order of fairness.
While Privacy Shield represented essentially an updated version of Safe Har-
bour, the invalidation provided additional leeway for the European Commission
to demand stronger safeguards, also in the review processes, and thus poten-
tially restrict potential US encroachment. The invalidation of Privacy Shield in
Schrems II is more difficult to assess at this point. The strict and extraterrito-
rial standards set by the CJEU create a strong jurisdictional claim which will
be difficult to reconcile with the existing intelligence practices in the US. In
view of the continued dependence on US services and the largely unchanged
position of US governments toward foreign intelligence collection, substan-
tial progress seems unlikely, as ongoing negotiations indicate a potential Safe
Harbour 3.0.
5
Passenger Data in Air Travel
Establishing Data as a Security ‘Tool’

The Checkpoint of the Future ends the one-size-fits-all concept for


security. Passengers approaching the checkpoint will be directed to
one of three lanes: ‘known traveler’, ‘normal’, and ‘enhanced security’.
(IATA 2011)

In 2011, the International Air Transport Association (IATA), a trade associa-


tion for 290 airlines, introduced a new paradigm that entails scanning for ‘bad
people’, rather than ‘bad objects’ (2011) and is built on differentiation and risk-
based pre-screening programmes. IATA still aims to replace the comprehensive
screening of all travellers with the close scrutiny of selected passengers. This is
embedded in a broader shift towards pre-emptive security measures (de Goede
2008) to identify criminals and terrorists on the basis of risk patterns in large
datasets.
This chapter aims to analyse the extensive and contested collection of air trav-
eller data as the basis of such screening processes. In what follows, I provide an
analysis of the evolution of passenger data sharing as a key security instrument in
the post-9/11 landscape as well as an area of conflict, increasing convergence, and,
ultimately, expansion. Extensive international data sharing began with the intro-
duction of the US Aviation and Transportation Security Act (ATSA) in November
2001, which obligated airlines flying to or from the US to share data on their
passengers, such as personal information, travel itineraries, and financial details,
with the US authorities. In light of the more general increase in surveillance for
security purposes (Bigo et al. 2013) in parallel to this specific law, it should not
be surprising that now, more than twenty years later, an emerging regime of
so-called Passenger Name Record (PNR) sharing includes a diversity of instru-
ments on the regional and international level. In 2016, the EU PNR Directive
established an intra-European system of passenger data sharing that requires data
processing for all flights to or from the EU. In 2017, the UN Security Council intro-
duced Resolution 2396 (UNSC 2017), co-sponsored by sixty-six states, on ‘threats
to international peace and security caused by terrorist acts’, which expands this
requirement to all UN member states.

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0005
PASSENGER DATA IN AIR TRAVEL 105

While PNR sharing seems to follow a linear path of expansion, the adoption
of PNR sharing was far from smooth, particularly in the context of transatlantic
relations. In the early 2000s, when the system was introduced by the US, the 9/11
attacks were insufficient to convince the EU of the necessity and effectiveness of
PNR sharing, and heavy pushback resulted in significant tensions. This involved a
referral to the Court of Justice of the European Union (CJEU) in 2006 and a signif-
icant challenge to a bilateral agreement in 2010, both instigated by the European
Parliament (see also Kaunert et al. 2012, 484). Thus, the conflict resolution pro-
cess not only stabilized but also had effects beyond the resolution of overlapping
jurisdictional claims in the domestic context. How is it possible that, despite suc-
cessful disruptions and strong resistance, PNR is now part of an emerging global
regime?
Compared with the other cases I examine, the earlier phases of PNR sharing
(de Goede 2008; de Hert and de Schutter 2008; Kaunert et al. 2012) in particular
have received more attention in the literature. Scholars offer different explanations
for the persistence of the regime and particularly for the transatlantic agreement.
On the one hand, Argomaniz has argued that ‘border security cooperation is
far from being a “partnership”, resembling instead an asymmetrical relationship’
(2009, 120). This explanation of the stabilization of PNR sharing conceptualizes
the EU as a ‘norm-taker’ (Argomaniz 2009) and establishes the influence of the
US as the key driver behind the continuation of the regime. A less stark version of
this argument simply points to normative convergence (Porter and Bendiek 2012).
Others have argued that this interpretation ignores important aspects of the evo-
lution of PNR or EU security policies more generally (Bellanova and Duez 2012,
123; Monar 2015). For instance, de Goede and Wesseling argue that actors in the
EU seem to have ‘appropriated’ (2017) security norms by adjusting them to the
local context, which also has significant benefits for EU counterterrorism efforts
(Monar 2015, 351). Farrell and Newman (2019, ch. 3) have highlighted the role
of influential transnational coalitions that successfully insulated security policy-
making from contesting actors such as the European Parliament (see also Pawlak
2009). Bellanova and de Goede (2022), as well as Ulbricht (2018), have identified
technological choices and the infrastructural setting as significant factors in the
regulatory process.
This chapter adds a more detailed analysis of how PNR was transformed in insti-
tutional fora, not despite but crucially also through contestations, and how these
processes are interlinked with private security actors. As the earlier phases of the
conflict have been examined in more detail, this chapter offers the opportunity to
focus more on the consequences of conflict resolution strategies. In this chapter, I
make three broad arguments: first, in line with existing literature, I argue that PNR
should be understood not only institutionally but also normatively embedded in
a more comprehensive strategy of post-9/11 counterterrorist activities (Kaunert
and Léonard 2019). By providing a detailed analysis of justificatory practices in
106 DATA GOVERNANCE

the context of an emerging field, I aim to outline the routinization of transat-


lantic cooperation by shaping the understandings of responsible behaviour for a
community of vulnerable people.
Second, I point to two influential drivers of the shifting normative develop-
ment. On the one hand, an institutional shift in 2006 excluded challenging actors
from meaning-making processes and entrenched the order of security as an eval-
uative standard, as is also pointed out in the literature (Farrell and Newman
2019, 81). On the other hand, I argue that this process was not only steered by
the actors in favour of broad PNR sharing. The decreasing normative quality
of contestations by challenging actors contributed to the inclusion of safeguards
but prevented a shock akin to that of the Schrems case (see Chapter 4). While
the jurisdictional conflicts constrained the existing sharing practices via legal
provisions, the institutionalization of passenger data sharing has contributed to
a normalization and broader acceptability of PNR sharing as a global security
standard.
Third, others have emphasized the role of complicit security actors such as
banks or airlines (de Goede and Wesseling 2017; Farrell and Newman 2019, 71).
I stress in particular the broader embedding of this conflict in a transformation of
the security-industrial or surveillance-industrial complex (Ball and Snider 2013;
Hayes 2012). More specifically, I point to the promotion of technological solu-
tions by the private sector that foster endogenous logics that contribute to the
continuation and expansion of the regime.
The chapter is structured as follows: first, I give a brief overview over the emer-
gence of a conflict between the EU and the US after the implementation of the
US ATSA. I highlight the contestations by, in particular, the European Parliament
and their unintended consequences. Second, I focus on how, through the institu-
tional shift that resulted from the court case, PNR was transformed into a ‘tool’ and
increasingly normalized. Third, I examine the expansion of PNR globally before
pointing to re-emerging contestations.

5.1 Counterterrorist Measures and the Introduction


of PNR Sharing

Before zooming in on the unfolding and evolution of jurisdictional conflicts over


PNR, I first illuminate the context in which the rise and contestation of PNR is
embedded. The significant expansion of counterterrorist action in the EU, the US,
and globally after the 9/11 terrorist attacks has entrenched security as a signifi-
cant evaluative criterion in data governance and demarcated the space for policy
options. Thus, when on 11 September 2001 two planes crashed into the World
Trade Center in New York and further planes targeted the Pentagon and Wash-
ington DC, this not only marked the beginning of a new era that would come to
PASSENGER DATA IN AIR TRAVEL 107

be significantly shaped by the war on terror but also fundamentally changed the
character of commercial air travel.

5.1.1 Counterterrorism after 9/11

The 9/11 terrorist attacks killed nearly 3,000 people and contributed to a signifi-
cant overhaul of counterterrorist policies. Within a year after the attacks, the US
had introduced more than 130 pieces of legislation. The US Department of Home-
land Security (DHS), established in 2002, is now one of the largest ministries in
terms of both budget and personnel. Legislation broadened information sharing
competences, for example through the US Patriot Act (2001), and intelligence and
surveillance capacities, for example through the Intelligence Reform and Terror-
ism Prevention Act or the classified Terrorist Surveillance Program. The Foreign
Intelligence Surveillance Act (FISA) passed in 1978 was amended in 2008—
through section 702—to prevent the surveillance of US citizens (Kaczmarek and
Lazarou 2018).¹
In the EU, many actors used the 9/11 attacks as a ‘window of opportunity’
(Kaunert et al. 2015) to similarly widen and deepen the competences of the EU
in counterterrorism. Terrorism featured prominently in diverse strategies, such as
the Counterterrorism Strategy, the European Security Strategy in 2003, and the
Internal Security Strategy (Tocci 2017). Until 2009, EU policy had been divided
into three pillars, that is, common market activities, foreign and security policy,
and home and justice affairs, which specified rules and responsibilities for the EU
institutions in these areas. The Lisbon Treaty significantly increased EU compe-
tences in security, particularly for the Commission and the Parliament (Kaunert
and Léonard 2012).
Besides the national and regional overhaul of counterterrorist policies, the 9/11
terrorist attacks also contributed to the proliferation of international efforts, par-
ticularly UNSC activity (Kreuder-Sonnen 2019). It also reinforced cooperation
between intelligence services and law enforcement agencies in the transatlantic
context (Farrell and Newman 2019, 45), including an operational agreement
between Europol and the US, Justice and Home Affairs (JHA) ministerial meet-
ings, and the European Counter Terrorism Centre (ECTC), as well as cooperation
with social media or tech companies (Bartlett and Reynolds 2015). Coopera-
tion efforts did not always run smoothly, for example concerning extraordinary
rendition and detention facilities (Porter and Bendiek 2012) but also regarding
data processing and protection standards. The conflicts in this area prompted

¹ For a more comprehensive overview of counterterrorism in the US, see Cook (2020); for the EU,
see Argomaniz, (2011); Argomaniz et al. (2017); and for the transatlantic community, see Anagnostakis
(2017); Porter and Bendiek (2012); for a specific focus on data, see Amoore and de Goede (2005);
Farrell and Newman (2019, ch. 2).
108 DATA GOVERNANCE

negotiations about bilateral agreements, such as the Terrorist Finance Tracking


Program (TFTP), which regulates access to financial data (see Chapter 6), the
Umbrella Agreement, which establishes data protection rules in law enforcement
cooperation, and the Passenger Name Records agreement, which regulates access
to traveller data, which I outline in Section 5.1.2. In sum, while the increase in EU
competences in counterterrorism (e.g. Argomaniz 2015; Monar 2015; Occhipinti
2015) and the transatlantic partnership in counterterrorism (Anagnostakis 2017;
Rees 2007) have been discussed more extensively, PNR has been less prominent
in the debate (exceptions include Argomaniz et al. 2017; De Goede 2008; Kaunert
et al. 2012, 2015; Porter and Bendiek 2012). Moreover, the literature also cur-
rently does not include a long-term assessment that considers the recent global
expansion (for a limited discussion, see Farrell and Newman 2019, ch. 3).

5.1.2 Passenger Name Record Sharing

In view of the key role of air travel in the attacks, aviation security in the US
and elsewhere immediately expanded. Security measures at airports, such as secu-
rity checks before check-in, or the prohibition on carry-on liquids, have been
buttressed by less visible safeguards, such as cooperation between airlines and
security authorities, the introduction of no-fly lists, or the increase in the number
of Federal Air Marshals (Salter 2007; Schouten 2014). The adoption of the US Avi-
ation and Transportation Security Act (ATSA 2001) in December 2001 expanded
the surveillance of passengers travelling to or from the US.
The ATSA introduced an obligation to provide passenger data to the US Cus-
toms and Border Protection Bureau for all airlines passing through the US (Guild
and Brouwer 2006, 1). There are different types of passenger data that correspond
to varying sensitivity levels. For instance, advanced passenger information (API)
includes basic data such as passenger name, document identification number, or
address. These data were shared between the US Customs and Border Protection
Bureau and air carriers on a voluntary basis prior to the ATSA. In the EU, the 2004
Carrier Directive 2004/82/EC also introduced API sharing on request to improve
border controls and fight illegal immigration. The ATSA demanded sharing of the
more comprehensive PNR data. PNRs consist of up to seventy-six elements of
personal information, depending on the airline and the legal requirements of the
jurisdictions in which it operates. PNR data are generated and transferred as part
of the booking process and stored in a computer system, often one of the global
distribution systems or the airline reservation system. They facilitate processing
of customers on flights with cooperating airlines, for example under a code-share
system. PNR data include, for example, personal and travel details of the passen-
ger and co-travellers, baggage information, IP addresses, flight meal preferences,
payment details or discount codes, and other services booked in connection with
PASSENGER DATA IN AIR TRAVEL 109

the flight, such as hotel or hire car bookings (EU and US 2012, Annex). While the
data were originally collected for commercial purposes, PNR transfers to security
authorities aim to identify passengers linked to terrorism and serious crime on the
basis of certain travel or intelligence patterns (Brouwer 2009). While API data may
identify known terrorists, criminals, or suspects, PNR data aim to flag persons who
were previously unsuspected of involvement in such activities for further inves-
tigation. They also reveal potentially highly sensitive information. For example,
specific service requests, such as particular meal preferences for kosher or dia-
betic food, or special assistance requests indicate religious affiliations or health
concerns, while emergency contact or co-traveller details reveal information on
close personal contacts.

5.1.3 Tensions without Conflict?

Despite the high salience of terrorism at the time of adoption, the introduction
of the ATSA provoked significant tensions between the EU and the US. The US
claim of legitimate control over passenger data via the ATSA conflicted with provi-
sions of the EU data protection regime. In a 2002 letter, the European Commission
informed the US authorities about potential conflicts of laws regarding the 1995
Data Protection Directive (Joined Cases C-317/04 and C-318/04 2006, para. 33).
The conflict concerned not only the potential violation of passenger rights but also
the violation of established principles of sovereignty as specified in international
law (29WP 2004; EP 2004), as US authorities directly accessed data stored on EU
soil. In particular, the so-called pull system, which enables direct access to airline
databases for security agencies, was strongly contested. In contrast to the push sys-
tem, in which airlines transfer or ‘push’ the data to these agencies themselves, data
access is potentially more intrusive. The normative dimension of the conflict thus
concerned incompatible demands based on justifications rooted in the order of
security (PNR as necessary in the fight against terrorism) and the order of fairness
(PNR as a privacy concern). In addition, there were competing concerns based on
the order of sovereignty (data access as a territorial intrusion).
In the months that followed, the US successfully stabilized the jurisdictional
claim of control over data. Consistent enforcement as well as strict punishments
for non-compliance established the use of PNR as ‘necessary for purposes of
ensuring aviation safety and protecting national security’ (US Department of the
Treasury 2002, 42,711). The strict enforcement secured support of the airlines that
wanted to avoid heavy fines of up to $5,000 per passenger or the loss of landing
rights. While the US postponed enforcement for EU airlines, US authorities argued
that compliance was necessary after March 2003 (Kaunert et al. 2012, 484). The
European Commission, faced with the unilateral jurisdictional claim of a major
power, referred to ‘competing, even to some extent irreconcilable concerns, all of
110 DATA GOVERNANCE

which are legitimate in themselves’ (Bolkestein 2003, 2). In 2003, the European
Commission announced that compliance with the ATSA would not be punished
(EC and US Customs 2003, para. 4). Yet legal uncertainty for airlines persisted
due to the potential enforcement of data protection law by member state DPAs
(Beatty 2003). The European Commission attempted to terminate this uncertainty
(Patten 2004, 3) and reified the issue as an international cooperation problem
that needed to be addressed via a bilateral or global agreement. This manifested
the convergence of the goals of global security cooperation and private company
interests.
The US strengthened its claim through security references. Already in 2003, the
EU–US joint declaration explicitly established a reference community of vulnera-
ble citizens using PNR data to ‘identify and interdict potential terrorists and other
threats to national and public security, and to focus Customs resources on high
risk concerns, thereby facilitating and safeguarding bona fide traveller’ (EC and
US Customs 2003, Annex, emphasis added). The reference to bona fide travel (see
also, e.g., EU and US 2007, 8 Annex II) establishes an imaginary reference com-
munity of innocent and helpless potential victims that is existentially threatened
and in need of protection. This move provided legitimacy for stricter preventative
surveillance by demarcating the space of possible options through the evaluative
criterion of security.
The EU echoed this shift. Already in 2003, the European Commission had
argued that ‘the EU’s approach cannot be limited to responding to the initiatives of
others’ (EC 2003, 3). It emphasized the EU’s aspiration to improve its position in
the field and establish its own PNR directive. This interpretation resonates with the
analysis of EU counterterrorism by Monar (2015, 355–6), who argues that the EU
established an increasing international presence and capability in this area, despite
the firm entrenchment of operational capacities in the member states. Both the
early articulations of an EU interest in developing its own PNR directive and the
securitization attempts in other documents highlight that, at least for some actors,
there was a pre-existing interest in strengthening security goals (de Goede 2008;
Farrell and Newman 2019) rather than a unidirectional construction or norm-
taking process. Nevertheless, terrorist attacks, particularly on EU soil, increased
public pressure for action. The United Sates and the EU aimed to mitigate this
pressure through cross-border cooperation.
The pre-existence of interests in broadening security-related policies did not
equal an uncritical acceptance of such instruments on either side of the Atlantic.
Farrell and Newman highlight how strong resistance from civil liberties orga-
nizations in the US challenged the set-up of domestic databases and surveil-
lance programmes such as CAPPS II.² The domestic failure of security measures

² The Computer Assisted Passenger Prescreening System II assessed the risk of individual passen-
gers by comparing PNR information with a number of governmental databases. It was terminated
PASSENGER DATA IN AIR TRAVEL 111

contributed to the necessity of transnational cooperation for US actors (Farrell


and Newman, 2019, 70–3). Due to existing security provisions for cross-border
travel as well as the emphasis on foreign travellers, this required less justificatory
effort for a domestic audience. Shifting the conflict away from the domestic com-
munity as a reference point externalized the conflict to the transatlantic context.
This unveiled divergences in jurisdictional claims within the EU.

5.1.4 EU Struggles over Competences and Norms

In March 2004, the European Council asked the European Parliament for an
opinion on the conclusion of a bilateral agreement with the US. The Council
emphasized its urgency and stated that:

The fight against terrorism, which justifies the proposed measures, is a key prior-
ity of the European Union. Air carriers and passengers are at present in a situation
of uncertainty which urgently needs to be remedied. In addition, it is essential to
protect the financial interests of the parties concerned.
(Council of the EU, cited in Joined Cases C-317/04 and C-318/04 2006, para. 37)

This statement not only highlights the importance of the fight against terrorism
and its interlinkages with private interests but also emphasizes the EU’s position.
MEPs were largely critical of this approach. In a 2003 resolution, the Parliament
emphasized:

the doubts and concerns that have been expressed by the national authorities
concerning the legitimacy of this demand, including its legitimacy under US
law, and in particular doubts about its compliance with EU data protection leg-
islation given the risk that reservation system databases may become de facto
‘data-mining’ territory for the US Administration.
(EP 2003)

By pointing to the violation of community values, the reference to the poten-


tial undermining of EU law through a joint declaration by EU and US officials
also appealed to the order of sovereignty. The explicit reference to a poten-
tial data mining territory highlighted a fear of foreign domination and poten-
tial territorial intrusions. The 29WP reinforced the emphasis on the order of
sovereignty by releasing an opinion highlighting the conflict of laws between
the ATSA and the 1995 Directive. The 2002 opinion stated that ‘it does not
seem acceptable that a unilateral decision taken by a third country for reasons

in 2004 and replaced by the more limited programme Secure Flight; see https://www.tsa.gov/travel/
security-screening, accessed 30 June 2022.
112 DATA GOVERNANCE

of its own public interest should lead to the routine and wholesale transfer of
data protected under the directive’ (29WP 2002, 6). In 2003, the head of the
29WP strongly contested the Commission’s approach arguing that ‘it has not yet
been clarified how the Joint Statement might provide a sound legal base to jus-
tify an exception to that rule’ (Rodotà 2003, 2). In the domestic context, DPAs
began to issue opinions and fines, stating the EU airlines were not allowed to
transfer data (Farrell and Newman 2019, 77). The fact that the Commission
largely ignored this pressure from the designated expert authorities reinforced
tensions.
In the spring of 2004, the discord between the European Commission and the
European Parliament manifested as a conflict over the actors’ positions in the field.
In a 2004 resolution, the Parliament ‘reminds the Commission of the require-
ment for cooperation between institutions which is laid down in Article 10 of
the Treaty’ (EP 2004, para. 7) and highlighted the lack of enforcement of EU law
(EP 2004, para. B). In April 2004, the European Parliament decided to request a
CJEU opinion on the basis of doubts about the potential overreach of Commis-
sion competences as well as the compatibility of the agreement with the right to
privacy granted under Article 8 of the ECHR. This also constituted an implicit
rejection of the Council’s request for urgent consideration and its jurisdictional
claim.
Despite the resistance of the European Parliament and the fact that the CJEU
had not articulated an opinion on the matter, the Commission decided to adopt
an adequacy decision in May 2004. The adequacy decision constituted a neces-
sary legal basis for any further agreement, as it established that EU passengers’
data would be protected sufficiently according to EU law. The decision was
tied to the adoption of a set of ‘Undertakings’ by the US Homeland Security
Bureau of Customs and Border Protection (CPB 2004), which specified condi-
tions for the collection and transfer of PNR. Like the Safe Harbour principles
(see Chapter 4), these undertakings reflected basic data protection elements, such
as purpose limitations, sensitive data restrictions, and a significantly reduced
retention time from fifty to three and a half years. The EU was not success-
ful in its demand for a push system, but the CBP committed to switching from
the pull system once airlines were able to provide the necessary data. Onward
transfers were allowed for other US agencies as well as other governmental author-
ities in the framework of counterterrorism and law enforcement (CPB 2004).
In sum, the conditions constituted an effort to strengthen privacy safeguards
and at the same time appealed to the order of sovereignty, conforming to legal
requirements.
In early June 2004, the Council informed the European Parliament that it had
approved the decision and stressed the fight against terrorism and the legal uncer-
tainty for the airlines. Shortly after, the US and the EU concluded a first bilateral
agreement that set standards for passenger data sharing and transfers to third
authorities. At the beginning, the agreement states:
PASSENGER DATA IN AIR TRAVEL 113

The European Parliament has not given an Opinion within the time-limit which,
pursuant to the first subparagraph of Article 300(3) of the Treaty, the Council
laid down in view of the urgent need to remedy the situation of uncertainty in
which airlines and passengers found themselves, as well as to protect the financial
interests of those concerned.
(EU and US 2004, 1)

As in the earlier proposal, the emphasis on legal certainty for companies was
strongly tied to the fight against terrorism, highlighting the convergence of eco-
nomic and security goals. The Council used this to justify the lack of inter-
institutional coordination with the Parliament.
Due to the treaty provisions at the time, the Parliament’s opinion was not bind-
ing. The Parliament decided to file a lawsuit with the CJEU, even though it was not
completely clear whether the case was actionable (Farrell and Newman 2019, 80).
As in the request for the court opinion, MEPs voiced concerns about the legality of
the adequacy decision, contesting the provisions based on justifications rooted in
the order of fairness, as well as the legal basis of the agreement, contesting partic-
ularly the Commission’s position in the field. In contrast, the Council emphasized
that the agreement was based on the first pillar, as it concerned commercial data
collection practices by airlines. Due to the harmonized nature of the data protec-
tion regime, unequal treatment for airlines could negatively affect competition.
Under Article 95 of the EU Treaty, the Council of Ministers would be authorized
to secure the conditions for the internal market.³
In 2006, the Court annulled both the Council decision on the agreement and
the European Commission’s decision on adequacy (Joined Cases C-317/04 and
C-318/04 2006) due to an inadequate legal basis. The Court found that both the
Council of Ministers and the European Commission had acted outside the legal
framework:

While the view may rightly be taken that PNR data are initially collected by air-
lines in the course of an activity which falls within the scope of Community law,
namely sale of an aeroplane ticket … the data processing which is taken into
account in the decision on adequacy is, however, quite different in nature. As
pointed out in paragraph 55 of the present judgment, that decision concerns not
data processing necessary for a supply of services, but data processing regarded
as necessary for safeguarding public security and for law enforcement purposes.
(Joined Cases C-317/04 and C-318/04 2006, para. 57)

While this echoed some of the concerns voiced by the European Parliament, the
Court’s reasoning had unintended consequences and the effect of the judgment

³ In July 2005, the LIBE Committee asked for full co-decision rights on PNR issues, which would
have been possible under the ‘passerelle’ clause of the EU Treaty whereby the Council would share
decision-making authority, but the Council declined.
114 DATA GOVERNANCE

was contrary to the intention of the Parliament. By establishing PNR data as a mat-
ter of public security, the Court shifted institutional responsibility to the Council
and particularly the interior ministers, who tend to focus more on security con-
cerns due to their professional mandate. Rather than limiting the influence of the
European Commission, the judgment altered the space of possibles and enabled
further shifts towards the order of security, entrenching PNR as a necessity.
This institutional shift towards the JHA Council resulted in an exclusion of
strong proponents of the order of fairness from the negotiations, particularly
the European Parliament and DPAs (de Hert and de Schutter 2008; Kaunert
et al. 2012; Monar 2015). The change of institutional forum also put the Coun-
cil presidency instead of the European Commission in charge, which required
the unanimous approval of the deal by the Council of Ministers (Kaunert et al.
2012, 485). In addition, the Court set a time limit for a new agreement of the end
of September 2006, thus imposing a sense of urgency, which was reinforced by
the legal uncertainty for European airlines. Therefore, these institutional changes
provided improved conditions to subject meaning-making processes to princi-
ples of the order of security. The reasoning of the judgment allowed the Court
to avoid any substantial balancing act between fundamental rights or goals, which
implicitly provided the Council with a legitimation not to push for stronger data
protection safeguards.
In summary, the institutional situation at the time provided significant advan-
tages to the Council and enlarged the space of possibles for jurisdictional claims
based on a pursuit of security, while it weakened the position of the Parliament
and DPAs in their attempts to strengthen principles of the orders of fairness and
sovereignty. While the European Commission’s reasoning to push for a bilateral
agreement seemed to be grounded in the orders of globalism and production,
emphasizing cooperation and the potential negative impacts of legal uncertainty
for airlines, the European Parliament highlighted a violation of principles of the
order of sovereignty. While the Parliament’s claim for invalidation of the PNR
agreement was successful, the judgment also changed the conditions for justifi-
catory practices. It altered the space of possibles to entrench evaluative criteria of
the order of security, which I outline in Section 5.2.

5.2 Normalizing Data as a Tool

As Hansen and Nissenbaum (2009, 1168) have argued, in a technological con-


text, actors who attempt to impose security as the main criterion of evaluation
frequently rely on the depoliticization of their actions. Rather than drawing pub-
lic attention to an issue through the creation of an existential threat (Balzacq
2007), the references to supposedly neutral technologies remove the issue from
the political debate. In the case of PNR, an institutional shift facilitated this move.
PASSENGER DATA IN AIR TRAVEL 115

In Sections 5.2.1–5.2.4, I aim to demonstrate how the discursive and practical fixa-
tion of meaning demarcated the space of possible claims to jurisdiction. I highlight
the transformation of PNR into a key ‘tool’ in the fight against terrorism, which
entrenched a broader normative shift to the order of security.

5.2.1 After the Invalidation of the 2004 Agreement

The invalidation of the first bilateral PNR agreement by the CJEU in 2006 did not
attract significant attention in the US, as domestic authorities just continued to
enforce the ATSA. Yet a new agreement was mandated by the CJEU. The increas-
ingly powerful DHS (Farrell and Newman 2019, 84) explicitly tried to distance
itself from former commitments regarding data protection safeguards. Michael
Chertoff, the then Secretary of Homeland Security, argued that ‘we still remain
handcuffed in our ability to use all available resources to identify threats and stop
terrorists’ (Chertoff 2006), pointing to the importance of transnational coopera-
tion and referring to a potential dystopian scenario in the order of security. This
emphasis on strong security cooperation was reinforced in informal settings. For
instance, the German Federal Minister of the Interior Wolfgang Schäuble allegedly
promised an informal solution if the PNR agreement failed. According to a leaked
US document, Schäuble:

instructed his staff, for example, to find a way to bilaterally share airline Passenger
Name Records (PNR) data with the U.S., in case the EU is unable to re-establish
a legal basis for PNR data sharing before the September 30 EU court-imposed
deadline. The German data privacy commissioner opposes the move and claims it
cannot be done legally, but Schaeuble told his ministry to find a legal way to do it.
(US Embassy Berlin 2006a, para. 4)

This statement highlights the importance of transnational security cooperation


and the perceived obligation to find common solutions for national security issues.
Contrary to the potential expectation that security would be restricted to the
emphasis of sovereign principles (e.g. in a (neo)realist paradigm is Mearsheimer
2003; Waltz 2010), the order of security is highly compatible with principles of
the order of globalism (see also Bauman et al. 2014; Farrell and Newman 2019).
Informal cooperation in this area was established as early as the 1970s with the
introduction of the TREVI Group in 1975, an informal meeting of EU interior
ministers designed to deal with domestic terrorism. The High-Level Political Dia-
logue on Border and Transportation Security, established in 2004, contributed to
the increasing compensation and substitution of formal safeguards by informal
networks (Pawlak 2009, 14–15). This also points to a vision of data governance as
a security partnership.
116 DATA GOVERNANCE

While the emphasis on cooperation paved the way for a global regulation of
PNR-sharing practices, there is still a tendency to leave room for manoeuvre to
domestic actors. For instance, the US practice of relying on letter exchanges has
been criticized for avoiding legally binding principles (EDPS 2007, 2). According
to leaked documents, ‘The U.S. seeks a different kind of PNR agreement, based
on general principles, not a list of detailed “dos and don’ts”’ (US Embassy Berlin
2006b, para. 7). The emphasis on vague principles conforms to the expectations in
the literature that states demand a considerable amount of leeway concerning core
state powers such as security (Genschel and Jachtenfuchs 2018). While considered
deficient in the order of fairness, informal and flexible guidelines may prove worth
in the order of security.
The attempts to evaluate the situation according to criteria based in the order
of security were underlined by references to concrete threats. Actors frequently
pointed to the threat of terrorism by referring to, for example, the 9/11 attacks,
the terrorist attacks in Madrid in 2004 and in London in 2005, or the failed
attacks on Christmas Day aboard a Northwest Airlines flight headed to the US
in 2009 (EC 2010b; Patten 2004). In 2007, Chertoff again addressed the Euro-
pean Parliament, emphasizing he believed that ‘we are at war’ and ‘life is the
liberty on which all others depend’ (Chertoff, cited in EP 2007a). This not only
highlights the immediate threat but also explicitly links the common good, includ-
ing the exercise of civil liberties, to the higher common principle of security.
While the literature has emphasized the role of ‘external shocks’ (Mahoney and
Thelen 2010, 4) in institutional change (Busch 2013), the terrorist attacks did
not contribute to increased politicization but to increased convergence around
security interests (Maricut 2016). This is in line with the expected depoliticizing
effect of security references in tech-heavy sectors (Hansen and Nissenbaum 2009,
1168).
In consequence and despite a lack of evidence regarding the effect on pub-
lic safety (29WP 2013a, 1), PNR became increasingly normalized as a significant
instrument in the fight against terrorism. A new agreement was adopted in 2007.
In line with the court judgement, PNR was increasingly referred to as a ‘tool’
(EU and US 2007, 2). For instance, the Council proposal characterized PNR data
as ‘a very important tool for carrying out risk assessments of the persons, for
obtaining intelligence and for making associations between known and unknown
people’ (Council of the EU 2007, 3). In contrast to a conception of data as a mat-
ter of individual rights and personal autonomy, the conceptualization of data as
a tool entrenches the fixed and apolitical nature of data as a governable object.
In this understanding, data may be employed for a certain purpose and then
be disregarded without any implications for the passenger’s identity or dignity.
This conceptualization altered the space of possibles regarding the necessary
safeguards.
The new interim PNR agreement sometimes offered a lower level of data
protection. For example, the 2004 agreement emphasized in its first paragraph:
PASSENGER DATA IN AIR TRAVEL 117

the importance of respecting fundamental rights and freedoms, notably privacy,


and the importance of respecting these values, while preventing and combating
terrorism and related crimes and other serious crimes that are transnational in
nature, including organised crime.
(EU and US 2004, para. 1)

In contrast, the first paragraph of the 2007 PNR agreement specifically stated that
the purpose of the agreement was to ‘prevent and combat terrorism and serious
transnational crime effectively as a means of protecting their respective democratic
societies and common values’ (EU and US 2007, para. 1). The fourth paragraph
referred first to the fight against terrorism and crime and then introduced the con-
dition ‘while respecting fundamental rights and freedoms, notably privacy’ (EU
and US 2007, para. 4). This emphasis was, however, again limited in the fifth
paragraph, which stated that ‘U.S. and European privacy law and policy share a
common basis and that any differences in the implementation of these principles
should not present an obstacle to cooperation between the U.S. and the European
Union’ (EU and US 2007, para. 5). While these are seemingly small differences and
details, the position of provisions in such agreements generally provides an esti-
mation of the hierarchization of priorities. The emphasis on the order of fairness,
which featured prominently in the 2004 agreement, was replaced by an emphasis
on security cooperation and the fight against terrorism. The Parliament and the
European Data Protection Supervisor heavily contested the 2007 agreement due
to the lack of data protection. Both problematized that the safeguards specified in
the letters could be unilaterally withdrawn at any point (EDPS 2007, 1). Yet this
criticism had no significant effects.
In summary, the increasing shift towards the order of security as the evaluative
criterion of PNR data was based on the transformation of data into a neutral ‘tool’
in transnational security cooperation. This not only shaped the space of possible
policy options but also had an impact on the space for contestations by defining the
relevant ‘rules of acceptability’ (Boltanski and Thévenot 1999, 360). The changing
space of possibles also significantly shaped the conditions for further cooperation
and contestation, as I highlight in Section 5.2.2.

5.2.2 Institutional Changes after Lisbon

The entrenchment of the conceptualization of data as a tool contributed to an


increasing discursive linkage between responsible behaviour in the fight against
terrorism and the call for PNR sharing. The notion of responsibility in particular
shaped the perceived space of possibles for the European Parliament (see also
Kaunert et al. 2012). At the beginning of the conflict, the European Parliament
tried to establish itself as a guardian of human rights and the order of fairness,
referring explicitly to ‘the failure of the Commission’ (EP 2004) to safeguard fun-
damental rights. This prompted, for example, MacKenzie and Ripoll Servent to
118 DATA GOVERNANCE

highlight that the European Parliament had been ‘most resistant to norm taking’
(2012, 72, emphasis in original). Yet the normative character of the challenges to
the status quo in the field decreased considerably over time. This suggests that
its increased institutional competences and the corresponding responsibility dis-
course ‘have led the EP to abandon, or at least compromise, on its principles’
(Ripoll Servent and MacKenzie 2011, 401).
Until the Lisbon Treaty came into force in 2009 and due the institutional shift
towards the JHA Council in 2006, the institutional setup had largely contributed
to the entrenchment of PNR. However, the Lisbon Treaty significantly changed
the institutional competences of the European Parliament.⁴ In the Area of Free-
dom, Security and Justice (AFSJ), often also referred to under the former label of
JHA, the European Parliament has risen to the role of a co-legislator which estab-
lishes increased parliamentary accountability in a particularly contested policy
area (Rittberger 2012). While the Parliament had been able to provide opinions
on so-called ‘third pillar’ decisions, these were not binding. This made it possible
for member states to adopt the 2004 PNR agreement despite the determination
of the European Parliament that transfers were illegal (US Mission to the EU
2009, para. 2). As is evident from leaked reports (US Mission to the EU 2009,
para. 4), there were increasing concerns due to the looming implementation of
the treaty. This concerned particularly the requirement for parliamentary ratifi-
cation of international agreements. The European Parliament does not have the
formal right to initiate legislation. Yet the LIBE Committee was now able to con-
duct reports and issue resolutions on its own initiative. These authority gains of
the Parliament constituted a potential threat to the stability of the PNR agree-
ment. The agreement was formally still only implemented as an interim agreement
because of an insufficient number of approvals by the national parliaments. This
meant it now required approval by the European Parliament (Farrell and Newman
2019, 86).

5.2.3 Changing Conceptions of Responsibility

In May 2010, MEPs rejected a vote on the interim agreement. This veto initially
seemed to represent a more foundational normative challenge based on the order
of fairness. The European Parliament had just decided to reject an agreement on
financial data sharing for counterterrorist purposes (see Chapter 6on the basis of
a lack of normative fit (MacKenzie and Ripoll Servent 2012, 80). The PNR agree-
ment seemed to constitute a similar assertion of the Parliament’s newfound power
(de Goede 2012b). Yet while the European Commission was forced to negotiate
a new PNR agreement, the assessment of the European Parliament as a signifi-
cant challenger was put in doubt when MEPs accepted a largely unchanged PNR

⁴ For a comprehensive overview, see, e.g., Rittberger (2012).


PASSENGER DATA IN AIR TRAVEL 119

agreement proposal in 2012. This prompted the NGO EDRi to call it a ‘bad day for
civil liberties in Europe’ (EDRi 2012). The Liberal MEP Sophie in ’t Veld who acted
as a rapporteur even withdrew her name in protest, because she felt that the agree-
ment failed to respect the Parliament’s demands on fundamental rights concerning
the retention period, necessity and proportionality concerns, and provisions for
sensitive data (Carrera et al. 2013, 13).
The 2011 agreement was different mainly in the specifics of the retention period,
which was changed to five years (compared to three in the 2004 agreement and fif-
teen in the 2007 agreement), and the fact that data were stored in a depersonalized
and then anonymized form. In the agreement, the conceptualization of data as a
tool was largely upheld. PNR was referred to as ‘necessary’ (EU and US 2011) or
even a ‘tool we cannot do without’ (Meehan 2011, 3), while reports of the EU
Counter-Terrorism Coordinator repeatedly stated that ‘PNRs are a key element
in the fight against international terrorism. It has been and remains a valuable
tool to detect terrorist networks and movements’ (Council of the EU 2010b, 22,
2011, 28).
It seems reasonable to conclude that the European Parliament failed to con-
vert its improved position in the field into the policy change it had previ-
ously demanded. In view of the early challenges, it seems surprising that an
increase in the European Parliament’s formal authority did not lead to signifi-
cant changes concerning demands to respect principles of the order of fairness,
particularly data protection and privacy. The literature seems to converge around
the increasing manifestation of the European Parliament’s role as a responsi-
ble legislator (Ripoll Servent 2013, 982) or partner in global counterterrorism
(Kaunert et al. 2012, 490). The Parliament struggled to be recognized as a
global actor. For instance, members of the US Congress did not necessarily
make time to meet MEP delegations (EP official, pers. comm. 2018) and thus
excluded the European Parliament from transnational processes of meaning-
making (Farrell and Newman 2019, 87). In addition, it is likely that the past
failure to gain recognition for a more fairness-based approach constrained the
perceived space of thinkable and sayable options, including those for civil society
actors.
Moreover, the institutional structure of the field had further converged around
common security interests. For instance, the Stockholm Programme emphasized
the role of ‘necessary technological tools’ (Council of the EU 2010a), including
PNR, in the protection European citizens. It highlights how data-driven secu-
rity policies have persistently shaped EU counterterrorism efforts. This was in
tune with a more comprehensive strategic shift towards prevention and risk in
the understanding of security (see also Amoore 2013).
Furthermore, and in contrast to other cases, the US was closely involved in the
meaning-making process. Already in 2005, the DHS emphasized the important
role of common liberal values, including privacy as ‘an essential right and fun-
damental value’ (DHS 2005, 3). In his 2007 address to the European Parliament,
120 DATA GOVERNANCE

the US Homeland Security Secretary Michael Chertoff emphasized that ‘civilised


countries should respect each other’s privacy laws’ (cited in EP 2007a), thus link-
ing sovereign equality and privacy as community values. In the early 2010s, the
US even more actively pursued this strategy. The 2011 US National Strategy for
Counterterrorism (NSCT) constituted a ‘sea change’ (Porter and Bendiek 2012,
500) because of its strong commitment to fundamental rights, the rule of law,
and privacy (US White House 2011, 4). US officials also largely embraced innova-
tions in privacy and data protection as contributing to legal certainty. The DHS
answered comprehensively European concerns by implementing review processes
and inviting data protection experts for consultations and information sessions.
In an interview (DPA representative, pers. comm. 2017), a member of a review
committee described the review process, recounting how US intelligence members
were very accommodating and willing and able to answer questions comprehen-
sively. They also demonstrated how PNR was used in security-relevant situations.
However, the interviewee also described how manifest concerns repeatedly voiced
by two members of the review committee were not included in the report. Again,
evaluative objects and practices perceived as valuable in the order of fairness
were employed but did not prompt significant changes. Thus, in contrast to, for
example, the dispute about commercial data transfers discussed in Chapter 4, the
US involvement created a transatlantic link in the meaning-making process which
established the legitimacy of the US jurisdictional claim.

5.2.4 Limited Challenges after Snowden

The transatlantic link based on the promotion of data governance as a secu-


rity partnership also persisted after the Snowden revelations, despite improved
transnational connections between the European Parliament and the US Congress
after the introduction of a parliamentary liaison office in Washington (EP official,
pers. comm. 2018). The European Parliament asked EU institutions and mem-
ber states to consider the suspension of the PNR agreement (EP 2013a, para. 4)
and criticized the ‘lack of judicial or administrative rights [that] nullifies the pro-
tections for EU citizens laid down in the existing PNR agreement’ (EP 2014a,
para. BG). Yet challenges remained limited. The Snowden revelations did not cre-
ate the same shock that catalysed meaning-making processes in other areas (see
Chapter 4). Despite the increased salience of the issue, the revelations did not
expose particularly new information. This made it more difficult to impose evalu-
ation criteria linked to the order of fairness, such as a dystopian scenario of mass
surveillance. The continued challenges to the PNR agreement had contributed to
transparency and review mechanisms that, despite potentially limited restricted
capabilities, contributed to a stabilization of the compromise. The European Com-
mission and the DHS conducted a joint review of the PNR agreement, which
PASSENGER DATA IN AIR TRAVEL 121

discussed limited but lawful transfers to the NSA. In consequence, PNR was not
more politicized after the Snowden revelations but became increasingly normal-
ized. While there are still some MEPs pushing against PNR, such as in ’t Veld, one
interviewee from the NGO Privacy International even stated ‘PNR has not been
on our agenda for a while; we are not engaged with it’ (NGO representative, pers.
comm. 2019).
In sum, PNR sharing has become an increasingly accepted security practice that
has stabilized through continued contestations of a decreasing normative qual-
ity. By subscribing to the ‘illusio’ (Bourdieu 1996, 331–6) that data governance
can contribute to the common good, actors accepted PNR as a necessary tool in
the fight against terrorism. The space of possible policy options was limited to the
extent of safeguards rather than the question of abolishing data sharing altogether.
As I demonstrate in Section 5.3, this has affected domestic and international
developments.

5.3 The EU PNR Directive and Global Expansion

In Sections 5.3.1–5.3.3, I illustrate how the resolution of the jurisdictional conflict


over PNR affected the broader policy arena. The section shows the consequences
of meaning-making processes in the context of passenger data sharing within the
EU and the adoption of PNR sharing obligations by the UNSC. These measures
have entrenched PNR not only as a transatlantic but as a global security practice.

5.3.1 EU PNR Directive

Despite significant criticism regarding PNR, discussions about the establishment


of a framework to share passenger data within the EU had started early on. In a
2003 statement, the European Commission was optimistic that within the next
three and a half years the ‘the EU will have developed its own policy on the use
of PNR for transportation and border security purposes, and the US debate on
data privacy may also have evolved’ (EC 2003, 7). While overstated in hindsight,
this statement demonstrates that the evolution of an EU PNR directive not only
was the result of a norm convergence over time but addressed previously exist-
ing interests and convictions. Yet the fact that over ten years passed between this
statement and the adoption of the EU PNR Directive in 2016 also demonstrates its
contested quality. The development of an EU-internal framework had been on the
agenda since shortly after the introduction of the ATSA, not only for the increas-
ingly assertive European Commission (Farrell and Newman 2019, 89) but also for
the Council members who aimed to improve the EU’s position in the field. For
instance, the Portuguese Minister of Internal Administration Rui Carlos Pereira
122 DATA GOVERNANCE

and the German Minister of the Interior Schäuble suggested that ‘a PNR system
would allow the EU to negotiate with the U.S. on an equal footing and would allow
for balanced cooperation’ (Guardian 2010). A 2008 proposal of the European
Commission failed to be adopted before the Lisbon Treaty came into force, which
made substantial revisions necessary to secure the support of the European Par-
liament. After prolonged debates (Brouwer 2009), a new proposal was presented
in 2011. Shortly before the Snowden revelations, in April 2013, the LIBE Com-
mittee rejected the proposed PNR Directive by thirty votes to twenty-five because
of concerns regarding inadequate safeguards. The EDPS warned of installing ‘the
first large-scale and indiscriminate collection of personal data in the history of the
Union’ (2015b, 1), pointing to a dystopian scenario in the order of fairness.
In 2016 and as part of a ‘package deal’ (EC official, pers. comm. 2019) with the
GDPR, the European Parliament adopted the EU PNR Directive with 461 votes in
favour, 179 against, and 9 abstentions. As the Parliament pushed for the inclusion
of human rights references, the agreement included some concessions. Yet the first
mention of privacy is in paragraph 15 (PNR Directive 2016, para. 15). The PNR
Directive largely focuses on the necessity of PNR, emphasizing that:

Effective use of PNR data, for example by comparing PNR data against various
databases on persons and objects sought, is necessary to prevent, detect, investi-
gate and prosecute terrorist offences and serious crime and thus enhance internal
security, to gather evidence and, where relevant, to find associates of criminals
and unravel criminal networks.
(PNR Directive 2016, para. 6)

The necessity of PNR is presented as a factual statement, despite existing doubts


about its effectiveness (EDPS 2015b). The original proposal did not even include
the reference to ‘effective use’ but just established that PNR data were neces-
sary (EP 2016a). This again highlights an almost automatic assumption that the
expansion of data collection and processing is necessary for achieving security,
as stipulated by the concept of the ‘illusio’ (Bourdieu 1991, 179). While in 2012
MacKenzie and Ripoll Servent (2012) juxtapose the ‘critical outsider’ behaviour
in PNR with the ‘insider’ position of the European Parliament in the TFTP
agreement (see Chapter 6), this assessment seems to be no longer valid.

5.3.2 Reasons for Adoption

Under the PNR Directive, data transfers for intra-European travel are not com-
pulsory, but nearly all member states decided to include such processing in their
legislation. During the negotiation phase, the Conservative LIBE Committee Rap-
porteur, Timothy Kirkhope explicitly recommended the mandatory inclusion of
PASSENGER DATA IN AIR TRAVEL 123

intra-EU flights due to the benefits of a ‘uniform set up and strong security
advantages’ (LIBE 2015, 59). The report stated that:

Given the importance of the purpose for which the PNR data is collected and pro-
cessed, as well as the varied, sophisticated and international nature of the threat
posed, it is necessary to have a system which operates on a 100% collection basis
both within the EU and with third countries in order for the system to be fully
effective. The collection of 100% data also reduces the risk of profiling.
(LIBE 2015, 100)

To justify a wider scope of surveillance measures with a decreasing the risk of


profiling is unique in European Parliament communication, which also high-
lights again the significance of rapporteurs (former MEP adviser, pers. comm.
2019). While in ’t Veld had actively taken a stance against the PNR agreement,
Kirkhope further entrenched the necessity of PNR. It also links liberal values and
counterterrorism as promoted by US actors.
Besides the role of key individuals, a diversity of reasons contributed to the
adoption of the EU PNR Directive. First, the status quo created pressure. PNR on
the domestic level was already in place in a significant number of member states
(Bąkowski and Voronova 2015), which made the introduction of community rules
to limit access practices more appealing. For instance, the UK had adopted a PNR
sharing system in 2012 to prevent any attacks in the context of the Olympic Games.
This pressure increased when, in a heavily criticized decision in 2013, the Euro-
pean Commission provided a total of €50 million to fourteen EU countries for
developing domestic PNR proposals. This move was perceived as ‘a Trojan horse’
(Farrell and Newman 2019, 89), as it aggravated the risk of fragmented national
systems, which in turn strengthened the pressure for legal harmonization.
Second, throughout the jurisdictional conflicts over PNR, security became
increasingly entrenched. In contrast to the Schrems case analysed in Chapter 4,
in which European DPAs established a credible threat to undercut US business
practices in the EU, the security relationship between the EU and the US was and
is in many ways unequal. The US has better access to intelligence information, on
which European intelligence services often depend. Data flows under the bilat-
eral PNR agreement exacerbated rather than mitigated this dependency, which
also reinforced the call for a European PNR system (EC 2010b). As the EU was
working on additional agreements with other countries, not using an intra-EU
system became even less attractive. The terrorist attacks in Paris in 2015 and Brus-
sels in 2016 provided additional salience to the order of security. The statement of
JHA ministers following the terrorist attacks in Brussels referred to the adoption
of the PNR Directive ‘as a matter of urgency’ (Council of the EU 2016), extensively
emphasizing the priority of increased information sharing. The prime minister of
France Manuel Valls decided to directly address the European Parliament, asking
124 DATA GOVERNANCE

it to vote for the PNR Directive and stated, ‘We’re at war’ (cited in Davis 2016), thus
repeating the statement the US DHS Secretary Michael Chertoff had made in 2007.
The concept of war re-emphasizes the idea of an existential threat as well as notions
of responsibility. Valls added, ‘I say in particular to the socialist and environmental
groups in the European Parliament: Everyone should assume their responsibili-
ties’ (cited in Davis 2016). French The Minister Bernard Cazeneuve echoed this
narrative, arguing that it was ‘irresponsible of the European parliament to delay
the vote on the adoption of PNR’ (cited in Rankin 2016). The establishment of
an existential threat also invalidated other forms of proving worth. Several actors
tried to contest the effectiveness of PNR. The EDPS said ‘that the Impact Assess-
ment includes extensive explanations and statistics to justify the Proposal. These
elements are however not convincing’ (EDPS 2011a, 3). According to an intervie-
wee, in an attempt to appeal to standards consistent with the order of production,
the European Greens tried to refer to cost-based arguments to emphasize the lack
of effectiveness and efficiency but failed (former MEP adviser, pers. comm. 2019).
Third, it is important to take note of private sector interests. On the one hand,
airlines called for a blanket collection of PNR in all forms of transport, includ-
ing trains, ‘so as to avoid discrimination and competitive distortions’ (Association
of European Airlines 2007, 1), thereby fostering an expansion of the regime.
On the other hand, private security companies designing security solutions con-
tributed to their perceived necessity. As information sharing has become an
integral part of security and defence policies, demand for private sector solutions
is increasing. In 2003, the European Commission convened a ‘Group of Person-
alities’ that included several industry representatives, including some from arms
production companies. The defence industry assumed an important role in shap-
ing future security policies (M. Britz and Eriksson 2005). Since then, different
funding programmes have promoted the use of technological tools in securing
both internal security and external borders. In 2015, the European Commis-
sion convened another group of personalities, which comprised sixteen members,
seven of whom held significant position in private security or defence companies,
such as Indra, MDBA, or BAE Systems. The EU Ombudsperson noted that ‘the
Group of Personalities seems to have played a significant role in preparing the
Commission’s EDAP [European Defence Action Plan], especially as regards the
European Defence Fund’ (European Ombudsman 2018, para. 21). The European
Defence Fund, reduced from the proposed €13 to €8 billion between 2021 and
2027, is intended to include significant investments in disruptive technologies and
data-based security practices. This demonstrates the broader ongoing entrench-
ment of security concerns in the field, which are interlinked with the expansion of
PNR sharing.
There are also indicators of a more substantial involvement of private actors.
Press reports (Lévy 2016) and a publication by the NGO network EDRi
(McNamee 2016) point to a connection between the French PNR proponent
PASSENGER DATA IN AIR TRAVEL 125

Manuel Valls and the company Safran. Safran is based in Évry, where Valls used to
be mayor until 2012. The Safran Group designs and implements data-based secu-
rity measures, including the biometric Aadhaar system in India that aims to be
the largest identity database in the world, a facial recognition program at Changi
Airport in Singapore, and the French army’s drone system (Safran 2016). The
Safran Group’s division Morpho was contracted to supply an API-PNR system by
France in 2014 and by Estonia in 2015.⁵ There is no evidence of any direct involve-
ment, but reports have claimed that Valls intervened when a contract was given to
a competitor (Lévy 2016). While private military contractors are relatively well
researched (Leander 2005), research on what is referred to as the ‘surveillance-
industrial complex’ (Ball and Snider 2013; Hayes 2012) is still in embryo. It is
evident that profits from systems like PNR often benefit the private sector. Compa-
nies such as US software developer SAS, which ‘offers a broad PNR data system for
border management and security that supports continuous management and eval-
uation of high-risk passengers based on their pattern of activities, watch lists and
other data while helping to expedite the smooth movement of legitimate travellers’
(SAS 2017, 1), have an interest in expanding data processing.
In sum, these examples highlight the stabilization of compromises between
the order of security, the order of globalism, and the order of production: Pri-
vate companies have increasing interests in promoting their technological security
solutions based on data availability, mandated by public legislation. To avoid
inequalities between countries, these measures are implemented broadly. This
contributes to a self-reinforcing logic whereby both private companies and public
officials aim to increase the use of data-based security measures, stabilizing juris-
dictional claims based on a vision of data governance as a security partnership.

5.3.3 PNR Going Global

The self-reinforcing dynamic of data-based security measures is also visible at the


global level. As the EU is exploring a variety of bilateral PNR agreements, several
countries, including the Five Eyes countries, have adopted domestic legislation.
Russia, Mexico, the United Arab Emirates, South Korea, Brazil, Argentina, and
Saudi Arabia have implemented measures and have similarly requested PNR shar-
ing with the EU (Spanish Delegation to the Council of the EU 2015). In 2020, the
Council formally authorized the opening of negotiations regarding a PNR agree-
ment with Japan (Council of the EU 2020). In 2017, the UN Security Council
introduced Resolution 2396 on ‘threats to international peace and security caused

⁵ The former Morpho Division has since merged with Oberthur Technologies to form new company
Idemia, which concluded both PNR projects. For more information, see https://www.idemia.com/,
accessed 30 June 2022.
126 DATA GOVERNANCE

by terrorist acts’ (UNSC 2017), which requires all UN member states to develop
systems for processing and analysing PNR. The UNSC also explicitly referred to
International Civil Aviation Organization (ICAO) standards and ‘urges ICAO to
work with its Member States to establish a standard for the collection, use, pro-
cessing and protection of PNR data’ (UNSC 2017, para. 12, emphasis in original).
In 2019, a joint EU–US declaration ‘reaffirmed our shared interest in establishing
ICAO standards to encourage rapid and effective implementation of UNSCR 2396
for the use of PNR to combat terrorist travel, with full respect for human rights
and fundamental freedoms’ (EU and US 2019, 2). This also reflects a commit-
ment to strengthening (institutionalized) global cooperation in this area as well as
the increasing role of the UNSC in counterterrorism after 9/11 (Kreuder-Sonnen
2018). This again highlights the important interlinkages to broader debates in
global governance.
Despite its strong entrenchment, there have been some signs of resistance that
highlight the precariousness of these agreements, including both direct and indi-
rect challenges. As is evident from the legal challenges to the Canadian PNR
agreement (CJEU 2017), challenges against PNR continue. The 29WP emphasized
that it ‘is convinced that the reasoning of the Court against the PNR agreement
with Canada is relevant for all PNR instruments’ (29WP 2018, 1–2). Indeed, fol-
lowing strategic litigation against the 2016 EU PNR Directive, a judgment in June
2022(Ligue des droits humains 2022) has set further restrictions on the collec-
tion and use of PNR. The Court broadly followed the recommendation of the
Advocate General Giovanni Pitruzzella, who, in January 2022, found the mass
collection and processing of data to be compatible with fundamental rights, con-
sidering necessity and proportionality as well as existing safeguards (CJEU 2022).
However, the Court set further restrictions on the scope and retention of data
as well as the purposes for their use, the application of the law in the context of
both intra-EU flights and alternative measures of transportations, and the use of
further databases and automated profiling (Ligue des droits humains 2022, para.
299). Thus, while the Court upheld the legality of the PNR Directive, the long-term
consequences of the judgment are unclear. The restrictions set by the Court would
require a significant overhaul of the adopted laws in the member states, and law
suits are already pending (Gesellschaft für Freiheitsrechte 2020). However, further
legal challenges are likely to occur, as the restrictions require further interpreta-
tion regarding specific cases, specific types of data and risk, and concerning the
design of review and supervisory mechanisms. While the Court apparently tried
to balance its approach, this judgment may be subject to criticism in the future, as
it not only sets at time impracticable restrictions but also upholds a directive that
‘entails undeniably serious interferences with the rights guaranteed in Articles 7
and 8 of the Charter, in so far, inter alia, as it seeks to introduce a surveillance
regime that is continuous, untargeted and systematic’ (Ligue des droits humains
2022, para. 111).
PASSENGER DATA IN AIR TRAVEL 127

5.4 Conclusion

To sum up the PNR case, successful efforts to conceptualize data as a ‘tool’ in the
fight against terrorism shifted the emphasis away from data as a protectable part
of human dignity, thereby entrenching the order of security in the field of data
governance. Interests of airlines that favoured legal certainty, private companies
that sell data-based security products, and US and EU officials who emphasized
the importance of security as a fundamental goal of governance converged with
endogenous logics that favoured an ever-increasing expansion of the regime. Con-
testation had significant but unintended consequences in this case. On the one
hand, the unwilling self-exclusion of the European Parliament through the 2006
CJEU case contributed to further entrenchment. On the other hand, incremen-
tal contestations proved stabilizing in the long term. While contestations have
constrained existing sharing practices in some ways via legal specifications, the
institutionalization of PNR has contributed to its normalization as a global secu-
rity standard. The jurisdictional conflict was, therefore, resolved by increasing
processes of normative and interest-based convergence, based on the delimitation
of the space of possibles through the conceptualization of data as a neutral tool in
fighting an existential threat.
From a normative perspective, it should be noted that the entrenchment of PNR
increases despite the absence of publicly available evidence of the effectiveness of
the systems. While the review reports mention concrete numbers (e.g. EC 2017b,
8), they do not show to what extent PNR was necessary for identifying these per-
sons, to what extent less intrusive measures such as API could have been used, or
whether any of these concerns have resulted in an arrest. Even if more details are
given, information is not necessarily more concrete. In answer to a parliamentary
question concerning the effectiveness of the German PNR system, the German
government states that
‘No information can be provided on the extent to which the measures men-
tioned or the data transmitted have contributed or will contribute to the preven-
tion/prosecution of crime. There is no systematic or standardized recording in the
sense of the question.’ (Federal Government (Germany) 2019, 5, my translation).⁶
This means not only that there is no effort to assess the effectiveness, neces-
sity, and proportionality of the system but that it is also not possible to do so.
The redress mechanisms, that provide legitimacy to the jurisdictional claim over
data, are also rarely used. The EU–US report notes that the DHS Traveller Redress
Inquiry Program has not received any enquiries or requests for judicial review.
This might be due not only to a lack of knowledge about the system but also to

⁶ The original reads: ‘Zu der Frage, inwieweit die genannten Maßnahmen oder die übermittelten
Daten in der weiteren Folge zur Straftatenverhütung/-verfolgung beigetragen haben oder noch beitra-
gen werden, kann keine Angabe gemacht werden. Eine systematische bzw. standardisierte Erfassung
im Sinne der Fragestellung erfolgt nicht.’
128 DATA GOVERNANCE

the lack of transparency of security measures. For example, the MEP Sophie in ’t
Veld filed a lawsuit against the DHS after she suspected a freedom of information
request on PNR and other data had her blacklisted, because she needed to undergo
additional security checks when entering the US (Kanter and Minder 2010). Data-
based security measures have been strongly criticized due to their discriminatory
potential, also with regard to, for example, racial profiling and other practices of
‘social sorting’ (Lyon 2007). Resulting watch lists may be used as a justification for
screening measures and for being denied air travel, entering a country, or govern-
ment benefits. These lists have been found to infringe the constitutional rights
of US citizens (Savage 2019). Considering that in Germany one lead generates
about 400 false leads (Endt 2019), there is significant potential for false positives.
In Chapter 6, I further outline how such practices of selective transparency and
secrecy contribute to the stabilization of agreement.
6
Financial Data Sharing
The Extended Arm of the US Treasury

In 2006, the use of data for security purposes was on the rise. The Europol Infor-
mation System, a central criminal and intelligence information database, had been
introduced in 2005 (Europol 2020), and the adoption of the EU Data Retention
Directive (Data Retention Directive 2006) just a year later specified mandatory
retention for telecommunication data. As discussed in Chapter 5, the CJEU had
found sharing passenger data in air travel a ‘necessary’ (Joined Cases C-317/04
and C-318/04 2006, para. 57) instrument in the fight against terrorism. In this
chapter, I explore a conflict over financial data between the EU, the US, and the
Belgian Society for Worldwide Interbank Financial Telecommunication (SWIFT).
When a 2006 article in the New York Times (Lichtblau and Risen 2006) revealed
that US authorities had over several years secretly accessed details of international
financial transactions under the so-called Terrorist Finance Tracking Program
(TFTP), significant transatlantic tensions arose. The TFTP is a combating the
financing of terrorism (CFT) measure that was designed to track financial flows
to and through terrorist networks, which depend on money for purposes includ-
ing training, travel, bribes, or weapons. After the revelation, European actors
initially heavily contested this jurisdictional claim by the US. In 2010, the Euro-
pean Parliament famously rejected a high-level bilateral agreement supposed to
institutionalize these transfers (Monar 2010). However, since 2012, the sharing of
financial data for security purposes has become entrenched in the fight against ter-
rorism, despite ongoing controversies. The conflict shows important parallels to
the Passenger Name Records (PNR) case discussed in Chapter 5. The impetus for
financial data sharing similarly emerged as part of anti-terrorist measures imple-
mented after 9/11. The jurisdictional conflict likewise concerns the clash of these
security measures with EU data protection regulations due to the secondary use of
commercially collected data for security purposes. There was strong contestation
regarding whether financial data sharing would fall under the first (commercial)
or the third (policing and internal security) pillar of the pre-Lisbon EU (de Goede
2012b, 219–20). Yet while access to PNR was based on domestic law, the data
from SWIFT were initially requested in secret. Due to this undisclosed nature of
the initial programme and the continued lack of oversight and transparency, the
TFTP seemed to be even more controversial than the PNR programme. However,
much as with PNR, sharing of financial data has persisted, despite concerns about

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0006
130 DATA GOVERNANCE

its effectiveness (EDPS 2010a, 2010b). Indeed, it has become part of a broader
financial security assemblage that also includes anti-money laundering (de Goede
2012a, 43–5).
In this chapter, I aim to investigate how, despite heavy contestations in the
beginning and throughout the unfolding of the jurisdictional conflict, the con-
flict resolution process has stabilized. Financial data governance has received more
extensive attention in the literature than other issues of data governance. Existing
studies have mostly foregrounded either legal challenges (Fuster et al. 2008; King
et al. 2018) or the respective roles of the EU or the US in security cooperation
(Amicelle 2011; de Goede 2012a, 2012b; Wesseling et al. 2012). As with the PNR
agreement, scholars find the fate of the TFTP to be interlinked with the role of
the Parliament within the EU institutional system (Ripoll Servent and MacKen-
zie 2011) as well as its role as a global actor (Kaunert et al. 2012). De Goede
(2012b) points to pre-existing similarities in risk management and security con-
cerns. While Farrell and Newman rightly stress the access to transnational fora as
a key factor in the persistence of the TFTP (Farrell and Newman 2019; see also
Pawlak 2009), they underestimate the necessity of creating a lasting normative ‘fit’
(MacKenzie and Ripoll Servent 2012, 74).
This chapter aims to add to more recent scholarship that illustrates how dis-
tinct values become inscribed in such data infrastructures and shape regulatory
decision-making (Bellanova and de Goede 2022) and particularly how selective
practices of secrecy and transparency entrench surveillance practices (Amicelle
2013; Curtin 2014; de Goede and Wesseling 2017). I argue that the introduc-
tion of selective transparency measures such as review and oversight mechanisms
strengthened the idea of TFTP effectiveness and legitimacy. They specify trans-
parency requirements but leave significant leeway for actors under scrutiny. This
re-establishes the character of such mechanisms as ‘composite objects’ (Boltanski
and Thévenot 2006, 279) that represent compromises between the order of fair-
ness and the order of security. As expected by the theoretical framework, this has a
significant stabilizing effect on the conflict resolution process and outcome. It has
contributed to the entrenchment of evaluative criteria that tie jurisdictional claims
to a specific social reality. In this case, the vision of data governance as a secu-
rity partnership requires rules to stabilize the regime but works mainly through
informal transnational cooperation in the fight against terrorism and crime. This
chapter thereby discusses the underlying mechanisms that have perpetuated the
‘illusio’ (Bourdieu 1991, 179) of the field of data governance, that is, that data
governance is meaningful in the pursuit of the common good. Even formerly chal-
lenging actors such as the European Parliament are complicit in upholding an
illusio of effectiveness concerning the pursuit of the common good of security.
The chapter is structured as follows: first, I give a brief overview of the emer-
gence of the jurisdictional conflict after the revelation of US access to financial data
by the New York Times and point to responses. Second, I illustrate the increasing
FINANCIAL DATA SHARING 131

entrenchment of financial surveillance for security purposes through transparency


and review mechanisms and the stabilization of the jurisdictional claim over finan-
cial data. Third, I embed the TFTP in a broader assemblage of financial data
governance.

6.1 The Surveillance of Financial Transactions

To understand the emergence of jurisdictional conflict over counterterrorism


financing, it is important to examine the specific context in which these poli-
cies arose. As the financial sector is one of the primary areas of globalization and
increasing interlinkages, the likelihood for jurisdictional overlap is high (Krisch
et al. 2020). In the mid-nineteenth century, inter-market communication in stock
exchanges used telecommunication. The introduction of the electric telegraph and
the first transatlantic submarine cable in 1866 entrenched the dependence on
quick communication and borderless transfers. The question of terrorism financ-
ing became more urgent in the 1990s, and US authorities unsuccessfully attempted
to gain access to SWIFT data (Wesseling et al. 2012, 54). Terrorism financing was
addressed by the UN International Convention on the Suppression of the Financ-
ing of Terrorism initiated in 1999, the UN asset-freezing regime (King and Walker
2015), and sometimes also in the context of anti-money laundering (Pieth 2006).
As terrorist financing may involve the use of licit funds, for example through dona-
tions to NGOs, funds supplied by humanitarian organizations in the framework
of disaster relief, or income from businesses, the 1999 UN Convention repre-
sented a major step forwards in the criminalization of terrorism financing (Pieth
2006). Yet as I outlined in Chapter 5, the changing security environment after 9/11
contributed to extensive reforms of counterterrorism policies globally. After the
attacks, surveillance efforts increased significantly, also because of the changing
perception of terrorism (C. Walker 2018), which provided a ‘window of opportu-
nity’ (Weiss 2008) to intensify public security measures. The analysis of financial
data promised insights into international terrorist networks, particularly Al Qaeda.

6.1.1 The Terrorist Finance Tracking Program

In October 2001, US authorities began requesting data on financial transactions


from the Belgian cooperative society SWIFT (Lichtblau and Risen 2006; Vanbever
2006). Through the TFTP, the US Treasury Department issued administrative
subpoenas for terrorism investigations. Administrative subpoenas do not require
judicial oversight but are issued directly by the responsible government agency.
Data included details such as name, address, account number, and national iden-
tification number and were restricted to terrorism-related data. Per the agreement
132 DATA GOVERNANCE

with the US Treasury, SWIFT could not disclose the requests (SWIFT 2006). It is
important to note that the TFTP was and is not designed to interdict any transfers
but is restricted to the identification and tracking of terrorist funding networks (US
Department of the Treasury 2022). In contrast to mutual legal assistance treaties,
which are used in the prosecution of crimes (see Chapter 7), the TFTP is based
on a ‘politics of pre-emption’ (de Goede 2018a, 759–60) which tries to identify
potential terrorists before they strike.
SWIFT serves over 11,000 financial institutions in more than 200 countries
and handles more than 7.8 billion financial messages, which makes up the vast
majority of electronic financial transactions in the world (SWIFT 2022). It devel-
ops international standards and facilitates international financial transactions.
SWIFT saves data on financial transactions in case of disputes between banks and
clients (Commission de la Protection de la Vie Privée 2006, 3). Through its quasi-
monopoly position, SWIFT has a key role in international trade and is subject to
oversight by the central banks of the Group of Ten (G10). SWIFT data contains
details about transactions, such as currency, amount, bank and account details,
date, addresses, and in some cases national identification numbers or additional
personal data.
While SWIFT is an organization operating under Belgian law, the data were also
accessible on an identical server from its processing centre in the US. European
data protection authorities were not aware that SWIFT mirrored the data. How-
ever, as they were technically stored in the US, this facilitated direct data access by
the US authorities. The data comprised information about financial transactions to
and from the EU. In principle, this required compliance with both member state
and EU law, including judicial authorization and an adequate standard of data
protection (EP 2006, note 2). SWIFT argued that ‘since the subpoenaed subset of
data already were on US soil, at no point did any crossborder data transfer exist’
(SWIFT 2006, 9). SWIFT, therefore, stated that any cross-border data transfers
had occurred purely for commercial purposes in line with required safeguards.
As a processor, SWIFT did not consider that it had any specific responsibility
for the data that were now stored on US soil. It suggested that any onward trans-
fers were out of its control and consequently complied with subpoenas requiring
data access (Vanbever 2006). Nonetheless, executives at SWIFT seemed to have
been aware of potential legal implications. They asked for additional oversight
and considered withdrawal from the agreement in 2003 (Commission de la Pro-
tection de la Vie Privée 2006, 6–8; Lichtblau and Risen 2006). In response, the US
Treasury introduced additional safeguards, and SWIFT subsequently informed
the National Bank of Belgium and members of the G10 group of central banks
that it was subject to US subpoenas (Commission de la Protection de la Vie Privée
2006, 7–8). While the jurisdictional claim by the US Treasury had been extended
in secret, it now included a broader group of actors that were aware of some form
of data access.
FINANCIAL DATA SHARING 133

6.1.2 The Disclosure and Instigation of Jurisdictional Conflict

While some actors in the EU had been informed about the overlapping claims,
most were not, and the 2006 article in the New York Times (Lichtblau and Risen
2006) that revealed the extent of international financial transactions data accessed
by the US caused significant tensions. In particular, data protection authorities
and the European Parliament felt blindsided (DPA official, pers. comm. 2017).
Through subpoenas, the US had forced a Belgian firm acting under EU law to
submit information through extraterritorial rule and thereby potentially break
EU data protection rules. Again, the secondary use of bulk commercial data for
law enforcement and security purposes was potentially at odds with the purpose
limitation principle as specified by the 1995 Data Protection Directive (Data Pro-
tection Directive 1995, Art. 6(1)(b)) (see also 29WP 2006; Commission de la
Protection de la Vie Privée 2006). The extent of the data transfers was not clear,
particularly because a single subpoena might grant access to a significant amount
of data (Bruguière 2010). Access practices not only concerned the data of European
and US citizens but—due to the international character of the SWIFT network—
impacted financial transactions all over the world. Complaints to data protection
authorities were filed not only in all EU member states but also in Australia,
Canada, Iceland, Liechtenstein, New Zealand, Norway, Switzerland, and Hong
Kong (Bilefsky 2006a).
When the transfers became public, SWIFT rejected allegations of legal mis-
conduct, arguing that it was acting in accordance with US law and Belgian
data protection law (SWIFT 2006). The insistence on legality is, however, to
some extent invalidated by the statement of Leonard Schrank, the Chief Exec-
utive Officer of SWIFT, who said in the same year, ‘We are caught between
complying with the U.S. and European rules, and it’s a train wreck’ (cited in
Bilefsky 2006b), thus hinting at conflicting obligations under different regu-
latory systems. SWIFT established itself as a target of US power, emphasiz-
ing that it ‘had no choice but to comply with compulsory subpoenas lawfully
issued by the UST [United States Treasury] to its US branch’ (SWIFT 2006,
6). SWIFT also emphasized that it has ‘obtained unique and effective data
protection guarantees from the UST, in circumstances where the Act [Belgian
data protection law] was not applicable’ (SWIFT 2006, 9). One of these guar-
antees was that SWIFT was not to be named as the original source of any
intelligence derived from the data (Commission de la Protection de la Vie
Privée 2006, 7). Thus, while establishing safeguards, SWIFT had also made sure
that the US authorities would not disclose the role of SWIFT in intelligence
collection.
Initial EU responses to the revelations were very critical, especially since the
US had kept the programme secret. The European Parliament adopted various
resolutions (EP 2006, 2007b, 2009) expressing deep concern and disappointment
regarding the TFTP. The Parliament justified these concerns in reference to the
134 DATA GOVERNANCE

order of sovereignty, problematizing particularly the violations of EU territorial


integrity and citizens’ rights. For instance, the European Parliament argued that
it ‘[s]trongly disapproves of any secret operations on EU territory that affect the
privacy of EU citizens; is deeply concerned that such operations should be taking
place without the citizens of Europe and their parliamentary representation hav-
ing been informed’ (EP 2006, para. 13). The Parliament highlighted the possibility
of ‘large-scale forms of industrial and economic espionage’ (EP 2006, para. D) and
expressed ‘its serious concern at the fact that a climate of deteriorating respect for
privacy and data protection is being created’, thus pointing to a dystopian scenario
in the order of fairness (EP 2006, para. 1).
The Parliament also criticized EU institutions and member states for not clearly
delegitimizing the jurisdictional claim by the US. In its 2007 resolution, the Par-
liament stressed that it ‘[d]eeply regrets the fact that, several months after these
matters came to light, the Council has not yet taken a stance on this subject affect-
ing so many citizens, consumers and enterprises’ (EP 2007b, para. 16). Similarly,
the Article 29 WP strongly criticized the ‘currently illegal state of affairs’ (29WP
2006, 27) and argued that it constituted ‘a violation of fundamental European
principles as regards data protection and is not in accordance with Belgian and
European law’ (29WP 2006, 26). Both the Belgian DPA and the Article 29 WP con-
sidered SWIFT and in part also the financial institutions, SWIFT’s shareholders,
to be responsible for undermining privacy and data protection principles (29WP
2006; Commission de la Protection de la Vie Privée 2006).
In consequence, a conflict emerged between SWIFT, Belgium, and the EU
regarding the application of data protection law. There were several questions that
had to be addressed, as neither SWIFT nor the European central banks had made
efforts to inform their customers or the DPA of the transfers (Bilefsky 2006b). It
was also unclear whether SWIFT would have needed to notify DPAs or banks of
the change in the purpose of the data—or to what extent processing for security
purposes constituted a change in purpose in the first place, as financial institutions
are always obligated to screen transactions for violation of anti-money laundering
(AML) and counterterrorism policies (US Mission to the EU 2007). This ques-
tion had implications for whether financial data sharing would fall under the first
(commercial) or the third (policing and internal security) pillar of the EU, with the
former being subject to EU competences and the latter being mostly restricted to
the member states and the Council. The security character prevailed, and until the
Lisbon Treaty came into force in late 2009, the European Parliament in particular
had no final decision-making power, only an advisory role.

6.1.3 Towards Compromise

The member states in particular were in favour of a bilateral agreement, also


due to interest in the information for their counterterrorism investigations
FINANCIAL DATA SHARING 135

(Monar 2010, 144). Yet due to the significant attention to the secret nature of the
TFTP, widespread open support for US practices was politically sensitive. The
Belgian prime minister Guy Verhofstadt publicly called the programme ‘an abso-
lute necessity’ (cited in Anderson 2006), but most member state representatives,
including those from Germany and France, expressed their support in more infor-
mal or private settings. Only the UK seemed to be hesitant about an EU–US
agreement, because it was likely that the Commission’s leading role in negotiations
would imply more constraints on data access and sharing (Farrell and Newman
2019, 104–5). Furthermore, according to a leaked document by the US embassy in
London, the UK seemed:

extremely concerned that the Terrorist Finance Tracking Program, and SWIFT’s
involvement in it, is being treated by the European Commission purely as a pri-
vacy issue when it should be ‘rooted in a national security debate’. While HM
Treasury officials are currently preparing advice for ministers on UK next steps,
they are planning to argue that the European Council, not the Commission, has
competency over this issue.
(US Embassy London 2006)

Hence, UK officials seemed to have been concerned about a potential shift in the
debate towards evaluative mechanisms rooted in the order of fairness. Instead,
they pointed to principles rooted in the order of security and emphasized informal
cooperation mechanisms and transnational dialogue, which is more consistent
with a vision of data governance as a security partnership. The European Com-
mission was generally in support of the programme. In 2005, the unit responsible
for data protection had been transferred from the Internal Market Directorate
General to the Directorate General for Justice, Freedom, and Security. This unex-
pected move was strongly criticized, due to the potential stronger influence of
security-favouring officials (Farrell and Newman 2019, 109). A leaked document
describes meetings between, inter alios, the Treasury Deputy Assistant General
Counsel James Freis and various representatives from the European Commission,
the Council, and member states, including Jonathan Faull, the Director General
for Justice, Freedom, and Security. The report notes the commitment of both EU
and US officials to the programme and states that:

Faull asked if it would be possible for Europeans to be added to the over-


sight mechanism … Freis made clear that the US would have problems if the
participants were European data protection authorities; this was a ‘non-starter’.
(US Mission to the EU 2007)

There were opportunities for exchange between high-level officials which, how-
ever, crucially excluded data protection and European Parliament representa-
tives. In this regard Pawlak (2009) stresses in particular the role of various
136 DATA GOVERNANCE

transatlantic fora for informal, mostly security-centric discussions that emerged


in the early 2000s, such as the High Level Political Dialogue on Border and
Transportation Security, the High Level Contact Group, regular JHA High Level
Meetings, and the introduction of the High-Level Working Group on Infor-
mation Sharing and Privacy and Personal Data Protection. The Parliament’s
leverage was limited. Months earlier, in May 2006, the CJEU had excluded
the European Parliament from negotiations of the PNR agreements (Joined
Cases C-317/04 and C-318/04 2006). In its 2007 resolution, the European
Parliament attempted to draw on transnational cooperation between parlia-
ments (EP official, pers. comm. 2018) through multiple references to the US
Congress and parliamentary oversight (EP 2007b, paras. 1, 3–4), with limited
effect.

6.1.4 A Bilateral Agreement

The revelations contributed to two interrelated jurisdictional conflicts. First, the


revelations exposed overlapping jurisdictional claims by the EU and the US.
Second, the European Parliament and data protection authorities in particular
outlined a conflict with SWIFT regarding the legitimacy and legality of data trans-
fers. Yet in view of the Parliament’s more peripheral position in the field and the
support by the Commission and the member states, a transatlantic agreement with
at least limited scrutiny of the TFTP constituted the most viable medium-term
conflict management strategy.
In June 2007, the parties concluded the agreement. It was based on letter
exchanges which were carefully set up to avoid the impression of legal bind-
ingness, which would require domestic ratification (Farrell and Newman 2019,
106). The US Treasury sent a letter with unilateral representations and asser-
tions concerning the collection and processing of the data, which included some
restrictions, notably on the purpose of counterterrorism, and oversight mecha-
nisms (US Department of the Treasury 2007a, 2007b). Independent auditors were
to be supported by an ‘eminent European person’ (US Department of the Trea-
sury 2007b, 25) to confirm compliance. The majority of the agreement referred
to principles based in the order of security and established the need for the pro-
gramme. The US Treasury referred to money as ‘the lifeblood of terrorism’ (US
Department of the Treasury 2007b, 19), suggesting that if funds are cut off, ter-
rorism is unable to survive. The representations also highlighted conformity with
the order of fairness, underlining that ‘The TFTP is grounded in law, carefully
targeted, powerful and successful, and bounded by privacy safeguards. It repre-
sents exactly what citizens expect and hope their governments are doing to protect
them from terrorist threats’ (US Department of the Treasury 2007b, Preamble).
Besides the concrete reference to citizens’ expectations and the performance of
FINANCIAL DATA SHARING 137

the programme, this statement also points to valued objects and principles, such
as law, and privacy safeguards in line with the order of sovereignty and the order of
fairness.

6.1.5 The Changing Perception of SWIFT Responsibility

In 2006, both the 29WP and the Belgian DPA had problematized the role of
SWIFT, suggesting that it was acting as a co-controller rather than a mere pro-
cessor of data (29WP 2006; Commission de la Protection de la Vie Privée 2006).
This distinction is central, because it assigns different levels of responsibility. A
‘processor’ is considered an entity ‘which processes personal data on behalf of the
controller’, which, in contrast to a ‘controller’, ‘determines the purposes and means
of the processing of personal data’ (Data Protection Directive 1995, Art. 2(d)–(e))
and has limited data protection responsibilities. For example, on the basis of the
definition of Google as a data controller rather than a data processor, the company
must limit the availability of search results under the EU ‘right to be forgotten’ (see
Chapter 8).
In 2008, the Belgian DPA largely changed its approach to the nature of the finan-
cial surveillance measures. It came to the new conclusion that SWIFT was to be
considered a ‘de facto delegate [délégué de fait]’ (Commission de la Protection
de la Vie Privée 2008, 59) for the financial community. This concept was intro-
duced by the report and, in contrast to a former report, stipulated that SWIFT
had not violated Belgian law. In the 2006 decision, the Belgian Privacy Commis-
sion had criticized the vague and unconventional definition of terrorism and had
emphasized the non-individualized and massive nature of the requests, compar-
ing it to the measures of ‘Rasterfahndung’ and ‘carpetsweeping’ (Commission de
la Protection de la Vie Privée 2006, 5). In contrast, in its 2008 report, the DPA
referred to several UNSC Resolutions (Commission de la Protection de la Vie
Privée 2008, 11–12) which had not been considered in the initial report but were
emphasized in the US undertakings (US Department of the Treasury 2007b, 19).
The changed assessment resolved the jurisdictional conflict, as the DPA evalu-
ated the claim according to principles and valued objects of the order of security
such as UNSC resolutions rather than private responsibility or privacy principles.
Nevertheless, this should be considered as a conflict management and contain-
ment strategy rather than as an indication that the conflict did not exist in the first
place, as is indicated by the initial opinion of the Belgian DPA and the 29WP state-
ments. At that point, the Belgian DPA had severely limited enforcement capacities
(Pawlak 2009, 572). Without the political support of the member states, it is likely
that it would have failed to assert its jurisdiction on the basis of alternative norma-
tive principles. This points to ignorance of a jurisdictional conflict as a potential
management strategy. For SWIFT, admitting that it was in conflict with European
138 DATA GOVERNANCE

law would have meant risking a penalty, as SWIFT did not notify the Belgian DPA
about the subpoenas. For the DPA, it meant not exposing its weak position in the
field. This shift in position by the DPA had crucial implications for the space of
possibles for future policy options. On the one hand, it limited the legal actions
that could be taken against SWIFT and the banks. On the other hand, it provided
legitimacy to the agreement, addressing concerns about potential ramifications of
openly contradicting DPA opinions, for example by Ireland (US Embassy Dublin
2006).
While willingness to pursue SWIFT’s practices legally was limited, DPA enforce-
ment was still possible, and SWIFT tried to address existing criticisms. On the one
hand, in 2007, SWIFT adopted the Safe Harbour principles, privacy principles that
allowed transatlantic data transfers for commercial purposes (see Chapter 4). As
a European firm, SWIFT was not obligated to join the Safe Harbour framework—
and in 2006 had argued that it was not even possible (SWIFT 2006, 8). The
decision to join must, therefore, be understood as a symbolic rather than substan-
tive move. On the other hand, and more importantly, in October 2007 SWIFT
announced the opening of an operating centre in Switzerland to handle European
data and stop data mirroring in the US. This had wide-ranging implications, as it
meant that the data were out of reach for US subpoenas (SWIFT 2007). The exclu-
sive storage of European data outside US territory made formal actions necessary if
the transfers were to continue after the envisaged moving date of 1 January 2010.
In sum, the secret and sovereignty-intrusive nature of the TFTP had caused an
initial furore in the EU. However, those concerns were increasingly marginalized.
While the jurisdictional conflict seemed to be resolved through letter exchanges,
the change of SWIFT’s technical architecture demanded a legally binding
agreement.

6.1.6 The Threat of Lisbon

Two factors contributed to a potential destabilization of the conflict resolution


process. On the one hand, a legally binding agreement became necessary due to the
relocation of SWIFT’s data processing centre. On the other hand, the upcoming
Lisbon Treaty would increase the European Parliament’s competences by the end
of 2009. This meant that such a legally binding agreement could then be adopted
only with support of the Parliament.
Negotiations for a formal agreement involved the Parliament and the EDPS
to a more significant extent (Farrell and Newman 2019, 113). The draft agree-
ment included data protection measures and requirements for reciprocity such
as purpose limitation, review mechanisms, and access for EU law enforcement
and security officials (LIBE 2010). Yet European Parliament support was still
low (Farrell and Newman 2019, 113–14). In September 2009, the Parliament
FINANCIAL DATA SHARING 139

problematized the restricted nature of the negotiating directive and the lack of
information on the suggested legal basis. MEPs demanded the negotiation of a
permanent agreement with increased participation and scrutiny (EP 2009, D, 7(i)).
On 30 November 2009, exactly one day before the Lisbon Treaty was set to come
into force, the Council decided to abandon any attempts to include the Parliament
and unilaterally adopted an interim agreement. However, the Council failed to
fully adopt the agreement before the Lisbon Treaty came into force (Farrell and
Newman 2019, 114). The agreement now required the consent of the European
Parliament according to Article 218(6) of the TFEU. It foresaw a starting date
for the agreement of 1 February 2010 (Council of the EU 2009), a short dead-
line with limited options for review (EDPS 2010b; LIBE 2010). The Council’s
delayed request for consent therefore seemed to represent an attempt to present
the Parliament with a done deal (Monar 2010, 144–6). In consequence, MEPs were
largely critical of the solution, as the last-minute agreement seemed to undermine
the increased institutional powers of the Parliament in EU external relations. The
adoption or rejection of the agreement was perceived as an important test case
for the new powers of the Parliament (de Goede 2012b; de Goede and Wesseling
2017, 254–5).

6.1.7 A Test Case for the Parliament

As a response to this perceived challenge to the its position in the field, the Parlia-
ment rejected the agreement by 378 votes to 196, with 31 abstentions (EP 2010a;
LIBE 2010). This assertion of power was remarkable, particularly in light of what
Monar describes as extraordinary lobbying efforts, which included a phone call
from the US Secretary of State Hillary Clinton to the European Parliament Pres-
ident Jerzy Buzek as well as internal appeals from the EU Presidency and the
European Commission (Monar 2010, 143). In the debate, the Home Affairs Com-
missioner Cecilia Malmström had warned that the Parliament’s rejection would
constitute ‘a serious threat to the security’ (cited in EP 2010a) of EU citizens,
thus attempting to impose criteria of the order of security. MEPs voiced concerns
regarding the retention period, the extent of bulk data transfers, and the shar-
ing of policies with other authorities and third countries (EP 2010a). The fact
that the agreement had never been critically evaluated with regard to its effec-
tiveness constituted a further concern (LIBE 2010). Most significantly, even for
strong supporters of data sharing for security purpose, the attempt to undermine
the European Parliament’s strengthened position in the field constituted a signif-
icant problem. For example, the Conservative MEP Timothy Kirkhope, a strong
supporter of the PNR agreement (see Chapter 5), expressed his anger at the Coun-
cil’s procedural decisions and argued that ‘Parliament’s right to consent should
not be used as a retrospective tool’ (cited in EP 2010a). Thus, while there were
140 DATA GOVERNANCE

concerns about the potential moral worthiness of the agreement, a significant part
of the rejection was based on the Council’s attempt to undercut the Parliament’s
strengthened position.
Despite this extraordinary display of power, the European Parliament accepted
a revised version of the agreement only four months later, in July 2010 (Kaunert
et al. 2012, 489), shortly after the then US Vice President Joe Biden had addressed
MEPs, arguing that ‘not less than privacy, physical safety is also an inalienable
right’ (cited in EP 2010b). The revised agreement contained concessions to the
Parliament’s concerns regarding bulk data transfer, oversight, blocking possibil-
ities, and retention time. It guaranteed that data were ‘pushed’ from the EU to
the US rather than ‘pulled’ directly by the US authorities (EU and US 2010). The
agreement also required the evaluation of transfer requests by Europol. Some still
considered concessions too limited (EDPS 2010b).
As in the PNR case, the literature converges around the assumption that the
Parliament mostly wanted to cement its role as a strict but responsible and coop-
erative partner also in ‘high politics’ (Kaunert et al. 2012, 489; Ripoll Servent
and MacKenzie 2011). Some additional factors contributed to the acceptance.
First, the institutionalization of reciprocity in Art. 9 and 10 of the agreement,
which the European Parliament (2009) had stressed, proved important. The
possibility of reciprocal data access stripped away the opposition by the social
democrats in particular (Farrell and Newman 2019, 114–15). Second, the agree-
ment recognized the European Parliament’s request for increased parliamentary
and EU scrutiny, which increased normative fit between the jurisdictional claims
(MacKenzie and Ripoll Servent 2012, 87) and further entrenched the Parliament’s
role in transatlantic security policymaking. Third, the revised agreement proposed
the implementation of a European Terrorist Finance Tracking System (TFTS)
(EU and US 2010, Art. 11), which would make the transfer of raw data to the
US expendable. This constituted one of the conditions under which an increasing
number of MEPs were willing to accept the agreement, as they hoped for addi-
tional protection once data processing would be restricted to EU soil (Farrell and
Newman 2019, 119). This is evidence of an increasing acceptance of the value
of TFTP data in the fight against terrorism. The Parliament had recognized the
field’s illusio regarding the effectiveness of financial surveillance with regard to
the pursuit of security.
In contrast to the similar PNR case (see Chapter 5), the initial plan of adopt-
ing a European TFTS has not been implemented (Wesseling 2016). In 2013, the
European Commission concluded that there was no demonstrated need for set-
ting up a European system. The Commission argued that TFTP reciprocity was
working well and doubted the added value of a European TFTS, from both data
protection and security perspectives (EC 2013b). Thus, the idea of potential intru-
sions on sovereignty and the ‘asymmetric relationship between the United States
and other sovereign entities in relation to the growing area of financial intelli-
gence’ (Amicelle 2013, 5) seemed to be outweighed by potential security benefits.
FINANCIAL DATA SHARING 141

The question was revisited after terrorist attacks on European soil (EC 2017a). As
with the revival of the PNR system after its rejection in 2013, new developments in
finances might prompt the expansion of the regime. The existing agreement to a
considerable degree depends on the monopoly position of SWIFT, and the TFTP
programme does not cover Single European Payments Area transactions within
the EU or transactions in cryptocurrency. In 2016, the Commission presented
plans to investigate potential additional coverage of intra-EU financial transactions
(EC 2016b), while a recent consultation does not yet discuss this option specifi-
cally (EC 2020d). Terrorism financing has also been addressed by the recent AML
Directive and in the 2017 Directive on Combating Terrorism (Haffke et al. 2019).
In sum, the Council attempted to avoid a more comprehensive inclusion of
MEP demands for safeguards in the negotiation of a bilateral EU–US agreement
by adopting an interim agreement shortly before the Lisbon Treaty strengthened
the European Parliament’s competences. When asked to consent to an agreement
already been provisionally in force, the Parliament asserted its new position in
the field but shortly after accepted a revised version. The agreement was based
on enhanced review and oversight mechanisms, reciprocity agreements, and a
potential future shift of data processing to the EU.

6.2 The Entrenchment of Financial Surveillance

Over time, the TFTP became entrenched, particularly as member states increas-
ingly relied on the submission of requests to the US Treasury (EC 2013c, 9).
While the 2013 revelations of widespread surveillance by the US National Secu-
rity Agency (NSA) by the former intelligence contractor Edward Snowden (for
an overview, see Chapter 4) constituted a challenge in the field, it did not funda-
mentally change the TFPT. In Section 6.2.1, I describe how practices of secrecy,
selective transparency, and review have contributed to the entrenchment of the
order of security as evaluative standards.

6.2.1 The Limited Effects of the Snowden Revelations

In 2013, after the Snowden revelations, the European Parliament called for a
suspension of the TFTP agreement on the basis of indications:

that the US National Security Agency (NSA) has had direct access to the IT
systems of several private companies and gained direct access to financial pay-
ment messages referring to financial transfers and related data by a provider
of international financial payment messaging services currently covered by the
Agreement.
(EP 2013, para. B)
142 DATA GOVERNANCE

In response to allegations that the US accessed data beyond the TFTP agree-
ment, the European Commission initiated talks but eventually decided not to
engage in formal consultations (EP 2014a, para. BC). In a report in late 2013,
the European Commission emphasized that the informal dialogue ‘did not reveal
any elements proving a breach of the TFTP Agreement, and they led the US
to provide written assurance that no direct data collection has taken place con-
trary to the provisions of the Agreement’ (EC 2013e, 5). Yet the lack of trans-
parency regarding US compliance with the agreement (EP 2014a, para. BB)
and the fact that member states did not request a technical investigation that
could have addressed concerns about direct data access (EP 2013b, para. 4) wor-
ried MEPs. Notably, in the LIBE hearings following the Snowden revelations
(see Chapter 4), the Commission’s Director of Home Affairs, Reinhard Priebe,
refused to comment on whether direct access to SWIFT by intelligence services
would actually constitute a derogation of the agreement (LIBE 2013, third hear-
ing, first session, 24 September). In a letter, the US stated that ‘the US Government
is using the TFTP to obtain SWIFT data that we do not obtain from other
sources’ (letter from the US authorities, dated 18 September, cited in EP 2014a,
note 1).
The letter’s ambiguity with regard to the collection of SWIFT data implied that
direct data collection could have potentially taken place in compliance with the
agreement. Press reports about financial surveillance of credit card data and direct
access to SWIFT networks prompted the Parliament to repeat its call for a suspen-
sion of the agreement (EP 2014a, para. 54). The Parliament further referred to the
testimony of the journalist Glenn Greenwald, who argued in a LIBE committee
meeting that both the NSA and the UK’s Government Communications Head-
quarters (GCHQ) had directly obtained data from SWIFT (EP 2014a, para. BD).
In 2017, the hacker group Shadow Brokers published documents that suggested
the NSA might have had access to two SWIFT service bureaux to monitor trans-
actions in the Middle East and Latin America. SWIFT rejected these allegations
(Lee 2017).
None of the references to dystopian scenarios in the order of fairness, such as
the lack of compliance with existing safeguards and transparency, had significant
effects on the persistence of the TFTP agreement. Despite the calls for its suspen-
sion, the European Parliament was acutely aware of the limitations of its position
in the field, being restricted to commenting on the existing agreement. However,
MEPs argued that they would ‘take account of the responses of the Commission
and the Council in relation to this Agreement’ (EP 2013b, sec. 11).
Thus, while the European Parliament seemed willing to discursively con-
test the jurisdictional claim of the US, it seemed to accept the lack of realistic
options to translate this challenge into tangible institutional change. The con-
flict resolution stabilized, on the one hand, through sheer necessity. On the other
hand, the Parliament’s perception of possible policy options was also shaped
FINANCIAL DATA SHARING 143

by the narrowing normative space of possibles. As I outline in Section 6.2.2,


selective practices of transparency and review procedures stabilized the resolution
process.

6.2.2 The Role of Secrecy and Transparency

Secrecy is well established as a legitimizing mechanism in international politics


(Kreuder-Sonnen 2018) and data governance (Curtin 2018; de Goede 2012a). De
Goede and Wesseling (2017) outline how secrecy evokes the perception of value.
Data turn into valuable information not despite but because of their secret nature.
For instance, a European Commission report on ‘the value of TFTP data’ (EC
2013c) does not give concrete examples of the use of TFTP data due to their clas-
sified nature. This ultimately entrenches the illusio of the field, that is, that data
governance is meaningful in the pursuit of the common good, even in the absence
of tangible evidence. Even challenging actors reify data as a ‘valuable’ (EP 2007b,
para. A) or ‘essential tool in the fight against terrorism financing and serious crime’
(EP 2014a, para. BA). By entrenching the value of data through secrecy, surveil-
lance practices move out of reach of contestation. Review reports to some extent
fulfil the function of truth tests that foster confirmatory rather than justificatory
practices and thereby prevent rather than enable critique (Boltanski 2011, 62).
By providing the possibility for tests but limiting their scope, the worthiness of
the jurisdictional claim is entrenched. This process is visible with regard to two
TFTP-related controversies: on the one hand, the debate on the disclosures by
the New York Times and, on the other hand, that concerning access to classified
documents.
After publishing the article that had revealed the TFTP, the New York Times
experienced significant pushback, including calls to investigate the newspaper for
treason. In a widely cited article on 22 October 2006, the New York Times public
editor published ‘Banking Data: A Mea Culpa’ stipulating that the article reveal-
ing the conflict should not have been published, as the programme was neither
illegal under US law nor violated people’s privacy (Calame 2006b). Just months
before, in July 2006, Calame had defended publication, arguing that the reve-
lation did not constitute a threat to national security (2006a). The article had
not identified any specific mechanisms or technicalities that would prevent the
programme’s continuation. Calame had also pointed out that the article merely
confirmed what experts had suspected, particularly since data access had oper-
ated for several years involving various officials at the Treasury as well as a
private organization (Calame 2006a; Lichtblau and Risen 2006). However, in
the US, the debate shifted away from the effectiveness or the legitimacy of the
programme towards criticism of its disclosure. Questions about implications for
individual rights were replaced by the potential hazards of transparency. This
144 DATA GOVERNANCE

contributed to stabilizing the programme. As de Goede and Wesseling put it,


‘the secrecy of the TFTP is not just a “stumbling block” to be overcome on the
road to transparency, but an active participant in regulating and focusing the way
in which public discussion of the program has unfolded’ (2017, 259–60). From
the outset, debates about the (valued) secrecy and (potentially dangerous) trans-
parency of the TFTP constrained the space of possibles for the evaluation of the
programme.
Transparency and secrecy also constituted important factors in the resolution
of the jurisdictional conflict. In particular, the European Parliament had previ-
ously voiced its disapproval ‘of any secret operations on EU territory that affect
the privacy of EU citizens; is deeply concerned that such operations should be tak-
ing place without the citizens of Europe and their parliamentary representation
having been informed’ (EP 2006, para. 13). While the 2010 agreement included
oversight and review procedures to address the Parliament’s concerns, controver-
sies about the continued practices of selective transparency and secrecy continued.
On the one hand, the MEP Sophie in ’t Veld attempted to gain access to the legal
service opinion of the negotiating directive of the 2010 agreement that the Euro-
pean Parliament had problematized due to its classified character (EP 2009, para.
D, 7(i)). The legal service opinion contained information on the legal basis of the
agreement. Rumours suggested that it could reveal doubts about the exclusion of
the Parliament from the negotiations, a potential source of inter-institutional con-
flict (Curtin 2013; Fahey 2014). The Council and the Commission fought hard
against access by the Parliament, and the case was eventually brought to the CJEU.
The Court argued that the Council had failed to adequately weigh the public inter-
est in disclosure with the strategic objectives of the EU in negotiating international
agreements (Curtin 2013, 449–52).
On the other hand, there were concerns about the review procedures of the
TFTP. The review procedures constitute a key mechanism for monitoring com-
pliance with safeguards, reciprocity requirements, and the overall effectiveness of
the framework (EC 2011). They include representatives from the Commission, the
US Treasury, and data protection as well as counterterrorism experts. The review
mechanism became conflictual when a parallel review process by the Europol Joint
Supervisory Body (JSB) scrutinized Europol’s oversight of the TFTP.¹ In 2011, the
JSB issued a critical report highlighting shortcomings in Europol’s supervision of
the broad nature of the requests granted and a lack of transparency (JSB 2011).
Europol had agreed to every request despite concerns about their vague nature
and broad scope. The review was further hampered by the fact that some informa-
tion was exclusively provided over the phone and not available for ex post scrutiny
(JSB 2011). The report was classified, but the JSB issued a summary of the review

¹ The JSB, which was composed of DPAs from different member states (JSB 2011), has since 2017
been replaced by the EDPS (EDPS 2017b).
FINANCIAL DATA SHARING 145

for the LIBE committee. This sparked a major controversy. The European Com-
mission was clearly dissatisfied with the JSB’s decision to investigate the TFTP.
In the joint review, it highlighted that ‘parallel or uncoordinated initiatives or
inquiries should be avoided because they undermine the article 13 review pro-
cess and have caused considerable workload on the Treasury in particular’ (EC
2012b, 15–16). The Commission further considered the JSB’s decision to pro-
vide access to the report to be ‘a clear violation of applicable security rules and
a breach of mutual trust’ and asked for future coordination between supervisory
bodies ‘in order to avoid overlapping activities and misleading public statements’
(EC 2012b, 16). The Commission’s statement clearly highlights the selective char-
acter of transparency and review procedures. While the Commission review is
conceptualized as an effort to prove worth in the order of fairness, citing data
protection principles and the importance of review procedures, the Commission
criticizes the JSB’s efforts due to their negative impact on transatlantic relations
and trust, referring to evaluative criteria based in the order of globalism. The
Commission report can be interpreted as a clear attempt to appease US concerns
and demonstrate strong alignment of the Commission with the US government,
which had articulated concerns about the JSB’s investigation (Mitsilegas 2014,
303). More specifically, according to a letter cited in an Ombudsperson report,
the US authorities considered that:

the decision by the JSB to draft, without US knowledge or consent, a classi-


fied report and then circulate a public [summary] of that report without prior
written authorization from the information owner (in this case the Treasury
Department) breached the security protocols associated with the TFTP.
(Letter from the US authorities, cited in European Ombudsman 2014, para. 6)

The US authorities considered that such a breach ‘may potentially undermine the
relationship of trust’ (letter from the US authorities, cited in European Ombuds-
man 2014, para. 6) between the parties, thus also emphasizing a lack of compliance
and reliability. The jurisdictional overlap became even more conflictual. When
LIBE committee members asked for privileged access to the full JSB report,
Europol refused access. Europol argued that ‘technical modalities’ which were
agreed upon in the TFTP agreement but never shared with the Parliament or
the Council prevented it from sharing the report as such (European Ombudsman
2014, para. 17). The MEP Sophie in ’t Veld again decided to lodge a complaint
with the EU ombudsperson to gain access to the report, which the Ombudsper-
son considered as the reflection of ‘a democratic deficit at the level of the EU which
must be addressed’ (European Ombudsman 2014, para. 17). The Ombudsper-
son was, despite initiating a dialogue with the US embassy, not able to gain
access to the review either. She strongly problematized her constrained ability to
access a document produced by an EU oversight body on the data of EU citizens.
146 DATA GOVERNANCE

She concluded that the Parliament should consider ‘whether it is acceptable for
arrangements to be agreed with a foreign government which has the consequence
of undermining mechanisms established by or under the EU Treaties for the
control of EU executive action’ (European Ombudsman 2014, para. 20), thus
highlighting concerns of internal and external sovereignty. The Ombudsperson’s
decision shows contempt for the selective practices of secrecy and transparency,
highlighting principles based in the order of fairness, such as democratic princi-
ples and parliamentary scrutiny, as well as principles of sovereignty referring to a
dystopian scenario of foreign encroachment.
This example demonstrates how the conflict resolution process is stabilized
by selective practices of secrecy and transparency. The possibility of demanding
transparency, even for MEPs, who have high political status, is severely con-
strained. The US jurisdictional claim over data notably extended beyond the
data as such and also covered the review procedures. As with the revelations,
transparency rather than the practices of secrecy were considered to be morally
deficient. This isolated the TFPT from comprehensive scrutiny, which contributed
to its stabilization. Regarding the book’s main question of conflict resolution, the
example points to the confirmatory character of review and scrutiny procedures,
which tend to circumvent rather than represent challenges to the stability of an
agreement.

6.2.3 Constrained Review Practices

The examples presented in Section 6.2.2 concerned the legitimacy of disclos-


ing secrecy and access to secret documents. Practices of selective transparency
are even more extensive with regard to the content of review reports outlin-
ing the implementation and effectiveness of surveillance measures. Scholars have
problematized that evidence-based evaluation of the effectiveness of data-based
security measures tends to fall short (Brzoska 2011; Curtin 2013; Pieth 2006;
C. Walker 2018). For example, regarding financial surveillance, the requirement
for banks to flag large transfers rarely identifies terrorists or criminals, who are
aware of such reporting thresholds (Wensink et al. 2017, 59). Furthermore, ter-
rorist attacks can be committed without significant costs. For instance, the 9/11
attacks were estimated to cost no more than $500,000 (Lowe 2006, 255), includ-
ing training and travel costs, and many more recent terrorist attacks rely on
suicide bombers, readily available weapons, or vehicles, which further decreases
costs. This has increased the surveillance of mundane and everyday money flows
(de Goede 2018a, 759–60), thus expanding the number of potential terrorists
significantly. These reports may be understood as ‘composite objects’ (Boltan-
ski and Thévenot 2006, 279) that represent compromises between the order of
fairness and the order of security by uniting transparency requirements with a
FINANCIAL DATA SHARING 147

largely unrestricted pursuit of principles of the order of security. With regard


to the TFTP, this composite character was sustained mainly by the deliberately
limited consequences of review reports and the vague character of the evidence
provided. The second joint review of the European Commission stressed that
it was:

based on the understanding that it was not its task to provide a political judgement
on the Agreement, this being considered outside the scope and mandate under
Article 13 … Where recommendations are presented, these are aimed at further
increasing the effectiveness of the application of the Agreement, in particular its
safeguards.
(EC 2012b, 4)

This statement, which can also be found in subsequent reviews, clearly highlights
how the European Commission constrained its space of possibles voluntarily. In
submission to the illusio of the field, particularly the interpretation that the TFTP
is meaningful in the pursuit of the common good of security, the European Com-
mission limited its role to informing about the programme while refraining from
value judgements or demanding consequences.
The agreement requires only minimal evidence for the effectiveness of the pro-
gramme. In its first review, the Commission noted the need to provide more
publicly available statistical information to demonstrate the benefits of the TFTP
(EC 2011, 1). However, the TFTP agreement does not require the US to give
detailed information on the total volume of financial messages. In consequence, no
information on the volume of data in total is given, but the US Treasury provides
information on ‘leads’ or ‘reports’ that may vary in data volume (EC 2012b, Annex
II, Q4). For example, in the 2019 review period, 1,115 searches were conducted
on average per month and the leads shared with the EU increased significantly
(EC 2019c, 8–9). However, ‘The Treasury maintains its view that disclosure of
overly detailed information on data volumes would, in fact, provide indications as
to the message types and geographical regions sought … and would have the effect
that terrorists would try to avoid such message types in those regions’ (EC 2019c,
9). Thus, the volume of data transfers was ‘reconfigured into sensitive security
information’ (de Goede and Wesseling 2017, 263), with any disclosure poten-
tially threatening the integrity of the programme. Therefore, any re-emergence
of conflict was avoided due to the classification of any potentially threatening
data as secret. Nonetheless, in 2015, the JSB emphasized that ‘There is a clear
tension between the idea of limiting the amount of data to be transmitted by tai-
loring and narrowing the requests and the nature of the TFTP’ (JSB 2015, 3).
While there are statements that question the broad nature of the requests, even
in the European Commission joint review (EC 2019c, 19), these are mainly char-
acterized by a lack of consequences, thus again stabilizing the agreement. For
148 DATA GOVERNANCE

instance, in the 2018–19 period, overseers queried 645 cases, ‘the overwhelm-
ing majority of which [queries] were selected for routine auditing purposes’ (EC
2019c, para. 37). They stopped seventy-five and blocked fifty-three cases retroac-
tively. This indicates that from a sample of largely routine request, more than 19
per cent of searches were considered too broad. This is not problematized in the
report. The evidence provided for the evaluation of the effectiveness of the TFTP
also seems to contradict its preventative aims. On its website, the US Treasury
states:

These U.S. Treasury Department efforts have not only disrupted terrorist net-
works, they have saved lives. Since the start of the program, the TFTP has
provided thousands of valuable leads to U.S. Government agencies and other gov-
ernments that have aided in the prevention or investigation of many of the most
visible and violent terrorist attacks and attempted attacks of the past decade.
(US Department of the Treasury 2022)

While the emphasis in this statement is on the preventative quality of the TFTP, the
examples provided in the annex of the 2017 review highlight cases in which indi-
viduals or groups were already being investigated rather than identified through
the TFTP. The report refers to the Charlie Hebdo or the Paris attacks (EC 2017b,
41–5), which are examples of the usefulness for the investigation and collection
of evidence rather than the TFTP’s ex ante success (see also Amicelle 2013). The
emphasis on pre-emption and prevention in post 9/11 counterterrorism measures
(McCulloch and Pickering 2009) thus seems to be at odds with the evidence pro-
vided for their effectiveness. As the EDPS points out in 2011 regarding a European
Commission document, ‘The Communication mentions on several occasions the
“added value” or the “interest” of the US TFTP, without referring to any analysis
of the efficiency of existing tools’ (EDPS 2011b, 2). By consistently avoiding a clear
assessment of the value of data, even in dialogue with its oversight committee, the
TFTP’s effectiveness is constructed around a certain aura of importance that is
created by not despite its secrecy (see de Goede and Wesseling 2017). Thus, again,
the notion of secrecy prevents the emergence of the imperative of justification. It
is based on the active removal of both the data and the discussion about its gover-
nance from public discourse. In conclusion, in the resolution of the conflict and the
stabilization of the agreement, actors draw on practices of selective transparency
to sustain and legitimize their jurisdictional claims. Continued practices of secrecy
entrench a vision of mundane data as highly classified and sensitive, which averts
critical scrutiny. As highlighted above, review reports seem to have a confirmatory
role for the persistence of the agreement by remaining vague, which is consistent
with the expectations articulated by Boltanski and Thévenot (2006, 336) regarding
the stabilization of compromise. Potential infringements of the agreement rarely
become visible. If actors manage to highlight a lack of transparency, the debate is
FINANCIAL DATA SHARING 149

constrained to the rectification of transparency principles, while the illusio of the


effectiveness and necessity of the agreement as such are not questioned.

6.3 Interlinkages with the Evolution of the Field: Anti-Money


Laundering

Conflict resolution processes in the context of the TFTP have a co-constitutive


relationship with meaning-making processes in financial data sharing more
broadly. Scholars have pointed to the embeddedness of this conflict in a broader
financial security assemblage involving blacklists, reporting obligations, and sanc-
tions (Curtin 2018; de Goede 2012a, 2018a).
Indeed, jurisdictional overlap in counterterrorism financing has contributed to
the emergence of one of the most widely recognized and controversial conflicts in
international law in the last years. In the 2008 Kadi case, a Swedish businessper-
son became the target of UNSC-mandated sanctions, in particular an assets freeze,
when he was identified as a possible al-Qaeda supporter. When Kadi challenged
these measures on EU human rights principles, the CJEU famously decided that a
UNSC resolution did not enjoy primacy over core human rights norms in EU law
(Kadi 2008), thus refusing the jurisdictional claim over EU residents. Indeed, in
response to its wide-ranging counterterrorism policies (Kreuder-Sonnen 2019),
the UNSC subsequently increased its review mechanisms to avoid future con-
flicts. The Kadi case exemplifies existing normative and institutional differences
and highlights the broader relevance of the resolution of jurisdictional conflicts in
global governance.
Another area of co-constitutive meaning-making processes is the fight against
money laundering. AML efforts have been a central area of law enforcement coop-
eration, most prominently in international drugs control (Herschinger et al. 2011,
459). In 1990, the Council of Europe adopted the Convention on Laundering,
Search, Seizure, and Confiscation of the Proceeds from Crime (CoE 1990), while
the EU has developed strengthened EU competences since 1991 (e.g. Directive
(EU) 2018/1673). Money laundering and the financing of terrorism have signif-
icant parallels, as both involve efforts to hide funds from public scrutiny and
both regimes target the underlying criminal activities, such as organized crime
and terrorism, rather than the financial activities themselves. In contrast to AML,
countering terrorism financing is largely pre-emptive. AML and CFT have to some
extent developed as different or even ‘fragmented’ (King and Walker 2015) regimes
with different standards, mainly because of the more recent manifestations of CFT
after 9/11. However, for both areas, regulation has significantly been impacted
by the international Financial Action Task Force (FATF) (G7 1989), which since
2001 includes both AML and CFT. The FATF publishes influential but non-
binding recommendations (Nance 2018) and monitors compliance (FATF 2020).
150 DATA GOVERNANCE

The growing impact of the FATF on the area of AML and CFT demonstrates an
increased emphasis on data sharing in financial matters for security purposes. The
29WP has criticized this as a departure from data protection standards, pointing to
their framing as obstacles needing to be overcome or circumvented (29WP 2011).
AML measures usually require banks to install systematic scanning and tracking
measures for customers and identify suspicious transactions, which the EDPS crit-
icizes as ‘blanket measures’ (EDPS 2017a, 12), raising concerns about invasions of
privacy and proportionality.
FATF standards have also increased reliance on private actors as actively inter-
preting security actors (de Goede 2018b). While the TFTP largely involved SWIFT
as a passive supporter, potential changes stemming from the diversification of
financial channels make it likely that, as in the case of AML, reliance on ‘pri-
vate corporations for policing purposes’ (Herschinger et al. 2011, 465) will further
increase in CFT. This is despite apparent failures of banks to fulfil their responsi-
bilities as private enforcers. In 2012, the US authorities issued a fine of $1.9 billion
to HSBC, because they had failed to detect Mexican drug cartels laundering $881
million as well as ongoing terrorism financing in Saudi Arabia and Bangladesh
(Homeland Security and Governmental Affairs Committee 2012). In 2019, the
Australian bank Westpac was accused of ‘serious and systemic’ breaches in more
than 23 million cases (Janda and Ryan 2019). The Danske Bank was at the cen-
tre of a money laundering scandal that involved over €200 billion in suspicious
transactions through branches in Estonia (Bjerregaard and Kirchmaier 2019).
In sum, the jurisdictional conflict over TFTP is embedded in a financial secu-
rity assemblage that is characterized by increasing scope, particularly concerning
the role of international actors, but also with regard to targeted transactions, the
increasing role of private actors, and a shift towards principles embedded in the
order of security. This might contribute to the emergence of jurisdictional con-
flicts in the future, as EU DPAs have voiced concerns about the blanket character
of financial surveillance.

6.4 Conclusion

This chapter has traced the evolution of financial data sharing in the transat-
lantic context. By analysing the unfolding and resolution of a jurisdictional conflict
involving the EU, the US, and the Belgian cooperative society SWIFT, I have inves-
tigated how financial data sharing for counterterrorism purposes has stabilized
despite significant disruptions. The jurisdictional claim by the US, while contested,
is codified through a bilateral agreement that has withstood the test of time. I
have argued that the resolution of this jurisdictional conflict is based on the suc-
cessful imposition of security as the main criterion of evaluation. In view of this
extensive inter- and transnational cooperation, data governance is most strongly
FINANCIAL DATA SHARING 151

envisioned as a security partnership. Yet we can also observe selective allusions to


transparency and oversight. These review and transparency mechanisms highlight
the importance of ‘composite objects’ (Boltanski and Thévenot 2006, 279), which
unite principles of the order of fairness and principles of the order of security.
From a normative perspective, these mechanisms can have problematic implica-
tions, because their scope and potential effects remain strictly circumscribed. In
addition, the EU ‘continues to outsource its financial intelligence service to the
US’ (LIBE 2010, para. 2). This enables a continuation of data sharing practices
that due to the limitation of the EU oversight competences cannot be evaluated
regarding their specific effectiveness, even though, as an anonymous security offi-
cial put it, ‘the potential for abuse is enormous’ (cited in Lichtblau and Risen 2006).
Both CFT and AML have moved increasingly towards the surveillance of every-
day data, as the perceived necessity of financial data sharing has contributed to an
expansion of competences (DPA representative A, pers. comm. 2017). In addition,
the data processing systems at the centre of this chapter and Chapter 5 are embed-
ded in a more general proliferation of data-based security measures. Data sharing
agreements with private companies also extend into other areas (Bigo et al. 2020).
For instance, the data management company Palantir supplies software for immi-
gration surveillance in the US for over $49 million (Simon 2020), while recent
investigations also reveal cooperation with Europol (in ’t Veld 2020). In Chapter 7,
I will outline what happens when private companies are less willing to cooperate
with such publicly mandated security measures.
7
Access Denied?
Struggles over Electronic Evidence

As the number of online interactions is increasing, the digital traces we create


when sending an email, conducting a bank transfer, or posting social media con-
tent are becoming increasingly relevant for criminal investigations. These data
may serve as electronic evidence in criminal investigations. And most of the time
they are held by private companies. In a significant number of cases, these data
are stored in the cloud, a remote location that can be accessed through the pub-
lic internet or a dedicated private network. As internet service providers, data
storage centres, law enforcement authorities, victims, and perpetrators are likely
to be located in different jurisdictions, data access often requires cross-border
cooperation. For example, in the EU, this concerns more than half of crimi-
nal investigations (EC 2020b). To avoid lengthy and administratively complex
processes of international judicial cooperation through Mutual Legal Assistance
Treaties (MLATs), law enforcement authorities increasingly rely on direct infor-
mal cooperation with internet service providers and tech companies (Europol
2019), often without judicial review. Such cross-border cooperation complicates
the design of effective regulatory and legislative solutions due to tensions between
due process rights, security objectives, criminal justice claims, and sovereignty
concerns. As law enforcement authorities rely on informal requests, companies
rather than judges have to decide on the legitimacy of evidence collection, bearing
some risks of ‘privatising international cooperation’ (Aguinaldo and de Hert 2021,
172). Therefore, informal access practices by law enforcement authorities have the
potential to profoundly affect the relationship between states and citizens in the
area of criminal justice.
In 2013, the US company Microsoft for the first time challenged the practice
of direct informal law enforcement requests by resisting a data access request,
citing concerns about extraterritoriality and a potential conflict of laws. US law
enforcement authorities had requested data related to an email account in the
investigation of a drugs-related offence. However, the data were stored in Ireland,
which made them subject to EU data protection law, which potentially made it
illegal for Microsoft to hand over the data. In 2017, the case reached the US
Supreme Court (United States v. Microsoft Corp. 2018). This reinforced public
attention and contributed to the increasingly international character of the case.
In March 2018, before a judgment was issued, the case became moot, because the

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0007
ACCESS DENIED? 153

US adopted the contested Clarifying Lawful Overseas Use of Data Act (CLOUD
Act 2018), a law that explicitly legalized access to data stored beyond US territory
under specific conditions. While this unilateral assertion of US law could have pro-
voked an international dispute, reactions were limited, most likely because similar
legislative proposals were already under discussion in the Council of Europe and
the EU. Yet as the EU is increasingly attempting to impose data protection stan-
dards for cloud data storage, particularly concerning public sector and sensitive
data, and in light of the deficiencies of data protection in the US context identified
by the Schrems II judgment (see Chapter 4), it is likely to be subject of conflicts in
the future.
Although electronic evidence has become a key concern not only in transat-
lantic law enforcement cooperation but also in the Council of Europe (Daskal
2018), it is subject to surprisingly little attention in IR (for a limited exception,
see Farrell and Newman 2019, 41–6). The rise of cross-border cooperation in law
enforcement, particularly in light of the widening role of supranational authority
in the EU (Herschinger et al. 2011; Trauner and Carrapiço 2012) but also more
specifically in relation to data protection (Blasi Casagran 2016; de Busser 2009; de
Busser et al. 2014) and EU–US cooperation (Anagnostakis 2017, ch. 3), has been
well illustrated. Yet few contributions have examined the challenges of electronic
evidence collection and sharing and then, mostly from a legal perspective (Autoli-
tano et al. 2016; Biasiotti et al. 2018; Christou 2018; Daskal and Swire 2018; Ligeti
and Robinson 2021). Bigo et al. (2012), have concentrate on the implications for
data protection rights in the context of cloud computing, including transfers to
third countries and jurisdictional disputes.
This chapter aims to address the existing gap in the literature concerning recent
evolutions of electronic evidence, focusing on two aspects of the conflict resolution
process. First, I investigate the role of Microsoft in this jurisdictional conflict. The
company profoundly challenged the US Department of Justice, acting differently
from other companies such as banks and airlines (see Chapters 5–6) that tend to
support rather than challenge informal access practices for security purposes. Sec-
ond, I analyse how it is possible that the legalization of the US jurisdictional claim
contributed to the resolution rather than the exacerbation of the conflict. The
chapter discusses in particular the international dynamics of this jurisdictional
conflict while also highlighting the complex public–private relationships that con-
tribute to both its emergence and its comparatively quick legislative resolution.
First, I argue that private companies such as Microsoft have increasingly adopted
a more long-term strategy that is based on legal challenges, political advertising,
and the discursive construction of private tech as responsible actors. In answer to
the second question, I argue that the field is subject to incremental processes that
entrench data governance as a security partnership. The circumvention of due
process norms, data protection concerns, and sovereignty considerations implies
a hierarchization of security compared with other principles. While shifts in the
154 DATA GOVERNANCE

order of security fail in the context of the Microsoft court case (In re: A Warrant
2016; United States v. Microsoft Corp. 2018), parallel meaning-making processes
create the backdrop against which the legalization of existing and contested US
law enforcement practices is considered legitimate.
In this chapter, I first offer a brief introduction to the technical and legislative
challenges of electronic evidence before, secondly, demonstrating the emergence
of a jurisdictional conflict due to the resistance of Microsoft in complying with a
US search warrant. I illustrate the increasingly international character of the con-
flict and focus on the justifications brought forward by the US, Microsoft, and a
diversity of international actors in the US Supreme Court before outlining the leg-
islative response of the US via the CLOUD Act. Thirdly, I highlight interlinkages
with broader developments in the field, in particular meaning-making processes
that entrench the moral value of security cooperation before reflecting on the
implications in the conclusion.

7.1 Cross-Border Access to Electronic Evidence

Before getting into the conflict, it is important to briefly illustrate the rele-
vance and details of electronic evidence, particularly how the technical nature
of data storage, transfer, and access constitutes challenges for law enforcement
agencies.
Discussions about access to electronic evidence are embedded in a broader
attempt to tackle crime on the internet. Several global and regional institutions
such as the OECD, the G7/8, or the International Telecommunication Union
(ITU) have mainly focused on the issue of cybercrime (Christou 2018). The Coun-
cil of Europe’s Budapest Convention (2001), with more than sixty parties, is the
only legally binding instrument. However, electronic evidence is no longer signif-
icant only in cybercrime investigations. In a 2019 survey, EU judicial authorities
mentioned the relevance of e-evidence in the investigation of fraud, the sexual
exploitation of children, human trafficking, and terrorism (Europol 2019, 9–13).
Both online and offline criminal activities frequently involve information and
communications technology (ICT), for example through the use of messaging
apps or email programs. In consequence, the significance of electronic evidence
or e-evidence is on the rise, with an 84 per cent increase in request numbers
between 2013 and 2018 in the EU (EC 2019a). Legislative frameworks tend to
distinguish between different types of data, such as subscriber information, traffic
data, and content data. Subscriber information comprises information that iden-
tifies the holder of a certain account, such as name, username, address, email, or
financial information. Traffic data includes metadata, such as the time, date, and
length of access. Content data refers to the substantive content of the messages or
files, which is generally considered to be the most sensitive (Europol 2019, 10).
ACCESS DENIED? 155

In the Microsoft case, the Department of Justice requested information regarding


subscriber, traffic, and content data (Microsoft 2018).
Due to the varying locations of providers and data, requests for electronic evi-
dence frequently require cross-border cooperation. This mostly takes place within
the framework of mutual legal assistance designed to facilitate evidence exchange
between countries. The Council of Europe concluded the Convention on Mutual
Assistance in Criminal Matters, which has been ratified by fifty states in 1959, while
the 2000 United Nations Convention against Transnational Organized Crime pro-
vides less formalized mutual legal assistance with currently 190 member states
(UNGA 2000). The conclusion of an MLAT between the EU and the US in 2003
formed part of an increasing number of such treaties globally (Farrell and New-
man 2019, 43–4). The European Investigation Order (2014), which facilitates the
investigation of criminal offences between EU member states, is the most advanced
regional cooperation mechanism. In the digital age, MLAT procedures have come
under increasing pressure, as they are administratively complex and take on aver-
age six to ten months, but can take up to 24 months (Clarke et al. 2013, 227; T-CY
2016b, 9), which involves risks, such as the destruction or removal of electronic
evidence. To avoid these highly formalized procedures, law enforcement agencies
have increasingly relied on direct informal cooperation with providers, often with-
out involving the judicial authorities. Requests for the disclosure of data may be
based on a court order from the originating country, but this is not always the
case. The 29WP has articulated concerns about both the human rights implica-
tions and the lawfulness of cooperation, as companies lack the ‘lawful authority to
disclose the data’ (29WP 2013b, 3). While the EU–US Umbrella Agreement (EU
and US 2016) specifies minimal legal safeguards for law enforcement data, it does
not constitute a legal basis for access to electronic evidence from private compa-
nies (EDPS 2019, 8–9). Yet before the challenge by Microsoft, beyond DPAs, few
problematized the legality of these requests.
The number of requests also varies considerably between countries. According
to transparency reports in the EU, 38 per cent (or 67,991) of requests to major com-
panies are from German law authorities, and the six biggest countries account for
90 per cent of all EU requests, despite only accounting for half of the EU population
(Europol 2019, 12). According to a report of the Cybercrime Convention Commit-
tee’s (T-CY ) Cloud Evidence Group, transparency reports indicate that US service
providers grant around 60 per cent of access requests ‘on a voluntary basis’ (T-CY
2016a, 5). Europol more recently found a similar number of positive requests in
the EU, with 66 per cent approval rates on average (2019, 13). The T-CY report,
furthermore, indicates that positive response rates vary greatly between request-
ing countries and providers. For example, while Microsoft granted 78 per cent of
requests for parties other than the US, Twitter granted only 21 per cent (T-CY
2016a, 5–6). Data storage solutions vary across providers on the basis of concerns
regarding data security, ease of access, or costs of storage. Due to the informality
156 DATA GOVERNANCE

and voluntary nature of these access requests, law enforcement authorities have
little leverage in forcing providers to release the data.
The increasing interest from law enforcement also generates additional burdens
for companies. On the one hand, companies need to become relatively familiar
with the legal requirements for warrants, wiretaps, or other data access requests in
different jurisdictions. On the other hand, there is the added challenge of forged
requests or faulty requests in a diversity of languages (T-CY 2016a). Companies
have developed different ways to deal with these administrative burdens. In early
2020, Google announced its intention to charge law enforcement agencies for pro-
cessing subpoenas, wiretaps, and search warrants, with charges ranging from $45
for a subpoena to $245 for a search warrant (Dance and Valentino-DeVries 2020).
This has the potential to reinforce the heavy dependency of a public sector with
limited resources on tech companies that have accumulated data. However, it may
also deter law enforcement authorities from requesting data in cases where they
are not strictly necessary.
In sum, challenges to the access of electronic evidence arise not only because
data are moved easily across borders but also because of the variation in the loca-
tion of service providers, data, crime, or investigations, as well as the suspect’s
nationality, which demands complex administrative procedures that complicate
and delay criminal investigations. In light of increasing request numbers and a
growing reliance on cloud storage, the stakes of cooperation increase, which also
raises the potential for jurisdictional conflict. Section 7.2 outlines how those chal-
lenges contributed to a conflict between Microsoft and the US Department of
Justice.

7.2 The Microsoft Warrant Case

Informal law enforcement requests usually involve a high level of legal uncertainty
for internet service providers. Companies need to assess several factors, such as the
applicability of the requesting state’s jurisdiction as well as the authenticity and
lawfulness of the request (T-CY 2016a). Thus, there are few benefits to coopera-
tion with law enforcement agencies, particularly because this cooperation is only
in very few cases publicly recognized. One exception is the Charlie Hebdo shoot-
ings in Paris in 2015, during which Microsoft provided the requested data after
forty-five minutes (Lien 2015). In light of the 44,655 requests Microsoft received
in 2018 concerning nearly twice this number in users or accounts (Microsoft
2020), cooperation constitutes a burden on companies. Yet Microsoft’s decision
in 2013 to challenge the relatively common practice of direct informal coopera-
tion came largely unexpected by many public officials (Tech company employee
A, pers. comm. 2019). The challenge created an imperative of justification for the so
far largely overlooked extraterritorial practices of law enforcement agencies. The
ACCESS DENIED? 157

holder of the relevant account had specified Dublin as the location of residence;
therefore, most of the data were stored in Microsoft’s data centre in Ireland. While
Microsoft handed over data stored on US servers, such as the address book, it did
not provide access to the more sensitive content data stored in Ireland. In the let-
ter refusing the warrant, the company cited concerns about legal certainty, arguing
that the request had extraterritorial implications which went beyond the scope of
the 1986 US Stored Communications Act, the legal basis of the request (Microsoft
2018). Several companies followed suit. Hence, Microsoft and other tech com-
panies used their unique position in the field, sustained by control over data of
high relevance to law enforcement, and challenged the jurisdictional claim by
the US.
Two main potential reasons may be behind this challenge. First, in Microsoft’s
2018 fiscal year, the company for the first time surpassed $100 billion in rev-
enue, which is largely accredited to its strong reliance on cloud services (Wong
2018). The mitigation of legal uncertainty, particularly in light of the company’s
increasing reliance on cloud storage, constituted a significant incentive. Through
the Supreme Court, Microsoft was able to increase the visibility of the issue,
as indicated by several media reports in major international newspapers (Die
Zeit 2016; Rushe 2014; New York Times 2015). Informal cooperation between
law enforcement agencies and internet service providers was neither particularly
widely discussed nor known at the time but suddenly became subject to public
debate (Tech company employee A, pers. comm. 2019). This enforced public jus-
tifications for the agencies’ implicit jurisdictional claim over data and with it the
potential for greater legal certainty in a key sector.
Second, Microsoft strongly emphasized the embeddedness of the conflict in a
broader effort to protect consumers from harm (B. Smith 2018a). This included
references to several initiatives promoted by Microsoft, for example, a ‘Dig-
ital Geneva Convention’ (B. Smith 2017a) or a public–private ‘Tech Accord’
(B. Smith 2018b). Microsoft, which has been described a ‘norm entrepreneur’
(Gorwa and Peez 2020) in global internet governance, seems to display character-
istics of what Eichensehr has conceptualized as ‘Digital Switzerlands’ (2019). The
concept suggests that companies are on par with the governments that regulate
them, scrutinize, and sometimes resist public power. For instance, the Microsoft
CEO Brad Smith suggested that ‘Cloud providers act as a critical check to ensure
that governments’ use of their investigative powers strictly adhere to the rule
of law’ (B. Smith 2018c, 2). This highlights that Microsoft perceives its role as
involving oversight functions and considers itself as a bulwark against intrusive
practices. The broad scope of the request (US District Court Southern District of
New York 2014, 3–4) provided additional legitimacy from a consumer protection
perspective.
Microsoft relied on a twofold justificatory strategy in its jurisdictional claim.
In contrast to expectations formulated in the literature on private governance
158 DATA GOVERNANCE

(Cutler et al. 1999; Green 2013) and in contrast to other areas of inter-
net governance such as the critical infrastructure of protocols (Mueller 2010),
Microsoft did not refer to its expertise to prove the rightfulness of its jurisdictional
claim. The company relied significantly on its large market share, emphasizing
the number of affected customers that were treated unlawfully (Supreme Court
of the US 2018, 47). Microsoft argued that US practices violated privacy stan-
dards (B. Smith 2017b), which highlights a rights-based conceptualization of data
access. While criteria embedded in the order of fairness contributed to salience in
public discourse, the attempt to impose them on the conflict found only limited
acceptance by the conflict parties. The Department of Justice representative even
explicitly argued that ‘It’s not a case about privacy’ (Dreeben, cited in Supreme
Court of the US 2018, 21). Yet NGOs have pointed to privacy implications,
recognizing Microsoft’s interpretation (EDRi 2017; EDRi et al. 2019; EPIC 2018).
Microsoft’s justifications further juxtaposed US practices that violated estab-
lished principles of sovereignty with its own responsible behaviour as a global
political actor. This emulated justifications normally associated with public actors,
such as references to sovereignty and the international order. In the Supreme
Court hearing, the representative frequently referred to the extraterritorial-
ity of the US approach and pointed to the exercise of ‘extraordinary power’
(Rosenkranz, cited in Supreme Court of the US 2018, 60) not granted by law.
The counsel for Microsoft, Joshua Rosenkranz, explicitly highlighted the violent,
intrusive nature of data requests, stating:

we all agree that the Stored Communications Act is limited to the United States.
The government wants to use the act to unilaterally reach into a foreign land to
search for, copy, and import private customer correspondence physically stored
in a digital lockbox, any foreign computer where it’s protected by foreign law.
(Supreme Court of the US 2018, 32)

The references to ‘reach’ and ‘physically stored in a digital lockbox’ reinforced the
physical nature of the intrusion and established data as having implications for
territoriality. Rosenkranz also referred to a violation of European law (cited in
Supreme Court of the US 2018, 42). The company’s justificatory strategy had sim-
ilarities to justifications articulated by Google regarding the right to be forgotten
(see Chapter 8). The emulation of public justifications entrenched the position
of Microsoft as a political actor scrutinizing the exercise of public power and
demanding justifications for potential wrongdoings akin to the idea of ‘Digital
Switzerlands’ (Eichensehr 2019). While Microsoft was not able to establish cri-
teria of evaluation embedded in the order of fairness such as compliance with
human rights law, the construction of data as a territorial sovereignty concern
had far-reaching implications. Extraterritoriality both violates principles of inter-
national law and interferes with the sovereignty of states (see Svantesson 2016).
ACCESS DENIED? 159

The territorial conceptualization of data posed problems for the justificatory strat-
egy of the US Department of Justice. To claim legitimate control over data, the US
Department of Justice needed to illustrate why their jurisdictional claim was not an
intrusion on sovereignty in the sense that Microsoft had proposed, while simul-
taneously making sure that data access continued. Arguing that their claim was
extraterritorial but legitimate would have meant a lack of compliance with estab-
lished principles of international law. I elaborate on the justifications articulated
by US Department of Justice officials in Section 7.2.1.

7.2.1 Justificatory Responses in the Court Case

In response to Microsoft’s justificatory narrative, the US representatives relied on


a threefold strategy to re-establish the position of public actors. First, they empha-
sized the significance of sovereignty as a bulwark against interference from the
outside; second, they attempted to subject the problem to the order of security;
and third, they tried to establish private companies as threatening both of these
visions.
First, the US representatives attempted to contest the internal validity of
the claim by questioning Microsoft’s moral hierarchization in the pursuit of
sovereignty. The US legal representative Michael Dreeben focused particularly on
the position of domestic public actors in the field. While the strategy confirmed
the construction of data as a territorial concern, he attempted to highlight that the
territorial and sovereign integrity of the US rather than the EU was at stake. This
established the territorial nation state as the relevant reference community, sug-
gesting that ‘the SCA [Stored Communications Act] involves domestic conduct,
and its terms remain domestically enforceable notwithstanding changes in the
business model of providers’ (US 2017, 21). This also reverberated with the deeply
engrained importance of domestic conduct in the area of data governance. The
Department of Justice representative also attempted to prove worth by drawing on
the specification of sovereignty according to the Budapest Convention (Dreeben,
cited in Supreme Court of the US 2018, 14).
The justifications further established the physical nature of data. For example,
the search and seizure warrant requested ‘the search of the following person or
property located in the Western District of Washington’ (US District Court South-
ern District of New York 2014, 3), which equated the search of an email account
to the search of a physical person or property. This implicitly recognized the con-
ceptualization of data as a concern of domestic sovereignty. Yet this required an
answer to the question whether the location of the data, the location of the service
provider, or the location of the crime and investigation would set the bound-
aries for the exercise and/or infringement of sovereignty. Participants largely
agreed that the legislature rather than the executive should provide this answer
160 DATA GOVERNANCE

(Supreme Court of the US 2018). Dreeben also emphasized the lack of resistance
by public international actors, suggesting that ‘we have heard no protests from
foreign governments’ (cited in Supreme Court of the US 2018, 23) concerning the
status quo.
Second, on the basis of the important position and significant material and
symbolic resources in the area of military capacities and intelligence, the US repre-
sentative tried to shift the focus away from human rights towards a security-based
criterion of evaluation:

hundreds if not thousands of investigations of crimes—ranging from terrorism,


to child pornography, to fraud—are being or will be hampered by the gov-
ernment’s inability to obtain electronic evidence … The decision protects only
criminals whose communications are placed out of reach of law enforcement
officials because of the business decisions of private providers.
(US 2017, 12–13)

This juxtaposition of an existentially threatened community with vague privacy


interests shifted any balancing acts or reconciliation with other conceptions of
worth into the unsayable. The justification aimed to remove a strong protection of
privacy from ‘the space of possibles’ (Bourdieu 1996, 234–9). It implicitly stig-
matized any opposition as a dismissal of the security of the public, as is also
demonstrated by the reference to the irresponsible and profit-oriented behaviour
of private actors. In addition, a dissenting opinion referred to electronic evidence
‘as an essential investigative tool used thousands of times a year’ (Cabranes 2017,
125a). This also explicitly links to the conceptualization of data as a tool, as in the
Passenger Name Records case discussed in Chapter 5. Yet most reactions to US
justifications show that the imposition of security as the relevant criterion of eval-
uation was not initially recognized, as other actors at best marginally referred to
potential consequences for security and instead emphasized the implications for
sovereignty (Supreme Court of the US 2018).
The third justificatory narrative of the US emphasized the higher common
good of protecting the sovereign (nation) state from security threats and implicitly
set up private companies as a threat to both, which reinforced a hierarchical
relationship between public and private actors. For instance, the US representa-
tive Michael Dreeben specifically attempted to undermine Microsoft’s position
pointing out that:

Microsoft’s theory is that if it moves information abroad, since storage is the only
thing that counts, it’s then free to disclose that information to the world, to sell
it, to do anything it wants free from U.S. law … the only person who can’t get it is
the United States under lawful process.
(cited in Supreme Court of the US 2018, 31–2)
ACCESS DENIED? 161

This suggested that any undermining of law enforcement access would constitute
the support of potentially irresponsible undisclosed third actors, which stigmatizes
private actors as illegitimate. This was reinforced by the devaluation of concerns
located in the order of production, as expressed in the following statement:

Economic concerns cannot override the text of the statute or the interests in pub-
lic safety and national security that are at stake in this case—particularly when the
claimed economic benefit is derived directly from a provider’s ability to market
itself as capable of shielding subscribers’ activity, including their criminal activity,
from discovery by the authorities.
(US 2017, 32)

The brief contains an explicit hierarchization of economic concerns as secondary


to public safety and national security. The pursuit of the common good of produc-
tion rather than security is perceived as particularly morally deficient for private
companies, which potentially profit from their decision not to share data, maybe
even use it as a marketing strategy. Justice Roberts suggested to Microsoft ‘you
might gain customers if you can assure them, no matter what happens, the gov-
ernment won’t be able to get access to their e-mails’ (Supreme Court of the US
2018, 49), thereby acknowledging the validity of the US representative’s claim.
The reference to national security in a criminal investigation on drug traffick-
ing reverberated with further justifications referring to the order of security. More
specifically, the US argued in its petition to the Supreme Court that the case rep-
resented a potential cascading threat due to the potential reactions of other service
providers, as ‘[t]he harm caused by the panel’s decision is not theoretical, nor is it
limited to the Second Circuit or Microsoft’ (US 2017, 26). The reference to con-
crete harm evoked a dystopian scenario of impunity, strengthened by references to
the potential assistance to criminals: ‘A s to Microsoft email services, the decision
provides a roadmap for terrorists and criminals in the United States to insulate
electronic communications from U.S. Investigators’ (US 2017, 27).
Variation in data storage practices by different companies exacerbated these
fears. Important service providers such as Google or Yahoo! store data for the same
accounts in different places and move the data frequently to optimize data storage
capacities. As companies often cannot state reliably where specific parts of the data
are located at a certain point in time, service providers tend to limit data access to
data stored in the requesting jurisdiction. As a reaction to the Second Circuit deci-
sion, other service providers had stopped complying with similar requests, such
as Google pointing to the Microsoft case in arguing that it was not required to
hand over data stored abroad (US District Court, E.D. Pennsylvania 2017, 3–4)
and used the decision to prove their conformity with the order of fairness.
Throughout the Microsoft warrant case, references to the order of security were
overall limited. Only a few actors recognized the references to exceptionality,
162 DATA GOVERNANCE

the threatened community, or national security. In particular, Microsoft avoided


responding to or even recognizing any objects or arguments embedded in the
order of security, thus avoiding any potential evaluation of its behaviour accord-
ing to criteria embedded in the order of security. Instead, references to alternative
value orders (fairness, sovereignty) enabled Microsoft to strengthen its position
as a protector of consumers through privacy protection as well as juxtaposing
its respect for the legal and political order with the extraterritorial practices of
the US.

7.2.2 International Involvement

The increasing establishment of data access as a matter of territorial sovereignty


contributed to a stronger imperative of justification. If data access had implications
for sovereignty, why had neither the EU nor Ireland protested, given the frequency
of data access practices? The European Commission, like the Department of Jus-
tice, perceived that privacy arguments were used purely strategically by Microsoft
to appeal to a broader audience. In an interview, an EU official suggested that
the case was ‘not about privacy at all; it is a case about sovereignty’ (EC official,
pers. comm. 2019). In a community of sovereign entities, access to data became
a question of potential intrusion on sovereignty. The case gained a public/public
dimension, compared to its formerly public/private nature. Hence, the resulting
attention to evidence sharing practices not only created a strong incentive for the
US to enhance the legitimacy of its practices but also established an imperative of
justification for other actors.
In response, various international actors submitted amicus curiae briefs to the
US Supreme Court, including the European Commission, Ireland, several NGOs,
and the UN Special Rapporteur for Privacy Online. The fact that international
actors, particularly the EU and Ireland, became involved in a domestic court case
was unusual, which was even highlighted by one of the judges (Supreme Court of
the US 2018, 13). The question of whether and to what extent the EU should inter-
vene in the proceedings was strongly contested. In a 2015 meeting of the Article
31 Committee, which is made up of representatives from the EU Member States
and can issue binding decisions (Data Protection Directive 1995, Art. 31), there is a
reference stating that ‘[redacted] also regretted that the Commission had not inter-
vened on the Microsoft case in the U.S. courts. The Commission explained that it
rarely intervenes in courts cases and even when it sometimes does, this would be
only in the last instance, even in the Courts of the Member states’ (EC 2015b,
2). The Commission also explicitly asked the member states to push their bilat-
eral channels with the US ‘to support the Commission’s ongoing talks with the
US’ (EC 2015b, 2). Yet the potential implications of the judgment compelled some
form of involvement by international actors. For instance, while explicitly stating
ACCESS DENIED? 163

that the case involved ‘a question of domestic law on which the Special Rapporteur
expresses no view’ (UN Special Rapporteur on the Right to Privacy 2017, 2), the
UN Special Rapporteur Joseph Cannataci urged the Court to interpret the case
narrowly, because it is:

in the interest of all those who care about privacy in the United States and around
the world, for only diplomatic processes of negotiation can accommodate and
balance all of the very significant interests that are in tension in this case.
(UN Special Rapporteur on the Right to Privacy 2017, 2)

The statement invokes the broad relevance of the case and referred to the impli-
cations for privacy rights everywhere. This ultimately compelled the Commission
to submit a statement. The amicus briefs by the European Commission and Ire-
land followed similar strategies in confirming the applicability of sovereignty
as an evaluative criterion while at the same time fundamentally contesting the
US interpretation of how to pursue sovereignty as a higher common principle.
Both problematized that the jurisdictional claim of the US implicitly established
internal sovereignty as primary to sovereign equality. This represents a contesta-
tion of the moral hierarchization within the order of sovereignty rather than the
imposition of alternative criteria. The arguments pointed to potential extraterri-
toriality. For example, the Commission argued ‘that it would be appropriate for
the Court to consider EU domestic law as it pertains to searches of data stored
in the European Union’ (EC 2018a, 14). Ireland even more clearly stated that it
‘does not accept any implication that it is required to intervene into foreign court
proceedings to protect its sovereign rights in respect of its jurisdiction, or that Ire-
land not intervening is evidence of consent to a potential infringement thereof ’
(Ireland 2018, 2–3). They pointed to inconsistencies between the US approach
and the value of the principles of territorial jurisdiction and sovereign equal-
ity in the order of sovereignty, which was also echoed by the judges (Supreme
Court of the US 2018, 7–14). While it was clear that sovereignty played a role,
the Department of Justice emphasized the importance of authority over com-
panies based in the jurisdiction as well as the prosecution of crimes in the US,
while the judges and the international participants emphasized the sovereignty
implications of access to data stored abroad. Both the Commission and Ireland
formally refrained from an explicit endorsement of either party but highlighted
potentially unlawful consequences of a ruling in favour of the Department of Jus-
tice, thus asserting their jurisdictional claim of legitimate control. In sum, while
the European Commission and Ireland acknowledged sovereignty as an evalu-
ative criterion, they delegitimized the hierarchization of principles within this
order as suggested by the US representative. Yet there seemed to be a general
reservation about publicly delegitimizing US practices as a strong violation of
sovereignty.
164 DATA GOVERNANCE

7.2.3 Legislative Action: The CLOUD Act

The US administration recognized that the Court seemed hesitant to rubber-


stamp US practices and probably feared legal precedence in case of a judgment.
The judges specifically discussed the resolution of conflict through a legislative
proposal and emphasized the responsibility of the US Congress (Supreme Court of
the US 2018, 6, 15, 24, 39). Therefore, rather than relying on its justificatory strate-
gies, the US government drew on an alternative form of capital in the field and put
forward a legislative proposal to address the competing jurisdictional claims. The
resulting CLOUD Act was signed into law in March 2018, less than a month after
the US Supreme Court hearing, and it introduced two novelties. The first part of
the CLOUD Act authorizes the US to implement executive agreements with other
countries which grant reciprocal data access if these countries meet certain cri-
teria such as a commitment to the rule of law. The second part of the CLOUD
Act specifies that US law enforcement agencies have the authority to access data
stored by service providers irrespective of data location. As the Department of
Justice put it, ‘the CLOUD Act makes explicit in U.S. law the long-established
U.S. and international principle that a company subject to a country’s jurisdic-
tion can be required to produce data the company controls, regardless of where it
is stored at any point in time’ (US Department of Justice 2019, 3). If, for example,
a US company stores data related to Moroccan citizens in France, the US is now
authorized to access data without formal notification. However, the CLOUD Act
preserves the right of comity analysis if data requests are considered as creating
a conflict of laws (CLOUD Act 2018, para. 10I), and companies may object on
such grounds. Yet a court order can require the production of data even against
other countries’ domestic law. Individuals cannot challenge law enforcement
access.
In the Senate meeting during which the bill was discussed the CLOUD Act was
referred to as ‘a win for law enforcement, for the tech community, and for the
Trump administration as well’ (Hatch 2018, 1923). In the debate as well as the bill,
there were strong references to the order of security. For instance, conceptualizing
data as a tool for law enforcement efforts, one participant stated that ‘The CLOUD
Act gives law enforcement the tools they need to keep us safe’ (Hatch 2018, 1923).
In the CLOUD Act, data access is specifically referred to ‘an essential component
of government efforts to protect public safety and combat serious crime, includ-
ing terrorism’ (CLOUD ACT 2018, sec. 2(1)). Thus, both statements establish the
necessity of data access for public safety and thus demarcate a space of possibles
based on the order of security. The CLOUD Act has received a mixed reaction. The
procedural implementation of the CLOUD Act was critically discussed, because
the CLOUD Act was passed on page 2,212 of a 2,232-page omnibus spending bill
that was required to avert a government shutdown (Ruiz 2018), which in turn pre-
vented a more extensive debate in Congress. Yet Daskal argues that it represents
ACCESS DENIED? 165

‘a new form of international lawmaking via domestic regulation … [which] will


be adopted by a growing list of foreign governments, thereby raising the privacy
standards that apply’ (2018, emphasis in original).
She compares this form of law-making with the GDPR’s reach beyond EU ter-
ritory, pointing to a potential parallel with a ‘Brussels effect’ (Bradford 2020),
raising privacy standards through the CLOUD Act. In addition to the civil lib-
erties safeguards, Daskal and Swire (2018) point to the possibility that foreign
governments unable to obtain data held by US companies will either unilater-
ally apply their laws extraterritorially or enact data localization laws that force
providers to store relevant data within a specific jurisdiction, which, they argue,
will undermine a global open internet. Other academics and NGOs have been
more critical of the substantive content of the law (Fischer 2018; Ruiz 2018).
The human rights safeguards expected from other countries are to some extent
limited, as the law requires that the other state or entity ‘adheres to applicable
international human rights obligations and commitments or demonstrates respect
for international human rights’ (CLOUD Act 2018, sec. 2(1), emphasis added).
At present, the CLOUD Act does not specify redress mechanisms for US ser-
vice providers under bilateral agreements, which limits the potential to challenge
decisions.
In sum, by disclosing overlapping jurisdictional claims and the limited scope of
applicable law, Microsoft prompted the US to create a legislative solution to replace
informal and legally uncertain cooperation. Even though it imposes standards on
third countries, the CLOUD Act was largely met with silence by officials in the EU
or the Council of Europe. In Section 7.3, I aim to demonstrate how the conflict
resolution process is interlinked with evaluative criteria that had been established
through deliberations outside the US Supreme Court.

7.3 Underlying Consensus in the Field

After the adoption of the CLOUD Act, the Supreme Court case was declared moot,
but the legislative solution did not address the underlying tensions that arose due
to the international dimension of the conflict, as the extraterritorial nature of the
CLOUD Act merely legalized the existing jurisdictional claim. Yet the introduc-
tion of the CLOUD Act did not spark any significant reactions from the conflict
parties. In this section, I trace how the US administration successfully imposed
a legislative solution on a political conflict, arguing that two related factors con-
tributed to its prevalence. On the one hand, the legal character of the CLOUD Act
was successfully established as a way to prove worth in the order of sovereignty. On
the other hand, I argue that incremental shifts towards the order of security con-
tributed to the increasing recognition of international cooperation as a necessary
step to fight impunity.
166 DATA GOVERNANCE

7.3.1 Public Legal Capital

While the increasing manifestation of private actors in the field of data governance
brings challenges to the position of public actors, they may draw on resources
to sustain their position in the field that are not available to private actors. The
introduction of a legislative solution changed the relationship between public and
private actors by providing additional resources to public actors, as it moved the
issue from the political back to the legal arena. While private actors articulated
strong jurisdictional claims in the court case, this changed with the introduction
of the CLOUD Act. In a letter to the senators sponsoring the bill, Microsoft and
other tech firms stated, ‘We appreciate your leadership championing an effective
legislative solution, and we support this compromise proposal’ (Apple et al. 2018).
Microsoft seemed to be willing to fight for consumer rights as long as it found
itself in a situation of legal uncertainty, even though this created a significant cost.
However, Microsoft was less willing to continue pushing for consumer rights and
sovereignty principles when it had mitigated this uncertainty. In 2018, Microsoft
published ‘Six Principles for International Agreements Governing Law Enforce-
ment Access to Data’ (B. Smith 2018c). These principles demanded stronger
transparency and judicial review but also emphasized Microsoft’s sacrifices for
customers. For instance, they highlight that ‘Microsoft has fought hard to secure
these rights and protections. Three times we filed lawsuits against the U.S. gov-
ernment to increase transparency, and all three successfully prompted significant
new protections for our customers’ (B. Smith 2018c, 1).
On the one hand, the acceptance of the CLOUD Act by tech companies points
to the limits of responsibility that private actors tend to accept. While it seems
that companies increasingly feel the need to posit themselves as embracing a more
proactive role in the field, they seem more likely to deflect responsibility in con-
texts where the estimated benefit is limited (Eichensehr 2019; Gorwa and Peez
2018). For example, a significant number of companies cooperated under the
PRISM program for years (see Chapter 4), but since widespread public criticism
emerged with regard to these practices, companies have tried to voice their objec-
tions publicly (Reform Government Surveillance 2015). While even companies
like Facebook that have long attempted to avoid responsibility (Haggart 2020)
have embraced calls for regulation more recently, it is important to further exam-
ine the role of private companies and platforms in jurisdictional conflicts as either
active challengers or passive bystanders. By emulating public justifications empha-
sizing the importance of sovereignty and pointing to the inconsistencies inherent
in law enforcement data access practices, Microsoft was able to enforce justifi-
cations from public actors. For Microsoft, this initial challenge was sufficient to
achieve more significant inclusion in the field, despite significant barriers to access.
One interviewee described how those engaged in electronic evidence have formed
somewhat of a ‘community where everybody knows each other’ (Tech company
employee A, pers. comm. 2019), describing close interaction between Microsoft
ACCESS DENIED? 167

and various officials from EU institutions. In contrast, NGOs and data protec-
tion authorities in particular have complained about the exclusive character of this
community (NGO representative A, pers. comm. 2019; DPA representative, pers.
comm. 2019).
On the other hand, this statement points to the valued status of domestic law
even in international contexts. According to a common definition of the distinc-
tiveness of legal rules, law is distinct because of its form and source rather than
the threat of sanctions or the content’s morality (Hart 1958). The justification of
extraterritoriality for security purposes had failed in the context of the Supreme
Court case, but the creation of a legal basis through the CLOUD Act altered the
space of possibles for the US to assert its jurisdictional claim. While its norma-
tive nature did not change, its formal nature did. All actors frequently referred
to the importance of respect for domestic law (e.g. EC 2018a, 5) to prove worth.
A Department of Justice letter explicitly outlined that ‘the legislative proposal is
necessary to reinstate the pre-Microsoft status quo when providers routinely com-
plied’ (Ramer 2017, 1), including for data stored abroad. This is also reminiscent of
the legalization of contested surveillance practices in the context of counterterror-
ism measures. As Viola and Laidler (2022) highlight, increased transparency about
practices has frequently resulted in the explicit legalization of certain surveillance
measures rather than their elimination. Hence, the legalization of US access prac-
tices contributed to an increase in the perceived legitimacy of the jurisdictional
claim by the US. Nonetheless, in light of the persistence of overlapping juris-
dictional claims, it seems surprising that, particularly in the EU, the CLOUD
Act found acceptance as a resolution of the conflict, apart from isolated critical
voices (in’t Veld 2019; Jourová 2018). I examine this question in more detail in
Section 7.3.2.

7.3.2 Legislative Proposals in the Council of Europe

While now backed by domestic law, US access practices were still in potential
violation of EU law and sovereignty. It is likely that one factor that contributed
to the recognition and acceptance of this solution stems from the fact that the EU
faced an asymmetric distribution of resources. As the majority of internet service
providers relevant for EU law enforcement authorities are located in the US, the
country holds significant control over data (EC official, pers. comm. 2019). Yet I
argue shifts in the normative character of the field constituted the main reason for
the acceptance of the US CLOUD Act. In what follows, I highlight how legislative
developments in the Council of Europe and the EU have contributed, via incre-
mental processes that emphasized the supremacy of security cooperation over
sovereignty concerns, to the entrenchment of alternative criteria of evaluation
in the field. This increased the availability of policy options in the Microsoft
case.
168 DATA GOVERNANCE

As mentioned above, the Council of Europe’s Budapest Convention is currently


the most important international treaty on cybercrime. It emphasizes cooperation
between jurisdictions in the fight against cybercrime and, due to mutual trans-
border access to stored computer data, contains elements that circumscribe state
sovereignty (CoE 2001). More specifically, there is a clash between the require-
ment to ensure territorial integrity online (Markoff and Kramer 2009) and the
necessity of promoting international cooperation. This tension arises in particular
from the ambiguity of Article 32b, which provides that a party to Convention 185:

may, without the authorization of another Party … access or receive, through a


computer system in its territory, stored computer data located in another Party,
if the Party obtains the lawful and voluntary consent of the person who has the
lawful authority to disclose the data to the Party through that computer system.
(CoE 2001, Art. 32b)

The T-CY acknowledges that ‘Article 32b is an exception to the principle of terri-
toriality’ (T-CY 2014, 3) but emphasizes that ‘the Parties to the Convention form a
community of trust and that rule of law and human rights principles are respected
in line with Article 15 Budapest Convention’ (T-CY 2014, 5). The circumvention
of sovereignty in the pursuit of the common good of security has become more
urgent in the context of the development of a second additional protocol to the
Budapest Convention on ‘Enhanced international cooperation on cybercrime and
electronic evidence’ (CoE 2020). The protocol is designed to improve cross-border
cooperation related to cybercrime and electronic evidence investigations and is
similar to the US CLOUD Act. The protocol has been under discussion for an
extended period and was approved by the T-CY in May 2021.
Russia is the only member state that has refused to sign the Budapest Con-
vention on the grounds of sovereignty violations. Yet in a 2017 resolution, the
European Parliament similarly stressed concerns about the circumvention of data
protection and due process rights (EP 2017b, para. 78). The majority of debates
about the additional protocol have focused on a combination of appeals to the
rule of law and the order of security. Justifications highlight the threat to both
the physical safety of a reference community and its core values. For example,
conference proceedings in the Council of Europe on the matter highlight that
‘During the past two years, cybercrime has reached even more threatening pro-
portions affecting the security of individuals and core values of societies’ (CoE
2018b, 1). In several international fora, Alexander Seger, the head of cybercrime
at the Council of Europe, presented the slogan ‘No data → no evidence → no
justice → what rule of law?’ (see, e.g., Seger 2019). This justification similarly
ties principles of the order of security to principles of the order of fairness and
thus echoes justifications brought forward in the context of counterterrorism
(see, e.g., Obama 2014). Council of Europe communications have also specifi-
cally appealed to a dystopian scenario, highlighting the dangers of the absence
ACCESS DENIED? 169

of rules. For example, ‘Failure to reach agreement on effective means to investi-


gate cybercrime and secure electronic evidence carries the risk that competencies
will further shift from the criminal justice arena (with strong safeguards) to the
national security arena’ (CoE 2018b, 1). This echoes the point of Daskal and Swire
(2018) that the proposed safeguards are significantly better than the absence of
any safeguards. Proponents of law enforcement access argue that criminal justice
is strongly tied to the pursuit of fairness and can only be achieved by data access
with lower thresholds for other fundamental rights. The underlying assumption
that broad law enforcement access actually enables the pursuit of justice or security
is rarely questioned, which highlights again the strong ‘illusio’ (Bourdieu 1993) of
the field.
The negotiation of the second protocol has contributed to tensions with NGOs
and DPAs regarding both the substance of the protocol and the procedural imple-
mentation. NGO representatives in particular have complained about insufficient
safeguards in the proposal (EDRi et al. 2019) and a lack of access to the Council of
Europe negotiations (NGO representative A, pers. comm. 2019), while DPAs have
described tensions with the responsible cybercrime unit at the Council of Europe
(DPA representative A, pers. comm. 2017). The Council of Europe conference that
aims to address these issues, entitled ‘Octopus Conference’, is dominated by coun-
try representatives from law enforcement or justice ministries, compared with a
much smaller number of non-governmental stakeholders, including academia,
the private sector, NGOs and DPAs—as is also illustrated by the conference’s
list of participants (CoE 2019). Criticism has also emerged in the private sector.
For example, Facebook has encouraged increased fundamental rights protection,
explicitly highlighting that the text ‘makes no mention of service providers and
Parties’ obligations to protect human rights more generally’ (Facebook 2019, 4).
Facebook emphasized that due to its ‘responsibility to ensure public safety both
on and off of our platform seriously, we are equally committed to protecting our
users’ privacy and human rights’ (Facebook 2019, 4). This again reinforces the idea
of companies as ‘Digital Switzerlands’ (Eichensehr 2019) that attempt to challenge
public authority. In sum, the strategy of the Council of Europe draws on a combi-
nation of the order of security and the order of fairness, arguing that only through
security can important liberal norms be realized. This demonstrates an effort to
create normative fit and appeal to a broad audience which has nonetheless faced
some resistance.

7.3.4 Legislative Proposals in the EU

Parallel negotiations have played out in the EU. Designed to increase cooperation
and law enforcement control of cybercrime, law enforcement and judicial coop-
eration using ICT have already been implemented within the framework of the
e-Justice strategy (Anagnostakis 2017). Legislative action increased significantly
170 DATA GOVERNANCE

after the beginning of the Microsoft warrant case, which started in 2013 and was
granted review by the US Supreme Court in October 2017. In July 2016, the
European Commission began requesting submissions on electronic evidence from
member states and found a ‘large variety of approaches adopted by the Member
States’ law enforcement and judicial authorities as well as by the service providers’
(EC 2016a, 1). In late 2017, the European Commission conducted a public consul-
tation on the improvement of cross-border cooperation in law enforcement that
already foresaw the publication of a legislative proposal in early 2018. In April
2018, only a month after the CLOUD Act had been signed into law, the European
Commission presented a twofold proposal to harmonize regulation of e-evidence
in the EU, with provisions in many ways similar to those of the CLOUD Act and
the CoE proposal.
The proposal includes a directive on the appointment of legal representatives
and a proposal for a regulation introducing European Production and Preser-
vation Orders (EC 2018b, c). While the European Investigation Order (2014)
facilitates cooperation between law enforcement agencies regarding investigative
measures, such as hearing remote witnesses or covert investigations, the Euro-
pean Production Order would enable judicial authorities in one member state
to directly obtain e-evidence from a service provider based in another member
state. In turn, the European Preservation Order authorizes judicial authorities to
request the preservation of data from service providers for future access. In the
proposal, data is conceptualized as fluid, which highlights the ‘volatile nature of
electronic evidence and its international dimension’ (EC 2018b, 2). The regula-
tion also ‘moves away from data location as a determining connecting factor, as
data storage normally does not result in any control by the state on whose terri-
tory data is stored. Such storage is determined in most cases by the provider alone,
on the basis of business considerations’ (EC 2018b, 13). This also emphasizes
the relevance of international cooperation. As in the European Arrest Warrant,
the European Commission emphasizes the principles of double criminality and
mutual recognition (EC 2018b, 4). Mutual recognition means that investigations
for offences recognized in both jurisdictions can be executed in another mem-
ber state without judicial review. The Commission carefully tried to avoid overly
broad sovereignty implications, as the proposal ‘clarifies the procedural rules and
safeguards applicable to cross-border access to electronic evidence but does not
go as far as harmonising domestic measures’ (EC 2018b, 6). However, the Com-
mission proposal also explicitly articulated the intention to provide ‘a model for
foreign legislation’ (EC 2018b, 10), thus highlighting the intention to promote EU
norms to third countries.
The proposal has been subject to criticism, including some from the mem-
ber states (Federal Ministry of Justice and Consumer Protection (Germany)
2019, 2). Concerns about the European Commission’s lack of transparency and
cooperation resulted in a total of 841 amendment proposals by the Parliament
ACCESS DENIED? 171

(Christakis 2020). The absence of safeguards has contributed to criticism by


29WP, which states that ‘in the absence of such guarantees, any envisioned instru-
ment for direct access to electronic evidence would fail to comply with the
requirements of EU law’ (29WP 2017, 8). The proposal also shifts due process
functions from judicial authorities to companies (DPA representative, pers. comm.
2018). In addition, the ‘unduly short’ (EDRi 2017, 2) deadline exacerbated chal-
lenges for some actors (Former MEP advisor, pers. comm. 2019). In December
2018, the Council published a widely criticized draft that removed safeguards
such as the potential to object to requests (Christakis 2020). In November 2019,
the rapporteur of the LIBE Committee, Birgit Sippel, published a critical report
(LIBE 2019) which inter alia highlighted that the framing of data as evidence is
problematic, as only part of the data is likely to be permissible evidence in court.
In sum, the legislative proposal put forward by the European Commission and
the ongoing discussions in the Council of Europe constitute steps in the direc-
tion of closer cross-border judicial and police cooperation. Both frameworks are
designed to function on an interoperable basis and avoid fragmentation.

7.3.5 Future Trajectories and Interlinkages

As Herschinger, Jachtenfuchs, and Kraft-Kasack have pointed out concerning


other semi-automatic cooperation mechanisms in the area of criminal justice
cooperation, these constitute ‘a true revolution’ (2011, 456), as a suprana-
tional authority increasingly shapes the conditions and applicability of the state
monopoly of violence. The fact that in the area of electronic evidence, such tenden-
cies now seem to diffuse to contexts beyond the EU is noteworthy, as EU member
states are generally considered unique in their readiness to accept supranational
authority (Herschinger et al. 2011, 465). This constitutes a significant recognition
of principles of the order of globalism. While it is less radical than the conception
of sovereignty articulated with regard to the responsibility to protect (Slaughter
2005), there seems to be an increasingly international dimension to the monopoly
of force with a conditional interpretation of sovereignty. This may have poten-
tially problematic consequences from a normative perspective. While the Council
of Europe is an institution specifically focused on human rights, the broader mem-
bership of the Budapest Convention means that not all signatories are strictly
bound by liberal standards. If states accept mutual access through the Budapest
Convention, this brings the risk that they share data with states that potentially use
data for illiberal purposes. Furthermore, in light of illiberal policies in Hungary or
Poland, there are also worries about EU member states.
In 2019, the UK and the US concluded the first bilateral agreement for recipro-
cal access under the CLOUD Act. There have been concerns about mission creep,
particularly concerning intelligence cooperation (EC 2020c). While the EU has
172 DATA GOVERNANCE

started negotiations with the US regarding the exchange of electronic evidence,


the Commission seems to avoid tying these negotiations to the framework of the
CLOUD Act (EC 2019a, 2019b). Nevertheless, the conclusion of a bilateral agree-
ment between the EU and the US is very likely, because the European Commission
aims to prevent bilateral agreements between member states and the US. These
would not only lead to fragmentation but due to the cooperation mechanisms
between EU member states may also risk undermining common safeguards.
These legal proposals, therefore, highlight the increasing convergence between
the order of globalism and the order of security, indicating a shift towards data gov-
ernance as a vision of security partnership. Thus, while in the Supreme Court case,
the US, also due to the involvement of the EU, was not able to gain recognition for
arguments based in the order of security on the basis of the extraterritorial char-
acter of law enforcement practices, the developments in the field have highlighted
the increased entrenchment of such principles. The emphasis on cooperation and
mutual recognition as key to fostering trust (Autolitano et al. 2016, 61) seems to
mitigate the potential violations of sovereignty. I argue that this entrenchment of
the order of security had significant implications for the resolution of the con-
flict, as the broader principles shaped the space of possible policy options and
thereby transformed the evaluation of data access in light of the threat of impunity.
This established the necessity of global cooperation in view of potential impunity
and undermining effects on the rule of law. Thus, the conflict shows strong inter-
linkages with the evolution of the field. While electronic evidence still has not
yet received considerable public attention, despite significant effects for users and
citizens, the Supreme Court case and Microsoft’s visibility certainly contributed
to a broader awareness. The case clearly generated the impetus for a legislative
solution. The resulting sense of urgency is also evident from the somewhat hidden
nature of the legislative proposal in a spending bill.

7.4 Conclusion

This chapter has discussed the contested area of access to electronic evidence
by law enforcement agencies in varying jurisdictions. I have demonstrated how
Microsoft used its position in the field to challenge practices by US law enforce-
ment authorities. Microsoft challenged the established and barely problematized
informal cooperation between law enforcement authorities and private companies
on the basis of their extraterritorial effects. This catalysed processes to legalize
and formalize these practices. This has contributed to the fact that sovereignty-
intrusive jurisdictional claims in the area of electronic evidence sharing seem to
have stabilized. The conflict represents, on the one hand, one of the first major
challenges to a formerly largely disregarded system of public–private cooperation.
On the other hand, while formally restricted to the national level, the conflict
ACCESS DENIED? 173

resonated with an inter- and transnational audience and thus initiated broader
meaning-making and normative ordering processes. While the conflict became
salient internationally due to the underlying conception of data governance as a
potential intrusion on sovereignty, the US implemented a legislative solution that
was largely based on the conceptualization of data as a security concern. Thus,
the chapter has illustrated an important conflict resolution strategy: legalization.
By creating a legal basis for existing practices, the US CLOUD Act has continued
strategies that, in the area of counterterrorism, have contributed to bilateral agree-
ments, for example with regard to Passenger Name Records (see Chapter 5) and
the Terrorist Finance Tracking Program (see Chapter 6), or several legislative acts
that enable data access by intelligence services. Law has an enabling as much as a
constraining dimension here (J. Cohen 2019).
I have argued that the conflict resolution process was based on incremental
processes that had altered the space of possibles due to a shift towards the order
of security. The interests of private companies and law enforcement authorities
converged around a vision of data governance as a security partnership. While
both sides benefit from legal certainty, the formalization of informal practices
also brings risks. Public actors attempt to uphold the benefits of informal direct
cooperation with providers, such as ease and speed of access, while making their
access requests mandatory, which potentially undermines due process norms. In
light of stringent data protection standards and a renewed emphasis on digital
sovereignty, particularly in cloud computing, the current regime might face legal
challenges in the EU.
The question of access to electronic data by law enforcement agencies has also
manifested in conflicts about the availability of the so-called WHOIS database in
domain name registration (Kulesza 2018). In Brazil, a similar conflict about judi-
cial orders to disclose user data became very serious. The refusal to disclose data
that Facebook insisted were not accessible due to encryption prompted Brazil-
ian judges not only to shut down WhatsApp temporarily on multiple occasions
but also to detain Facebook’s vice president for Latin America, Diego Dzodan
(Freedom House 2016). To avoid such disputes, companies face incentives to wel-
come legal harmonization, because it enables them to maintain their image as
responsible protectors of customer rights, while simultaneously creating legal cer-
tainty. As Microsoft has introduced its first underwater data centre off the coast
of the Orkney Islands (Cellan-Jones 2018), new jurisdictional challenges and
conflicts are likely to arise.
8
The Right to Be Forgotten
Moral Hierarchies of Fairness

The increasing wealth of data has contributed to the transformation of the inter-
net into a durable collective human memory. Search engines such as Google
aim to make this collection of information available via crawling for, indexing,
and ranking content. According to current estimates, Google, which has a search
engine market share of over 90 per cent worldwide, processes at least 40,000 search
queries on average per second, which amounts to 3.5 billion searches per day and
1.2 trillion searches per year worldwide (Google Search Statistics 2020). As Mayer-
Schönberger put it, on the internet, the availability of data ‘will forever tether us to
all our past actions, making it impossible, in practice, to escape them’ (2011, 125),
as the internet transforms time into a ‘perpetual present’ (2011, 92). This transfor-
mation raises fundamental questions about how identity is represented and shared
online, how it can evolve, and to what extent digital memory is tied to the technical
specificities of search engines. As Zuboff (2018) has warned of surveillance capi-
talism’s impact on human futures, the idea of a ‘right to be forgotten’ has attracted
widespread attention (McGoldrick 2013). Also known as the right to be delisted or
the right to erasure, the right to be forgotten specifies circumstances under which
private individuals may request the removal or delisting of personal information
from search engine results.
This chapter aims to analyse the evolution of the debate about the right to be for-
gotten in the context of two major instances of jurisdictional conflict that emerged
between the US-based search engine Google and EU data protection authorities.
First, I focus on the introduction of the right to be forgotten through a landmark
ruling by the CJEU in 2014. Following a legal dispute with a Spanish citizen,
Google was forced to allow EU residents to demand the delisting of websites from
search results if data are ‘inadequate, irrelevant or no longer relevant or exces-
sive’ (Google Spain 2014, para. 94). The second jurisdictional conflict unfolded
between Google and the French data protection authority regarding its scope of
applicability. While the Commission Nationale de l’Informatique et des Libertés
(CNIL) argued for global applicability, Google refused. The CJEU (Google v.
CNIL 2019) restricted the scope of applicability to the European context, argu-
ing against the French data protection authority’s request. In this second instance
of conflict, the debate shifted from the question of the existence of the right to be
forgotten to the permissible imposition of norms or rules on other communities.

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023).DOI: 10.1093/oso/9780192870193.003.0008
THE RIGHT TO BE FORGOTTEN 175

In this chapter, I explore the differences between these instances of jurisdic-


tional conflict both in strategy and in outcome, focusing inter alia on the role
of Google in the conflict resolution process as an advocate of human rights.
The analysis of this more recent jurisdictional conflict constitutes an important
contribution to the literature. While the societal and legal implications of the
right to be forgotten have been discussed extensively (e.g. Azurmendi 2017; Jones
2018; Mayer-Schönberger 2011; Rosen 2011), also in the context of the ruling
(Svantesson 2015a, 2016) and other legal frameworks (Garstka and Erdos 2017),
the more recent manifestation of a jurisdictional conflict has not received suffi-
cient attention yet (exceptions include Burchardt 2020; Globocnik 2020; Samonte
2020). Most of these accounts at best discuss in passing the political dimension of
these conflicts. Exceptions include Chenou and Radu (2019) on the hybrid gover-
nance arrangement that follows from the right to be forgotten and Powles (2015a)
on the role of Google and data protection authorities as active shapers of evalua-
tive criteria in jurisdictional conflicts. An emerging body of literature attempts to
explain the unique sources of power of tech companies and platforms more gen-
erally (Cammaerts and Mansell 2020; Culpepper and Thelen 2019; Eichensehr
2019; Haggart 2020). Zuboff (2018) also extensively focuses on Google. The rela-
tionship between companies and data protection authorities has been the subject
of comparatively little research. The legally dominated literature tends to focus on
the framework rather than the enforcement of the law (exceptions include Bennett
and Raab 2006, 133–43; Hijmans 2016b; Solove and Hartzog 2014). Yet the role of
data protection authorities in ‘arbitrating the degree of information privacy that
we enjoy as a fundamental right’ (Raab and Szekely 2017, 421) has been subject to
closer scrutiny. More specifically, the role of data protection authorities has been
described as both shaping and applying the law (Jóri 2015) through deliberation,
advocacy, monitoring, coordinating, or enforcement (see, e.g., GDPR 2016, Art.
57(1)).
I argue that, first, the scarce recognition for Google’s justifications in the 2014
court case, which contributed to the entrenchment of the right to be forgotten,
was based on the lack of appeal to higher normative principles. Google underesti-
mated discursive shifts in the EU that favoured a stronger notion of responsibility
for private companies, including companies based abroad. In contrast, Google’s
increasing appeal to the higher moral principle of fairness in the second case was
supported and recognized by different actors, particularly NGOs and media com-
panies, which contributed to the company’s identity construction as a responsible
actor, which contributed to the success of Google in the second instance of conflict.
Second, I argue that the perceived identity of data protection authorities as
enforcers or shapers of data protection legislation plays a significant role in conflict
resolution processes. The CNIL’s justifications in the order of sovereignty reveal
strong tensions between principles of territorial jurisdiction and field-inherent
restrictions on the exercise of sovereignty. I suggest that national data protection
176 DATA GOVERNANCE

authorities in particular may feel inclined to protect their reference communities


by imposing solutions that draw on a combination of justifications based on the
order of sovereignty and the order of fairness.
Third, while most of the other examples point to normative tensions between
different value orders, this case demonstrates the rise of jurisdictional conflicts
based on a challenge to the in-order moral hierarchization. More specifically,
this concerned the hierarchization, on the one hand, of principles rooted in the
order of fairness, such as privacy, freedom of speech, and freedom of informa-
tion, and, on the other hand, the right conduct of governance in the pursuit of the
higher common principle of sovereignty. I argue that this character of the conflict
enabled Google to establish itself as an active shaper rather than passive taker of
rules.
The chapter is structured as follows: first, I briefly illustrate the historical and
legal background of the right to be forgotten before focusing, secondly, on the ini-
tial complaint and court case in 2014, highlighting particularly actors’ diverging
justifications. Thirdly, I contrast these justifications with the normative appeals in
the dispute between the French data protection authority and Google. Fourthly, I
embed this conflict in broader normative shifts that represent a changing percep-
tion of private responsibility in relation to big tech companies which shapes the
resolution of jurisdictional conflicts in the long term.

8.1 Legal and Normative Background to the Right


to be Forgotten

As search engines enable finding publicly available information and content for
participating in democratic processes and societal debates, they are considered
to ‘play a pivotal role in the information society’ (CoE 2012, 1). Due to this
enabling function vis-à-vis the exercise of human rights, they are considered to
have specific public value. However, it is important to note that access to informa-
tion is not independent on the user. The order and visibility of search results is
determined by algorithms, which may be biased and, due to concerns about com-
petition, are not transparent. In 2009, Google began offering personalized search
results depending on user data, such as location, search history, IP addresses, date
and time of the request, cookies, and information from other Google services
(Google 2009). Therefore, individual searches vary significantly in their results.
As crawling and indexing technologies improve, searches are more likely to dis-
close content that was not originally intended to be widely available, in particular
sensitive information. While access to the internet allows the exercise of human
rights such as freedom of speech, the accessibility of this content to anyone may,
in turn, infringe upon the individual rights of others. There are several reasons
why people might aim to have past actions removed from the public record. These
THE RIGHT TO BE FORGOTTEN 177

include fears of reputational damage but also attempts to reduce emotional dis-
tress resulting from cyberbullying or image-based sexual abuse. The right to be
forgotten seeks to restrict such infringements, particularly concerning identity
fraud, risks of the abuse of data or incorrect conclusions on the basis of data, and
limits to autonomy based on a lack of control (Tjong Tjin Tai 2016). The per-
ceived lack of control in relation to data disseminated on the internet may have
serious consequences. For example, in 2012, a 15-year-old Canadian committed
suicide after experiencing significant cyberbullying because of a topless photo she
had sent to a stranger in 2009 (M. Dean 2012). As search results connected to
personal identifiers such as names significantly shape the possibility of moving
beyond past experiences (Mayer-Schönberger 2011), the enactment of individ-
ual and collective digital identity is to an unprecedented extent shaped by the
design of search engines that determine the composition and structure of such
results.

8.1.1 The Right to be Forgotten

While the right to be forgotten became the subject of intense debate only more
recently, both the societal and legal implications have already been discussed in
the context of offline data, for example in relation to the Data Protection Directive
(1995). The right to be forgotten was also employed in a decentralized manner by
national data protection authorities, for example in Spain (e.g. Azurmendi 2017),
or with regard to certain provisions regarding criminal records, for example in
France, Germany, or Italy. Many domestic legal systems specify general require-
ments that support rehabilitation after criminal convictions, for example, to clear
offences from criminal records or to remove debt histories. In contrast, in the US,
the right to information, freedom of speech, and freedom of the press often take
precedence over the protection of privacy based on the First Amendment to the
US Constitution (Barendt 2005, 232–46).
In the EU, debates about the specification of a right to be forgotten resurfaced
in the context of data protection reform. In 2010, the Justice Commissioner and
Vice President Viviane Reding emphasized that ‘Internet users must have effective
control of what they put online and be able to correct, withdraw or delete it at will’,
also specifically referring to a proposed ‘right to be forgotten’ (Reding 2010). At the
EU level, the first explicit legal expression and clarification of this right manifested
in 2014 through a judgment of the CJEU, which I explore in Section 8.2. While
the court considered a legal basis in Article 12(b) and Article 14(a) of the Data
Protection Directive (1995), which granted rights to rectify non-compliant data
and to object to their processing, the right has been strengthened with the GDPR,
which stipulates that data subjects have, in the first instance, a ‘right to erasure’
vis-à-vis the data controller (GDPR 2016, Art. 17).
178 DATA GOVERNANCE

8.2 Diverging Conceptions of Responsibility in the Costeja Case

In 2014, a CJEU landmark judgment specified that data subjects have the right to
request the delisting of search results from search engines if the data are outdated
or irrelevant. The lawsuit was the result of a complaint to the Spanish data pro-
tection authority which sparked a jurisdictional conflict between Google and the
public data protection authority over the legitimate conduct of data and content
displayed by the search engine. In what follows, I first give a brief summary and
then examine the justifications in more detail.

8.2.1 A Complaint Goes to Court

In 2009, after a Spanish citizen, Mario Costeja Gonzáles, had unsuccessfully tried
to get information about a home repossession order removed from the internet, he
filed a complaint with the Agencia Española de Protección de Datos (AEPD). The
relevant information had been published in a newspaper in 1998 and was featured
as one of the top listings when searching on the search engine Google. Accord-
ing to Costeja Gonzáles, this negatively affected his professional life. He had first
asked the newspaper to remove the article and, after that failed, requested a delist-
ing from Google Spain SL which would remove the article from search results on
his name. He argued that the information was no longer relevant, as any financial
issues had been resolved long ago (Google Spain 2014, paras. 14–15). While the
AEDP dismissed the complaint against the newspaper, as the publication had been
legal and for the legitimate purpose of attracting buyers (Google Spain 2014, para.
17), the complaint against Google was upheld. The AEPD argued that the delist-
ing of such information from search engines results rather than the removal of the
information sufficiently decreased the likelihood of accidental access while pre-
serving freedom of information. When Google Spain challenged this decision, the
Audiencia Nacional, the Spanish High Court, referred the case to the CJEU. The
Spanish High Court asked questions regarding, first, the applicability of the Data
Protection Directive to search engines due to their potential status as data ‘pro-
cessors’ or ‘controllers’, secondly, regarding the applicability of EU law to Google
Spain, and thirdly, whether an individual had the right to request the erasure of
data from search results (Google Spain 2014, para. 3). Thus, the CJEU had to weigh
competing claims about the legitimate conduct of data governance with regard to
search engine results.

8.2.2 The Costeja Case

In the 2014 CJEU case, justifications spoke to the question of responsibility of


both Google and the CJEU. A significant portion of the case was concerned with
THE RIGHT TO BE FORGOTTEN 179

whether Google as a US-based company could be held responsible for practices


in the EU. Google Inc., now Alphabet, is a US company. However, it acts in
Europe through its subsidiaries and targets Europeans as customers. In its first
submission, Google, therefore, attempted to contravene the authority of the CJEU,
arguing that the company was outside CJEU jurisdiction, as Google Inc. rather
than the EU-based Google Spain conducted the data processing. The company
explicitly highlighted its compliance with US law but argued against any extended
responsibilities as a data controller (Google Spain 2014, paras. 2–4). In the 1995
Data Protection Directive, a controller of data is defined as ‘the natural or legal
person, public authority, agency or any other body which alone or jointly with
others determines the purposes and means of the processing of personal data’
(Data Protection Directive 1995, Art. 2(d)). In contrast to processors, data con-
trollers have more extensive responsibilities. Apart from a reference to potential
censorship at the very beginning (Google Spain and Google, Inc. 2012, para. 1),
Google’s justificatory strategy did not appeal to higher normative principles but
relied on these legal aspects. This approach is not without precedent. While Brazil
referred to Facebook’s attempt to prove it was not in the country’s jurisdiction as
an ‘outrageous disregard’ (cited in Frosio 2013) of its sovereignty, others have rec-
ognized this principle as legitimate.¹ The European Commission countered and
emphasized that as Google’s business model relied on advertising to customers
in the EU, claims and injunctions sent to them by the persons concerned and
the competent authorities are only the logical consequence of the fact that the
processing is carried out ‘within the framework’ of the activities of Google Spain
(EC 2012a, 27, my translation). The European Commission thereby emphasized
a variation of the market-place principle, which stipulates the applicability of EU
law for all activities targeting EU customers and came to be institutionalized with
the GDPR.
The CJEU had to weigh arguments of individual rights protection against
jurisdictional applicability and corporate responsibility. The CJEU found Google
Spain to fall within the territorial scope of the Data Protection Directive (1995) due
to its activities and stable arrangements in Spain as well as its status as a separate
legal personality (Google Spain 2014, para. 49). In addition, the Court classified
Google as a data controller. It argued that the curation of search results also implies
legal responsibilities. The judgment emphasized the protection of privacy of the
individual, arguing that these rights ‘override, as a rule, not only the economic
interest of the operator of the search engine but also the interest of the general pub-
lic in finding that information upon a search relating to the data subject’s name’
(Google Spain 2014, 17). The CJEU argued in favour of Costeja Gonzáles for a
right to delist search results from search engines if the data are ‘inadequate, irrele-
vant or no longer relevant, excessive in relation to the purposes of the processing’

¹ For an overview, see Svantesson (2016).


180 DATA GOVERNANCE

(Google Spain 2014, 19). Like the AEDP, it argued against the removal of the data
from the newspaper website.
In its judgment, the Court created two moral hierarchies. On the one hand, it
determined that the relevant reference community was constituted by individuals
rather than businesses, referring also to the qualitative difference between individ-
ual rights and business interests. Some have criticized the fact that the judgment
extensively referred to Articles 7 and 8 in the Charter of Fundamental Rights but
omitted that there is no specification of the role of private actors in the enforcement
of such standards (Frantziou 2014, 768). By emphasizing the private responsibil-
ity to protect, the CJEU judgment not only established the jurisdictional claim of
individuals over their data but also transferred the enforcement responsibility to
private companies.
On the other hand, the Court also created a hierarchization of fundamen-
tal rights. The judgment prioritized the rights of data protection and privacy
of the individual. However, the CJEU strongly argued for balancing acts on a
case-by-case basis, emphasizing the importance of the public interest (Google
Spain 2014, para. 81). Some, nonetheless, suggested that the court neglected the
fundamental rights of freedom of expression and information and freedom of
the press (Frantziou 2014, 770). The contested character of the judgment is also
indicated by the fact that the CJEU—atypically—did not follow the opinion of
the Court’s Advocate General. The Advocate General Niilo Jääskinen had argued
against the right to be forgotten on the basis of the fact that ‘the fundamental
right to information merits particular protection in EU law, especially given the
ever‑growing tendency of authoritarian regimes elsewhere to limit access to the
internet or to censure content made accessible by it’ (2013, para. 121). He pointed
to a dystopian scenario in the order of fairness based on authoritarian censorship
and the limitations of freedom of speech and concluded that a right to be forgotten
‘would entail sacrificing pivotal rights such as freedom of expression and informa-
tion’ (Jääskinen 2013, para. 132). In contrast, others argued that the CJEU ‘builds
a defence of privacy against a new dimension of risk generated by the Internet
and search engines’ (Azurmendi 2017). While both draw on a dystopian scenario
based in the order of fairness, they arrive at different conclusions. While the AG
argued against sacrificing freedom of expression and information, Azurmendi, fol-
lowing the Court’s emphasis on privacy rights, argues in favour of the protection
of privacy against private interference.

8.2.3 Responses to the 2014 Judgment

Responses to the CJEU judgment varied greatly. The European Commission in


particular embraced the right to be forgotten as a measure to reinforce individ-
ual rights in relation to data. For instance, the EU Justice Commissioner Martine
THE RIGHT TO BE FORGOTTEN 181

Reicherts emphasized the right to be forgotten as a way to ‘put citizens back in con-
trol of their data’ (2014, 4), which also evokes the notion of individual ownership
of data (see also Reding 2014). This idea is set up against the idea of data ownership
by companies (B. Smith 2018c) as well as the conception of data as some form of
a public good which became more important in the subsequent dispute. Thus, the
relevant reference community consists of individuals who need to be empowered
rather than businesses and customers or ‘the public’.
In contrast, Google initially aimed to construct a reference community based
on (existential) threats. More specifically, in the immediate aftermath of the judg-
ment, the company tried to delegitimize the right to be forgotten by applying
evaluative criteria based in the order of security. Google attempted to discur-
sively link the right to be forgotten to the cover-up of crime (J. Halliday 2014).
For instance, Google stated that about 30 per cent of the 41,000 requests they got
within two weeks of the ruling were related to fraud, 20 per cent concerned seri-
ous crimes, and more than 10 per cent were related to child pornography, thus
referring to a dystopian scenario of abuse to cover up morally deficient behaviour.
European actors tried to counter this strategy. With regard to criticism of data
protection reform proposals, which included the right to be forgotten, the EU
Justice Commissioner Martine Reicherts warned that ‘Those who try to use dis-
torted notions of the right to be forgotten to discredit the reform proposals are
playing false. We must not fall for this’ (Reicherts 2014, 2). Her explicit warn-
ings of distortion, discredit, and foul play pointed to morally deficient attempts
at interference.
In contrast to the initial reports by Google, the majority of requests under
the right to be forgotten—between 84 and 98 per cent in 2015—were found to
involve private personal information rather than criminal activities (Tippmann
and Powles 2015). After initial attempts to delegitimize the right to be forgotten,
Google rapidly changed its position and adopted a more comprehensive strat-
egy to adapt to the conflict resolution process initiated by the CJEU judgment.
Google implemented an online form for requests under the ruling. The company
pointed out that it ‘moved rapidly to comply with the ruling from the Court.
Within weeks we made it possible for people to submit removal requests, and soon
after that began delisting search results’ (Google 2015), highlighting its coopera-
tive compliance. In addition, Google created an Advisory Council and developed
removal guidelines for the EU in cooperation with academics and other experts.
The membership of the council, which included respected scholars and authori-
ties, established Google’s valuation of independent expertise and impartiality, thus
proving conformity with principles of the order of fairness. The Advisory Coun-
cil’s mission was ‘to advise it [Google] on performing the balancing act between
an individual’s right to privacy and the public’s interest in access to information’
(Advisory Council to Google 2015, 1). This established the company’s increasingly
assertive response to the imperative of justification in response to the ruling and
182 DATA GOVERNANCE

completed, as Chenou and Radu put it, the company’s ‘full transformation from a
messenger into an editor of the world’s information’ (2019, 97).
In sum, the challenge of Costeja and the AEDP presented an alternative vision
of the field that highlighted private responsibility and principles embedded in
the order of fairness. Google failed to engage comprehensively with the imper-
ative of justification. The CJEU, in keeping with parallel discursive developments
in the EU legal order, emphasized the primacy of the human rights of privacy
and data protection in relation to business interests. While initial reactions were
hesitant, Google proactively and comprehensively complied with the ruling. It
was obvious that the right to be forgotten might have effects beyond the EU,
but the specific global relevance of the ruling only manifested in 2015. A new
jurisdictional conflict emerged when Google and the French data protection
authority CNIL voiced diverging opinion regarding the geographical scope of
the law.

8.3 Territoriality and the Internet: Google v. CNIL

Neither the 2014 judgment nor the GDPR specified the geographical scope of
delisting or removal under the right to be forgotten. In 2015, the French data
protection authority contested Google’s interpretation that requests should only
be removed from a specific country domain, such as google.fr or google.de
(CNIL 2015b). Instead, the CNIL argued that the full enforcement of the law
required global delisting (i.e., also from google.com), thus extending its juris-
dictional claim over data globally. Google agreed to apply the right to be for-
gotten to domains outside Europe if the IP address indicated a user’s location
in the EU (Fleischer 2016), but the company appealed against the CNIL’s fine
of €100,000. The case was eventually referred to the CJEU in July 2017 (Google
v. CNIL 2019).
It is important to note that the legal context changed during the dispute. The
GDPR, which codifies the right to be forgotten (GDPR 2016, Art. 17), came into
force in 2016 and has been applicable since early 2018. Thus, the Court exam-
ined the overlapping jurisdictional claims, considering the 1995 Directive and the
2014 ruling as well as the GDPR. Due to the borderless character of the internet,
the demarcation of geographical limitations for the removal or delisting of content
presented the Court with multiple normative and legal contradictions and chal-
lenges. In another landmark ruling in late 2019 (Google v. CNIL 2019), the CJEU
found no current obligation under EU law to apply the right to be forgotten glob-
ally but maintained EU-wide enforcement. In Section 8.3.1, I briefly illustrate the
ambiguous roles of data protection authorities as active shapers of or bystanders
in jurisdictional conflicts before illustrating Google’s increased emphasis on the
imperative of (moral) justification.
THE RIGHT TO BE FORGOTTEN 183

8.3.1 Public Regulators: CNIL

In its guidelines published shortly after the judgment, the European data protec-
tion authority 29WP argued that ‘de-listing decisions must be implemented in a
way that guarantees the effective and complete protection of these rights and that
EU law cannot be easily circumvented’ (29WP 2014, 9). This opinion constituted
sufficient grounds for the CNIL to contest Google’s limited delisting practices and
request the global removal of links (CNIL 2015a), thereby articulating an exten-
sive jurisdictional claim. The CNIL’s interpretation of the adequate role of data
protection authorities in conflict resolution processes diverges significantly from
the interpretation by the Irish data protection authority outlined in Chapter 4.
While the Irish DPA was criticized for a lack of regulatory momentum by largely
assuming the role of a passive bystander, the French data protection authority was
accused of overreach by assuming the role of an (over)active shaper.
The position of data protection authorities in the field of data governance dif-
fers depending on the legal and historical context of the applicable jurisdiction. For
example, the US Federal Trade Commission (FTC) has strong enforcement rights
in comparatively few areas of general consumer protection, while most DPAs in
the EU have a broader mandate (Dowd 2019). In interviews and public statements,
private actor representatives have problematized ‘populist rhetoric’ (Tech com-
pany employee B, pers. comm. 2019) and ‘harmful protectionism’ (Private sector
representative, pers. comm. 2019) by data protection authorities vis-à-vis US tech
companies. The monopoly position of Google, which holds a more than 90 per
cent market share in Europe, thus makes it a particularly likely ‘target’ of such
efforts of data protection authorities to ‘raise their profile’ (EC official, pers. comm.
2019).
The CNIL and its president since 2011, Isabelle Falque-Pierrotin, have been
active challengers of company practices. For example, in 2019, the data protec-
tion authority made headlines when it imposed a financial penalty of €50 million
against Google due to a lack of compliance with the GDPR, despite the company
headquarters being in Ireland. While the company should, according to the so-
called one-stop-shop mechanism, fall under the jurisdiction of the Irish DPA, the
CNIL argued that ‘In this case, the discussions with the other authorities, in par-
ticular with the Irish DPA, where GOOGLE’s European headquarters are situated,
did not allow to consider that GOOGLE had a main establishment in the Euro-
pean Union’ (CNIL 2019). The data protection authority, therefore, argued that
the one-stop-shop mechanism, which is designed to prevent multiple data pro-
tection authorities from taking deviating decisions, did not apply and claimed
jurisdiction to unilaterally investigate the complaints. A recent judgment by the
CJEU confirmed this approach (Facebook Ireland Ltd 2021).
In the right to be forgotten dispute, the CNIL attempted to assert the legiti-
macy of its jurisdictional claim by pointing to the value of domestic law and the
184 DATA GOVERNANCE

importance of comprehensive protection. The CNIL suggested the delisting ‘must


be effective without restriction for all processing, even if it conflicts with foreign
rights’ (CNIL 2016, 7). It referred to the opinion of the 29WP and, supported by
the Italian, French, and Austrian public authorities, highlighted the importance of
full enforcement based on the marketplace principle (CNIL 2015a). The market-
place principle, which has been codified in the GDPR, had been established since
the 2014 case, when the AG, the European Commission, and the CJEU argued
that EU law applies ‘if the establishment is intended to promote and sell, in the
Member State in question, advertising space offered by the search engine’ (CJEU
2014, 2). According to the CNIL, the enforcement of global compliance with the
right to be forgotten therefore does not represent an extraterritorial application
of law but merely constitutes full enforcement of European legislation for firms
acting in the European market (CNIL 2015a). In a blog post, the CNIL president
Isabelle Falque-Pierrotin aimed to respond to significant criticism that emerged
in a response to this jurisdictional claim. She argued that ‘there is no imperialism
in subjecting a foreign company to our rules when it joins our market (Falque-
Pierrotin 2017, my translation).² The juxtaposition of ‘our’ market and ‘foreign’
firms establishes the relevant reference community as European citizens, while the
reference to domestic law links to the order of sovereignty. This justification again
combined principles of the order of sovereignty and the order of fairness, which
reinforces the identity of data protection authorities as domestic bodies that pro-
mote a vision of data governance as local liberalism with strong community norms.
The perception of reference communities is shaped by their mandate to implement
and interpret the law for a certain community rather than the world. The Interna-
tional Conference of Data Protection and Privacy Commissioners (ICDPPC), for
example, has so far refrained from issuing a statement on the right to be forgotten.
To summarize, the data protection authority substantiated justifications by
principles rooted in the order of sovereignty and fairness, which established EU
citizens as the relevant reference community and prioritized the own community’s
interests over other communities’ interests.

8.3.2 Google: Following the Imperative of Justification

While the CNIL’s argument had backup from the data protection community,
Google responded to the imperative of justification posed by the CNIL in a
manifestly different manner. In comparison with the first court case, Google
changed its strategy to rely on normative arguments and closer interaction with
local stakeholders (Tech company employee B, pers. comm. 2019). This had

² The original reads: ‘Mais il n’y a pas d’impérialisme à soumettre à nos règles une entreprise
étrangère venue sur notre marché.’
THE RIGHT TO BE FORGOTTEN 185

three components: Google more strongly emphasized the fundamental right sta-
tus of freedom of speech (cited in Gibbs 2014; Bodoni 2019); it aimed to establish
its position as a responsible actor in the field; and it relied on justifications in line
with the order of sovereignty. One could argue that the case is different in that it
more immediately affects communities beyond Europe. Yet as strong normative
concerns also existed concerning the right to be forgotten as such, the different
justifications are likely to reflect a more foundational change of strategy.
First, the company tried to rectify what it perceived to be a tilted balance
between freedom of information and privacy in Google Spain (2014). Its Chief
Legal Officer, Kent Walker, said that the proceedings ‘represent a serious assault
on the public’s right to access lawful information’ (K. Walker 2017). The company
also highlighted ‘serious chilling effects’ and pointed to a ‘race to the bottom’ (K.
Walker 2016) regarding freedom of speech as potential consequences. Google’s
chief executive Larry Page was quoted highlighting the problematic consequences
of authoritarian states using the right to be forgotten to legitimize widespread
censorship (Gibbs 2014). In particular, the removal of lawfully published content
was expected to encourage abuse (Google 2014, 12). The justifications cautioned
against a dystopian scenario in the order of fairness. Second, the strategy estab-
lished Google as an actor in conformity with the field’s ordering principles. In
2016, the company emphasized its sacrifices as a human rights defender, arguing
that ‘We have received demands from governments to remove content globally
on various grounds—and we have resisted, even if that has sometimes led to the
blocking of our services’ (K. Walker 2016). Third, Google’s strategy also high-
lighted normative justifications rooted in the order of sovereignty, albeit with an
explicit emphasis on external sovereignty and sovereign equality. Google specif-
ically highlighted the inconsistencies of the CNIL’s approach, which, on the one
hand, strongly emphasized protection of shared norms within constitutive com-
munities but, on the other hand, imposed rules on other communities, thereby
inhibiting other communities’ capacity for self-determination. In a 2016 blog post,
Google’s Ken Walker states, ‘For hundreds of years, it has been an accepted rule of
law that one country should not have the right to impose its rules on the citizens
of other countries … As a company that operates globally, we work hard to respect
these differences’ (K. Walker 2016). The post entitled ‘A Principle That Should Not
Be Forgotten’ established extraterritorial practices as morally deficient and against
customary principles.
Google’s new emphasis on moral and norm-based arguments was successful in
so far as not only a broad coalition of businesses such as Microsoft and media
companies but also NGOs such as the Fondation pour la liberté de la presse,
the Wikimedia Foundation Inc., or the Reporters Committee for Freedom of the
Press backed the company. Being at the centre of the struggle, Google was sud-
denly in the position of a human rights defender. For instance, the amicus brief
by Article 19 and others described a global scope of the right to be forgotten as ‘a
186 DATA GOVERNANCE

sweeping, extraterritorial de-referencing obligation adopted without regard to the


fundamental right of freedom of expression’ (Article 19 and Others 2017, 20). The
amicus briefs warned of a dystopian scenario in the order of fairness, including
a ‘race to the bottom’ (RCFP and Others 2017, 10) based on censorship and the
restriction of journalism (Article 19 and Others 2017, 4–15), pointing to China
and Russia potentially using similar arguments to restrict free speech. The amicus
briefs thereby similarly challenged the moral hierarchization of freedom of speech
and privacy. In contrast to the interpretation by the European Commission, the
amicus briefs tied the ownership of data to the public rather than the individ-
ual. For example, Article 19 and others suggested that ‘personal information may
equally “belong” to the public, that the public should be able to access it’ (Article 19
and Others 2017, 19, emphasis in original).
The RCPF and others also argued against extraterritoriality and potential viola-
tions of territorial sovereignty (RCFP and Others 2017, 4–5). The NGOs pointed
to interpretations by other international or regional courts, such as the Inter-
American Court of Human Rights or the African Court of Human and People’s
Rights (Article 19 and Others 2017, 12–16). This proved worth within the order
of sovereignty but also contested the inward-looking approach of the CNIL, high-
lighting other communities’ interests. While the CNIL’s reference community was
restricted to EU citizens, the NGO briefs emphasized a global reference commu-
nity. Some have argued that a right to be forgotten is, in fact, compatible with other
international standards, as similar proposals have manifested in other jurisdictions
such as Argentina, Brazil, Mexico, Japan, or South Korea (Frosio 2016, 310).
While these briefs echoed and supplemented the justifications brought forward
by Google, it should be noted that there was no explicit acknowledgement of
Google’s position as a human rights defender. Rather, the briefs contributed to
a recognition of the same principles and similar wording. While there was no
explicit support for Google, there was also no significant criticism or problemati-
zation of Google’s role in these accounts. In contrast, some academic publications
on the subject have criticized Google’s self-portrayal as impartial and neutral
(Powles 2015a).
In sum, both Google and the amicus briefs pointed to inconsistencies in the
moral order established through the right to be forgotten and articulated in
the CNIL’s jurisdictional claims. They highlighted the problems arising from
the application of domestic legal principles beyond territorial jurisdictions and
pointed to the potential undermining effect on human rights, particularly freedom
of speech.

8.3.3 Outcome: The Judgment and the Way Forward

Google’s strategy has at least partly succeeded. While the right to be forgotten
has been strengthened with the GDPR (2016, Art. 17), the CJEU in its 2019
THE RIGHT TO BE FORGOTTEN 187

judgment generally followed the opinions by Google and the supporting briefs.
The judgment stated that global enforcement of the right to be forgotten is not
currently prescribed by EU law. The CJEU pointed to domestic variation in bal-
ancing acts between fundamental rights and implicitly raised the possibility of
national regulation. More specifically, the Court noted that ‘[while] EU law does
not currently require that the de-referencing granted concern all versions of the
search engine in question, it also does not prohibit such a practice’ (Google v.
CNIL 2019, para. 72). This statement implicitly suggested that future domestic or
EU-level legislation would be in line with EU law, leaving significant wiggle room
for legislators. Besides, while both argued for EU-wide applicability of the right to
be forgotten, the AG considered geo-blocking as sufficient (Szpunar 2019, paras.
73–70), while the CJEU did not specify. Therefore, this jurisdictional conflict has
strengthened a vision of data governance that is clustered around the sovereign
rights of communities. It has strengthened the idea that, rather than a globally
uniform approach to data governance, regulation should allow for normative
differences. The Court specifically emphasized that:

the balance between the right to privacy and the protection of personal data, on
the one hand, and the freedom of information of internet users, on the other,
is likely to vary significantly around the world … the interest of the public in
accessing information may, even within the Union, vary from one Member State
to another.
(Google v. CNIL 2019, paras. 61–7)

The Court emphasized the responsibilities and rights of the member states ‘to pro-
vide for the exemptions and derogations necessary to reconcile those rights with,
inter alia, the freedom of information’ (Google v. CNIL 2019, para. 67). There-
fore, the judgment might provide legitimation to strengthen privacy or freedom
of access rights depending on the specific balancing acts of domestic legislators or
data protection authorities (Samonte 2020). The judgment establishes that moral
hierarchizations in the order of fairness must be negotiated in the context of spe-
cific constitutive communities. It also re-emphasizes that both economic interest
and the public interest are secondary to the fundamental rights of privacy and data
protection (Google v. CNIL 2019, para. 45).
In conclusion, while the court case was largely perceived as a victory for Google
in the media (e.g. Kelion 2019), it might challenge the transnational claims of
jurisdiction for companies in the long term. The Court did not follow Google in
establishing a global scope of the right to be forgotten as morally deficient. The
Court thus upheld the emphasis on ‘effective and complete protection’ (Google
Spain 2014, para. 58) articulated in 2014, which some have criticized for the
construction of data protection as an ‘almost super-right’ (Powles 2015a, 592).
Rather than prohibiting the global scope of the right to be forgotten, the judg-
ment established the normative desirability of varying balancing acts which might
188 DATA GOVERNANCE

favour an increasing fragmentation of rules in the long term. This may also have
implications for other areas of content governance such as hate speech, thereby
broadening the administrative load for Google.

8.4 Changing Public–Private Relations

Sections 8.1–8.3 have illustrated the increasingly normative character of Google’s


jurisdictional claims. In this section, I situate these justificatory strategies in the
broader context of the field and illustrate the implications for the conflict res-
olution processes between public and private actors. The book suggests that
private actors are increasingly and proactively trying to impose recognition of
their vision of the world and establish themselves as central actors in the field that
actively shape political and legal processes. They thereby strengthen their credi-
bility in making jurisdictional claims of legitimate control over data, as was also
emphasized in Chapter 7 with regard to Microsoft.

8.4.1 Private Responsibilities to Protect

Google’s strategy for establishing itself as an increasingly responsible actor in the


field by acknowledging the implications of its practices for human rights can be
observed with regard to two aspects. First, there is a shifting emphasis towards
increasingly normative arguments by private tech companies. The decision to
make the 2014 case about the legal rather than the normative dimensions of the
conflict was not for a lack of recognition of its implication per se. Google’s Chief
Privacy Counsel Peter Fleischer had warned about the right to be forgotten in
a private blog post in 2012 (Fleischer 2012). Yet Google’s argument in 2014 was
largely based on the absence of jurisdiction and limited responsibility due to its
position as a search engine that only processes data created by others. When this
strategy failed, Google increasingly tried to depict the search results as a pub-
lic space in need of protection from public encroachment. Although algorithms
heavily shape and restrict search results, particularly in line with the interests of
advertisers, this debate established Google as a defender of this space. This aims
to depoliticize the activities of search engines as curators of content, while at the
same time politicizing removal requests. There are clear information asymme-
tries that are inherent to (the business model of ) search engines. Notably, in an
academic paper that presented the design of the search engine in 1998, Google
co-founders Sergey Brin and Larry Page themselves were wary of the prospects
of advertisement-driven search engines. They pointed to their expectation that
‘advertising-funded search engines will be inherently biased towards the advertis-
ers and away from the needs of the consumers’ (Brin and Page 1998, Appendix I).
THE RIGHT TO BE FORGOTTEN 189

However, by emphasizing its commitment to complying quickly and collabora-


tively with the ruling, Google largely avoided a broader underlying debate on
the internet as a public space of dialogue. By pushing a ‘digital binary’ (Powles
2015a, 590) of privacy versus freedom of information, Google entrenched its
image as a gatekeeper for collective memory. This obscures the contingency and
interdependency of different rights.
It is also important to note that this strategy stands in contrast to how Google
acts in the case of copyright: requests concerning copyright infringements have
risen to 5.6 billion URLs (Google 2022b). In contrast, the number of take-down
requests concerning the right to be forgotten amounts to over 1.2 million requests
concerning about 4.8 million URLs, with about 48.6 per cent of URLs actually
delisted as of February 2022 (Google 2022a). Removals of copyright-infringing
content are based on the US Digital Millennium Copyright Act (DMCA), which
applies to domains worldwide. The DMCA is based on the World Intellectual
Property Organization (WIPO) Copyright Treaty but expands beyond its policy
requirements, for example regarding controversial anti-circumvention provisions
(Besek 2004, 467–70). Google applies these rules globally, even though some coun-
tries have weaker intellectual property and copyright laws than the US (Office
of the United States Trade Representative 2017). Thus, tech companies make
increasingly normative arguments but only in selected policy areas.
Second, and as in the Microsoft case, companies not only appeal to corpo-
rate social responsibility (CSR) standards but also aim to establish themselves as
global political players by coupling normative arguments with fundamental polit-
ical principles such as the rules of the international order, including sovereign
equality or territoriality. While Microsoft explicitly promoted a ‘Digital Geneva
Convention’ (B. Smith 2017a), both Google and Microsoft heavily problema-
tize extraterritorial practices. This contrasts with the visions normally invoked
by private actors, which typically emphasize global interconnectedness and the
borderless nature of the internet (IA 2018). As expected by the literature (Han-
rieder 2016), actors seem to adapt to specific audiences. For example, companies
may adapt to domestic and international law when dealing with public actors
such as data protection authorities, thereby responding to the intergovernmen-
tal character of the field. As in the Microsoft case, the implicit and sometimes
explicit challenge of public actors again represents the idea of companies as ‘Digital
Switzerlands’ (Eichensehr 2019) that scrutinize and challenge state practices.
This discursive construction is embedded in a broader strategy to improve the
company’s position in the international community, also on the basis of mate-
rial resources. For instance, Google’s lobbying spending increased by 240 per cent
between 2014 and 2017 (Kergueno 2017), with reports stating over 280 meetings
with high-level EU officials since 2014, which follows a general trend of increased
lobbying activity by tech companies in Brussels (Lobbyfacts 2022). In a media
interview, Google’s chief executive Larry Page argued, ‘That’s one of the things
190 DATA GOVERNANCE

we’ve taken from this, that we’re starting the process of really going and talking to
people’ (cited in Gibbs 2014). The creation of an Advisory Council under the pre-
text of seeking advice also illustrates the complex nature of Google’s strategy in the
conflict resolution process. While the council included reputable academics from
the field likely to act impartially, the members were selected by Google and the
information they received was limited to publicly available information (Advisory
Council to Google 2015). It is notable that Google did not engage in coordi-
nated efforts with other search engines but chose to unilaterally shape the system
and thus set a precedent for others. In the dispute with the CNIL, Google was
well equipped to respond to challenges to its implementation decision and again,
through collaborative compliance, succeeded in establishing itself as a respon-
sible international actor in contrast to the CNIL’s overbroad claim. Powles has
highlighted that the response in the first case ‘co-opt[ed] significant elements of
the media, civil society, governments, and institutions in promulgating its own
agenda’ (2015a, 583–4), particularly as publishers and media companies are also
less likely to be interested in a mechanism that might contribute to the delisting
or removal of articles. Other tech companies have adopted similar strategies to
resolve conflicts in the field. To improve removal processes of harmful or illegal
content, Facebook has established the Facebook Oversight Board, which some
have criticized as a move largely the sake of appearance that nevertheless forces
governments to be ‘passive rule-takers’ (Haggart 2020, 323). Companies seem
increasingly well equipped to respond to rising demands of responsibility by steer-
ing them in specific directions and exercising control over their implementation.
As the Advisory Council mainly seemed to aim at legitimizing its practices rather
than controlling them (Chenou and Radu 2019, 89), the capacity to shape the rules
of implementing the judgment thus ultimately contributed to a reinforced position
for Google in the field.
In sum, the examples outlined above are evidence of how normative arguments
and an emphasis on compliance as well as material resources formed part of a
more comprehensive strategy to establish Google’s identity as a responsible actor.
As I demonstrate in Chapters 7 and 9, this seems to speak to a more general phe-
nomenon regarding private tech companies that tend to act as norm entrepreneurs
to enter the political and diplomatic stage (Gorwa and Peez 2018).

8.4.2 Friendly Takeover?

The responsibility of tech companies has been the subject of discussion in recent
years. It is important to note that this strategy also responds to rising demands
for companies to recognize their influence, as pointed out by the EU Justice Com-
missioner Martine Reicherts, who suggested that ‘handling citizens’ personal data
brings huge economic benefits to them [companies]. It also brings responsibility’
THE RIGHT TO BE FORGOTTEN 191

(2014). Indeed, a lack of responsibility may facilitate hate speech and abuse,
as demonstrated by the data abuse in the Cambridge Analytica scandal or the
violence incited on Facebook in Myanmar (Stevenson 2018). Nevertheless, the
unique capacities of tech companies in drawing on both material and emotional
dependency (Culpepper and Thelen 2019), particularly compared with the public
legal system, may have adverse consequences. Frosio proposes that this process
‘might be pushing an amorphous notion of responsibility that incentivizes inter-
mediaries’ self-intervention to police allegedly infringing activities on the Internet’
(2018, 3). The asymmetry of financial resources and administrative constraints
reinforces a dynamic whereby public actors accept tech companies’ implementa-
tion and rule shaping. Google’s quick implementation after the ruling confronted
data protection authorities with a functioning system that was largely effective but
designed according to the company’s preferences.
Thus, Google not only shaped the conflict resolution process but also weighs
individuals’ privacy and freedom of information rights. A publication by Google-
affiliated researchers highlighted ‘the challenges of implementing privacy regula-
tions in practice, both in terms of the thousands of hours of human review required
and the fundamentally challenging process of weighing individual privacy against
access to information’ (Bertram et al. 2019, 971). In another recent case (GC and
Others v. CNIL and Others 2019), the outsourcing of fundamental rights bal-
ancing to private companies is even more explicitly articulated. With regard to
sensitive data, including political opinions, religious beliefs, or past convictions,
the CJEU submitted that Google needs to weigh the ‘public interest’ of particu-
larly sensitive data against the rights of individuals who might be strongly affected
by the decision. This sparked concerns about the empowering effect on private
companies (Lynskey 2015b, 532) with Google possibly becoming a private or fun-
damental rights ‘adjudicator’ (Pirkova and Massé 2019). A 2015 Guardian article
stated, ‘Google has acted as judge, jury and executioner in the wake of Europe’s
right to be forgotten ruling’ (Powles and Chaparro 2015). In the past, Google has
been explicitly criticized for a lack of transparency concerning its assessment of
fundamental rights under the right to be forgotten, for example by a group of over
eighty academics in 2015 criticizing a ‘jurisprudence built in the dark’ (Kiss 2015).
The debate about the power of platforms to decide on human rights is rele-
vant beyond data governance and relates to illegal or harmful or unlawful content
according to recent legislation, such as the German Network Enforcement Act
or the Digital Services Act. A recent case regarding hate speech on Facebook
(Glawischnig-Piesczek v. Facebook Ireland 2019), or the UK Online Harms White
Paper (UK Home Office 2019) have demonstrated more comprehensive respon-
sibilities of providers. Yet under current legislation, content removals would not
be feasible for public regulators due to a lack of financial resources. With regard to
law enforcement cooperation concerning electronic evidence, Google has decided
to issue fees for data requests (see Chapter 7).
192 DATA GOVERNANCE

In conclusion, Google has adopted a multifaceted strategy in dealing with juris-


dictional conflicts over the right to be forgotten. The company has increasingly
emphasized a moral agenda of safeguarding human rights such as freedom of
expression and access to information and highlighted its role as a responsible actor
and collaborative complier. This has enabled the company to significantly shape
the interpretation of the CJEU ruling and the law to its own advantage. While this
strategy has to some extent reduced the potential for conflict, it has, at the same
time, contributed to the entrenchment of public–private asymmetries in the field,
as Google occupies the role of an impartial adjudicator and guardian of public
space online.

8.4.3 Challenges from Within

I argue that the successful construction of private actors as responsible actors in


the field—and thus their dominance in conflict resolution processes—is facili-
tated in within-order rather than across-order conflicts. By referring to the same
higher common principles as public actors, such as fairness or sovereignty, but
contesting the current moral hierarchization, that is, the prioritization of certain
principles, objects, or values over others in the pursuit of this common princi-
ple, private companies expand their normative repertoires. They add legitimacy to
their claims by adopting the reference to such principles and, like ‘Digital Switzer-
lands’ (Eichensehr 2019), entrench their position as impartial and responsible
rather than profit-seeking. To a more significant extent than other examples, this
conflict is characterized by a strong within-order conflict, that is, jurisdictional
claims are based on the same value order.
This within-order character of the conflict is most obvious with regard to the
clashing demands of privacy and data protection, on the one hand, and freedom of
information, expression, and access and free media, on the other hand. These all
represent principles linked to the order of fairness. Nevertheless, the conflict also
revolves around the normative meaning and interpretation of sovereignty in an
increasingly interconnected world. Competing justifications demonstrate a shared
understanding of the importance of higher normative principles. Yet they expose
different understandings of the appropriate interpretation of how to achieve them.
The jurisdictional claim acknowledges the prevailing constellation of orders in the
field.
The contestation, as Boltanski and Thévenot put it, constitutes a ‘demand for a
readjustment of worths’ (2006, 133). Thus, actors appeal to a shared understanding
of the worth of fairness and/or sovereignty. However, they disagree on whether
the promotion of privacy or freedom of speech is more important in the pursuit
THE RIGHT TO BE FORGOTTEN 193

of fairness or whether the pursuit of own community’s interest and values trumps
respect for other communities’ interests and values in the pursuit of sovereignty.
Theoretically, this also demonstrates that, far from being static, the hierarchization
within value orders is dynamic.
To prevail in conflict resolution processes, the successful imposition of a specific
conceptualization of governance objects and principles is important. For example,
the amicus brief by the Reporters Committee for Freedom of the Press (RCFP)
and others criticized the fact that ‘by consistently framing freedoms and rights
negatively affected by the right of delisting as “interests” … the balancing con-
ducted by the Court of Justice would be tilted towards the right to data protection
from the very beginning’ (RCFP and Others 2017, 13). The amicus brief by Arti-
cle 19 and others reverses this logic by speaking about ‘data protection interests’
(Article 19 and Others 2017, 12) that need to be weighed against the fundamental
rights of users in other jurisdictions. In the order of fairness, human rights and
binding human rights agreements are considered objects of worth. Thus, par-
ties to conflicts aim to reduce the principles referred to by the other party to
an ‘interest’ which corresponds to a lower position within the established moral
hierarchy.
In contrast, with regard to jurisdictional claims drawing on the order of
sovereignty, actors highlight international law and the interpretation by regional
and international courts as well as domestic law to prove how the imposition
of norms on other communities is morally deficient or worthy. By problematiz-
ing extraterritoriality and constructing it as a shared problem, both the CNIL
and Google refer to similar principles, but the interpretation of what constitutes
extraterritorial action differs. The CNIL perceives a responsibility to protect Euro-
pean citizens’ fundamental rights and aims to fully enforce the 2014 judgment.
However, the borderless nature of the internet complicates the demanded ‘effective
and complete’ (Google Spain 2014, para. 58) protection of citizens. Thus, Google
and others perceive the extraterritorial effects of the CNIL’s jurisdictional claim as
a problematic imposition of foreign norms in violation of principles of sovereignty
and therefore conceptualize the CNIL’s demand as an extraterritorial practice with
problematic consequences for the internet as a global network.
While Boltanski (2011, 40) considers such conflicts as ‘reformist’ compared with
the more ‘radical’ conflicts that challenge the existing order based on alternative
principles of worth, the example illustrates that a critique of the current balancing
between different principles may contribute to normative change. It might enable
private actors to emulate public justifications (and potentially vice versa) and beat
public actors at their own game, thus prevailing in conflict resolution processes
by assuming the positively connotated characteristics of the identity of the other
party to the conflict, in this case, responsibility and impartiality.
194 DATA GOVERNANCE

8.5 Conclusion

In this chapter, I have discussed the introduction and implementation of the ‘right
to be forgotten’ through a jurisdictional conflict initially between Google and the
Spanish data protection authority AEDP and later the French CNIL. The first
part of the conflict culminated in a 2014 landmark judgment by the CJEU, which
established both the responsibility of search engines as data controllers and a hier-
archy of business interests and fundamental rights protections. The second part
of the conflict focused on diverging interpretations of the geographical scope of
these protections and the specific balance between different fundamental rights.
The conflict between Google and the CNIL was again resolved by the CJEU,
which emphasized meaning-making processes at the community level. However,
as in the first case, this conflict resolution strategy opened up a new space for
jurisdictional conflicts by shifting the negotiation processes to the community
level.
The chapter has explored the division of public and private responsibilities in
the enactment and enforcement of fundamental rights, arguing that actors use dif-
ferent resources and strategies to prevail. The normative dimension of the conflict
became particularly prominent in the second phase of the jurisdictional conflict.
Google’s initial strategy of deflecting responsibility, however, proved unsuccess-
ful. In contrast, the emphasis on, first, the normative binary of either freedom of
speech or privacy that the company pushed after the ruling and in the CNIL case
turned out to be a more successful strategy. The normative fit with the agenda of
NGOs and media companies was reinforced by the emphasis on, secondly, the
balance between sovereignty and self-determination and the imposition of norms
on others. The chapter has suggested that Google’s strategy of drawing on moral
arguments while emphasizing its identity as a responsible global actor has been
similarly employed by other tech companies, such as Microsoft (see Chapter 7)
and, more recently, Facebook (Haggart 2020). By drawing on moral arguments,
tech companies seem to attempt to shape evaluative criteria and entrench their
positions in the field as political players. In particular, the fact that Google was able
to emulate public justification strategies and draw on NGO and media support in
its justifications contributed to its prevailing in the second phase of the conflict,
as the company could prove worth within the order of fairness. This should not
obscure company interests in shaping policies according to their preferences, par-
ticularly with regard to the construction of Google as a guardian of public space
online. Zuboff (2018, 8) suggests that companies possess a unique ability to shape
human behaviour and future conduct. Further debate is needed about the priva-
tized, biased, and opaque nature of the decision-making that constructs what is
visible for users (see also Chenou and Radu 2019; Powles 2015a).
THE RIGHT TO BE FORGOTTEN 195

In light of intersecting dynamics that may contribute to mobilization across


contexts, for example concerning anti-feminist content (Rothermel 2020) or con-
servative radicalization (Kydd 2021), questions of responsibility become increas-
ingly urgent. Two central European judgments show that both users and platforms
are increasingly held accountable to rights violations that occur on their sites. In
Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaft-
sakademie Schleswig-Holstein GmbH (Unabhängiges Landeszentrum für Daten-
schutz 2018), the CJEU found the Wirtschaftsakademie and Facebook to be joint
controllers with regard to the processing of personal data. The Court argued for
the lawfulness of data protection fine issued to the Wirtschaftsakademie, as it did
not inform visitors about data processing through cookies on the Facebook fan
page. Relying on this shared notion of responsibility, in Sanchez v. France (2021),
the European Court of Human Rights (ECHR) similarly upheld a fine issued to a
politician who failed to remove unlawful and discriminatory comments from his
public Facebook page.
At the same time, public actors should similarly recognize their responsibil-
ity when enforcing territorial rules in a space with extremely fluid borders. The
CNIL is unlikely to succeed at the unilateral enforcement of local legislation glob-
ally without coordination with other data protection bodies. However, on the
basis of a survey of data protection authorities in Europe regarding ICT, Raab
and Szekely conclude that ‘the present level of expertise available in the DPAs’
offices is either not satisfactory or at least needs to be further enhanced’ (2017,
426), thus also highlighting broader underlying challenges. The CNIL’s jurisdic-
tional claim to control over data globally failed, as the data protection authority
did not recognize the interests of stakeholders in reference communities beyond
the EU and narrowly interpreted individual rights. Unless there is a more com-
prehensive debate about the function of information and access to information in
the information society, conflicts will likely increase as diverse community norms
clash. While the CJEU judgment has contributed to a stricter focus on local com-
munities, this brings the dangers of exceedingly diverging interpretations, as data
protection authorities have shown systematically different approaches in weighing
similar decisions. This lack of normative clarity is matched by a lack of insti-
tutional certainty, as the increased responsibility has also been accompanied by
an increase in delegated public tasks, which may undermine democratic public
processes. Thus, the CJEU’s appeal for stronger inter-European coordination is
a step in the right direction but insufficient to strengthen the underlying norma-
tive debate that should more specifically engage with its global implications. The
attempt by the Libyan Telecom and Technology to make vb.ly, a URL shortening
service, comply with its national Sharia-based law (B. Johnson et al. 2010) is just
one example that demonstrates the broader relevance of this debate globally.
9
Conclusion
Normative Visions across Jurisdictional Conflicts

After analysing five cases of jurisdictional conflict in data governance, this chapter
provides a comparative assessment embedding the empirical findings in a broader
debate on the normative trajectories of digital governance. The chapter draws on
the insights developed in the case studies to come back to the central question of
the book: how do actors respond to or resolve jurisdictional conflicts in data gov-
ernance? To provide insights into responses and strategies across cases, I examine
how value orders are discursively combined and linked to specific calls for action.
Do actors tend to endorse a cosmopolitan vision of data governance by relating
an emphasis on fairness to the global nature of problems? Or do they emphasize
neoliberal market globalism that connects the order of production with global-
ist claims for efficiency? And what implications does that have for transnational
data governance at large? In this chapter, I look beyond the transatlantic context
to discuss the central findings and implications in light of rising regulatory ambi-
tions, particularly those of China, as well as developments in other contexts such
as Brazil or India.
Chapters 1–8 demonstrated the multidimensional character of jurisdictional
conflicts, which are characterized by the interplay of legal, political, institutional,
and normative dimensions. Throughout the case studies, I focused particularly
on so far understudied normative ordering processes that draw on distinct value
orders. These orders express diverging underlying conceptualizations of data
which are linked to the pursuit of substantive common goods such as security and
safety, human rights, or economic progress as well as particular reference com-
munities and evaluative principles. In addition, there are references to procedural
common goods that specify whether data governance should be exercised to pro-
tect specific constitutive communities or whether it demands coordinated global
responses. Mediated by positional differences and perceived thinkable and sayable
options, these ordering processes have contributed to different conflict resolution
processes. While some jurisdictional claims have been stabilized through insti-
tutional agreements and the introduction of safeguards, many have been subject
to continuous contestation. In other words, I have argued that conflict resolution
processes are shaped by the continuous struggle to impose normative and eval-
uative principles that comprise specific conceptualizations of the common good.

Data Governance. Anke Sophia Obendiek, Oxford University Press.


© Anke Sophia Obendiek (2023). DOI: 10.1093/oso/9780192870193.003.0009
CONCLUSION 197

The book has illustrated that while actors share the notion that data governance is
meaningful in pursuit of common societal goals, they have different conceptions
of both what data actually are and what common societal goals should be achieved.
In the remainder of the chapter, I draw on the insights from the case studies
to examine how conflicts have been resolved. First, I discuss to what extent we
may observe specific patterns in justifications. I identify four distinct visions that
actors loosely follow in their justification. Second, I assess the results compara-
tively and discuss the implications for the resolution of jurisdictional conflicts.
Third, Section 9.3 discusses potential normative and policy implications globally
and highlights avenues for further research.

9.1 Linking Justificatory Practices across Conflicts

Throughout this book, I have argued that conflict resolution processes must be
understood as multidimensional. Actors’ capability to engage with the impera-
tive of justification, their position, and their subscription to the illusio of the field
(Bourdieu 1991, 179) all shape their ability to create normative fit between their
jurisdictional claims and the underlying principles in the field. This, in turn, cre-
ates winners and losers. In Chapter 2, I argued that, as actors ‘seek to satisfy several
moral concerns’ (Hanrieder 2016, 412), they are expected to draw on multiple
orders when interacting with their audience(s). In this section, I discuss the inter-
pretative patterns in data governance as compromises between different orders.
I zoom out of specific conflict situations to analyse how actors use justifications
across conflicts and combine orders. The discussion of very different cases through
a common conceptual lens makes it possible to identify the stable character of
conflicts in the field. I argue that these arguments may be understood through
four distinct visions of data governance. To assess how actors draw eclectically
on distinct value orders, I combine insights from the qualitative analysis with a
more quantitative assessment of code co-occurrences to develop an empirically
grounded typology of these four distinct visions.
These visions outline both the meaning of data and data governance and pre-
scriptions for its adequate conduct that follow from this assessment. These visions
may find expression when actors link different values in their justifications. For
example, actors frequently rely on composite principles or objects which are
expected to stabilize agreement (Boltanski and Thévenot 2006, 279). They create
compatibility between principles and are broadly formulated to satisfy diverging
demands. For example, the idea of cross-border cooperation between supervisory
and enforcement authorities appeals to both the order of fairness and the order of
globalism. Actors widely employ such composite principles as ‘cross-border secu-
rity cooperation’, ‘free flow of data/information’, or the ‘protection of domestic
citizens’.
198 DATA GOVERNANCE

As outlined in Chapter 2, I draw on a code co-occurrence analysis of the jus-


tifications employed in jurisdictional conflicts to find out how actors combine
different orders. The co-occurrence of codes assigned to these justifications indi-
cates that actors normalize trade-offs and stabilize compromises between these
orders. The visualization in Figure 9.1 is based on the analysis of 4,385 individual
codings.
The visualization of code co-occurrences offers a first impression of how justifi-
cations for distinct orders intersect across conflicts. It shows strong links between
the order of fairness and the order of sovereignty and similarly robust links
between the order of fairness and the order of globalism. In addition, there are
noteworthy linkages between the order of security and the order of globalism.
The order of production most strongly links to the order of globalism but also
has links to the order of fairness. The high number of references to the order of
fairness is not surprising, since actors frequently invoke references to human rights
to justify political conduct, such as restrictions on sovereignty, or security policies
(Buchanan 2008). This demands caution in the interpretation of the high absolute

Globalism (796)
68

Sovereignty (777) 237

292

108

32 130

Fairness (1761)
40
98

108

Production (520)
24

Security (531)

Fig. 9.1 Visualization of code co-occurrence across cases


CONCLUSION 199

numbers, while the relative strength of linkages between orders can still give an
indication of discursive connections.¹ Combining the code co-occurrences with
insights into the qualitative case studies, I argue that we can distinguish between
four distinct visions of data governance (see also Obendiek 2022).
These visions not only represent compromises between orders but indicate the
orders’ meaning in use. In invoking distinct orders, actors link their appeals to
higher common principles with specific calls for action. These create comprehen-
sive and distinct ‘vision[s] of the world’ (Bourdieu 1991, 159) that define how data
governance is and how it should be. They designate community conceptions and
conceptualize data as a governable object. I describe these stabilizations and mate-
rializations of data as ‘epistemic objects’ (Knorr Cetina 2005, 183) through four
metaphors. For example, a local liberal vision of data governance puts forward a
conceptualization of data as a ‘body part’ attached to the individual and in need
of significant protection. These metaphors represent only a reductive understand-
ing of the characteristics of data but have significant evocative power. They show
us how actors think about data and to what extent they foreground some charac-
teristics rather than others. This gives us important insights into what actors are
likely to consider as acceptable and unacceptable policy choices. These visions also
comprise calls for action. They not only specify the abstract goals of data gover-
nance but also define its means. These means encompass high degrees of protective
measures by domestic actors as well as looser governance structures at the global
level. In sum, these visions offer a comprehensive understanding of the current and
future status of data governance. They resemble preliminary ideal types in need of
further (quantitative) scrutiny and can form the starting point of future research.
The visions give a general indication of how actors draw on specific combinations
of principles of worth and thereby create specific visions of the social world. How
do these visions of the social world link to imaginaries of the global order more
comprehensively?
As outlined in Chapter 2, I contextualize these visions through selective com-
parison with existing accounts of globalization ideologies (e.g. de Wilde 2019; de
Wilde et al. 2019; Steger 2013; Steger and Wilson 2012; Zürn and de Wilde 2016).
As reference points, I use the ideologies identified by de Wilde (2019). Through
a quantitative analysis of debates in contested policy areas in domestic and inter-
national fora between 2004 and 2011, de Wilde (2019) found empirical support
for liberalism, cosmopolitanism, communitarianism, and statism. He identifies
four categories: (i) the permeability of borders, (ii) the global or local outlook
concerning the allocation of authority, the perceived scope of challenges, and the
collective identity of the reference community, (iii) justifications, and (iv) moral

¹ In Appendix 2.2, I exclude general references to safeguards. Overall, linkages decrease consider-
ably in absolute numbers, but the analysis offers similar results, with the exception of the connection
between security and globalism, which becomes relatively stronger.
200 DATA GOVERNANCE

foundations. Element (ii), the global or local nature of community, as well as the
conceptualization of the problem, addresses the procedural dimension of worth.
The identification of justifications (iii) and moral foundations (iv) resonates with
the abovementioned substantive dimension of worth. While the former speaks to
the community whose interests or values the claim (supposedly) represents, the
latter concerns the realization and reasoning of a specific goal.² I outline these
visions in Section 9.1.1; an overview is provided in Table 9.1.

Table 9.1 Visions of data governance

Visions Local liberalism Digital Security Global


of data economy partnership cooperation
governance

Object of Data as oil Data as a tool Data as Data as a body part


governance water
Community Local Global Global Global
Public Private Public and Public
private
Restrictions High Low Low Medium
on
the use of
data
Value orders Fairness, Production, Security, Fairness, globalism
sovereignty globalism, globalism
fairness
Justifications Individual Economic Safety, fight Universal human
rights, com- progress, against rights, solidar-
munity open inter- impunity, ity, freedom,
rights/norms, net, efficiency, counterter- strong interna-
justice, innovation, rorism tional institutions,
sovereignty trust independent
enforcement
Actors Data protection Private Law enforce- UN Special
authorities, companies, ment, Rapporteur, mul-
NGOs, CJEU EC interior tistakeholder
ministries initiatives, US
Department of
Commerce

Adapted from Obendiek 2022.

² The permeability of borders is less relevant in the context of the internet as a global structure with
a high given permeability and was not considered. In view of rising control over the domestic conduct
of the internet by countries such as China, however, this dimension will benefit from further research.
CONCLUSION 201

9.1.1 Local Liberalism

First, I suggest a vision of local liberalism in data governance characterized by


a focus on constitutive communities that forefronts human and fundamental
rights, equality, proportionality, and necessity. The vision is based on moral con-
ceptions of justice and fairness that link to cultural and political articulations
of liberal values as conveyed by the European Parliament (2014a). It links to
the order of fairness and the order of sovereignty. The vision represents strong
public authority with precise and enforceable rules, as embodied by data pro-
tection authorities (29WP and Working Party on Police and Justice 2009; EDPS
2007). Proponents employ references to domestic law and case law to prove
worth (e.g. noyb 2020). As in de Wilde’s (2019) account of liberalism, this is
associated with a local rather than a global outlook and is linked to justifica-
tions related to sovereignty. This may be rooted in an ideational legacy from
the predominantly domestic or local character of laws and enforcement bod-
ies such as data protection authorities. While challenged by a ‘global internet’
discourse (Mansell 2011), sovereignty concerns still seem to constitute an impor-
tant justificatory strategy. The idea of a constitutive community, particularly of
EU citizens, alludes to shared characteristics with the globalization ideology of
communitarianism.
This manifests in two ways. On the one hand, this vision of data governance
articulates a fear of the imposition of norms or rules by others. The analysis
indicates in particular conceptions of worth that antagonize big private tech
companies, which representatives describe as ‘discrimination’ (Private company
employee B, pers. comm. 2019) or ‘harmful protectionism’ (Private sector repre-
sentative, pers. comm. 2019). The current discourse about potential encroachment
by China in the context of 5G and Huawei (Cartwright 2020) further demon-
strates how this notion is closely entangled with the status of these companies
as ‘foreign’, sometimes even evoking fears of private colonialism (Couldry and
Mejias 2019).
On the other hand, public authorities attempt to reproduce their specific local
norms on a global scale. The strong focus on the local community justifies the
extraterritorial exercise of authority to protect the community. Yet this approach
creates tensions with field-inherent restrictions on sovereignty. In light of the fre-
quency and ease of transnational data transfers, the invocation of a distinctly
local liberalism seems to exacerbate conflicts (see Kuner 2015). For instance, the
attempt by the French data protection authority to introduce a global scope of
the right to be forgotten explicitly disregarded competing local norms to pro-
tect the full enforcement of European rights (Falque-Pierrotin 2017). To some
extent surprisingly, NGOs frequently invoke this vision. Interviewees pointed to
the importance of local or European networks rather than transnational advo-
cacy strategies (Tarrow 2005; NGO representative A, pers. comm. 2018, NGO
202 DATA GOVERNANCE

representative B, pers. comm. 2019). This indicates the importance of the (domes-
tic) legal system for proponents of this vision, particularly regarding strategies such
as strategic litigation.

9.1.2 Digital Economy

Second, another vision of data governance I call digital economy is based on a con-
stellation of principles of the orders of fairness, globalism, and, predominantly,
production. It constructs data governance as a transnational space of commerce
beneficial for prosperity. This vision does not resonate immediately with any of
the identified globalization ideologies. While Zürn and de Wilde (2016) identify
a potential connection between cosmopolitan principles and neoliberalism, de
Wilde’s (2019) empirical investigation does not support this link. In data gover-
nance, this constellation of orders draws on principles such as trust to strengthen
a community feeling between customers and private companies. For instance, the
European Commission argues for strengthening legal frameworks on the assump-
tion that ‘Consumer trust in EU and third country operators will fuel and thus
benefit the European and global digital economy’ (EC 2016c, 7). This particular
compromise emphasizes the ‘free flow of data/information’ as inherently valu-
able in the pursuit of the common good of mankind and values innovation. Data
are perceived as ‘raw material’ (Zuboff 2018, 8) that can be processed and mone-
tized. Freedom of enterprise and legal certainty for companies form cornerstones.
As I discussed in Chapters 7 and 8 with regard to Microsoft and Google and as
indicated by the references to fairness, there is an increased demand for private
responsibility. For example, the European Commission emphasizes the linkage
between data protection and strengthening the digital economy (Reicherts 2014),
particularly through notions of trust. The notion of responsibility is increasingly
relevant for private companies in view of the recent ‘techlash’ (Scott 2019). For a
long time, tech companies’ significant position in the field was largely secured by
their accumulation of capital in the form of control over vast amounts of data. This
was embedded in a discourse of ‘solutionism’, that is, the suggestion that this accu-
mulation of data contributes to solving major problems of humanity (Nachtwey
and Seidl 2020). Now, companies are increasingly trying to support this status
by creating narratives about the legitimacy of their practices, explicitly calling
for regulation (Facebook 2020) and criticizing the exploitative data practices of
competitors (Hern 2019). This strengthens their capacity to create alliances with
customers in fighting unwanted regulatory interventions (Culpepper and Thelen
2019). Tech companies seem to construct their identities according to the idea
of ‘Digital Switzerlands’ (Eichensehr 2019). As demonstrated in Chapters 1–8,
they not only emulate public justification strategies but also scrutinize the exercise
of public authority. They actively shape policies by promoting a ‘Digital Geneva
CONCLUSION 203

Convention’ (B. Smith 2017a) or a public–private Cybersecurity Tech Accord


(2018). In terms of concrete means of data governance, such measures, however,
demand looser controls at the global level rather than truly restrictive policies, as
demonstrated by the heavy lobbying against the GDPR (Laurer and Seidl 2021).

9.1.3 Security Partnership

Third, I observe a shift towards loose global structures in security-related data


governance in which the order of security and the order of globalism converge
to form a security partnership. This vision is based on ‘global security threats’
(Council of the EU 2018, 2) that require global cooperation and common solu-
tions to ensure safety. An existentially threatened community of vulnerable people
needs protection against (foreign) terrorist fighters or criminals. Law enforce-
ment and intelligence agencies unite with interior ministry officials and security-
focused departments in international organizations. They attempt to fight against
impunity, which represents a challenge both on the internet and in transnational
crime and terrorism (Seger 2019). In this vision, data governance must address
the global challenge of transnational terrorism and crime through similarly global
measures. Increased law enforcement cooperation, data access for public bodies,
and information sharing through common databases (Council of the EU 2019)
provide crucial answers to this challenge. Appeals to a common ‘responsibility to
protect’ entrench the necessity of security measures. The case of financial data
sharing discussed in Chapter 6 is a relevant example. Even though the jurisdic-
tional claim of the US was strongly contested due to its secret and intrusive nature,
a global scenario of existential threat shaped the perceived space of possibles.
Morally worthy and responsible behaviour was strongly tied to global cooper-
ation in security matters (Ripoll Servent and MacKenzie 2011; MacKenzie and
Ripoll Servent 2012). This vision to some extent contrasts with the ‘statist’ cat-
egory de Wilde (2019) identifies. The statist ideology has a strong emphasis on
security and safety but is tied to the pursuit of sovereignty. In data governance,
actors seem willing to make concessions with regard to their sovereignty to fos-
ter transnational cooperation to strengthen security objectives. To be sure, there
are restrictions on the bindingness of agreements and derogations for states, as
expected by the literature that remains doubtful of the integration of national secu-
rity competences (Genschel and Jachtenfuchs 2018). The differences might result
from the more intangible nature of data. States might have a greater willingness
to share information for common security purposes in comparison with making
concessions to the monopoly of force in a narrower materialist sense. This vision
seems to be embedded in a broader trend of globalizing security practices, as is
also seen in the collaboration between domestic intelligence services (Bauman
et al. 2014; Fischer-Lescano 2016). The principles articulated in this vision are
204 DATA GOVERNANCE

also consistent with the emphasis on transnational networks of security officials


established in the empirical literature (Farrell and Newman 2019; Pawlak 2009).
Through a system of informal cooperation between public authorities as well as
with the private sector, data governance is subject to a system of looser global struc-
tures designed to enable seamless access to potentially security-relevant data. Yet
as was demonstrated by the introduction of the CLOUD Act in Chapter 8, domes-
tic laws continue to act as an important resource to institutionalize security-related
data access.

9.1.4 Global Cooperation

Fourth, the order of globalism and the order of fairness converge in a vision
of global cooperation. This vision focuses on the universal character of privacy
and data protection and demands stronger cooperation and legal harmonization
to avoid fragmentation and international discord. For example, with regard to
the Umbrella Agreement, the European Commission emphasizes the significant
improvement compared with ‘the present day situation which is characterised by
fragmented, non-harmonised and often weak data protection rules in a patchwork
of multilateral, bilateral, national and sectorial instruments’ (EC 2016c, 14). This
vision emphasizes a global reference community and demands respect for other
communities’ needs (e.g. ICDPPC 2005). There is some overlap with the ideology
of cosmopolitanism identified by de Wilde (2019). For example, the UN Special
Rapporteur emphasizes the universal human rights character of privacy (UN Spe-
cial Rapporteur on the Right to Privacy 2017; UNHRC 2018). This is consistent
with approaches that highlight the importance of political accountability levels:
global actors advocate for global values (de Wilde et al. 2019, 32). In a recent
statement, the EDPS also articulated a more cosmopolitan vision of data gover-
nance, arguing that ‘The protection of personal data requires actionable rights
for everyone, including before independent courts. It is more than a “European”
fundamental right—it is a fundamental right widely recognised around the globe’
(EDPS 2020, emphasis in original). References also relate to multilateral cooper-
ation through institutions or cross-border enforcement networks. For example,
the US Federal Trade Commission argues that ‘Meaningful protection for such
data requires convergence on core principles, an ability of legal regimes to work
together, and enhanced cross-border enforcement cooperation’ (FTC 2013, 7).
While there is a demand for common principles and enforcement, there seems to
be a general recognition of the need for the interoperability of frameworks, even
if that implies a lower level of protection compared with that of the local liberal
vision (ICDPPC 2005; UNHRC 2018). The qualitative case studies indicate that
the vision of global cooperation seems to be less decisive in the resolution of juris-
dictional conflicts, as none of the outcomes or challenges is based on this vision.
CONCLUSION 205

However, the relative strength of the connection between fairness and globalism
as illustrated in Figure 9.1 and the number of references indicate that actors draw
on similar principles simultaneously. The limited relevance in outcomes might
speak to bias in the selection of cases, which are mostly limited to the transatlantic
context.
In conclusion, I argue that in their attempt to justify jurisdictional claims, actors
draw on four distinct visions or social realities of data governance. There seem to
be some similarities between the globalization ideologies observed by IR scholars
and the order constellations I suggest in the area of data governance, particularly
with regard to the local vision of liberalism. However, there are also important
differences. For example, the applicability of the central category of permeability
of borders is difficult to conceptualize and useful only to a limited extent in the
context of the internet as an inherently global structure. The analysis also points
to a potential conceptual shortcoming in current globalization debates in their
ignorance of the prominence of private actors and particularly the normativity that
is assigned to the degree of ‘publicness’. While particularly in this area but also
in other globalized contexts, the distinction between private and public is fluid,
actors’ justifications strongly invoke such binaries. Considering the relevance of
private actors in other globalized arenas, it seems surprising that this distinction
has not been more comprehensively included. This short comparison highlights
the need to consider the specific context of the internet more comprehensively
in the conceptualization of globalization ideologies. The plurality of visions also
suggests that the view of competing privacy- and security-focused coalitions is in
many ways too simple. At the very least, membership varies as actors shift between
these coalitions. I elaborate on the implications for conflict resolution processes in
Section 9.2.

9.2 The Resolution of Jurisdictional Conflicts: Key Findings

Each of the chapters has highlighted different elements of the process of conflict
resolution and the book’s central questions regarding the responses, the stabi-
lization of agreement, and the interlinkages with the field of data governance. I
have illustrated that major conflicts in data governance share a common char-
acteristic: they emerge when actors articulate overlapping ‘jurisdictional claims’,
that is, claims of legitimate control (Abbott 1986, 191). I have argued that in addi-
tion to the well-explored institutional and power dimensions of these conflicts,
there are deep but undertheorized normative divisions. While adding complex-
ity to already complex inter- and transnational dynamics, I argue that the analysis
of such normative divisions in jurisdictional conflicts can contribute to a better
understanding of the broader trajectory of the field of data governance at large.
Through a bifocal approach focusing on specific conflicts and their interlinkages
206 DATA GOVERNANCE

with broader developments in the field, I analysed conflict resolution processes in


five cases, focusing on the justifications brought forward by actors in their jurisdic-
tional claims through the analytical lens of value orders (Boltanski and Thévenot
2006). I outlined how these claims are constitutive of four distinct social realities
of data governance (Table 9.1). I highlighted how actors in their justifications are
constrained by a perceived ‘space of possibles’ (Bourdieu 1996, 234–9) organized
around the central ‘illusio’ (Bourdieu 1991, 179) of the field. In Chapter 3, I argued
that this illusio establishes that data governance is meaningful in the pursuit of the
common good. The results of the case studies and their comparative assessment
illustrated in Table 9.2 highlight three points in answer to the overarching ques-
tion of how actors resolve jurisdictional conflicts, which motivated the book. First,
actor responses are shaped by their position in the field as challengers or incum-
bents. Second, agreements are stabilized through the introduction of confirmatory
truth tests such as review or transparency mechanisms with limited impact. Third,
actors draw on four distinct normative visions of the field. In short, the resolution
of conflicts in the field of data governance depends on the character of the nor-
mative challenge, the capacity of actors to create normative fit, and the existing
hierarchization in the field.
First, the comparative assessment of the cases suggests that the response to juris-
dictional overlap depends on the actor’s position in the field, with more peripheral
actors mostly appearing as challengers. Most of the conflicts I analysed draw on
the rejection of the applicability of a specific regime of justification (across-order
conflict) rather than the contestation of hierarchy within one order (within-order
conflict). Actors who are more peripheral in the field seem more likely to appeal
to alternative principles and create a different vision of the field: They are to a
lesser extent integrated into the current order as outside challengers and hold an
‘“oppositional” perspective’ (Fligstein and McAdam 2011, 4) that strengthens their
‘change agency’ (Harbisch 2021, 42). Thus, their challenge might contribute to
more foundational normative change as the field is re-evaluated in light of such
alternative principles. Only in Chapter 6, the rejection of the TFTP agreement
constituted a challenge by a (newly) central actor, as the European Parliament
had just assumed additional institutional power under the Lisbon Treaty. Sub- or
non-state actors instigated the challenges in the remaining cases.
The peripheral position of challengers might prompt more established actors
in the field to not or barely respond to the imperative of justification, because the
nature of the challenge is perceived as minor. For example, this dynamic played
out with regard to the invalidation of the Safe Harbour agreement in 2015 (see
Chapter 4) and with regard to Google in the 2014 right to be forgotten case (see
Chapter 8). This shows that not only is access to transnational networks and coali-
tions important (Farrell and Newman 2019; Pawlak 2009) but also the capacity to
envision an alternative reality of the field. For example, the invalidation of Privacy
Shield to some extent seems to suggest limits on the influence of transnational
Table 9.2 Jurisdictional conflict resolution processes across cases

Case Safe Harbour PNR TFTP Electronic evidence Right to be forgotten

Type of conflict Principles of Principles of Principles of Mixed Hierarchization within


alternative order(s) alternative order(s) alternative order(s) the existing order
Jurisdictional EU US US US EU
claim
Jurisdictional US EU EU EU Google/US
overlap
Response to No Yes Yes Limited Google, in second
imperative of phase
justification by
Px
Successful Yes, Maximilian Limited due to unin- Yes, EP after Lisbon Yes, Microsoft Yes, Spanish AEPD
challenge (P1) Schrems tended consequences,
EP before Lisbon
DPA Passive bystander to Silenced challenger Silenced challenger Silenced challenger Active challenger
active challenger
Court Active challenger From passive — Passive bystander Active challenger
bystander to active
challenger
Company Passive bystander Passive bystander Passive bystander From active chal- From passive
involvement lenger to passive bystander to active
bystander challenger
Case Safe Harbour PNR TFTP Electronic evidence Right to be forgotten
Disruption of US, EC EP, DPAs US, EC, member US Google
the articulated states
interests of

Continued
Table 9.2 Continued

Case Safe Harbour PNR TFTP Electronic evidence Right to be forgotten

Disruption based Local liberal Local liberal Local liberal Local/global liberal Local liberal
on
Outcome Continued disruption Formalization and Formalization of Formalization of New institutional
expansion of existing existing practices, existing practices, structure, limited
practices, limited limited revisions ongoing changes in second
revisions phase
Outcome based Uncertain Security partnership Security partner- Security partnership Local liberal
on ship
Convergence Uncertain Global economic — Global economic —
with
Normative Uncertain Entrenchment of Entrenchment Entrenchment of Shift to alternative
consequences alternative principles of alternative alternative principles principles
principles
CONCLUSION 209

security coalitions, as it runs against the interests of the vast majority of the pow-
erful actors in the field, including the US government, intelligence agencies, the
European Commission, many member states, and major private companies. How-
ever, the peripheral position of actors in the field is notably not necessarily tied to
their power more generally. For example, Microsoft is clearly a powerful actor but
assumes a more peripheral position on law enforcement data, which prompted
the company to issue a foundational challenge (see Chapter 7). As the relationship
between the company and law enforcement agencies becomes formalized, how-
ever, the company is increasingly integrated into the field. The analytical focus
on dissimilar social realities offers an alternative account of how such disruptions
of powerful interests are possible when peripheral actors through legal or other
means gain opportunities to demand justifications from dominant actors in the
field.
This high potential for disruption also means that conflicts are rarely fully
resolved. In all cases but the electronic evidence example (see Chapter 8), which
is likely to become subject to challenge in the future, conflict outcomes have been
contested. This seems to confirm the continuous character of conflicts that has
been emphasized in both the empirical (Farrell and Newman 2019) and theoreti-
cal literature (Abbott 1986; Boltanski and Thévenot 2006; Fligstein and McAdam
2012). Thus, there is rarely a clear-cut end, as underlying normative differences
remain. Compromises may dissolve in response to external shocks or new infor-
mation, such as surveillance revelations or scandals involving private companies,
shifting notions of identity, such as increased perceptions of company responsibil-
ity, or institutional shifts in formal power, such as after the Lisbon Treaty. As Faude
and Große-Kreul (2020) outline, overlap between institutions may enable actors
that face limited recognition in one institution to draw on norms or rules embod-
ied by an alternative institution to justify their claims. The book has outlined the
empowerment of actors in the EU to contest transatlantic or even US data gov-
ernance practices. However, this empowerment stands in contrast to the limited
effects on domestic activities, particularly surveillance practices by EU member
state intelligence services (see also Bigo et al. 2013; Farrell and Newman 2019).
Second, and notwithstanding the potential for disruption, jurisdictional con-
flict resolution outcomes may be stabilized over an extended period when actors
succeed in entrenching a specific conceptualization of data. Examples such as the
expansion of Passenger Name Record (PNR) sharing and the persistence of the
TFTP agreement in the face of contestations (Chapters 5–6) are often related to the
conceptualization of data as a security tool. While data access by public authorities
contributed to the emergence of significant disputes in the immediate post-9/11
period, surveillance practices are increasing (Fischer-Lescano 2016; Viola and
Laidler 2022). While the securitization literature prioritizes the intentionality of
selected actors, the findings emphasize processes of incremental change (Williams
2003, 521). For example, through the concept of a ‘chain of security’ (de Goede
210 DATA GOVERNANCE

2018b), de Goede demonstrates how financial transactions or social media data


are transformed into an indicator of suspicion, risky patterns, or admissible evi-
dence, depending on the context and transforming actor. A similar process plays
out in the case study demonstrating the transformation of PNR data into a ‘tool’
in the fight against terrorism. By establishing PNR as an indicator of suspicion,
data are depoliticized and removed from the individual. Once data are stabilized
as ‘epistemic objects’ (Knorr Cetina 2005, 183) within one social reality, the space
of possibles is likely to form around the characteristics of this object.
Claims are further stabilized through the introduction of continual truth tests
that have a confirmatory rather than critical character (Boltanski 2011, 62); they
do not question the existing moral order but strengthen it. The introduction of
safeguards such as access and redress mechanisms or oversight and review pro-
cedures provides evidence of conformity with the order of fairness. By severely
limiting the potential consequences of such safeguards, they simultaneously find
acceptance among proponents of the order of security and the order of production.
Examples of such stabilizing mechanisms can be found in the PNR and the TFTP
programmes, which introduced oversight and review mechanisms with limited
actual constraints. The case study in Chapter 7 regarding the CLOUD Act shows
that the formalization or legalization of claims through domestic law or bilateral
agreements had a similar effect. This highlights the relevance of creating norma-
tive compatibility of diverging demands through composite objects (Boltanski and
Thévenot 2006, 279) such as transparency reports and review mechanisms. Even
if, as in the TFTP example, the limited nature of such mechanisms is revealed, the
agreement persists, because existence of the safeguards as such may be sufficient to
satisfy normative demands. This argument also speaks to the relevance of recent
arguments in the literature that highlight an entrenching rather than destabilizing
relationship of transparency in relation to surveillance (Viola and Laidler 2022).
The stabilization of claims of legitimate control over data can similarly be
observed with regard to private companies. The increasing number of trans-
parency reports issued by tech companies demonstrates how they face increasing
incentives to demonstrate responsibility. Companies disclose to what extent they
are subject to public control in the area of law enforcement and national security
and actively criticize a lack of transparency (e.g. B. Smith 2018c). Yet transparency
reports are carefully curated efforts that show only a specific fraction of how com-
panies actually handle their users’ data (Parsons 2017). Companies may further
attempt to establish a strong position by engaging in academic and policy research
activities (Bodó et al. 2020), for example by co-sponsoring events such as the
Amsterdam Privacy Conference or the Computers, Privacy, and Data Protection
Conference, donating to universities (e.g. Huawei donated $7.5m to the Univer-
sity of Surrey’s 5G Innovation Centre), or setting up expert boards (Chenou and
Radu 2019). Companies are, therefore, actively constructing the very standards of
‘transparency’ and ‘responsibility’ against which they are measured.
CONCLUSION 211

Third, with regard to the interlinkages between jurisdictional conflicts and


broader developments in the field, I have already illustrated the interlinkages with
the normative structure of the field through the four distinct visions. I argue that
actors’ capacity to create a shared understanding based on these normative visions
is crucial for the conflict resolution process. For instance, in light of the inher-
ently transnational nature of global data flows, it is not surprising that justifications
fail if they limit data governance to the domestic context. In contrast, in connec-
tion with more substantive arguments, for example emphasizing the protection
of the individual, sovereignty justifications resonate with the roots of data gover-
nance in domestic data protection legislation. This also points again to the role of
data protection authorities. Perceived notions of the right conduct of data gover-
nance have differed considerably between actors, which highlights the need for
further comparative research. It is also notable that, with regard to data-based
security measures, data protection authorities have been largely reduced to the
role of silenced challengers. Despite extensive criticism, calls by the EDPS (2010a)
to provide evidence of their effectiveness have been largely ignored. This silencing
also has a more active component when data protection authorities are explic-
itly excluded from negotiations (DPA representatives A, B, pers. comm. 2017). For
instance, in a leaked document, the participation of data protection authorities was
described as ‘a “non-starter”’ (US Mission to the EU 2007). Thus, it seems hardly
surprising that data protection authorities assert their claims where they can, even
if this contributes to extraterritorial consequences.
Data governance seems to follow a more general trend that sees an increase in
claims of extraterritorial jurisdiction in both the EU and the US (Putnam 2009;
Raustiala 2011). This also particularly applies to the judicial arena. The literature
has identified an increase in legal activism since the early 2000s (T. C. Halliday
et al. 2007). This seems to be consistent with observations in the area of data
governance (see also Fahey and Terpan 2021). The ‘legal route’ seems to be an
increasingly successful strategy to contest jurisdictional overlap in this field. There
are indications of a judicialization of politics (Alter et al. 2019) in data gover-
nance, that is, ‘the reliance on courts and judicial means for addressing core moral
predicaments, public policy questions, and political controversies’ (Hirschl 2009).
In four out of five cases, courts have been asked to answer questions of jurisdic-
tional overlap that, as I have argued, have a foundational normative quality. While,
for example, the US Supreme Court judges explicitly referred to the responsibility
of the legislature for addressing such conflicts (Supreme Court of the US 2018),
the CJEU has, particularly more recently, become increasingly assertive in the area
of data protection, as was discussed in Chapter 4. While some (Brkan and Psy-
chogiopoulou 2017) have explored the role of courts, including the CJEU (Brkan
2017), from a legal perspective, there is the need for more comprehensive and
interdisciplinary scrutiny of the more recent court decisions. The decisions of the
CJEU have been examined comparatively (Carrubba and Gabel 2015; Larsson and
212 DATA GOVERNANCE

Naurin 2016), but there is little systematic qualitative research on the evolution of
decisions of the Court in more recent years, particularly from a political science
perspective (but see Bois 2021).
In the early 2000s, the Court seemed to be ‘disinclined to back a privacy rights-
based approach’ (Farrell and Newman 2019, 35), as was illustrated in the PNR
case in Chapter 5. However, the recent invalidation of Privacy Shield seems to
confirm an increasing willingness to entrench data protection measures and a
more general expansion of fundamental rights jurisprudence by the CJEU (Bur-
chardt 2020, 2). In 2014, the CJEU surprisingly, and against the recommendations
of the AG Yves Bot, backed the ‘right to be forgotten’ and thus increased obli-
gations on private companies, and in the Digital Rights Ireland case (2014), the
Court invalidated the European Data Retention Directive. In addition, it con-
firmed the lack of lawfulness of member states’ general obligation to data retention
just two years later (Joined Cases C-203/15 and C-698/15 2016). As in the recent
Canada PNR case (CJEU 2017), in the Schrems cases (Schrems 2015; Data
Protection Commissioner 2020), the Court entrenched the evaluation of data gov-
ernance according to principles based in the order of fairness. This also entrenched
the local liberal vision of data governance, despite serious political consequences.
In a recent judgment, the Court further attempted to strengthen its claim over
data processing by member states for security purposes (Joined Cases C-511/18,
C-512/18 and C-520/18 2020; Privacy International 2020b). Yet assertions of
extraterritorial jurisdictions or the increased fundamental rights jurisprudence
might contribute to political backlash or pushback, including that from domes-
tic constitutional courts (Burchardt 2020). Therefore, while recent literature has
suggested that the Court enjoys broad support with its main audience of legal pro-
fessionals (Bois 2021), it will be crucial to examine the interlinkages between the
legal, political, and normative dimensions that play out in such seemingly legal
conflicts.
In its core, this book has attempted to outline how actors deal with the signifi-
cant uncertainties about the governance of digital data. Data have unique features
that create challenges for successful governance, particularly in a world shaped by
territorial rules and domestic conduct. Data can be transferred across the globe in
an instant; they can be multiplied with very limited efforts but are also linked to
physical structures such as fibre-optic cables and storage centres. Empirical case
studies have illustrated the complexities and tensions that actors need to resolve
when addressing fundamental questions: who, if anybody, owns data? What are
the objectives of data use, processing, and protection practices? Who is and should
be affected by those practices? And, most importantly, who may answer such ques-
tions in light of ever-increasing inter- and transnational linkages? The book has
illustrated that, as much as actors share an opinion of what is at stake in the field
(Bourdieu and Wacquant 1992), there is no ‘consensual “taken for granted” real-
ity’ (Fligstein and McAdam 2011, 4–5). Actors express references to the shared
understanding that data governance is meaningful in the pursuit of the common
CONCLUSION 213

good but articulate different conceptions of what they perceive as shared societal
goals. The book provides a novel account of what these realities might look like
in data governance and establishes the conceptual groundwork for comparative
perspectives into other geographical contexts as well as areas of global gover-
nance that share tendencies of jurisdictional overlap, such as internet governance,
international financial regulation, or environmental governance.

9.2.1 A Transatlantic Future?

What do these findings mean for the future of transatlantic data governance?
The 2020 judgment in Schrems II in particular makes clear that the challenge
in overcoming existing legal, political, and normative divisions is serious. The
CJEU seems unwilling to rubber-stamp institutional solutions that do not ade-
quately protect EU citizens’ rights. The protracted negotiations show that Safe
Harbour 3.0 entails significant political and economic risks. The judgment also
highlights that existing transparency, oversight, review, and redress mechanisms
must depart from their largely superficial character to include specific safeguards
and more effective constraints on bulk surveillance. Yet at the time of writing in
early 2022, negotiations still seem to be headed towards a revision of the current
framework rather than a complete overhaul. Considering the current situation,
this is hardly surprising. Despite calls to reform US government surveillance,
change has neither been significant nor seems very likely (Viola and Laidler 2022),
while the EU suffers from a lack of credibility. Somewhat paradoxically, the EU’s
legal order provides increased legislative scrutiny for surveillance measures of
third countries, while surveillance by member states’ intelligence services remains
largely unrestricted. Domestic courts such as the German constitutional court
have found surveillance by intelligence services to be too far-reaching (1 BvR
2835/17 -, Rn. 1-332 2020), but, due to national security exceptions, neither the
Commission nor the CJEU has much of a say. While two judgments by the CJEU
seem to indicate that the Court at least attempts to expand its competences in this
area (Joined Cases C-511/18, C-512/18 and C-520/18 2020; Privacy International
2020b), ongoing struggles to re-establish comprehensive data retention provisions
in various member states show that governments are hesitant to back down. In
light of these derogations by member states, it seems unlikely that US authorities
would be willing to seriously reform intelligence practices. To increase its credibil-
ity vis-à-vis the US, the EU needs to address these concerns more comprehensively
and problematize surveillance by its member states. Otherwise, it is likely that the
most stable solution, a comprehensive transatlantic data agreement, will stay out
of reach.
Yet the EU’s cooperation with the UK, which is also known for its extensive
surveillance activities, may foreshadow a potential path for cooperation. The Euro-
pean Commission was relatively quickly able to grant an adequacy decision, which
214 DATA GOVERNANCE

was also based on the existence of a comprehensive data protection law. The deci-
sion has a four-year sunset clause and, while including law enforcement data, has
some restrictions, for example exempting data transfers for immigration control,
which were considered to be too far-reaching. The fact that the Commission was
able to quickly grant an adequacy decision to the UK might indicate how the US,
once it has a federal privacy law in place, could more easily circumvent a com-
prehensive discussion due to the formal existence of legal safeguards. Yet, at least
for the moment, Congress does not seem to be making significant progress in this
regard. If both sides fail to move, transatlantic commercial data sharing might be
headed towards an increasingly conflictual future, perhaps even deadlock, as frus-
tration increases. As the European Commission (2020e) with the Digital Services
Act package introduced a comprehensive and ambitious set of rules on compe-
tition, market structure, content, and services that will shape the digital space in
Europe and beyond, the transatlantic partnership needs to work on cooperative
solutions.

9.3 Outlook: Global Implications and Further Research

From the tentative answers to the questions and problems that motivated this
book, inevitably more questions arise. The book has focused on transatlantic con-
flicts as the most influential context and main arena of data governance to date.
Yet these insights provide important starting points for thinking about data gov-
ernance in diverse geographical contexts. A significant portion of data governance
frameworks is still to be negotiated. With further jurisdictional conflicts likely to
emerge, this section focuses on the global implications of the book’s findings while
sketching avenues for further research.

9.3.1 The Rise of China

The first and major game changer in data governance is constituted by the rise
of China as a regulatory power. In recent years, China has significantly expanded
its regulatory efforts to establish a comprehensive data governance regime. After
the incremental adoption of data protection provisions in various sectors such as
finance, telecommunication, education, and healthcare, regulatory efforts target-
ing online content, critical infrastructure, and encryption, efforts culminated in
the adoption of the Data Security Law and the Personal Information Protection
Law in 2021. Investigating more closely the visions that guide Chinese data gov-
ernance policies should, therefore, be a priority for further research. While it is
often difficult to get access to documents that outline such guiding principles and
collective values, particularly in the inter- and transnational realm, the Chinese
approach stands out because of its exceptionally strong focus on state control.
CONCLUSION 215

First, while they are in many provisions similar to frameworks in the EU or the US,
Chinese regulations constrain data collection and use practices by tech companies
much more significantly than state activities. As Creemers (2021, 2) outlines, the
motives that drive data protection frameworks in China, are much more anchored
in national security and public interest perspectives. Second, Chinese regulatory
frameworks establish extensive data localization obligations requiring personal
data to be stored in China. Any cross-border transfers are subject to the control of
the Cyberspace Administration of China, which thereby gains significant insights
into data flows. Third, Chinese influence on transnational data governance only
marginally relies on regulation. Instead, Erie and Streinz (2022) argue, the ‘Beijing
effect’ shapes data governance through extensive Chinese influence on standard-
setting and the provision of infrastructure. The vast infrastructure investments
through the Chinese Belt and Road Initiative in particular have sparked major dis-
cussions about Chinese influence in Africa, Asia, and Europe (Shen 2018). While
perhaps unique in its extensiveness, the Chinese approach to data governance is
indicative of broader ongoing trends in data and internet governance, as tensions
between self-determination, democracy, and hyperglobalization (Haggart 2020)
have contributed to sovereigntist assertions across the globe. Furthermore, the
Chinese approach is likely to impact regulatory and infrastructural initiatives, par-
ticularly in developing countries. This may result in an increasing regionalization
of regulatory approaches with distinct and competing EU, US, and Chinese or
sovereigntist models (see also Flonk et al. 2020). I examine the likely impacts of
a sovereign perspective on data, data localization policies, and influence through
infrastructure in turn.

9.3.2 Strategic Sovereignty

The increasing political relevance of digitization has strengthened a geopolitical


perspective on data across the globe. The Chinese ‘twin objectives of protect-
ing national security first and foremost, while also fostering economic growth’
(Erie and Streinz 2022, 24) may appeal to developing countries concerned about
the political, social, and economic implications of global interconnectedness. The
expansion rates of companies such as Alibaba, Baidu, Huawei, or Tencent have
contributed to a fast-growing e-commerce sector. Yet, in contrast to US tech firms
that pose immense political challenges to governments, Chinese party officials
exercise significant influence, as companies are closely entangled with the state,
for example through subsidies, staff, and leadership links (Feldstein 2021, 240).
Concerns about illegal and harmful online content, economic concentration,
and the undermining of data protection standards have also increased the calls for
a stronger role for public authorities. Across the globe, governments have been
involved in various conflicts with tech companies. For instance, in Brazil, the
216 DATA GOVERNANCE

decision by the messaging platform WhatsApp to require users to agree to data


sharing with Facebook has proven controversial, provoking both data protection
and competition cases (Mari 2021). India engaged in a conflict with WhatsApp
and Twitter over extensive content regulations that would undermine privacy pro-
tections by preventing the use of encryption, as WhatsApp has claimed in a lawsuit
filed with the Delhi High Court (Ellis-Petersen 2021). In the EU, the attempt
to increase control over data and infrastructure has spurred calls for ‘digital
sovereignty’ (Obendiek 2021). As public officials attempt to rebalance public–
private asymmetries, limiting the influence of foreign tech companies through
appeals to national security promises to bring both geopolitical and geoeconomic
advantages. An interviewee stressed that the security relevance of data was increas-
ingly emphasized in OECD countries (IO official, pers. comm. 2017). In 2020, the
Trump administration declared that the Chinese video platform TikTok consti-
tuted a national security threat (Inman 2020). This might indicate that the security
nature of data increasingly extends beyond immediate security challenges (as in
crime) and security threats (as in the prevention of terrorism) to also include more
indirect security considerations (such as geopolitics).

9.3.3 Data Localization

Another popular measure to strengthen public control has been the adoption
of data localization policies. Data localization policies demand data storage in
a specific jurisdiction. Countries may adopt such policies to increase control
over sensitive data such as health data or biometric information about their
population, prevent foreign access to sensitive data, or simply restrict company
conduct to give domestic businesses an advantage. Data localization policies
have been one factor driving concerns about internet fragmentation along ter-
ritorial borders (Haggart et al. 2021). The US in particular tries to counteract
these tendencies in trade agreements and international fora such as the WTO,
emphasizing data localization policies as a trade problem (Selby 2017). While
it is important to recognize that various actors have used this justification to
lobby against data protection regulations such as the GDPR, data localization
policies have frequently been found to be failing to address challenges of data
governance, including privacy protection (Brehmer 2018; Selby 2017). Through
provisions such as those established in the CLOUD Act that authorize access by
public bodies or law enforcement irrespective of data location, any additional
protections provided by local data storage are easily circumvented. Nonethe-
less, data localization policies have been frequently used to provide evidence of
data protection- or privacy-compliant policies. Yet this particularly concerns nar-
row data localization policies, for example those of Germany or France. While
the increasing adoption of data governance measures has also resulted in an
CONCLUSION 217

increase in data localization provisions, some emerging economies have begun


to re-emphasize free data flows. Former champions of data localization, including
India, Indonesia, and Vietnam, have limited their data localization requirements,
for example to ‘sensitive data’ in India or to public organizations in Indonesia,
establishing the potential consequences as an important area for further empirical
research.

9.3.4 Infrastructure

Another aspect that warrants further research is the role of other layers of inter-
net governance such as technical standards or infrastructure. Empirical cases,
particularly in the security context, have demonstrated the immense difficulty of
contesting a system once it has been put in place, but physical infrastructures
and technical decision-making processes makes this insight even more pressing.
As Bellanova and de Goede illustrate, public values are ‘built into transatlantic
data infrastructures’ (2022, 103). Further research could explore how distinct
visions of data governance are represented in diverse infrastructural and technical
settings. A crucial starting point would be to scrutinize the design of data infras-
tructures. Again, China is the primary example of exercising exceptional control
through its infrastructure domestically. The comprehensive data control infras-
tructure known as the Great Firewall (Erie and Streinz 2022, 43–4) controls traffic
to, from, and within China. While this requires extremely high levels of capacity,
test runs of a sovereign internet in Russia (Sherman 2022) show that the Chi-
nese model is seen as a potential blueprint for other countries. As Russia is facing
sanctions for its invasion of Ukraine, the relevance of such a sovereign internet
further increases. Furthermore, China has been engaged in efforts to proactively
export its model through massive infrastructure investment and the export of
surveillance technologies. This has caused major concerns about the potential
rise of digital repression, as some countries, for example in sub-Saharan Africa,
would be unlikely to gain access to sophisticated surveillance equipment without
China (Feldstein 2021, 240–1). Yet case studies analysing reliance on digital tools
and infrastructures in Pakistan (Erie and Streinz 2022, 66–83), Thailand, or the
Philippines (Feldstein 2021) show that the influence of China is less clear than
often propagated. The availability of surveillance tools and a potentially increased
capacity for digital repression do not mean that countries employ them. Instead, it
significantly depends on pre-existing resources and political interests in the host
country. In addition, both democracies and authoritarian countries are engaged in
buying and selling digital surveillance technologies. The recent disclosures about
the Israeli NSO Group are a case in point. Without detection, the firm’s Pega-
sus spy software gathers data such as text messages from mobile phones and may
also activate the phone’s microphone or video without users noticing (Guardian
218 DATA GOVERNANCE

2021). The software has been employed by both authoritarian and democratic
governments such as Hungary, Rwanda, or the United Arab Emirates to target
journalists and political activists.

9.3.5 Whose Model Wins?

What does this mean for global data governance? This book has tried to bring
nuance to typical binaries: EU versus US, privacy versus security, power versus
values. It has demonstrated that data governance is characterized by a plurality of
conceptions of worth—there are several value orders, not one. Further research
could, therefore, compare value orders as promoted by different actors and their
success rates. Will the Chinese influence prevail via the Beijing effect or will the
EU shape data governance via its regulatory reach? Or will the hegemonic position
of the US in internet governance institutions and discourse prevail? While the US
has long enjoyed a hegemonic position in internet governance, its position in data
governance has long been characterized by a piecemeal approach to data protec-
tion and comprehensive data access for law enforcement and intelligence services.
Further research could examine tech companies’ notions of responsibility more
closely. What role do the highly qualified employees play in negotiating the prior-
itization of values within companies? With the GDPR, the EU has strengthened
its position as a data protection vanguard, potentially spurring a regulatory race
to the top via the Brussels effect. Countries like Japan and Korea have upgraded
their data protection provisions, while others are in still the process of changing
their legislative frameworks. In Brazil, the Lei Geral de Proteção de Dados came
into effect in 2020, including measures similar to those of the GDPR. In India, the
2017 judgment of the Constitutional Court declaring privacy to be a fundamental
right has given renewed emphasis to the adoption of a comprehensive data protec-
tion law currently including, for example, provisions comparable to the right to be
forgotten (Dhavate and Mohapatra n.d. 2022). Following a comprehensive review
of the proposed legislation, the Data Protection Bill is currently under debate in
the Indian Parliament. With India, Indonesia, and Pakistan overhauling their data
protection measures, roughly 2 billion people will be presented with new mea-
sures and rights in the near future. At the same time, the simple imposition of,
for example, liberal norms or rules is likely to increase ‘normative friction’ (Kette-
mann 2020, 152). When actors try to impose community-specific norms globally,
they risk undermining other communities’ collective self-determination. It is also
important to point out that even the EU’s approach has been far from uniform.
Besides weakening transparency and data protection standards for counterterror-
ism measures or law enforcement, the EU has also been involved in exporting
surveillance tools. For example, the EU is funding the development of biomet-
ric population databases in Côte d’Ivoire and Senegal by the company Civipol,
CONCLUSION 219

which is partly owned by the French government, with €30 million and €28
million, respectively. While Senegal has signed the CoE’s Convention 108, nei-
ther country fulfils the adequacy standards of the European Commission, which
makes the centralized databases containing sensitive information potentially vul-
nerable to abuse. In addition, EU documents suggest that a key objective of setting
up these databases is to facilitate the deportation of foreign nationals from the EU
(Privacy International 2020a).
Therefore, further research should move beyond justificatory practices to inves-
tigate more closely the implementation of policies both externally and internally.
It could also analyse, for example, the influence of individual data protection bod-
ies. How are data governance measures implemented in Argentina, Ireland, or
Senegal? How do access and correction requests work? While I have focused on
policymakers and major companies, bringing attention to the perception of the
collective values at stake in data governance by the broader public through survey
research or newspaper analyses is crucial. Do people conceptualize data differently
in China from how they do the US? Under what conditions would people decide to
restrict or allow data transfers? In addition, further research should aim for a more
normative assessment of data governance, evaluating the ethical implications of
trends, including the rise of sovereignty, the simultaneous diversification and cen-
tralization of power globally, and the growing importance of data governance for
areas such as security, development, and democracy.

9.3.6 What Follows?

For policymakers and scholars alike, implicit conceptualizations of data should be


put more up front. There is a need for a clear understanding of what data can and
cannot do. Public and private officials should prioritize the formulation of clear
and tangible goals as well as their effective scrutiny and oversight. The Covid-19
crisis has made apparent the willingness of public officials in both democracies
and authoritarian countries to expand data use. But data should not be conceptu-
alized as a silver bullet when addressing social and political challenges. This book’s
focus on bringing implicit assumptions about data to the fore, urging us to think
about whether data is conceptualized as a neutral tool or something that is attached
to individuals, enables a better evaluation of the necessity and proportionality of
data use to achieve specific governance objectives. It also enables the distinct prob-
lematization of intrusive measures. This not only relates to the assumption that
the accumulation and extensive processing of data makes the world a safer place
but also has implications for a local or physical conceptualization of data that dis-
regards consequences of data governance on other jurisdictions. With this book,
I hope to have demonstrated that the conceptualization of data matters for the
resolution of jurisdictional conflicts but also for policymaking more generally. It
220 DATA GOVERNANCE

is tempting to conceptualize data as a tool or resource that offers solutions for a


diversity of societal problems in the spirit of tech ‘solutionism’ (Nachtwey and
Seidl 2020). Basing data governance decisions on the assumption of data-based
technologies’ effectiveness rather than on evidence for their effectiveness may
implicitly legitimize data access in authoritarian contexts. Therefore, conceptu-
alizations of data should also include an assessment of potential implications for
power asymmetries, social exclusion, and autonomy.
There is, therefore, an urgent need for a more comprehensive debate but also
more open institutional settings. In its current form, the institutionalization of
normative differences in an explicit detachment from normative debate through
formalized language or legal expression has made transatlantic data governance
vulnerable to contestation and disruption. The current lack of open fora to make
meaningful policy decisions has entrenched informal transatlantic cooperation
and a lack of public and parliamentary scrutiny. By opening up to affected stake-
holders rather than further restricting debate and shifting it into the realm of
informality, some of these vulnerabilities might be mitigated.
APPENDI X 1

List of Interviews

No. Organizational affiliation/function Location Date

01 Interview with DPA representative A Geneva December 2017


02 Interview with DPA representative B Geneva December 2017
03 Interview with IO official Geneva December 2017
04 Interview with multistakeholder Geneva December 2017
representative A
05 Interview with multistakeholder Geneva December 2017
representative B
06 Interview with EP official Paris November 2018
07 Interview with NGO representative Paris November 2018
08 Written interview with former DPA Remote November 2018
representative
09 Interview with European Commission Brussels January 2019
official
10 Interview with European Parliament Brussels January 2019
official
11 Interview with former MEP adviser Brussels January 2019
12 Interview with NGO representative Brussels January 2019
13 Video interview with tech company Remote February 2019
employee A
14 Interview with tech company Berlin February 2019
employee B
15 Phone interview with former US data Remote March 2019
protection expert
16 Interview with former FTC official Washington DC March 2019
17 Interview with FTC official Washington DC March 2019
18 Interview with tech company Silicon Valley, CA April 2019
employee C
19 Interview with NGO representative A San Francisco April 2019
20 Interview with NGO representative B San Francisco April 2019
21 Interview with private sector Berlin November 2019
representative
APPENDI X 2

MAXQDA Additional Material

A.2.1 Codebook
1 Fairness
1.1 Data as a (Fundamental) Rights Concern
1.2 Reference Community: Individuals
1.2.1 Multistakeholderism
1.2.2 Whistle-blowers/Human Rights Defender
1.2.3 Vulnerable Groups
1.2.4 Protection of Individuals/Users
1.3 Dystopian Scenario: Repression
1.3.1 Data Sharing with Other Authorities
1.3.2 Abusive Practices
1.3.3 Lack of Oversight/Control/Review
1.3.4 Lack of Democratic Process
1.3.5 Vagueness/Ambiguity
1.3.6 Lack of Bindingness
1.3.7 Lack of Enforcement
1.3.8 Lowering Standards
1.3.9 Lack of Transparency
1.3.10 Lack of Compliance
1.3.11 Undermining Principles/Liberal Values
1.3.11.1 By Private Actors
1.3.11.2 By Democratic States
1.3.11.3 Media/Press Restrictions/Censorship
1.3.11.4 Large-Scale Collection/Bulk Collection
1.3.11.5 Mass Surveillance
1.4 Virtues/Principles: Liberal Partnership, Human Rights
1.4.1 Public Responsibility to Protect
1.4.2 Human Rights/Fundamental Freedoms
1.4.2.1 Right to Access/Access to Information
1.4.2.2 Freedom of Expression/Speech
1.4.3 Information Society for the Public Good
1.4.4 Free, Open Cyberspace
1.4.5 Protection of Rights/Values
1.4.5.1 Rule of Law
1.4.5.2 Public Interest, incl. Democratic Society
1.4.5.3 Private Responsibility
1.4.5.4 Robust Safeguards—General
1.4.5.4.1 Data Finality
1.4.5.4.2 Precise/Clear Rules/Agreement
1.4.5.4.3 Binding Agreements
APPENDIX 2 223

1.4.5.4.4 Privacy by Design


1.4.5.4.5 Depersonalization/Anonymization
1.4.5.4.6 Limitations on Automated Profiling
1.4.5.4.7 Data Security
1.4.5.4.8 Data Minimization
1.4.5.4.9 Enforcement
1.4.5.4.10 Adequacy
1.4.5.4.11 Consent/Choice
1.4.5.4.12 Purpose Limitation/ Specification
1.4.5.4.13 Access/Redress Mechanisms
1.4.5.4.14 Necessity
1.4.5.4.15 Proportionality
1.4.5.4.16 Fairness
1.4.5.4.17 Data Quality
1.4.5.4.18 Sensitive Data
1.4.5.4.19 Accountability
1.4.5.5 Oversight, Review, and Monitoring
1.4.5.5.1 Judicial Oversight
1.4.5.5.2 Best Practices/Education/Awareness
1.4.5.5.3 Transparency/ Notice
1.4.5.5.4 Independent Enforcement Authorities
1.4.5.5.5 Parliamentary Oversight/Scrutiny
1.4.5.5.6 DPAs as Advisory/Oversight Bodies
1.4.6 Liberal Partnership
1.4.7 Protection of Privacy
1.4.7.1 As a Fundamental Right
1.4.7.2 Universality/Global Level
1.4.7.3 International Promotion
1.5 Proving Worth/Deficiency
1.5.1 Binding Human Rights Conventions/Law
1.5.2 Privacy Scandal
1.5.3 Cross-Referencing Common Standards
2 Security
2.1 Conceptualization of Data
2.1.1 Data as Valuable
2.1.2 Data as a Tool
2.1.3 Data as Evidence
2.2 Reference Community: Vulnerable People
2.2.1 Cross-Border Security Cooperation
2.2.1.1 Security Partnership
2.2.2 Intra-EU Cooperation
2.3 Dystopian Scenario: Terrorism, Crime, Death
2.3.1 Threat/Vulnerability
2.3.2 Terrorism and Crime
2.3.2.1 Terrorism
2.3.2.1.1 Terrorist Financing
2.3.2.2 Crime (Transnational, Serious, Organized)
2.3.3 Changing Security Environment
2.3.4 Law Enforcement/Intelligence Constraints
2.3.5 Insufficient Necessity/Effectiveness
224 APPENDIX 2

2.4 Virtues/Principles: Information Sharing, Police, Intelligence


2.4.1 Security
2.4.1.1 Suspicion/Prevention
2.4.1.2 Public Security/Safety
2.4.1.2.1 National Security
2.4.1.3 Cybersecurity
2.4.1.4 Infrastructure
2.4.1.5 Protecting Liberal Values
2.4.2 Broad Collection/Retention
2.4.3 Law Enforcement Cooperation
2.4.4 Law Enforcement/ Intelligence Access
2.4.5 Information Sharing/Databases
2.4.5.1 Electronic Evidence
2.4.5.2 PNR/API
2.4.6 Exception
2.4.7 Secrecy
2.4.8 Vigilance/Fighting Impunity
2.5 Proving Worth
2.5.1 Terrorist Attack
2.5.2 Counterterrorism Experts/Investigators
3 Production
3.1 Conceptualization of Data as Resource/Currency
3.2 Reference Community (Customers, Businesses)
3.2.1 Customer Protection
3.2.1.1 Strengthening Trust
3.2.1.2 Reasonable/ Appropriate
3.2.2 Companies as Digital Switzerlands
3.2.3 Limited Private Responsibility
3.3 Dystopian Scenario: Economic Downturn/Inferiority
3.3.1 Economic Espionage
3.3.2 Protectionism
3.3.3 Damage to Businesses
3.3.3.1 Damage to SMEs
3.3.3.2 Damage to Domestic Corporations
3.3.3.3 Damage to Non-Domestic Corporations
3.3.4 Loss of Trust in the Digital Economy
3.3.5 Company Burden
3.3.5.1 Legal Uncertainty for Companies
3.3.5.2 Discrimination
3.3.5.3 Companies Caught in the Middle
3.3.5.4 Intermediary Liability
3.4 Virtues/Principles: Consumer Trust, Free Trade
3.4.1 Innovation/Progress
3.4.2 Efficiency
3.4.3 Competitiveness
3.4.4 Economic Growth
3.4.5 Trade/ International Commerce
3.4.6 Jobs
3.4.7 Legal Certainty
3.4.7.1 Point of Contact
APPENDIX 2 225

3.4.8 Digital/Data Economy


3.4.8.1 Freedom of Enterprise
3.4.8.2 Business Needs/Interests
3.4.9 Economic Transatlantic Partnership
3.4.9.1 US Leading Role in Digital Economy
3.4.10 Economic Value of Data
3.4.11 Soft Solutions
3.4.11.1 Collaborative Compliance
3.4.11.2 Flexibility
3.4.11.3 Informal Solutions
3.4.12 Public–Private Cooperation
4 Sovereignty
4.1 Conceptualization of Data as Local/Physical
4.2 Reference Community: Local/Limited
4.2.1 Domestic/ State/National Responsibility
4.2.1.1 Judicial Responsibility
4.2.1.2 Governmental Responsibility
4.2.1.3 Parliamentary Responsibility
4.2.2 Constitutive Community
4.2.2.1 Community Interests/Action
4.2.2.2 Protection of Domestic Citizens
4.2.2.2.1 Enforcement of Domestic Rights
4.2.2.2.2 Protection of Citizens Abroad
4.2.2.2.3 EU Citizens
4.2.2.2.3.1 Marketplace Principle
4.2.3 Outside the Reference Community
4.3 Dystopian Scenario: Interference, Intrusion
4.3.1 Lack of Legal Basis/Legality
4.3.2 Extraterritoriality
4.3.3 Imposition of Domestic Norms
4.3.4 Private Interference
4.3.5 Public Interference/Violation of Sovereignty
4.4 Legal Uncertainty
4.5 Violation of Legal Standards/ Agreements
4.6 Virtues/Principles: Domestic Law, Hierarchies
4.6.1 Territorial/Jurisdiction
4.6.1.1 Common EU Approach
4.6.2 Domestic Interests
4.6.2.1 Public Order
4.6.2.2 Public Interest
4.6.3 Respect Domestic Values/Process
4.6.3.1 Legal Basis
4.6.3.2 Respect Domestic Law
4.7 Proving Worth
4.7.1 Domestic Law
4.7.1.1 US Law
4.7.1.2 EU Law
4.7.1.3 Reference to Case Law
226 APPENDIX 2

5 Globalism
5.1 Conceptualization of Data as Fluid
5.1.1 Borderless Nature of the Internet
5.1.2 Free Flow of Data/Information
5.2 Reference Community: Global/Other Communities
5.3 Dystopian Scenario: Fragmentation, Discord
5.3.1 Misunderstanding of Legal Framework
5.3.2 Undermining of Trust
5.3.3 Fragmentation
5.3.3.1 Conflicting Rules/Norms
5.3.3.2 Negative Consequences of Unilateral Action
5.3.4 Outdated Law
5.4 Virtues/Principles: International Institutions, Expertise
5.4.1 Urgency of Agreement
5.4.2 Reform Measures
5.4.3 Cross-Border Framework/Standards
5.4.3.1 Legal Harmonization
5.4.3.2 Global Framework
5.4.4 Strong Institutions
5.4.5 Interoperability
5.4.6 Conflict of Laws Avoidance
5.4.7 Credibility
5.4.8 Multilateral Agreement/Institution
5.4.9 Bilateral Agreement
5.4.10 International Law
5.4.11 Compliance
5.4.12 Reciprocity
5.4.13 Cooperation
5.4.13.1 Compromise
5.4.13.2 Transatlantic Partnership
5.4.13.3 Dialogue/Exchange
5.4.13.4 Assurances
5.4.13.5 Sharing Expertise
5.5 Proving Worth/Deficiency
5.5.1 Fast Solution
5.5.2 Reiteration of Commitment
5.5.3 Reference to International Case Law
5.5.3.1 Reference to Other Communities’ Practices
6 Balanced Interests
7 Call for Action

A.2.2 Code Co-Occurrences—Cross-Check


The analysis of qualitative evidence through quantitative means brings some conceptual
challenges. In this section, I provide evidence of the robustness of the results of the co-
occurrence analysis I present in Chapter 9 (see Figure 9.1). I limit potential references to
general safeguards that might suggest a stronger connection between distinct orders and
provide a proximity analysis in addition to the intersection analysis. Both results are similar
APPENDIX 2 227

68 Globalism (796)

Sovereignty (777) 182

202

108

32 130
Fairness (excl. safeguards) (1511)
40

66

70

Production (404)
24

Security (531)

Figure. A.2.1 Visualization of code co-occurrence (excluding general safeguards)


in the empirical case studies

to the analysis visualized in Chapter 9; differences, albeit not unidirectional, are limited to
the connection between security and globalism.
The visualization and analysis showed relatively strong relationships between the order
of fairness and the other orders. The analysis of code co-occurrences in more detail, that
is, checking for individual intersections between codes, indicated that some but not all of
these co-occurrences relate to strong normative underpinnings. As highlighted in Chapter
9, references to the order of fairness are common attempts to increase the legitimacy of
policies. To minimize the reference to vague principles, I repeated the analysis without the
code ‘robust safeguards’, which comprises general references to safeguards without speci-
fying their nature. While references to ‘necessity’ or ‘proportionality’ or the protection of
‘privacy as a fundamental/human right’ were included, general mentions of ‘safeguards’
were excluded. The results are displayed in Figure A.2.1. Compared with the original
analysis, links between security and production and the order of fairness decreased consid-
erably, which potentially indicates more general, unspecific references of little substantial
connection.
In the main analysis, I included only co-occurrences that included intersecting codes,
that is, direct overlap between coded segments. In Figure A.2.2, I display a visualization
that shows co-occurrence within ten paragraphs (proximity). This includes not only direct
overlap but also justifications from different orders that are put forward in close proxim-
ity. This analysis also offers similar results, with the exception of the connection between
security and globalism, which is less clear.
228 APPENDIX 2

Globalism (796)
26

Sovereignty (777) 81

102

23

10 22

Fairness (1761)
12
64

22

Production (520)
5

Security (531)

Figure. A.2.2 Visualization of code co-occurrence (within 10 paragraphs) in the


empirical case studies
References

1 BvR 2835/17 -, Rn. 1–332. 2020. Federal Constitutional Court (Germany).


29WP. 2002. ‘Opinion 6/2002 on Transmission of Passenger Manifest Information and
Other Data from Airlines to the United States’. WP 66. Brussels.
29WP. 2004. ‘Opinion 02/2004 Adequate Protection of Personal Data Contained in the PNR
of Air Passengers to Be Transferred to the United States’ Bureau of Customs and Border
Protection (US CBP)’. WP 87. Brussels.
29WP. 2006. ‘Opinion 10/2006 on the Processing of Personal Data by the Society for
Worldwide Interbank Financial Telecommunication (SWIFT)’. WP 128. Brussels.
29WP. 2011. ‘Opinion 14/2011 on Data Protection Issues Related to the Prevention of
Money Laundering and Terrorist Financing’. WP 186. Brussels.
29WP. 2013a. ‘Letter to the LIBE Committee’. Ref. Ares(2013)3212708. Brussels.
29WP. 2013b. ‘Letter from the Chairman of the 29WP to Alexander Seger, Data Protection
and Cybercrime Division, Council of Europe’. Ref. Ares(2013)3645289. https://ec.
europa.eu/justice/article-29/documentation/other-document/files/2013/20131205_
wp29_letter_to_cybercrime_committee.pdf, accessed 3 July 2022.
29WP. 2014. ‘Guidelines on the Implementation of the Court of Justice of the European
Union Judgement on “Google Spain and Inc v. Agencia Espanola de Proteccion de Datos
(AEPD) and Mario Costeja Gonzales” C-131/12’. WP 225. Brussels.
29WP. 2015. ‘Statement of the Article 29 Working Party on the Implementation of the Judge-
ment of the Court of Justice of the European Union of 6 October 2015 in the Maximilian
Schrems v Data Protection Commissioner Case (C-362-14)’. Brussels.
29WP. 2017. ‘Statement of the Article 29 Working Party. Data Protection and Privacy
Aspects of Cross-Border Access to Electronic Evidence’. Brussels.
29WP. 2018. ‘Letter of the Chair of the 29WP to the European Commission’. https://ec.
europa.eu/newsroom/article29/document.cfm?action=display&doc_id=51023.
29WP and Working Party on Police and Justice. 2009. ‘The Future of Privacy: Joint Con-
tribution to the Consultation of the European Commission on the Legal Framework for
the Fundamental Right to Protection of Personal Data’. WP 168. Brussels.
Abbott, Andrew. 1986. ‘Jurisdictional Conflicts: A New Approach to the Develop-
ment of the Legal Professions’. American Bar Foundation Research Journal 11 (2):
187–224.
Abbott, Andrew. 1988. The System of Professions: An Essay on the Division of Expert Labor.
Chicago: University of Chicago Press.
Abbott, Andrew. 2005. ‘Linked Ecologies: States and Universities as Environments for
Professions’. Sociological Theory 23 (3): 245–74.
Acquisti, Alessandro, Curtis Taylor, and Liad Wagman. 2016. ‘The Economics of Privacy’.
Journal of Economic Literature 54 (2): 442–92.
Adler, Emanuel. 1997. ‘Imagined (Security) Communities: Cognitive Regions in Interna-
tional Relations’. Millennium 26 (2): 249–77.
Adler-Nissen, Rebecca, and Vincent Pouliot. 2014. ‘Power in Practice: Negotiating the
International Intervention in Libya’. European Journal of International Relations 20 (4):
889–911.
230 REFERENCES

Advisory Council to Google. 2015. ‘The Advisory Council to Google on the Right
to Be Forgotten’. https://static.googleusercontent.com/media/archive.google.com/de//
advisorycouncil/advisement/advisory-report.pdf, accessed 3 July 2022.
Aguinaldo, Angela, and Paul de Hert. 2021. ‘European Law Enforcement and US Data
Companies: A Decade of Cooperation Free from Law’. In Data Protection beyond Bor-
ders: Transatlantic Perspectives on Extraterritoriality and Sovereignty, edited by Federico
Fabbrini, Edoardo Celeste, and John Quinn, 157–72. London: Bloomsbury.
Allan, Bentley B. 2017. ‘Producing the Climate: States, Scientists, and the Constitution of
Global Governance Objects’. International Organization 71 (1): 131–62.
Allen, Anita L. 1988. Uneasy Access: Privacy for Women in a Free Society. Totowa, NJ:
Rowman & Littlefield.
Alter, Karen J, Emilie M Hafner-Burton, and Laurence R Helfer. 2019. ‘Theorizing the Judi-
cialization of International Relations’. International Studies Quarterly 63 (3): 449–63. doi:
10.1093/isq/sqz019.
Alter, Karen J., and Sophie Meunier. 2009. ‘The Politics of International Regime Complex-
ity’. Perspectives on Politics 7 (1): 13–24. doi: 10.1017/S1537592709090033.
Amicelle, Anthony. 2011. ‘The Great (Data) Bank Robbery: Terrorist Finance Track-
ing Program and the “SWIFT Affair”’. Research Questions 36. Centre d’Études
et de Recherches Internationales Sciences Po. https://www.sciencespo.fr/ceri/sites/
sciencespo.fr.ceri/files/qdr36.pdf, accessed 3 July 2022.
Amicelle, Anthony. 2013. ‘The EU’s Paradoxical Efforts at Tracking the Financing of Terror-
ism: From Criticism to Imitation of Dataveillance’. CEPS Liberty and Security and Europe
56.
Amoore, Louise. 2011. ‘Data Derivatives: On the Emergence of a Security Risk Calculus for
Our Times’. Theory, Culture & Society 28 (6): 24–43. doi: 10.1177/0263276411417430.
Amoore, Louise. 2013. The Politics of Possibility: Risk and Security beyond Probability.
Durham: Duke University Press.
Amoore, Louise. 2020. Cloud Ethics. New York: Duke University Press. doi:
10.1515/9781478009276.
Amoore, Louise, and Marieke de Goede. 2005. ‘Governance, Risk and Dataveillance in the
War on Terror’. Crime, Law and Social Change 43 (2–3): 149–73.
Anagnostakis, Dimitrios. 2017. EU–US Cooperation on Internal Security: Building a
Transatlantic Regime. Abingdon and New York: Routledge.
Anderson, John Ward. 2006. ‘Belgium Rules Sifting of Bank Data Illegal’. Washington Post,
29 September 2006. http://www.washingtonpost.com/wp-dyn/content/article/2006/09/
28/AR2006092801846.html, accessed 3 July 2022.
APEC. 2005. ‘APEC Privacy Framework’. APEC#205-SO-01.2. CTI Sub-Fora & Industry
Dialogues Groups, Digital Economy Steering Group (DESG).
Appiah, Kwame Anthony. 2010. The Ethics of Identity. Princeton, NJ: Princeton University
Press.
Apple, Facebook, Google, Microsoft, and Oath. 2018. ‘Joint Letter to Senator Orrin Hatch,
Senator Christopher Coons, Senator Lindsey Graham, Senator Sheldon Whitehouse’.
https://blogs.microsoft.com/datalaw/wp-content/uploads/sites/149/2018/02/Tech-
Companies-Letter-of-Support-for-Senate-CLOUD-Act-020618.pdf, accessed 3 July
2022.
Aradau, Claudia. 2008. ‘Forget Equality? Security and Liberty in the “War on Terror”’.
Alternatives: Global, Local, Political 33 (3): 293–314.
Aradau, Claudia. 2020. ‘Experimentality, Surplus Data and the Politics of Debilitation in
Borderzones’. Geopolitics, December, 1–21. doi: 10.1080/14650045.2020.1853103.
REFERENCES 231

Archibugi, D., D. Held, and M. Köhler. 1998. Re-Imagining Political Community: Studies in
Cosmopolitan Democracy. Stanford, CA: Stanford University Press.
Arendt, Hannah. 1958. The Human Condition. 2nd edn. Chicago: University of Chicago
Press.
Arendt, Hannah. 2017. ‘Freedom and Politics’. In Liberty Reader, edited by David Miller,
58–79. London: Routledge.
Argomaniz, Javier. 2009. ‘When the EU Is the “Norm-Taker”: The Passenger Name Records
Agreement and the EU’s Internalization of US Border Security Norms’. European Integra-
tion 31 (1): 119–36.
Argomaniz, Javier. 2011. The EU and Counter-Terrorism: Politics, Polity and Policies after
9/11. New York: Routledge.
Argomaniz, Javier. 2015. ‘European Union Responses to Terrorist Use of the Internet’.
Cooperation and Conflict 50 (2): 250–68.
Argomaniz, Javier, Oldrich Bures, and Christian Kaunert. 2017. EU Counter-Terrorism and
Intelligence: A Critical Assessment. London: Routledge.
Aristotle. 2012. Aristotle’s Nicomachean Ethics. Translated by R. C. Bartlett and S. D. Collins.
Chicago: University of Chicago Press.
Article 19. 2017. ‘The Global Principles on Protection of Freedom of Expression and
Privacy’. http://article19.shorthand.com/, accessed 3 July 2022.
Article 19 and Others. 2017. ‘In the CJEU Case-507/17: Written Observations of Article 19
and Others’. Court of Justice of the European Union.
ASEAN. 2016. ‘A SEAN Telecommunications and Information Technology Ministers Meet-
ing (Telmin): Framework on Personal Data Protection’. Brunei Darussalam.
Association of European Airlines. 2007. ‘European Airline Body Dismayed at Pro-
posal for EU-PNR System’. https://www.statewatch.org/news/2007/nov/eu-pnr-airlines-
reactions.pdf, accessed 3 July 2022.
AU. 2014. African Union Convention on Cyber Security and Personal Data Protection.
EX.CL/846(XXV ).
Autolitano, Simona, Anja Dahlmann, Tommaso De Zan, and Vincent Joubert. 2016.
‘EUnited against Crime: Improving Criminal Justice in European Union Cyberspace’. IAI
Istituto Affari Internazionali, November. https://www.iai.it/en/pubblicazioni/eunited-
against-crime-improving-criminal-justice-european-union-cyberspace, accessed 4 July
2022.
Avant, Deborah D., Martha Finnemore, and Susan K. Sell. 2010. Who Governs the Globe?
Cambridge and New York: Cambridge University Press.
Aviation and Transportation Security Act. 2001. Vol. S. 1447.
Azurmendi, Ana. 2017. ‘Spain: The Right to Be Forgotten; the Right to Privacy and
the Initiative Facing the New Challenges of the Information Society’. In Privacy, Data
Protection and Cybersecurity in Europe, edited by Wolf J Schünemann and Max-Otto
Baumann,17–30. Cham: Springer.
Bagger Tranberg, Charlotte. 2011. ‘Proportionality and Data Protection in the Case Law
of the European Court of Justice’. International Data Privacy Law 1 (4): 239–48. doi:
10.1093/idpl/ipr015.
Baker, Steven. 1994. ‘Don’t Worry Be Happy: Why Clipper Is Good for You’. http://groups.
csail.mit.edu/mac/classes/6.805/articles/baker-clipper.txt, accessed 4 July 2022.
Bąkowski, Piotr, and Sofija Voronova. 2015. ‘The Proposed EU Passenger Name
Records (PNR) Directive: Revived in the New Security Context’. European Parliamen-
tary Research Service Blog, May. https://epthinktank.eu/2015/05/04/the-proposed-eu-
232 REFERENCES

passenger-name-records-pnr-directive-revived-in-the-new-security-context/, accessed
4 July 2022.
Ball, Kirstie, and Laureen Snider. 2013. The Surveillance-Industrial Complex: A Political
Economy of Surveillance. Abingdon and New York: Routledge.
Balzacq, Thierry. 2007. ‘The Policy Tools of Securitization: Information Exchange, EU
Foreign and Interior Policies’. Journal of Common Market Studies 46 (1): 75–100. doi:
10.1111/j.1468-5965.2007.00768.x.
Barendt, Eric. 2005. Freedom of Speech. Oxford: Oxford University Press.
Bartlett, Jamie, and Louis Reynolds. 2015. The State of the Art 2015: A Literature Review
of Social Media Intelligence Capabilities for Counter-Terrorism. London: Demos.
Bauman, Zygmunt, Didier Bigo, Paulo Esteves, Elspeth Guild, Vivienne Jabri, David Lyon,
and R. B. J. Walker. 2014. ‘After Snowden: Rethinking the Impact of Surveillance’.
International Political Sociology 8 (2): 121–44. doi: 10.1111/ips.12048.
Beatty, Andrew. 2003. ‘Decimated Airlines Seek Solution to Passenger Data Feud’. EUob-
server, 26 March 2003. https://euobserver.com/justice/10701, accessed 4 July 2022.
Beckerman, Michael. 2015. ‘Examining the EU Safe Harbor Decision and Impacts for
Transatlantic Data Flows’. Washington DC.
Beitz, Charles R. 1979. ‘Bounded Morality: Justice and the State in World Politics’. Interna-
tional Organization 33 (3): 405–24.
Beitz, Charles R. 2001. ‘Human Rights as a Common Concern’. American Political Science
Review 95 (2): 269–82.
Bell, Daniel A. 1993. Communitarianism and Its Critics. Oxford: Clarendon Press.
Bell, Daniel A. 2005. ‘A Communitarian Critique of Liberalism’. Analyse & Kritik 27 (2):
215–238.
Bellanova, Rocco, and Marieke de Goede. 2022. ‘The Algorithmic Regulation of Security:
An Infrastructural Perspective’. Regulation & Governance 16 (1): 102–18.
Bellanova, Rocco, and Denis Duez. 2012. ‘A Different View on the “Making” of European
Security: The EU Passenger Name Record System as a Socio-Technical Assemblage’.
European Foreign Affairs Review 17 (2): 109–24.
Bénatouïl, Thomas. 1999. ‘A Tale of Two Sociologies: The Critical and the Pragmatic Stance
in Contemporary French Sociology’. European Journal of Social Theory 2 (3): 379–96.
doi: 10.1177/136843199002003011.
Benhabib, Seyla. 2004. The Rights of Others: Aliens, Residents, and Citizens. Cambridge and
New York: Cambridge University Press.
Bennett, Colin John. 1992. Regulating Privacy: Data Protection and Public Policy in Europe
and the United States. Ithaca: Cornell University Press.
Bennett, Colin John. 2008. The Privacy Advocates: Resisting the Spread of Surveillance.
Cambridge, MA: MIT Press.
Bennett, Colin John, and Charles D. Raab. 2006. The Governance of Privacy: Policy
Instruments in Global Perspective. Cambridge, MA: MIT Press.
Bennett, Colin John, and Charles D. Raab. 2020. ‘Revisiting the Governance of Privacy:
Contemporary Policy Instruments in Global Perspective’. Regulation & Governance 14
(3): 447–64. doi: 10.1111/rego.12222.
Bertram, Theo, Elie Bursztein, Stephanie Caro, Hubert Chao, Rutledge Chin Feman, Peter
Fleischer, Albin Gustafsson, Jess Hemerly, Chris Hibbert, and Luca Invernizzi. 2019. ‘Five
Years of the Right to Be Forgotten’. In Proceedings of the Conference on Computer and
Communications Security, 11–15 November, London. 959–72. New York: Association
for Computing Machinery
REFERENCES 233

Besek, June M. 2004. ‘Anti-Circumvention Laws and Copyright: A Report from the Ker-
nochan Center for Law, Media and the Arts’. Columbia Journal of Law & Arts 27 (4):
385–519.
Bessette, Rändi, and Virginia Haufler. 2001. ‘Against All Odds: Why There Is No Inter-
national Information Regime’. International Studies Perspectives 2 (1): 69–92. doi:
10.1111/1528-3577.00038.
Bially Mattern, Janice. 2001. ‘The Power Politics of Identity’. European Journal of Interna-
tional Relations 7 (3): 349–97. doi: 10.1177/1354066101007003003.
Biasiotti, M. A., J. P. M. Bonnici, J. Cannataci, and F. Turchi. 2018. Handling and Exchang-
ing Electronic Evidence across Europe. Law, Governance and Technology Series. Cham:
Springer.
Biermann, Frank, Philipp Pattberg, Harro Van Asselt, and Fariborz Zelli. 2009. ‘The
Fragmentation of Global Governance Architectures: A Framework for Analysis’. Global
Environmental Politics 9 (4): 14–40.
Bignami, Francesca, and Giorgio Resta. 2015. ‘Transatlantic Privacy Regulation: Conflict
and Cooperation’. Law & Contemporary Problems 78 (4): 231–66.
Bigo, Didier. 2011. ‘Pierre Bourdieu and International Relations: Power of Practices,
Practices of Power’. International Political Sociology 5 (3): 225–58. doi: 10.1111/j.1749-
5687.2011.00132.x.
Bigo, Didier. 2014. ‘The (In)Securitization Practices of the Three Universes of EU Border
Control: Military/Navy—Border Guards/Police—Database Analysts’. Security Dialogue
45 (3): 209–25.
Bigo, Didier, Gertjan Boulet, Caspar Bowden, Sergio Carrera, Julien Jeandesboz, and
Amandine Scherrer. 2012. ‘Fighting Cyber Crime and Protecting Privacy in the Cloud’.
Study for the European Parliament’s Committee on Civil Liberties, Justice and Home
Affairs (LIBE), PE 462: 35–49.
Bigo, Didier, Sergio Carrera, Nicholas Hernanz, Julien Jeandesboz, Joanna Parkin,
Francesco Ragazzi, and Amandine Scherrer. 2013. ‘Mass Surveillance of Personal Data
by EU Member States and Its Compatibility with EU Law’. Liberty and Security in Europe
Papers, CEPS, no. 61.
Bigo, Didier, Lina Ewert, and Elif Mendos Kuşkonmaz. 2020. ‘The Interoperability Con-
troversy or How to Fail Successfully: Lessons from Europe’. International Journal of
Migration and Border Studies 6 (1–2): 93–114.
Bilefsky, Dan. 2006a. ‘Bank Consortium Faces Outcry on Data Transfer: Europe: Interna-
tional Herald Tribune’. The New York Times, 28 June 2006. http://www.nytimes.com/
2006/06/28/world/europe/28iht-suit.2071000.html, accessed 4 July 2022.
Bilefsky, Dan. 2006b. ‘Belgians Say Banking Group Broke European Rules in Giving Data
to U.S.’. The New York Times, 29 September 2006. https://www.nytimes.com/2006/09/29/
world/europe/29swift.html, accessed 4 July 2022.
Bjerregaard, Elisabetta, and Tom Kirchmaier. 2019. ‘The Danske Bank Money Laundering
Scandal: A Case Study’. Copenhagen Business School, CBS 2019, Working paper. doi:
10.2139/ssrn.3446636.
Blasi Casagran, Christina. 2016. Global Data Protection in the Field of Law Enforce-
ment: An EU Perspective. Routledge Research in EU Law. London and New York:
Routledge.
Blokker, Paul, and Andrea Brighenti. 2011. ‘Politics between Justification and Defiance’.
European Journal of Social Theory 14 (3): 283–300. doi: 10.1177/1368431011412346.
BMI. 2013. ‘Abteilungsleiterrunde zur Koordinierung der Europapolitik am Donner-
stag, dem 12. Dezember 2013 um 8.30 Uhr im BMWi. TOP 6 Datenschutz’. MAT
234 REFERENCES

A BMI-1-8d_8. https://wikileaks.org/bnd-inquiry/docs/BMI/MAT%20A%20BMI-1-8/
MAT%20A%20BMI-1-8d_8.pdf, accessed 4 July 2022.
BMI. 2014. ‘A ktenvorlage an den 1. Untersuchungsausschuss des Deutsches Bundestages
in der 18. WP’. MAT A BMI 6b. Berlin. https://wikileaks.org/bnd-inquiry/docs/BMI/
MAT%20A%20BMI-6b.pdf, accessed 4 July 2022.
Bodó, Balázs, Helene von Schwichow, and Naomi Appelman. 2020. ‘Bodó, Balázs and von
Schwichow, Helene and Appelman, Naomi, Money Talks? Report on the One-Day Sym-
posium on the Impact of Corporate Funding on Information Law Research Institute for
Information Law Research Paper’. Amsterdam Law School Research Paper No. 2020–16.
Bodoni, Stephanie. 2019. ‘Google Clash over Global Right to Be Forgotten Returns to
Court’. Bloomberg, 9 January 2019. https://www.bloomberg.com/news/articles/2019-
01-09/google-clash-over-global-right-to-be-forgotten-returns-to-court, accessed 4 July
2022.
Bodoni, Stephanie. 2022. ‘Meta, Google Face Data Doomsday as Key EU Decision Looms’.
Bloomberg, 18 February 2022. https://www.bloomberg.com/news/articles/2022-02-18/
meta-google-face-eu-data-blackout-as-ruling-on-contracts-looms, accessed 4 July 2022.
Boettcher, Miranda. 2020. ‘Cracking the Code: How Discursive Structures Shape Climate
Engineering Research Governance’. Environmental Politics 29 (5): 890–916.
Bois, Julien Raymond Florent. 2021. ‘The Uncertain World of the Court of Justice of the
European Union: A Multidisciplinary Approach of the Legitimacy of the EU Judiciary in
the 21st Century’. PhD Dissertation. Berlin: Hertie School.
Bolkestein, Frits. 2003. ‘EU/US Talks on Transfers of Airline Passengers’ Personal Data’.
Brussels, September 9. europa.eu/rapid/press-release_SPEECH-03-613_en.pdf.
Boltanski, Luc, and Laurent Thévenot. 1999. ‘The Sociology of Critical Capac-
ity’. European Journal of Social Theory 2 (3): pp. 359–77. https://doi.org/10.1177/
136843199002003010.
Boltanski, Luc, and Ève Chiapello. 2005. The New Spirit of Capitalism. London and New
York: Verso.
Boltanski, Luc, and Laurent Thévenot. 2006. On Justification: Economies of Worth. Prince-
ton Studies in Cultural Sociology. Princeton, NJ: Princeton University Press.
Boltanski, Luc. 2011. On Critique: A Sociology of Emancipation. Cambridge and Malden,
MA: Polity.
Boltanski, Luc. 2012. Love and Justice as Competences. Cambridge and Malden, MA: Polity.
Bot, Yves. 2015. ‘Opinion of Advocate General Bot Delivered on 23 September 2015 Case
C-362/14 Maximilian Schrems v Data Protection Commissioner’. ECLI:EU:C:2015:627.
Bourdieu, Pierre. 1985. ‘The Social Space and the Genesis of Groups’. Information (Inter-
national Social Science Council) 24 (2): 195–220.
Bourdieu, Pierre. 1989. ‘Social Space and Symbolic Power’. Sociological Theory 7 (1): 14–25.
Bourdieu, Pierre. 1990. The Logic of Practice. Stanford, CA: Stanford University Press.
Bourdieu, Pierre. 1991. Language and Symbolic Power. Cambridge, MA: Harvard Univer-
sity Press.
Bourdieu, Pierre. 1993. The Field of Cultural Production: Essays on Art and Literature. New
York: Columbia University Press.
Bourdieu, Pierre. 1996. The Rules of Art: Genesis and Structure of the Literary Field.
Stanford, CA: Stanford University Press.
Bourdieu, Pierre. 1998. Practical Reason: On the Theory of Action. Stanford, CA: Stanford
University Press.
Bourdieu, Pierre, and Loïc Wacquant. 1992. An Invitation to Reflexive Sociology. Chicago
and London: The University of Chicago Press.
REFERENCES 235

Bradford, Anu. 2020. The Brussels Effect: How the European Union Rules the World. New
York: Oxford University Press.
Brehmer, Jacqueline H. 2018. ‘Data Localization: The Unintended Consequences of Privacy
Litigation’. American University Law Review 67 (3): 927–69.
Brin, Sergey, and Lawrence Page. 1998. ‘The Anatomy of a Large-Scale Hypertext Web
Search Engine’. http://infolab.stanford.edu/%7Ebackrub/google.html, accessed 4 July
2022.
Britz, Johannes J. 2008. ‘Making the Global Information Society Good: A Social Justice
Perspective on the Ethical Dimensions of the Global Information Society’. Journal of the
American Society for Information Science and Technology 59 (7): 1171–83.
Britz, Malena, and Arita Eriksson. 2005. ‘The European Security and Defence Policy: A
Fourth System of European Foreign Policy?’ Politique Européenne 17 (3): 35–62. doi:
10.3917/poeu.017.0035.
Brkan, Maja. 2016. ‘Data Protection and Conflict-of-Laws: A Challenging Relationship’.
European Data Protection Law Review 2: 324–41.
Brkan, Maja. 2017. ‘The Court of Justice of the EU, Privacy and Data Protection: Judge-
Made Law as a Leitmotif in Fundamental Rights Protection’. In Courts, Privacy and
Data Protection in the Digital Environment, edited by Maja Brkan and Evangelia Psy-
chogiopoulou, 10–31. Cheltenham: Edward Elgar Publishing.
Brkan, Maja, and Evangelia Psychogiopoulou. 2017. Courts, Privacy and Data Protection in
the Digital Environment. Cheltenham: Edward Elgar Publishing.
Brouwer, Evelien. 2009. ‘Towards a European PNR System? Questions on the Added Value
and the Protection of Fundamental Rights’. Study Requested by the European Parliament’s
Committee on Civil Liberties, Justice and Home Affairs (LIBE).
Bruguière, Jean-Louis. 2010. ‘Second Report on the Processing of EU-Originating Personal
Data by the United States Treasury Department for Counter Terrorism Purposes’. http://
www.statewatch.org/news/2010/aug/eu-usa-swift-2nd-bruguiere-report.pdf, accessed 4
July 2022.
Brzoska, Michael. 2011. ‘The Role of Effectiveness and Efficiency in the European Union’s
Counterterrorism Policy: The Case of Terrorist Financing’. Economics of Security Work-
ing Paper No. 51.
Buchanan, Allen. 2007. Justice, Legitimacy, and Self-Determination: Moral Foundations for
International Law. Oxford: Oxford University Press.
Buchanan, Allen. 2008. ‘Human Rights and the Legitimacy of the International Order’. Legal
Theory 14 (1): 39–70. doi: 10.1017/S1352325208080038.
Bueger, Christian. 2014. ‘Pathways to Practice: Praxiography and International Politics’.
European Political Science Review 6 (3): 383–406.
Bueger, Christian, and Frank Gadinger. 2014. International Practice Theory: New Perspec-
tives. Basingstoke: Palgrave Macmillan.
Bull, Hedley. 2002. The Anarchical Society. New York: Columbia University Press.
Burchardt, Dana. 2020. ‘Backlash against the Court of Justice of the EU? The Recent
Jurisprudence of the German Constitutional Court on EU Fundamental Rights as a
Standard of Review’. German Law Journal 21 (S1): 1–18.
Burman, Anirudh. 2020. ‘Will India’s Proposed Data Protection Law Protect Privacy and
Promote Growth?’ Working Paper. Carnegie India. https://carnegieendowment.org/
files/Burman_Data_Privacy.pdf
Burri, Mira, and Rahel Schär. 2016. ‘The Reform of the EU Data Protection Framework:
Outlining Key Changes and Assessing Their Fitness for a Data-Driven Economy’. Journal
of Information Policy 6 (1): 479–511.
236 REFERENCES

Busch, Andreas. 2013. ‘The Regulation of Transborder Data Traffic: Disputes across the
Atlantic’. Security & Human Rights 23 (4): 313–30.
Busch, Andreas, and David Levi-Faur. 2011. ‘The Regulation of Privacy’. In Handbook on
the Politics of Regulation, edited by David Levi-Faur, 227–41. Cheltenham: Edward Elgar
Publishing.
Butler, Alan. 2017. ‘Whither Privacy Shield in the Trump Era’. European Data Protection
Law Review 3: 111–13.
Buzan, Barry, Ole Wæver, and Jaap De Wilde. 1998. Security: A New Framework for
Analysis. Boulder, CO, and London: Lynne Rienner Publishers.
Cabranes, José A. 2017. ‘José A. Cabranes, Circuit Judge, joined by Dennis Jacobs, Reena
Raggi and Christopher F. Droney, Circuit Judges, Dissenting from the Order Deny-
ing Rehearing En Banc’. http://online.wsj.com/public/resources/documents/Cabranes_
dissental01242017.pdf, accessed 4 July 2022.
Cadwalladr, Carole, and Duncan Campbell. 2019. ‘Revealed: Facebook’s Global Lobbying
against Data Privacy Laws’. The Observer, 2 March 2019. https://www.theguardian.com/
technology/2019/mar/02/facebook-global-lobbying-campaign-against-data-privacy-
laws-investment, accessed 4 July 2022.
Cadwalladr, Carole, and Emma Graham-Harrison. 2018. ‘Revealed: 50 Million Facebook
Profiles Harvested for Cambridge Analytica in Major Data Breach’. The Guardian, 17
March 2018. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-
facebook-influence-us-election, accessed 4 July 2022.
Calame, Byron. 2006a. ‘Secrecy, Security, the President and the Press’. The New York Times,
2 July 2006. https://www.nytimes.com/2006/07/02/opinion/02pub-ed.html, accessed 4
July 2022.
Calame, Byron. 2006b. ‘Can “Magazines” of The Times Subsidize News Coverage?’ The
New York Times, 22 October 2006. https://www.nytimes.com/2006/10/22/opinion/
22pubed.html, accessed 4 July 2022.
Cammaerts, Bart, and Robin Mansell. 2020. ‘Digital Platform Policy and Regulation:
Toward a Radical Democratic Turn’. International Journal of Communication 14: 135–
54.
Campbell, David. 1992. Writing Security: United States Foreign Policy and the Politics of
Identity. Minneapolis, MN: University of Minnesota Press.
Cannataci, Joseph A, and Jeanne Pia Mifsud-Bonnici. 2005. ‘Data Protection Comes of
Age: The Data Protection Clauses in the European Constitutional Treaty’. Information
& Communications Technology Law 14 (1): 5–15. doi: 10.1080/1360083042000325274.
Carrera, Sergio, Nicholas Hernanz, and Joanna Parkin. 2013. ‘The “Lisbonisation” of the
European Parliament: Assessing Progress, Shortcomings and Challenges for Democratic
Accountability in the Area of Freedom, Security and Justice’. Centre for European Policy
Studies Liberty and Security in Europe (CEPS), no. 58.
Carrier Directive. Council Directive 2004/82/EC of 29 April 2004 on the Obligation of
Carriers to Communicate Passenger Data. 2004. OJ L 261.
Carrubba, Clifford J, and Matthew J. Gabel. 2015. International Courts and the Perfor-
mance of International Agreements: A General Theory with Evidence from the European
Union. New York: Cambridge University Press.
Cartwright, Madison. 2020. ‘Internationalising State Power through the Internet:
Google, Huawei and Geopolitical Struggle.’ Internet Policy Review 9 (3). doi:
10.14763/2020.3.1494.
Castells, Manuel. 2011. The Rise of the Network Society. Vol. I, second edition. Malden, MA:
John Wiley & Sons.
REFERENCES 237

Cellan-Jones, Rory. 2018. ‘Microsoft Sinks Data Centre off Orkney’. BBC News, 6 June 2018.
https://www.bbc.com/news/technology-44368813, accessed 4 July 2022.
Charter of Fundamental Rights of the European Union. 2000. OJ 2000/C 364.
Chenou, Jean-Marie, and Roxana Radu. 2019. ‘The “Right to Be Forgotten”: Negotiating
Public and Private Ordering in the European Union’. Business & Society 58 (1): 74–102.
doi: 10.1177/0007650317717720.
Chertoff, Michael. 2006. ‘A Tool We Need to Stop the Next Airliner Plot’. Washington Post,
29 August 2006. https://www.washingtonpost.com/archive/opinions/2006/08/29/a-
tool-we-need-to-stop-the-next-airliner-plot/bcd240b8-8d61-4664-8f8f-f45d5b3cfaf7/,
accessed 4 July 2022.
Choucri, Nazli. 2012. Cyberpolitics in International Relations. Cambridge, MA, and Lon-
don: MIT Press.
Christakis, Theodore. 2020. ‘E-Evidence in the EU Parliament: Basic Features of Birgit
Sippel’s Draft Report’. European Law Blog, January. https://europeanlawblog.eu/2020/
01/21/e-evidence-in-the-eu-parliament-basic-features-of-birgit-sippels-draft-report/,
accessed 4 July 2022.
Christou, George. 2018. ‘The Challenges of Cybercrime Governance in the European
Union’. European Politics and Society 19 (3): 355–75.
Christou, George, and Imir Rashid. 2021. ‘Interest Group Lobbying in the European Union:
Privacy, Data Protection and the Right to Be Forgotten’. Comparative European Politics
19 (3): 380–400. doi: 10.1057/s41295-021-00238-5.
Chuches, Genna, and Monika Zalnieriute. 2020. ‘A Groundhog Day in Brussels’. Ver-
fassungsblog. 16 July 2020. https://verfassungsblog.de/a-groundhog-day-in-bruessels/,
accessed 4 July 2022.
CJEU. 2014. ‘Press Release No 70/14: Judgment in Case C-131/12 Google Spain SL, Google
Inc. v Agencia Española de Protección de Datos, Mario Costeja González’. https://curia.
europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf, accessed 4 July
2022.
CJEU. 2017. Opinion 1/15. The Court Declares That the Agreement Envisaged between the
European Union and Canada on the Transfer of Passenger Name Record Data May Not
Be Concluded in its Current Form. Court of Justice of the EU: General Court. 2017.
CJEU. 2020. ‘Press Release No 91/20: The Court of Justice Invalidates Decision 2016/1250
on the Adequacy of the Protection Provided by the EU-US Data Protection Shield’.
Luxembourg.
CJEU. 2022. ‘Press Release No 19/22. Advocate General’s Opinion in Case C-817/19 Ligue
des Droits Humains’. Luxembourg.
Clarke, Richard A., Michael J. Morell, Geoffrey R. Stone, Cass R. Sunstein, and Peter
Swire. 2013. ‘Liberty and Security in a Changing World’. The President’s Review Group
on Intelligence and Communications Technologies. https://obamawhitehouse.archives.
gov/blog/2013/12/18/liberty-and-security-changing-world, accessed 4 July 2022.
Clifford, Damian, and Jef Ausloos. 2018. ‘Data Protection and the Role of Fairness’. Yearbook
of European Law 37: 130–87.
CLOUD Act. H.R.4943 - Clarifying Lawful Overseas Use of Data Act. 2018.
CNIL. 2015a. ‘CNIL Orders Google to Apply Delisting on All Domain Names of the
Search Engine’. June. https://web.archive.org/web/20180201043309/https://www.cnil.
fr/fr/node/15790.
CNIL. 2015b. ‘Right to Delisting: Google Informal Appeal Rejected’. September. https://
web.archive.org/web/20181119132900/https://www.cnil.fr/en/right-delisting-google-
informal-appeal-rejected-0.
238 REFERENCES

CNIL. 2016. ‘Délibération N°2016-054 du 10 Mars 2016’. https://www.legifrance.gouv.fr/


affichCnil.do?id=CNILTEXT000032291946, accessed 4 July 2022.
CNIL. 2019. ‘Deliberation of the Restricted Committee SAN-2019-001 of 21 January 2019
Pronouncing a Financial Sanction against GOOGLE LLC’. https://www.cnil.fr/sites/
default/files/atoms/files/san-2019-001.pdf.
CoE. 1981. ‘Convention for the Protection of Individuals with Regard to Automatic Process-
ing of Personal Data’. ETS No.108. https://www.coe.int/en/web/conventions/full-list/-/
conventions/treaty/108, accessed 4 July 2022.
CoE. 1990. ‘Convention on Laundering, Search, Seizure and Confiscation of the Proceeds
from Crime’. ETS No.141. Strasbourg. https://rm.coe.int/168007bd23.
CoE. 2001. ‘Convention on Cybercrime’. ETS No 185. Budapest. http://conventions.coe.
int/Treaty/EN/Treaties/Html/185.htm, accessed 4 July 2022.
CoE. 2002. ‘Guidelines on Human Rights and the Fight Against Terrorism’. H (2002) 4.
Strasbourg.
CoE. 2012. ‘Recommendation CM/Rec(2012)3 of the Committee of Ministers to Mem-
ber States on the Protection of Human Rights with Regard to Search Engines’.
Strasbourg.
CoE. 2014. ‘Recommendation CM/Rec(2014)7 Adopted by the Committee of Ministers of
the Council of Europe on 30 April 2014 and Explanatory Memorandum’. Strasbourg.
CoE. 2018a. ‘Modernised Convention for the Protection of Individuals with Regard to the
Processing of Personal Data’. No 108+. Elsinore. https://search.coe.int/cm/Pages/result_
details.aspx?ObjectId=09000016807c65bf, accessed 4 July 2022.
CoE. 2018b. ‘Octopus Conference 2018: Key Messages’. Strasbourg. http://rm.coe.int/3021-
90-octo18-keymessages/16808c67bb, accessed 4 July 2022.
CoE. 2019. ‘Octopus Conference 2019: Draft List of Participants’. Strasbourg. https://rm.
coe.int/list-of-participants-octopus2019/168098d302, accessed 4 July 2022.
CoE. 2020. ‘Protocol Negotiations’. Cybercrime. 2020. https://www.coe.int/en/web/
cybercrime/t-cy-drafting-group, accessed 4 July 2022.
Cohen, Antonin. 2018. ‘Pierre Bourdieu and International Relations’. In The Oxford Hand-
book of Pierre Bourdieu, edited by Thomas Medvetz and Jeffrey J. Sallaz, 200–46. New
York: Oxford University Press.
Cohen, Julie E. 2019. Between Truth and Power: The Legal Constructions of Informational
Capitalism. New York: Oxford University Press.
Coleman, Clive. 2018. ‘Are You Ready for a Data Privacy Shake-Up?’ BBC News, 20 April
2018. https://www.bbc.com/news/technology-43657546, accessed 4 July 2022.
Commission de la Protection de la Vie Privée. 2006. ‘Avis relatif à la transmission de données
à caractère personnel par la SCRL SWIFT suite aux sommations de l’UST (OFAC)’. 37.
Brussels.
Commission de la Protection de la Vie Privée. 2008. ‘Décision du 9 décembre 2008’.
Brussels.
Connolly, Chris. 2008. ‘The US Safe Harbor: Fact or Fiction?’ Galexia. http://www.galexia.
com/public/research/assets/safe_harbor_fact_or_fiction_2008/safe_harbor_fact_or_
fiction.pdf, accessed 4 July 2022.
Connolly, Chris. 2013. ‘EU/US Safe Harbor: Effectiveness of the Framework in Relation to
National Security Surveillance’. Speaking/background notes for an appearance before the
Committee on Civil Liberties, Justice and Home Affairs (the LIBE Committee) Inquiry
on “Electronic Mass Surveillance of EU Citizens”. Strasbourg.
Cook, Joana. 2020. A Woman’s Place: US Counterterrorism Since 9/11. Oxford: Oxford
University Press.
REFERENCES 239

Cool, Allison. 2018. ‘Europe’s Data Protection Law Is a Big, Confusing Mess’. The New York
Times, 15 May 2018. https://www.nytimes.com/2018/05/15/opinion/gdpr-europe-data-
protection.html, accessed 4 July 2022.
Couldry, Nick, and Ulises A Mejias. 2019. ‘Data Colonialism: Rethinking Big Data’s
Relation to the Contemporary Subject’. Television & New Media 20 (4): 336–49.
Council of the EU. 2007. ‘Proposal for a Council Framework Decision on the Use of Pas-
senger Name Records (PNR) for Law Enforcement Purposes’. MEMO/07/449. Brussels.
Council of the EU. 2009. ‘Council Decision 2010/16/CFSP/JHA of 30 November 2009 on
the Signing, on Behalf of the European Union, of the Agreement between the European
Union and the United States of America on the Processing and Transfer of Financial Mes-
saging Data from the European Union to the United States for Purposes of the Terrorist
Finance Tracking Program’. OJ L 8, 13.1.2010, 9–10.
Council of the EU. 2010. ‘EU Action Plan on Combating Terrorism’. Brussels. 15
November 2010 https://register.consilium.europa.eu/doc/srv?l=EN&f=ST%2015893%
202010%20INIT.
Council of the EU. 2010. ‘The Stockholm Programme: An Open and Secure Europe Serving
and Protecting Citizens’. 17024/09. Brussels.
Council of the EU. 2011. ‘EU Counter-Terrorism Coordinator (CTC) Report on EU Action
Plan on Combating Terrorism’. 17594/1/11 REV 1. Brussels.
Council of the EU. 2016. ‘Joint Statement of EU Ministers for Justice and Home Affairs and
Representatives of EU Institutions on the Terrorist Attacks in Brussels on 22 March 2016’.
158/16. Brussels.
Council of the EU. 2018. ‘Joint EU–U.S. Statement Following the EU–U.S. Justice and Home
Affairs Ministerial Meeting’. https://ec.europa.eu/commission/presscorner/detail/en/
STATEMENT_18_3906, accessed 4 July 2022.
Council of the EU. 2019. ‘Interoperability between EU Information Systems: Council
Adopts Regulations’. http://www.consilium.europa.eu/en/press/press-releases/2019/
05/14/interoperability-between-eu-information-systems-council-adopts-regulations/,
accessed 4 July 2022.
Council of the EU. 2020. ‘EU-Japan PNR Agreement: Council Authorises Opening of Nego-
tiations’. February. http://www.consilium.europa.eu/en/press/press-releases/2020/02/
18/eu-japan-pnr-agreement-council-authorises-opening-of-negotiations/.
CPB. 2004. ‘Undertakings of the Department of Homeland Security Bureau of Customs
and Border Protection (CBP)’. https://www.statewatch.org/news/2006/jun/eu-usa-pnr-
undertakings-may-2004.pdf, accessed 4 July 2022.
Creemers, Rogier. 2021. ‘China’s Emerging Data Protection Framework’. SSRN Scholarly
Paper. Rochester, NY: Social Science Research Network. doi: 10.2139/ssrn.3964684.
Culpepper, Pepper D, and Kathleen Thelen. 2019. ‘Are We All Amazon Primed? Consumers
and the Politics of Platform Power’. Comparative Political Studies 53 (2): 288–318.
Curtin, Deirdre. 2013. ‘Official Secrets and the Negotiation of International Agreements: Is
the EU Executive Unbound’. Common Market Law Review 50 (2): 423–57.
Curtin, Deirdre. 2014. ‘Overseeing Secrets in the EU: A Democratic Perspective’. Journal of
Common Market Studies 52 (3): 684–700.
Curtin, Deirdre. 2018. ‘Second Order Secrecy and Europe’s Legality Mosaics’. West Euro-
pean Politics 41 (4): 846–68.
Cutler, A. Claire, Virginia Haufler, and Tony Porter. 1999. Private Authority and Interna-
tional Affairs. SUNY Series in Global Politics. Albany, NY: State University of New York
Press.
240 REFERENCES

Cybersecurity Tech Accord. 2018. https://cybertechaccord.org/accord/, accessed 4 July


2022.
Dance, Gabriel J. X., and Jennifer Valentino-DeVries. 2020. ‘Have a Search Warrant for
Data? Google Wants You to Pay’. The New York Times, 24 January 2020. https://www.
nytimes.com/2020/01/24/technology/google-search-warrants-legal-fees.html, accessed
4 July 2022.
Daskal, Jennifer. 2018. ‘Microsoft Ireland, the CLOUD Act, and International Lawmaking
2.0’. Stanford Law Review 71: 9–16.
Daskal, Jennifer, and Peter P. Swire. 2018. ‘Privacy and Civil Liberties under the
CLOUD Act: A Response’. Lawfare, March. https://www.lawfareblog.com/privacy-and-
civil-liberties-under-cloud-act-response, accessed 4 July 2022.
Data Protection Commissioner. 2020. Data Protection Commissioner v. Facebook Ireland
Limited, Maximillian Schrems. 2020. Court of Justice of the EU: General Court. Case
C-311/18.
Data Protection Directive. 1995. Directive 95/46/EC of the European Parliament and of
the Council of 24 October 1995 on the Protection of Individuals with Regard to the
Processing of Personal Data and on the Free Movement of Such Data. OJ L281/31.
Data Retention Directive. 2006. Directive 2006/24/EC of the European Parliament and of
the Council of 15 March 2006 on the Retention of Data Generated or Processed in Con-
nection with the Provision of Publicly Available Electronic Communications Services or
of Public Communications Networks and Amending Directive 2002/58/EC. OJ L 105.
Davis, Avi. 2016. ‘After Brussels Attacks, France’s Valls Renews Push for Europe-Wide
Flight Database’. France 24, 23 March 2016. https://www.france24.com/en/20160323-
after-brussels-attacks-france-valls-renews-push-europe-wide-flight-database, accessed 4
July 2022.
Dean, Edward M. 2015. ‘Testimony of Edward M. Dean, Deputy Assistant Secretary for
Services, International Trade Administration, U.S. Department of Commerce. U.S.–EU
Safe Harbor Framework’. Washington DC.
Dean, Michelle. 2012. ‘The Story of Amanda Todd’. The New Yorker, 18 October 2012.
https://www.newyorker.com/culture/culture-desk/the-story-of-amanda-todd, accessed
4 July 2022.
de Busser, Els. 2009. Data Protection in EU and US Criminal Cooperation: A Substantive Law
Approach to the EU Internal and Transatlantic Cooperation in Criminal Matters between
Judicial and Law Enforcement Authorities. Antwerp and Apeldoorn: Maklu.
de Busser, Els, Serge Gutwirth, Ronald Leenes, and Paul de Hert. 2014. ‘Privatization of
Information and the Data Protection Reform’. In Reloading Data Protection: Multidisci-
plinary Insights and Contemporary Challenges, edited by Serge Gutwirth, Ronald Leenes,
and Paul de Hert, 129–49. Dordrecht: Springer.
de Goede, Marieke. 2008. ‘The Politics of Preemption and the War on Terror
in Europe’. European Journal of International Relations 14 (1): 161–85. doi:
10.1177/1354066107087764.
de Goede, Marieke. 2012a. Speculative Security: The Politics of Pursuing Terrorist Monies.
Minneapolis, MN, and London: University of Minnesota Press.
de Goede, Marieke. 2012b. ‘The SWIFT Affair and the Global Politics of European Security’.
Journal of Common Market Studies 50 (2): 214–30.
de Goede, Marieke. 2018a. ‘Counter-Terrorism Financing Assemblages after 9/11’. In The
Palgrave Handbook of Criminal and Terrorism Financing Law, edited by Colin King,
Clive Walker, and Jimmy Gurulé,755–79. Cham: Palgrave Macmillan.
REFERENCES 241

de Goede, Marieke. 2018b. ‘The Chain of Security’. Review of International Studies 44 (1):
24–42.
de Goede, Marieke, and Mara Wesseling. 2017. ‘Secrecy and Security in Transatlantic
Terrorism Finance Tracking’. Journal of European Integration 39 (3): 253–69. doi:
10.1080/07036337.2016.1263624.
de Hert, Paul, and Martin de Schutter. 2008. ‘International Transfers of Data in the Field
of JHA: The Lessons of Europol, PNR and Swift’. In Justice, Liberty, Security: New Chal-
lenges for EU External Relations, edited by S Thiel and B Martenczuk, 303–40. Brussels:
VUBPRESS.
Deibert, Ronald J., and Masashi Crete-Nishihata. 2012. ‘Global Governance and the Spread
of Cyberspace Controls’. Global Governance 18 (3): 339–61.
Deitelhoff, Nicole. 2009. ‘The Discursive Process of Legalization: Charting Islands
of Persuasion in the ICC Case’. International Organization 63 (1): 33–65. doi:
10.1017/S002081830909002X.
DeNardis, Laura. 2013. ‘The Emerging Field of Internet Governance’. In The Oxford Hand-
book of Internet Studies, edited by William H. Dutton, 555–76. Oxford: Oxford University
Press.
Dencik, Lina, Arne Hintz, and Jonathan Cable. 2016. ‘Towards Data Justice? The Ambiguity
of Anti-Surveillance Resistance in Political Activism’. Big Data & Society 3 (2): 1–12. doi:
10.1177/2053951716679678.
de Wilde, Pieter. 2019. ‘The Making of Four Ideologies of Globalization’. European Political
Science Review 11 (1): 1–18.
de Wilde, Pieter, Ruud Koopmans, and Wolfgang Kleinwächter. 2019. The Struggle over Bor-
ders: Cosmopolitanism and Communitarianism. Cambridge and New York: Cambridge
University Press.
Dhavate, Nitin, and Ramakant Mohapatra. n.d. 5 January 2022. ‘A Look at Proposed
Changes to India’s (Personal) Data Protection Bill’. Iapp. https://iapp.org/news/a/a-look-
at-proposed-changes-to-indias-personal-data-protection-bill/, accessed 4 July 2022.
Dhont, Jan, and Marı́a Verónica Pérez Asinari. 2004. ‘Safe Harbour Decision Implementa-
tion Study: With the Assistance of Prof. Reidenberg and Dr. Lee Bygrave, at the Request
of the European Commission’. CRID. https://www.europarl.europa.eu/meetdocs/2009_
2014/documents/libe/dv/07_etude_safe-harbour-2004_/07_etude_safe-harbour-2004_
en.pdf.DHS. 2005. ‘A Report Concerning Passenger Name Record Information Derived
from Flights between the U.S. and the European Union’. Privacy Office. https://www.
dhs.gov/xlibrary/assets/privacy/privacy_pnr_rpt_09-2005.pdf, accessed 4 July 2022.
Digital Rights Ireland v. Commission. 2014. Digital Rights Ireland v. Commission. Court of
Justice of the EU: General Court. Case T-670/16.
Dijck, José van. 2014. ‘Datafication, Dataism and Dataveillance: Big Data between Scientific
Paradigm and Ideology’. Surveillance & Society 12 (2): 197–208.
Dillon, Michael. 2002. Politics of Security: Towards a Political Phiosophy of Continental
Thought. London: Routledge.
Dingwerth, Klaus, and Philipp Pattberg. 2009. ‘World Politics and Organizational Fields:
The Case of Transnational Sustainability Governance’. European Journal of International
Relations 15 (4): 707–43.
Directive (EU) 2018/1673 of the European Parliament and of the Council of 23
October 2018 on Combating Money Laundering by Criminal Law. 2018. OJ
L 284.
Doneda, Danilo, and Laura Schertel Mendes. 2014. ‘Data Protection in Brazil: New
Developments and Current Challenges’. In Reloading Data Protection: Multidisciplinary
242 REFERENCES

Insights and Contemporary Challenges, edited by Serge Gutwirth, Ronald Leenes, and
Paul de Hert, 3–20. Dordrecht: Springer.
Doty, Roxanne Lynn. 1993. ‘Foreign Policy as Social Construction: A Post-Positivist Anal-
ysis of US Counterinsurgency Policy in the Philippines’. International Studies Quarterly
37 (3): 297–320.
Dowd, Rebekah. 2019. ‘The Development of Digital Human Rights in the European Union:
How Key Interests Shape National and Regional Data Governance’. PhD Dissertation.
Atlanta: Georgia State University. doi: https://doi.org/10.57709/14937195.
Drezner, Daniel W. 2007. All Politics Is Global: Explaining International Regulatory Regimes.
Princeton, NJ: Princeton University Press.
Drezner, Daniel W. 2013. ‘The Tragedy of the Global Institutional Commons’. In Back to
Basics: State Power in a Contemporary World, edited by Martha Finnemore and Judith L
Goldstein, 280–312. Oxford and New York: Oxford University Press.
Drozdiak, Natalia. 2019. ‘EU Privacy Laws May Be Hampering Pursuit of Ter-
rorists’. Bloomberg, 8 July 2019. https://www.bloomberg.com/news/articles/2019-07-
08/european-privacy-laws-may-be-hampering-those-catching-terrorists, accessed 4 July
2022.
Dunoff, Jeffrey L., and Joel P. Trachtman. 2009. Ruling the World? Constitutionalism, Inter-
national Law, and Global Governance. Cambridge and New York: Cambridge University
Press.
Dür, Andreas, David Marshall, and Patrick Bernhagen. 2019. The Political Influence of Busi-
ness in the European Union. New Comparative Politics. Ann Arbor, MI: University of
Michigan Press.
EC. 2002. ‘Commission Staff Working Paper: The Application of Commission Decision
520/2000/EC of 26 July 2000 Pursuant to Directive 95/46 of the European Parliament
and of the Council on the Adequate Protection of Personal Data Provided by the Safe
Harbour Privacy Principles and Related Frequently Asked Questions Issued by the US
Department of Commerce’. SEC(2002) 196. Brussels.
EC. 2003. ‘Transfer of Air Passenger Name Record (PNR) Data: A Global EU Approach’.
COM(2003)826 final. Communication from the Commission to the Council and the
Parliament. Brussels.
EC. 2010a. “A Comprehensive Approach on Personal Data Protection in the European
Union”. https://eur-lex.europa.eu/legal-content/en/txt/pdf/?uri=celex:52010dc0609&
from=en, accessed 4 July 2022.
EC. 2010b. ‘On the Global Approach to Transfers of Passenger Name Record (PNR) Data
to Third Countries’. COM/2010/0492 final. Communication from the Commission.
Brussels.
EC. 2011. ‘Commission Report on the Joint Review of the Implementation of the Agreement
between the European Union and the United States of America on the Processing and
Transfer of Financial Messaging Data from the European Union to the United States for
the Purposes of the Terrorist Finance Tracking Program 17–18 February 2011’. Brussels.
EC. 2012a. “Observations Écrites”. sj.f(2012)819079. Brussels.
EC. 2012b. ‘Commission Staff Working Document. Report on the Second Joint Review
of the Implementation of the Agreement between the European Union and the United
States of America on the Processing and Transfer of Financial Messaging Data from the
European Union to the United States for the Purposes of the Terrorist Finance Tracking
Program October 2012’. SWD(2012) 454 final. Brussels.
EC. 2013a. ‘Report from the Commission to the European Parliament and the Council on
the Joint Review of the Implementation of the Agreement between the European Union
REFERENCES 243

and the United States of America on the Processing and Transfer of Passenger Name
Records to the United States Department of Homeland Security’. COM/2013/0844 final.
Brussels.
EC. 2013b. ‘Communication from the Commission to the European Parliament and the
Council: A European Terrorist Finance Tracking System (EU TFTS)’. COM(20130 842
final. Brussels.
EC. 2013c. ‘Annex: Joint Report from the Commission and the U.S. Treasury Depart-
ment Regarding the Value of TFTP Provided Data Pursuant to Article 6 (6) of
the Agreement between the European Union and the United States of America on
the Processing and Transfer of Financial Messaging Data from the European Union
to the United States for the Purposes of the Terrorist Finance Tracking Program’.
Brussels.
EC. 2013d. ‘Communication from the Commission to the European Parliament
and the Council on the Functioning of the Safe Harbour from the Perspective
of EU Citizens and Companies Established in the EU’. COM(2013) 847 final.
Brussels.
EC. 2013e. ‘Rebuilding Trust in EU–US Data Flows’. COM(2013) 846 final. Brussels.
EC. 2014. ‘Written Observations Submitted by the European Commission in Case
C-362/14’. sj.f(2014)4003332. Brussels.
EC. 2015a. ‘First Vice-President Timmermans and Commissioner Jourová ’s Press Confer-
ence on Safe Harbour Following the Court Ruling in Case C-362/14 (Schrems)’. Stras-
bourg. http://europa.eu/rapid/press-release_STATEMENT-15-5782_en.htm, accessed
4 July 2022.
EC. 2015b. ‘Minutes of the 62nd Meeting’. Committee on the Protection of Individuals with
regards to the processing of Personal Data (Article 31 Committee). Brussels.
EC. 2016a. ‘Questionnaire on Improving Criminal Justice in Cyberspace Summary
of Responses.’ https://home-affairs.ec.europa.eu/system/files/2020-09/summary_of_
replies_to_e-evidence_questionnaire_en.pdf.
EC. 2016b. ‘Commission Presents Action Plan to Strengthen the Fight against Terrorist
Financing’. 2 February 2016. https://ec.europa.eu/commission/presscorner/detail/en/
IP_16_202, accessed 4 July 2022.
EC. 2016c. ‘Transatlantic Data Flows: Restoring Trust through Strong Safeguards’.
COM(2016) 117 final. Brussels.
EC. 2016d. ‘Minutes of the 65th Meeting’. Committee on the Protection of Individuals with
Regards to the Processing of Personal Data (Article 31 Committee). Brussels.
EC. 2016e. ‘Minutes of the 68th Meeting’. Committee on the Protection of Individuals with
Regards to the Processing of Personal Data (Article 31 Committee). Brussels.
EC. 2016f. ‘European Commission Launches EU–U.S. Privacy Shield: Stronger Protec-
tion for Transatlantic Data Flows’. https://europa.eu/rapid/press-release_IP-16-2461_
en.htm, accessed 4 July 2022.
EC. 2017a. ‘Terrorist Finance Tracking Programme’. https://ec.europa.eu/home-affairs/
what-we-do/policies/crisis-and-terrorism/tftp_en, accessed 4 July 2022.
EC. 2017b. ‘Commission Staff Working Document: Joint Review of the Implementation
of the Agreement between the European Union and the United States of America on
the Processing and Transfer of Passenger Name Records (PNR) to the United States
Department of Homeland Security’. SWD/2017/014 final.
EC. 2018a. ‘Brief of the European Commission on Behalf of the European Union as Amicus
Curiae in Support of Neither Party, United States of America v. Microsoft Corporation’.
US Supreme Court.
244 REFERENCES

EC. 2018b. ‘Proposal for a Regulation of the European Parliament and of the Council
on European Production and Preservation Orders for Electronic Evidence in Criminal
Matters’. COM(2018) 225 final.
EC. 2018c. ‘Proposal for a Directive of the European Parliament and of the Council Laying
Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose
of Gathering Evidence in Criminal Proceedings’. COM/2018/226 final.
EC. 2018d. ‘Report from the Commission to the European Parliament and the Coun-
cil on the Second Annual Review of the Functioning of the EU–U.S. Privacy Shield
{SWD(2018) 497 Final}’. COM(2018) 860 final.
EC. 2019a. ‘Questions and Answers: Mandate for the EU–U.S. Cooperation on Elec-
tronic Evidence’. 5 February 2019. https://ec.europa.eu/commission/presscorner/detail/
en/memo_19_863.
EC. 2019b. ‘Commission Staff Working Document. Joint Review Report of the Imple-
mentation of the Agreement between the European Union and the United States of
America on the Processing and Transfer of Financial Messaging Data from the Euro-
pean Union to the United States for the Purposes of the Terrorist Finance Track-
ing Program’. SWD(2019) 301 final. https://eur-lex.europa.eu/legal-content/EN/TXT/
?uri=SWD:2019:301:FIN, accessed 4 July 2022.
EC. 2019. ‘Commission Staff Working Document Accompanying the Document “Report
from the Commission to the European Parliament and the Council on the Third Annual
Review of the Functioning of the EU–U.S. Privacy Shield {COM(2019) 495 Final}’.
Brussels.
EC. 2020a. ‘A European Strategy for Data’. https://ec.europa.eu/digital-single-market/en/
policies/building-european-data-economy, accessed 4 July 2022.
EC. 2020b. ‘E-Evidence: Cross-Border Access to Electronic Evidence’. European Commis-
sion. 2020. https://ec.europa.eu/info/policies/justice-and-fundamental-rights/criminal-
justice/e-evidence-cross-border-access-electronic-evidence_en, accessed 4 July 2022.
EC. 2020c. ‘Parliamentary Questions: Answer Given by Mr Reynders on Behalf of the Euro-
pean Commission. Question Reference: E-003136/2019’. Brussels. http://www.europarl.
europa.eu/doceo/document/E-9-2019-003136-ASW_EN.html, accessed 4 July 2022.
EC. 2020d. ‘Commission Steps Up Fight against Money Laundering’. 7 May 2020. https://
ec.europa.eu/commission/presscorner/detail/en/ip_20_800, accessed 4 July 2022.
EC. 2020e. ‘Europe Fit for the Digital Age: Commission Proposes New Rules for Digital
Platforms’. 15 December 2020. https://ec.europa.eu/commission/presscorner/detail/en/
ip_20_2347, accessed 4 July 2022
EC. 2022. ‘Adequacy Decisions’. European Commission. 2022. https://ec.europa.eu/info/
law/law-topic/data-protection/international-dimension-data-protection/adequacy-
decisions_en, accessed 4 July 2022.
EC and US Customs. 2003. ‘Joint Statement: European Commission/US Customs Talks
on PNR Transmission’. Brussels. 17-18 February. https://www.statewatch.org/news/
2003/february/european-commission-us-customs-talks-on-pnr-transmission-brussels-
17-18-february-joint-statement/.
The Economist. 2017. ‘The World’s Most Valuable Resource Is No Longer Oil, but Data’. The
Economist, 6 May 2017. https://www.economist.com/leaders/2017/05/06/the-worlds-
most-valuable-resource-is-no-longer-oil-but-data, accessed 4 July 2022.
ECOWAS. 2010. ‘Supplementary Act A/SA.1/01/10 on Personal Data Protection within
ECOWAS, 37th Session of the Authority of Heads of State and Government’. Abuja.
EDPS. 2007. ‘New PNR Agreement with the United States of America: Letter from Peter
Hustinx, EDPS, to Dr. Wolfgang Schäuble, Minister for the Interior, Berlin’. PH/SM/ab
REFERENCES 245

D(2007)1367 C2007-0351. https://epic.org/privacy/pdf/hustinx-letter.pdf, accessed 4


July 2022.
EDPS. 2010a. ‘Comments of the EDPS on Different International Agreements, Notably
the EU–US and EU–AUS PNR Agreements, the EU–US TFTP Agreement, and
the Need of a Comprehensive Approach to International Data Exchange Agree-
ments’. Brussels. https://edps.europa.eu/sites/edp/files/publication/10-01-25_eu_us_
data_exchange_en.pdf, accessed 4 July 2022.
EDPS. 2010b. ‘Opinion of the European Data Protection Supervisor: the Proposal for a
Council Decision on the Conclusion of the Agreement between the European Union
and the United States of America on the Processing and Transfer of Financial Messaging
Data from the European Union to the United States for the Purposes of the Terrorist
Finance Tracking Program (TFTP-II)’. Brussels. https://edps.europa.eu/sites/edp/files/
publication/10-06-22_opinion_tftp_en.pdf, accessed 4 July 2022.
EDPS. 2011a. ‘Opinion of the European Data Protection Supervisor on the Proposal for
a Directive of the European Parliament and of the Council on the Use of Passenger
Name Record Data for the Prevention, Detection, Investigation and Prosecution of
Terrorist Offences and Serious Crime’. Brussels. https://edps.europa.eu/sites/edp/files/
publication/11-03-25_pnr_en.pdf, accessed 4 July 2022.
EDPS. 2011b. ‘Comments on the Communication from the Commission to the European
Parliament, the Council, the European Economic and Social Committee and the Com-
mittee of the Regions of 13 July 2011: “A European Terrorist Finance Tracking System:
Available Options”’. Brussels. https://edps.europa.eu/sites/edp/files/publication/11-10-
25_comments_tfts_en.pdf, accessed 4 July 2022.
EDPS. 2015a. ‘A Further Step towards Comprehensive EU Data Protection: EDPS Rec-
ommendations on the Directive for Data Protection in the Police and Justice Sector’.
Opinion 6/2015. Brussels. https://edps.europa.eu/sites/default/files/publication/15-10-
28_directive_recommendations_en.pdf.
EDPS. 2015b. ‘EDPS Supports EU Legislator on Security but Recommends Re-Thinking on
EU PNR’. EDPS/2015/12. https://edps.europa.eu/press-publications/press-news/press-
releases/2015/statement-edps-supports-eu-legislator-security_en, accessed 4 July 2022.
EDPS. 2017a. ‘EDPS Opinion on a Commission Proposal Amending Directive (EU)
2015/849 and Directive 2009/101/EC Access to Beneficial Ownership Information and
Data Protection Implications’. Opinion 1/2017. Brussels.
EDPS. 2017b. ‘Supervising Europol: The EDPS Is Ready!’ April. https://edps.europa.
eu/press-publications/press-news/blog/supervising-europol-edps-ready_en, accessed 4
July 2022.
EDPS. 2019. ‘EU–US Agreement on Electronic Evidence’. 2/2019. https://edps.europa.
eu/data-protection/our-work/publications/opinions/eu-us-agreement-electronic-
evidence_en, accessed 4 July 2022.
EDPS. 2020. ‘EDPS Statement Following the Court of Justice Ruling in Case C-
311/18 Data Protection Commissioner v Facebook Ireland Ltd and Maximilian
Schrems (“Schrems II”)’. July. https://edps.europa.eu/press-publications/press-news/
press-releases/2020/edps-statement-following-court-justice-ruling-case_en, accessed 4
July 2022.
EDRi. 2012. ‘EU-US PNR Agreement: A Bad Day for Civil Liberties in Europe’, March.
https://edri.org/edrigramnumber10-6us-eu-pnr-libe-decision/, accessed 4 July 2022.
EDRi. 2017. ‘Annex to EDRi’s Response to the Public Consultation on Improving Cross-
Border Access to Electronic Evidence in Criminal Matters Organised by the European
Commission’. Brussels.
246 REFERENCES

EDRi, EFF, et al. 2019. ‘Letter to Alexander Seger, Head of the Cybercrime Unit
of the Council of Europe’. https://edri.org/wp-content/uploads/2019/11/20191108_
CivilSocietyLetter_TCYSecondProtocol.pdf, accessed 4 July 2022.
Eichensehr, Kristen. 2019. ‘Digital Switzerlands’. University of Pennsylvania Law Review
167: 665–732.
Elligsen, Nora. 2016. ‘The Microsoft Ireland Case: A Brief Summary’. Lawfare, July.
https://www.lawfareblog.com/microsoft-ireland-case-brief-summary, accessed 4 July
2022.
Ellis-Petersen, Hannah. 2021. ‘WhatsApp Sues Indian Government over “Mass Surveil-
lance” Internet Laws’. The Guardian, 26 May 2021. https://www.theguardian.com/
world/2021/may/26/whatsapp-sues-indian-government-over-mass-surveillance-
internet-laws, accessed 4 July 2022.
Emirbayer, Mustafa, and Victoria Johnson. 2008. ‘Bourdieu and Organizational Analysis’.
Theory and Society 37 (1): 1–44.
Endt, Christian. 2019. ‘Überwachung von Flugpassagieren liefert Fehler über
Fehler’. Süddeutsche Zeitung, 24 April 2019. https://www.sueddeutsche.de/digital/
fluggastdaten-bka-falschtreffer-1.4419760, accessed 4 July 2022.
EP. 2003. ‘European Parliament Resolution on Transfer of Personal Data by Airlines in the
Case of Transatlantic Flights’. P5_TA(2003)0097. Strasbourg.
EP. 2004. ‘Resolution on the Draft Commission Decision Noting the Adequate Level of Pro-
tection Provided for Personal Data Contained in the Passenger Name Records (PNRs)
Transferred to the US Bureau of Customs and Border Protection’. P5_TA(2004)0245.
Brussels.
EP. 2006. ‘Interception of Bank Transfer Data from the SWIFT System by the US Secret
Services’. P6_TA(2006)0317. Strasbourg.
EP. 2007a. ‘US Homeland Security Secretary Michael Chertoff and MEPs Debate
Data Protection’. https://www.europarl.europa.eu/sides/getDoc.do?type=IM-PRESS&
reference=20070514IPR06625&language=EN, accessed 4 July 2022.
EP. 2007b. ‘European Parliament Resolution on SWIFT, the PNR Agreement and the
Transatlantic Dialogue on These Issues’. P6_TA(2007)0039. Strasbourg.
EP. 2009. ‘European Parliament Resolution of 17 September 2009 on the Envisaged Interna-
tional Agreement to Make Available to the United States Treasury Department Financial
Payment Messaging Data to Prevent and Combat Terrorism and Terrorist Financing’.
P7_TA(2009)0016. Strasbourg.
EP. 2010a. ‘SWIFT: European Parliament Votes Down Agreement with the US’.
11 February 2010. https://www.europarl.europa.eu/sides/getDoc.do?type=IM-
PRESS&reference=20100209IPR68674&language=GA, accessed 4 July 2022.
EP. 2010b. ‘Brussels Plenary Session: 5–6 May, 2010’. 6 May 2010. https://www.
europarl.europa.eu/sides/getDoc.do?language=en&type=IM-
PRESS&reference=20100430FCS73854#title9, accessed 4 July 2022.
EP. 2013. ‘European Parliament Resolution of 23 October 2013 on the Suspension
of the TFTP Agreement as a Result of US National Security Agency Surveillance’.
P7_TA(2013)0449. Strasbourg.
EP. 2014a. ‘Resolution on the US NSA Surveillance Programme, Surveillance Bodies
in Various Member States and Their Impact on EU Citizens’ Fundamental Rights
and on Transatlantic Cooperation in Justice and Home Affairs (2013/2188(INI))’.
P7_TA(2014)0230. Strasbourg.
EP. 2014b. ‘Written Observations Submitted by the European Parliament in Case C-362/14’.
SJ-0686/14. Brussels: Court of Justice of the European Union.
REFERENCES 247

EP. 2015. ‘Resolution of 29 October 2015 on the Follow-Up to the European Parlia-
ment Resolution of 12 March 2014 on the Electronic Mass Surveillance of EU Citizens
(2015/2635(RSP)).
EP. 2016a. ‘Position of the European Parliament Adopted at First Reading on 14 April 2016
with a View to the Adoption of Directive (EU) 2016/ … of the European Parliament and
of the Council on the Use of Passenger Name Record (PNR) Data for the Prevention,
Detection, Investigation and Prosecution of Terrorist Offences and Serious Crime (EP-
PE_TC1-COD(2011)0023)’. Brussels.
EP. 2016b. ‘European Parliament Resolution of 26 May 2016 on Transatlantic Data Flows
(2016/2727(RSP))’. P8_TA(2016)0233. Brussels.
EP. 2017a. ‘European Parliament Resolution of 6 April 2017 on the Adequacy of the Pro-
tection Afforded by the EU–US Privacy Shield (2016/3018(RSP))’. P8_TA(2017)0131.
Strasbourg.
EP. 2017b. ‘European Parliament Resolution on the Fight against Cybercrime’.
(2017/2068(INI)). Strasbourg.
EP. 2018. ‘European Parliament Resolution of 5 July 2018 on the Adequacy of the Pro-
tection Afforded by the EU–US Privacy Shield (2018/2645(RSP))’. P8_TA(2018)0315.
Strasbourg.
EPIC. 2018. ‘Brief of Amici Curiae Electronic Privacy Information Center (EPIC) and
Thirty-Seven Technological Experts and Legal Scholars in Support of Respondent,
United States of America v. Microsoft Corporation’. https://www.epic.org/amicus/ecpa/
microsoft/US-v-Microsoft-amicus-EPIC.pdf, accessed 4 July 2022.
Epstein, Dmitry, Christian Katzenbach, and Francesca Musiani. 2016. ‘Doing Internet
Governance: Practices, Controversies, Infrastructures, and Institutions’. Internet Policy
Review 5 (3): 1–14.
Epstein, Dmitry, Merrill C. Roth, and Eric P.S. Baumer. 2014. ‘It’s the Definition, Stupid!
Framing of Online Privacy in the Internet Governance Forum Debates’. Journal of
Information Policy 4: 144–72. doi: 10.5325/jinfopoli.4.2014.0144.
Erie, Matthew S., and Thomas Streinz. 2022. ‘The Beijing Effect: China’s “Digital Silk Road”
as Transnational Data Governance’. SSRN Scholarly Paper ID 3810256. Rochester, NY:
Social Science Research Network. https://papers.ssrn.com/abstract=3810256, accessed
4 July 2022.
Etzioni, Amitai. 1999. ‘A Communitarian Perspective on Privacy’. Connecticut Law Review
32: 897–905.
Etzioni, Amitai. 2014. ‘Communitarianism Revisited’. Journal of Political Ideologies 19 (3):
241–60.
Etzioni, Amitai. 2018. Law and Society in a Populist Age: Balancing Individual Rights and
the Common Good. Bristol: Bristol University Press.
EU and US. 2004. ‘Agreement between the European Community and the United States
of America on the Processing and Transfer of PNR Data by Air Carriers to the United
States Department of Homeland Security, Bureau of Customs and Border Protection’.
22004A0520(01). OJ L 183.
EU and US. 2007. ‘Agreement between the European Union and the United States of Amer-
ica on the Processing and Transfer of Passenger Name Record (PNR) Data by Air Carriers
to the United States Department of Homeland Security (DHS)’. OJ L 204.
EU and US. 2010. ‘Agreement between the European Union and the United States of Amer-
ica on the Processing and Transfer of Financial Messaging Data from the European
Union to the United States for the Purposes of the Terrorist Finance Tracking Program’.
OJ L195/5.
248 REFERENCES

EU and US. 2011. ‘Agreement between the United States of America and the Euro-
pean Union on the Use and Transfer of Passenger Name Records to the United States
Department of Homeland Security’. OJ L 215, 11.8.2012.
EU and US. 2012. ‘EU–US Agreement on the Use and Transfer of PNR to the US Depart-
ment of Homeland Security’. OJ C 258E. http://eur-lex.europa.eu/legal-content/EN/
TXT/?uri=celex:52012AP0134, accessed 4 July 2022.
EU and US. 2016. ‘Agreement between the United States of America and the European
Union on the Protection of Personal Information Relating to the Prevention, Investiga-
tion, Detection, and Prosecution of Criminal Offences’. OJ L336.
EU and US. 2019. ‘Joint EU–U.S. Statement Following the EU–U.S. Justice and Home Affairs
Ministerial Meeting’. 2019. 11 December 2019.
EU FRA. 2018. ‘Surveillance by Intelligence Services: Fundamental Rights Safeguards
and Remedies in the European Union: Volume II: Summary’. https://fra.europa.eu/
en/publication/2018/surveillance-intelligence-services-fundamental-rights-safeguards-
and-remedies#TabPubOverview, accessed 4 July 2022.
European Centre for International Political Economy and Kearney. 2021. ‘The Eco-
nomic Costs of Restricting the Cross-Border Flow of Data’. https://www.kearney.
com/documents/3677458/161343923/The+economic+costs+of+restricting+the+cross-
border+flow+of+data.pdf/82370205-fa6b-b135-3f2b-b406c4d6159e?t=1625067571000,
accessed 4 July 2022.
European Investigation Order. 2014. Directive 2014/41/EU of the European Parliament and
of the Council of 3 April 2014 Regarding the European Investigation Order in Criminal
Matters. OJ L 130 1–36.
European Ombudsman. 2014. ‘Decision of the European Ombudsman Closing the Inquiry
into Complaint 1148/2013/TN against the European Police Office (Europol)’. https://
www.ombudsman.europa.eu/en/decision/en/54678#_ftnref2, accessed 4 July 2022.
European Ombudsman. 2018. ‘Decision in Case 811/2017/EA on the Transparency of
“Advisory Bodies” That Influence the Development of EU Policy’. 19 September 2018.
https://europa.eu/!UQ46By, accessed 4 July 2022.
Europol. 2019. ‘SIRIUS EU Digital Evidence Situation Report 2019’. 1073582. The Hague.
https://www.europol.europa.eu/sites/default/files/documents/sirius_eu_digital_
evidence_report.pdf, accessed 4 July 2022.
Europol. 2020. ‘Europol Information System (EIS)’. https://www.europol.europa.eu/
activities-services/services-support/information-exchange/europol-information-
system, accessed 4 July 2022.
EU–US Working Group. 2013. ‘Report on the Findings by the EU Co-Chairs of the Ad Hoc
EU–US Working Group on Data Protection’. http://www.consilium.europa.eu/uedocs/
cms_data/docs/pressdata/en/jha/139745.pdf, accessed 4 July 2022.
Fabbrini, Federico, Edoardo Celeste, and John Quinn. 2021. Data Protection beyond
Borders. Oxford and New York: Hart Publishing.
Facebook. 2019. ‘Public Consultation Response: 2nd Additional Protocol to the Budapest
Convention on Cybercrime: 4th Round of Consultation’. Strasbourg: Council of Europe.
https://rm.coe.int/facebook-comments-2nd-additional-protocol/168098c93f, accessed
4 July 2022.
Facebook. 2020. ‘Meta Investor Relations’. https://investor.fb.com/resources/default.aspx,
accessed 4 July 2022.
Facebook Ireland Ltd. 2021. Facebook Ireland Ltd, Facebook Inc., Facebook Belgium BVBA,
v Gegevensbeschermingsautoriteit. Court of Justice of the EU: General Court. Case
C-645/19.
REFERENCES 249

Fahey, Elaine. 2014. ‘EU Foreign Relations Law: Litigating to Incite Openness in EU
Negotiations’. European Journal of Risk Regulation 5 (4): 553–6.
Fahey, Elaine. 2020. Framing Convergence with the Global Legal Order: The EU and the
World. Oxford and London: Bloomsbury Publishing.
Fahey, Elaine, and Fabien Terpan. 2021. ‘Torn between Institutionalisation and Judicialisa-
tion: The Demise of the EU–US Privacy Shield’. Indiana Journal of Global Legal Studies
28 (2): 205–44.
Falque-Pierrotin, Isabelle. 2017. ‘Pour un droit au déréférencement mondial’. https://www.
cnil.fr/fr/pour-un-droit-au-dereferencement-mondial, accessed 4 July 2022.
Farrell, Henry. 2003. ‘Constructing the International Foundations of E-Commerce: The
EU–U.S. Safe Harbor Arrangement’. International Organization 57 (2): 277–306.
Farrell, Henry. 2005. ‘New Issue-Areas in the Transatlantic Relationship: E-Commerce
and the Safe Harbor Arrangement’. In Creating a Transatlantic Marketplace: Govern-
ment Policies and Business Strategies, edited by Michelle Egan, 112–33. European Policy
Research Unit. Manchester: Manchester University Press.
Farrell, Henry. 2006. ‘Regulating Information Flows. States, Private Actors, and
E-Commerce’. Annual Review of Political Science 9 (1): pp. 353–74.
Farrell, Henry. 2015. ‘Obama Says That Europeans Are Using Privacy Rules to Protect
Their Firms against U.S. Competition. Is He Right?’ Washington Post, 2015. https://www.
washingtonpost.com/news/monkey-cage/wp/2015/02/17/obama-says-that-europeans-
are-using-privacy-rules-to-protect-their-firms-against-u-s-competition-is-he-right/.
Farrell, Henry, and Abraham L. Newman. 2016. ‘The New Interdependence Approach:
Theoretical Development and Empirical Demonstration’. Review of International Politi-
cal Economy 23 (5): 713–36.
Farrell, Henry, and Abraham L. Newman. 2018. ‘Linkage Politics and Complex
Governance in Transatlantic Surveillance’. World Politics 70 (4): 515–54. doi:
10.1017/S0043887118000114.
Farrell, Henry, and Abraham L. Newman. 2019. Of Privacy and Power: The Transatlantic
Struggle over Freedom and Security. Princeton, NJ: Princeton University Press.
Farrell, Henry, and Abraham L. Newman. 2021. ‘The Janus Face of the Liberal International
Information Order: When Global Institutions Are Self-Undermining’. International
Organization 75 (2): 333–58.
FATF. 2020. ‘Home - Financial Action Task Force (FATF)’. 2020. http://www.fatf-gafi.org/
home/, accessed 4 July 2022.
Faude, Benjamin, and Felix Groβe-Kreul. 2020. ‘Let’s Justify! How Regime Complexes
Enhance the Normative Legitimacy of Global Governance’. International Studies Quar-
terly 64 (2): 431–9.
Federal Government (Germany). 2019. ‘Antwort der Bundesregierung auf die Kleine
Anfrage der Abgeordneten Roman Müller-Böhm, Michael Theurer, Dr. Marcel Klinge,
Weiterer Abgeordneter und der Fraktion der FDP: Drucksache 19/12402: Fluggast-
datenspeicherung und Übermittlung an das Bundeskriminalamt’. 19/12858. http://dipbt.
bundestag.de/dip21/btd/19/128/1912858.pdf, accessed 4 July 2022.
Federal Ministry of Justice and Consumer Protection (Germany). 2019. ‘E-Evidence
(Grenzüberschreitende Gewinnung elektronischer Beweismittel in Strafverfahren)’.
https://cdn.netzpolitik.org/wp-upload/2019/07/Hintergrundpapier-e-Evidence-cl.pdf.
pdf, accessed 4 July 2022.
Feldstein, Steven. 2021. The Rise of Digital Repression: How Technology Is Reshaping Power,
Politics, and Resistance. Oxford and New York: Oxford University Press.
250 REFERENCES

Fischer, Camille. 2018. ‘The CLOUD Act: A Dangerous Expansion of Police Snooping
on Cross-Border Data’. Electronic Frontier Foundation, February. https://www.eff.
org/deeplinks/2018/02/cloud-act-dangerous-expansion-police-snooping-cross-border-
data, accessed 4 July 2022.
Fischer-Lescano, Andreas. 2016. ‘Struggles for a Global Internet Constitution: Protecting
Global Communication Structures against Surveillance Measures’. Global Constitution-
alism 5 (2): 145–72. doi: 10.1017/S204538171600006X.
Flaherty, David H. 1989. Protecting Privacy in Surveillance Societies: The Federal Republic of
Germany, Sweden, France, Canada, and the United States. Chapel Hill, NC, and London:
University of North Carolina Press.
Fleischer, Peter. 2012. ‘Peter Fleischer: Privacy … ?: The Right to Be Forgotten, or How to
Edit Your History’. Peter Fleischer, January. http://peterfleischer.blogspot.com/2012/01/
right-to-be-forgotten-or-how-to-edit.html, accessed 4 July 2022.
Fleischer, Peter. 2016. ‘Adapting Our Approach to the European Right to Be Forgotten’.
Google, March. https://www.blog.google/around-the-globe/google-europe/adapting-
our-approach-to-european-rig/, accessed 4 July 2022.
Fligstein, Neil. 2001. ‘Social Skill and the Theory of Fields’. Sociological Theory 19 (2): 105–
25. doi: 10.1111/0735-2751.00132.
Fligstein, Neil, and Doug McAdam. 2011. ‘Toward a General Theory of Strategic Action
Fields’. Sociological Theory 29 (1): 1–26.
Fligstein, Neil, and Doug McAdam. 2012. A Theory of Fields. Oxford and New York: Oxford
University Press.
Flonk, Daniëlle. 2021. ‘Emerging Illiberal Norms: Russia and China as Promoters of
Internet Content Control’. International Affairs 97 (6): 1925–44. doi: 10.1093/ia/iiab146.
Flonk, Daniëlle, Markus Jachtenfuchs, and Anke S. Obendiek. 2020. ‘Authority Conflicts in
Internet Governance: Liberals vs. Sovereigntists?’ Global Constitutionalism 9 (2): 364–
86.
Floridi, Luciano. 2005. ‘The Ontological Interpretation of Informational Privacy’. Ethics and
Information Technology 4: 185–200.
Floridi, Luciano, ed. 2016. The Routledge Handbook of Philosophy of Information. Routledge
Handbooks in Philosophy. London: Routledge.
Forst, Rainer. 2001. ‘Towards a Critical Theory of Transnational Justice’. Metaphilosophy 32
(1–2): 160–179. https://doi.org/10.1111/1467-9973.00180.
Forst, Rainer. 2015. Normativität und Macht: Zur Analyse sozialer Rechtfertigungsordnun-
gen. Berlin: Suhrkamp.
Fourcade, Marion, and Kieran Healy. 2017. ‘Seeing like a Market’. Socio-Economic Review
15 (1): 9–29.
Fowler, Bridget. 2014. ‘Figures of Descent from Classical Sociological Theory: Luc Boltan-
ski’. In The Spirit of Luc Boltanski: Essays on the ‘Pragmatic Sociology of Critique’, edited
by Bryan S. Turner and Simon Susen, 67–88. London: Anthem Press.
Frantziou, Eleni. 2014. ‘Further Developments in the Right to Be Forgotten: The European
Court of Justice’s Judgment in Case C-131/12, Google Spain, SL, Google Inc v Agen-
cia Española de Protección de Datos’. Human Rights Law Review 14 (4): 761–777. doi:
10.1093/hrlr/ngu033.
Freedom House. 2016. ‘Brazil: Judicial Over-Reach in Data Privacy Case | Press Release’. 1
March 2016. https://freedomhouse.org/article/brazil-judicial-over-reach-data-privacy-
case.
Friedman, Milton. 2009. Capitalism and Freedom. Chicago: University of Chicago Press.
REFERENCES 251

Friedman, Thomas L. 2000. The Lexus and the Olive Tree: Understanding Globalization.
New York: Farrar, Straus, and Giroux.
Friedrichs, Jörg, and Friedrich Kratochwil. 2009. ‘On Acting and Knowing: How Prag-
matism Can Advance International Relations Research and Methodology’. International
Organization 63 (4): 701–31.
Frosio, Giancarlo F. 2013. ‘A Brazilian Judge Orders Facebook off Air If It Fails to Remove
a Defamatory Discussion’. Stanford Law School: Center for Internet and Society.
Frosio, Giancarlo F. 2016. ‘The Right to Be Forgotten: Much Ado about Nothing’. Colorado
Technology Law Journal 15 (2): 307–36.
Frosio, Giancarlo F. 2018. ‘Why Keep a Dog and Bark Yourself ? From Intermediary Liability
to Responsibility’. Oxford International Journal of Law and Information Technology 26
(1): 1–33.
FTC. 2013. ‘Privacy Enforcement and Safe Harbor: Comments of FTC Staff to European
Commission. Review of the U.S.–EU Safe Harbor Framework’. Washington DC.
Fuster, Gloria González, Paul de Hert, and Serge Gutwirth. 2008. ‘SWIFT and the Vul-
nerability of Transatlantic Data Transfers’. International Review of Law, Computers &
Technology 22 (1–2): 191–202. doi: 10.1080/13600860801925185.
G7. 1989. ‘Economic Declaration Paris, July 16, 1989’. http://www.g8.utoronto.ca/summit/
1989paris/communique/index.html, accessed 4 July 2022.
G7. 2016. ‘Joint Declaration by G7 ICT Ministers’. Takamatsu, Kagawa: G7 ICT Ministers’
Meeting. http://www.g8.utoronto.ca/ict/2016-ict-declaration.html, accessed 4 July 2022.
G20. 2016. ‘G20 Digital Economy Development and Cooperation Initiative’. http://www.
g20chn.org/English/Documents/Current/201609/t20160908_3411.html, accessed 4
July 2022.
Garstka, Krzysztof, and David Erdos. 2017. ‘Hiding in Plain Sight: The Right to Be Forgot-
ten and Search Engines in the Context of International Data Protection Frameworks’. In
Platform Regulations: How Platforms Are Regulated and How They Regulate Us, edited
by Luca Belli and Nicolo Zingales, 127–47. Rio de Janeiro: FGV Direito Rio.
GC and Others v. CNIL and Others. 2019. GC, AF, BH, and ED v Commission Nationale
de l’Informatique et des Libertés (CNIL), Premier ministre, and Google LLC. Court of
Justice of the EU of the EU: General Court. Case C-136/17.
Genschel, Philipp, and Markus Jachtenfuchs. 2018. ‘From Market Integration to Core State
Powers: The Eurozone Crisis, the Refugee Crisis and Integration Theory’. Journal of
Common Market Studies 56 (1): 178–96. doi: 10.1111/jcms.12654.
General Data Protection Regulation (GDPR). 2016. Regulation (EU) 2016/679 of the Euro-
pean Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons
with Regard to the Processing of Personal Data and on the Free Movement of Such Data,
and Repealing Directive 95/46/EC.
Gerring, John. 2004. ‘What Is a Case Study and What Is It Good for?’ The American Political
Science Review 98 (2): 341–54.
Gesellschaft für Freiheitsrechte. 2020. ‘CJEU to Decide on Processing of Passenger Data
under PNR Directive’. EDRi, January. https://edri.org/cjeu-to-decide-on-processing-of-
passenger-data-under-pnr-directive/, accessed 4 July 2022.
Gibbs, Samuel. 2014. ‘Larry Page: “Right to Be Forgotten” Could Empower Government
Repression’. The Guardian, 30 May 2014. http://www.theguardian.com/technology/
2014/may/30/larry-page-right-to-be-forgotten-government-repression, accessed 4 July
2022.
Gibbs, Samuel. 2015a. ‘Leave Facebook If You Don’t Want to Be Spied on, Warns
EU’. The Guardian, 26 March 2015. https://www.theguardian.com/technology/2015/
252 REFERENCES

mar/26/leave-facebook-snooped-on-warns-eu-safe-harbour-privacy-us, accessed 4 July


2022.
Gibbs, Samuel. 2015b. ‘What Is “Safe Harbour” and Why Did the EUCJ Just Declare
It Invalid?’ The Guardian, 6 October 2015. http://www.theguardian.com/technology/
2015/oct/06/safe-harbour-european-court-declare-invalid-data-protection, accessed 4
July 2022.
Gitelman, Lisa. 2013. Raw Data Is an Oxymoron. Cambridge, MA: MIT Press.
Glawischnig-Piesczek v. Facebook Ireland. 2019. Eva Glawischnig-Piesczek v.
Facebook Ireland Limited. Court of Justice of the EU: General Court. Case
C-18/18.
Globocnik, Jure. 2020. ‘The Right to Be Forgotten Is Taking Shape: CJEU Judgments in
GC and Others (C-136/17) and Google v CNIL (C-507/17)’. GRUR International 69 (4):
380–8.
GNI. 2017. ‘The GNI Principles on Freedom of Expression and Privacy’. https://
globalnetworkinitiative.org/gni-principles/, accessed 4 July 2022.
Goddard, Michelle. 2017. ‘The EU General Data Protection Regulation (GDPR): European
Regulation That Has a Global Impact’. International Journal of Market Research 59 (6):
703–5.
Goffman, Erving. 1974. Frame Analysis: An Essay on the Organization of Experience. New
York: Harper & Row.
Goldsmith, Jack, and Tim Wu. 2006. Who Controls the Internet? Illusions of a Borderless
World. Oxford and New York: Oxford University Press.
Google. 2009. ‘Personalized Search for Everyone’. Official Google Blog. 4 Decem-
ber 2009. https://googleblog.blogspot.com/2009/12/personalized-search-for-everyone.
html, accessed 4 July 2022.
Google. 2014. ‘Letter to Isabelle Falque-Pierrotin, Chair, Article 29 Working Party’, 31 July
2014. http://online.wsj.com/public/resources/documents/google.pdf.
Google. 2015. ‘Implementing a European, Not Global, Right to Be Forgotten’. Google
Europe Blog, 30 July 2009. https://europe.googleblog.com/2015/07/implementing-
european-not-global-right.html, accessed 4 July 2022.
Google. 2020. ‘How Google Search Works’. https://www.google.com/intl/en_us/search/
howsearchworks/how-search-works/, accessed 4 July 2022.
Google. 2022a. ‘Content Delistings Due to Copyright’. https://transparencyreport.google.
com/copyright/overview?hl=en, accessed 4 July 2022.
Google. 2022b. ‘Requests to delist content under European privacy law’. Transparency
Report. https://transparencyreport.google.com/eu-privacy/overview?hl=en, accessed 4
July 2022.
Google Search Statistics. 2020. Internet Live Stats. https://www.internetlivestats.com/
google-search-statistics/#sources, accessed 4 July 2022.
Google Spain. 2014. Google Spain SL and Google Inc. v Agencia Española de Protección de
Datos (AEPD) and Mario Costeja González. Court of Justice of the EU: General Court.
Case C‑131/12.
Google v. CNIL. 2019. Google LLC, successor in law to Google Inc., v Commission nationale
de l’informatique et des libertés (CNIL). Court of Justice of the EU: General Court. Case
C-507/17.
Green, Jessica F. 2013. Rethinking Private Authority: Agents and Entrepreneurs in Global
Environmental Governance. Princeton, NJ: Princeton University Press.
Greenleaf, Graham. 2006. ‘APEC’s Privacy Framework Sets a New Low Standard for
the Asia-Pacific’. In New Dimensions in Privacy Law: International and Comparative
REFERENCES 253

Perspectives, edited by Andrew T. Kenyon and Megan Richardson, 91–120. Cambridge


University Press: Cambridge.
Greenleaf, Graham. 2011. ‘Global Data Privacy Laws: Forty Years of Acceleration’. Privacy
Laws and Business International Report, 112: 11–17.
Greenleaf, Graham. 2012. ‘The Influence of European Data Privacy Standards outside
Europe: Implications for Globalization of Convention 108’. International Data Privacy
Law 2 (2): 68–92.
Greenleaf, Graham. 2014a. Asian Data Privacy Laws: Trade & Human Rights Perspectives.
Oxford: Oxford University Press.
Greenleaf, Graham. 2014b. ‘Sheherezade and the 101 Data Privacy Laws: Origins, Signifi-
cance and Global Trajectories’. Journal of Law, Information & Science 23 (1): 4–49.
Greenwald, Glenn. 2013. ‘NSA Collecting Phone Records of Millions of Verizon Customers
Daily’. The Guardian, 6 June 2013. https://www.theguardian.com/world/2013/jun/06/
nsa-phone-records-verizon-court-order, accessed 4 July 2022.
Greenwald, Glenn. 2015. No Place to Hide. New York: Picador.
Gressin, Seena. 2017. ‘The Equifax Data Breach: What to Do’. FTC Consumer Infor-
mation, September. https://amerifirstbank.com/wp-content/uploads/2017/09/Equifax-
Data-Breach-FTC.pdf.
Grimm, Dieter. 2016. Constitutionalism: Past, Present, and Future. Oxford: Oxford Univer-
sity Press.
Gros, Valentin, Marieke de Goede, and Beste İşleyen. 2017. ‘The Snowden Files Made Pub-
lic: A Material Politics of Contesting Surveillance’. International Political Sociology 11 (1):
73–89. doi: 10.1093/ips/olw031.
The Guardian. 2010. ‘US Embassy Cables: Madeleine McCann Case Pushes EU to Act on
Child Abductions’. The Guardian, 13 December 2010. https://www.theguardian.com/
world/us-embassy-cables-documents/125480, accessed 4 July 2022.
The Guardian. 2020. ‘The NSA Files’, 2020. https://www.theguardian.com/us-news/the-
nsa-files, accessed 4 July 2022.
The Guardian. 2021. ‘The Pegasus Project’, 2021. https://www.theguardian.com/news/
series/pegasus-project, accessed 4 July 2022.
Guild, Elspeth, and Evelien Brouwer. 2006. ‘The Political Life of Data: The ECJ Decision
on the PNR Agreement between the EU and the US’. Policy Paper No. 110. CEPS Policy
Brief. http://aei.pitt.edu/7354/, accessed 4 July 2022.
Habermas, Jürgen. 1962. The Structural Transformation of the Public Sphere. Cambridge:
Polity.
Habermas, Jürgen. 1984. The Theory of Communicative Action. Vol. 1. Readon and the
Rationalization of Society. London: Heinemann.
Haffke, Lars, Mathias Fromberger, and Patrick Zimmermann. 2019. ‘Cryptocurrencies
and Anti-Money Laundering: The Shortcomings of the Fifth AML Directive (EU) and
How to Address Them’. Journal of Banking Regulation, April. doi: 10.1057/s41261-019-
00101-4.
Haggart, Blayne. 2020. ‘Global Platform Governance and the Internet-Governance Impos-
sibility Theorem’. Journal of Digital Media and Policy 11 (3): 321–39.
Haggart, Blayne, Natasha Tusikov, and Jan Aart Scholte, eds. 2021. Power and Authority in
Internet Governance: Return of the State? 1st edn. Abingdon and New York: Routledge.
Halberstam, Daniel. 2012. ‘Local, Global and Plural Constitutionalism: Europe Meets the
World’. In The Worlds of European Constitutionalism, edited by Gráinne de Búrca and
J. H. H. Weiler, 150–202. Cambridge: Cambridge University Press.
254 REFERENCES

Halliday, Josh. 2014. ‘Google Search Results May Indicate “Right to Be Forgotten” Cen-
sorship’. The Guardian, 8 June 2014. http://www.theguardian.com/technology/2014/
jun/08/google-search-results-indicate-right-to-be-forgotten-censorship, accessed 4 July
2022.
Halliday, Terence C, Lucien Karpik, and Malcolm Feeley. 2007. Fighting for Political Free-
dom: Comparative Studies of the Legal Complex and Political Liberalism. Oxford and
Portland, OR: Hart Publishing.
Hamm, Marylou, Frédéric Mérand, and Anke S. Obendiek. 2021. ‘Field Theory: From
Social Interaction to the Analysis of Power’. Unpublished Manuscript.
Hanrieder, Tine. 2016. ‘Orders of Worth and the Moral Conceptions of Health in Global
Politics’. International Theory 8 (3): 390–421.
Hanrieder, Tine. 2019. ‘How Do Professions Globalize? Lessons from the Global South in
US Medical Education’. International Political Sociology 13 (3): 296–314.
Hansen, Lene. 2000a. ‘Gender, Nation, Rape: Bosnia and the Construction of Security’.
International Feminist Journal of Politics 3 (1): 55–75.
Hansen, Lene. 2000b. ‘The Little Mermaid’s Silent Security Dilemma and the Absence of
Gender in the Copenhagen School’. Millennium 29 (2): 285–306.
Hansen, Lene, and Helen Nissenbaum. 2009. ‘Digital Disaster, Cyber Security, and
the Copenhagen School’. International Studies Quarterly 53 (4): 1155–75. doi:
10.1111/j.1468-2478.2009.00572.x.
Harbisch, Amelie. 2021. ‘Practices of Constructing the Ideal Refugee Subjectivation and
Performance Approaches to Discourse Combined’. PhD Dissertation. Berlin: Freie
Universität Berlin.
Harding, Elizabeth (Liz), Jarno J. Vanto, Reece Clark, L Hannah Ji, and Sara C Ainsworth.
2019. ‘Understanding the Scope and Impact of the California Consumer Privacy Act of
2018’. Journal of Data Protection & Privacy 2 (3): 234–53.
Harmes, Adam. 2012. ‘The Rise of Neoliberal Nationalism’. Review of International Political
Economy 19 (1): 59–86. doi: 10.1080/09692290.2010.507132.
Hart, Herbert Lionel Adolphus. 1958. ‘Positivism and the Separation of Law and Morals’.
Harvard Law Review 71 (4): 593–629.
Hartlapp, Miriam, Julia Metz, and Christian Rauh. 2014. Which Policy for Europe? Power
and Conflict inside the European Commission. Oxford: Oxford University Press.
Hatch, Orrin. 2018. ‘Congressional Record: Senate’. S1885-1984. Washington DC.
Hayes, Ben. 2012. ‘The Surveillance-Industrial Complex’. In Routledge Handbook of Surveil-
lance Studies, edited by Kirstie Ball, Kevin Haggerty, and David Lyon, 167–75. Abingdon
and New York: Routledge.
Hegel, Georg Wilhelm Friedrich. 2015. The Philosophy of Right. Mineola, NY: Dover
Publications.
Heisenberg, Dorothee. 2005. Negotiating Privacy: The European Union, the United States,
and Personal Data Protection. Ipolitics: Global Challenges in the Information Age.
Boulder, CO: Lynne Rienner Publishers.
Held, David. 2002. ‘Cosmopolitanism: Ideas, Realities and Deficits’. In Governing Global-
ization: Power, Authority and Global Governance, edited by David Held and Anthony
McGrew, 305–24. Malden, MA: Polity Press.
Held, David, and Anthony McGrew. 1998. ‘The End of the Old Order? Globalization and
the Prospects for World Order’. Review of International Studies 24 (5): 219–45.
Héritier, Adrienne, and Dirk Lehmkuhl. 2008. ‘The Shadow of Hierarchy and New Modes
of Governance’. Journal of Public Policy 28 (1): 1–17.
REFERENCES 255

Hern, Alex. 2019. ‘Apple Chief Calls for Laws to Tackle “Shadow Economy” of Data
Firms’. The Guardian, 17 January 2019. https://www.theguardian.com/technology/
2019/jan/17/apple-chief-tim-cook-calls-for-laws-to-tackle-shadow-economy-of-data-
firms, accessed 4 July 2022.
Herschinger, Eva, Markus Jachtenfuchs, and Christiane Kraft-Kasack. 2011. ‘Scratching
the Heart of the Artichoke? How International Institutions and the European Union
Constrain the State Monopoly of Force’. European Political Science Review 3 (3): 445–68.
Herzog, Lisa. 2013. Inventing the Market: Smith, Hegel, and Political Theory. Oxford: Oxford
University Press.
Hijmans, Hielke. 2016a. ‘The DPAs and Their Cooperation: How Far Are We in Making
Enforcement of Data Protection Law More European’. European Data Protection Law
Review 2: 362–72.
Hijmans, Hielke. 2016b. The European Union as Guardian of Internet Privacy. Cham:
Springer.
Himma, Kenneth Einar. 2016. ‘Why Security Trumps Privacy’. In Privacy, Security and
Accountability: Ethics, Law and Policy, edited by A. D. Moore, 145–70. London: Rowman
& Littlefield International.
Hirschl, Ran. 2009. ‘The Judicialization of Politics’. In The Oxford Handbook of Political
Science, edited by Robert E Goodin: 253–74. Oxford and New York: Oxford University
Press.
Hobbes, Thomas. 2016. Thomas Hobbes: Leviathan. Longman Library of Primary Sources
in Philosophy. New York: Routledge.
Hobbes, Thomas, and Edwin Curley. 1994. Leviathan: With Selected Variants from the Latin
Edition of 1668. Indianapolis, IN: Hackett Publishing.
Hoffmann, Anna Lauren. 2019. ‘Where Fairness Fails: Data, Algorithms, and the Limits of
Antidiscrimination Discourse’. Information, Communication & Society 22 (7): 900–15.
Hofmann, Jeanette, Christian Katzenbach, and Kirsten Gollatz. 2016. ‘Between Coordina-
tion and Regulation: Finding the Governance in Internet Governance’. New Media &
Society 19 (9): 1406–23. doi: 10.1177/1461444816639975.
Holzscheiter, Anna, Sassan Gholiagha, and Andrea Liese. 2020. ‘Activating Norm Colli-
sions: Interface Conflicts in International Drug Control’. Global Constitutionalism 9 (2):
290–317.
Homeland Security & Governmental Affairs Committee. 2012. ‘HSBC Exposed U.S.
Financial System to Money Laundering, Drug, Terrorist Financing Risks’. https://
www.hsgac.senate.gov/subcommittees/investigations/media/hsbc-exposed-us-finacial-
system-to-money-laundering-drug-terrorist-financing-risks.
Honneth, Axel. 2010. ‘Dissolutions of the Social: On the Social Theory of Luc
Boltanski and Laurent Thévenot’. Constellations 17 (3): 376–89. doi: 10.1111/j.1467-
8675.2010.00606.x.
H.R.1428-114th Congress (2015–2016): Judicial Redress Act of 2015. 2016.
Hutter, Swen, Edgar Grande, and Hanspeter Kriesi. 2016. Politicising Europe. Cambridge:
Cambridge University Press.
Huysmans, Jef. 2006. The Politics of Insecurity: Fear, Migration and Asylum in the EU.
London and New York: Routledge.
Huysmans, Jef. 2008. ‘The Jargon of Exception: On Schmitt, Agamben and the Absence of
Political Society’. International Political Sociology 2 (2): 165–83.
IA. 2015. ‘Examining the EU Safe Harbor Decision and Impacts for Transatlantic Data
Flows’. Congressional Hearing: House Energy and Commerce, Subcommittees on Com-
merce, Manufacturing and Trade and Communications & Technology.
256 REFERENCES

IA. 2018. ‘IA Privacy Principles for a Modern National Regulatory Framework’. https://
web.archive.org/web/20190530160257/https://internetassociation.org/files/ia_privacy-
principles-for-a-modern-national-regulatory-framework_full-doc/.
IATA. 2011. ‘IATA Reveals Checkpoint of the Future’, June. https://www.iata.org/en/
pressroom/2011-press-releases/2011-06-07-01/.
ICANN. 2017. ‘Updated ICANN Procedure for Handling WHOIS Conflicts with Pri-
vacy Laws Now Available: ICANN’. https://www.icann.org/news/announcement-2017-
04-18-en, accessed 4 July 2022.
ICDPPC. 2005. ‘The Protection of Personal Data and Privacy in a Globalised World: A
Universal Right Respecting Diversities’. 27th International Conference of Data Protection
and Privacy Commissioners. Montreux.
Inman, Phillip. 2020. ‘TikTok Halts Talks on London HQ amid UK–China Tensions’. The
Guardian, 19 July 2020. https://www.theguardian.com/technology/2020/jul/19/tiktok-
halts-talks-on-london-headquarters-amid-uk-china-tensions, accessed 4 July 2022.
In re: A Warrant. 2016. In re: A Warrant to Search a Certain E-Mail Account Controlled and
Maintained by Microsoft Corporation, Appellant, v. United States of America, Appellee.
United States Court of Appeals, Second Circuit.
In Re Search Warrant No. 16-1061-M to Google, 232 F. Supp. 3d 708. US District Court,
E.D. Pennsylvania, 2017.
International Law Commission. 2006. ‘Report on the Work of Its Fifty-Eighth Session’. UN
Doc. A/61/10. 1 May–9 June and 3 July–11 August.
in ’t Veld, Sophie. 2019. ‘Long Arm of American Law? Not in Europe!’. February. https://
sophieintveld.eu/long-arm-of-american-law-not-in-europe/, accessed 4 July 2022.
in ’t Veld, Sophie. 2020. ‘Letter to the European Commission’, 10 June 2020. https://twitter.
com/SophieintVeld/status/1270972850367877121/photo/1.
Ireland. 2018. ‘Brief of the Republic of Ireland as Amicus Curiae in Support of Neither Party,
United States of America v. Microsoft Corporation’. US Supreme Court.
Irish DPC. 2013. ‘Letter from the Irish Data Protection Commissioner’, 26 July 2013. http://
www.europe-v-facebook.org/DPC_PRISM_all.pdf (Letters from the DPC, compiled by
Maximilian Schrems, 2–3).
Irish DPC. 2020. ‘DPC Statement on CJEU Decision’. Data Protection Commis-
sion. https://www.dataprotection.ie/news-media/press-releases/dpc-statement-cjeu-
decision, accessed 4 July 2022.
Jääskinen, Niilo. 2013. ‘Opinion of Advocate General Jääskinen Delivered on 25 June 2013.
Case C-131/12. Google Spain SL, Google Inc. v Agencia Española de Protección de Datos
(AEPD), Mario Costeja González’. ECLI:EU:C:2013:424.
Jagd, Søren. 2011. ‘Pragmatic Sociology and Competing Orders of Worth in Organizations’.
European Journal of Social Theory 14 (3): 343–59. doi: 10.1177/1368431011412349.
Janda, Michael, and Peter Ryan. 2019. ‘Westpac Faces Fines over “Serious and
Systemic” Anti-Money Laundering Breaches, AUSTRAC Says’. ABC News, 19 Novem-
ber 2019. https://www.abc.net.au/news/2019-11-20/westpac-to-face-fines-anti-money-
laundering-terrorism-breaches/11720474, accessed 4 July 2022.
Janger, Edward J. 2002. ‘Privacy Property, Information Costs, and the Anticommons’.
Hastings Law Journal 54: 899–929.
Johns, Fleur, and Caroline Compton. 2022. ‘Data Jurisdictions and Rival Regimes of Algo-
rithmic Regulation’. Regulation & Governance 16 (1): 63–84. doi: 10.1111/rego.12296.
Johnson, Bobbie, Charles Arthur, and Josh Halliday. 2010. ‘Libyan Domain Shutdown
No Threat, Insists Bit.Ly’. The Guardian, 9 October 2010. http://www.theguardian.com/
technology/2010/oct/08/bitly-libya, accessed 4 July 2022.
REFERENCES 257

Johnson, David R, and David Post. 1996. ‘Law and Borders: The Rise of Law in Cyberspace’.
Stanford Law Review 48: 1367–402.
Joined Cases C-92-93/09. 2010. Volker und Markus Schecke. GbR and Hartmut Eifert v Land
Hessen. Court of Justice: General Court.
Joined Cases C-203/15 and C-698/15. Tele2 Sverige AB v. Post- och Telestyrelsen and Sec-
retary of State for the Home Department v Tom Watson and Others. Court of Justice of
the EU: General Court, 2016.
Joined Cases C-317/04 and C-318/04. 2006. European Parliament v. Council of the Euro-
pean Union and Commission of the European Communities. Court of Justice of the EU:
General Court.
Joined Cases C-511/18, C-512/18 and C-520/18. 2016. La Quadrature du Net and Others v
Commission. Court of Justice of the EU: General Court.
Joined Cases C-511/18, C-512/18 and C-520/18. 2020. La Quadrature du Net et. al v. Premier
ministre et al. and Order des barreaux francophones et germanophones et al. v. Conseil
des ministres. Court of Justice of the EU: Grand Chamber.
Jones, Meg Leta. 2018. Ctrl + Z: The Right to Be Forgotten. New York and London: NYU
Press.
Jóri, András. 2015. ‘Shaping vs Applying Data Protection Law: Two Core Functions of Data
Protection Authorities’. International Data Privacy Law 5 (2): 133–43.
Jourová, Věra. 2018. ‘Tweet’, 26 March 2018. https://twitter.com/VeraJourova/status/
978256311480709120, accessed 4 July 2022.
JSB. 2011. ‘Report on the Inspection of Europol’s Implementation of the TFTP Agreement,
Conducted in November 2010 by the the Europol Joint Supervisory Body’. JSB/Ins.
11–07. Brussels. https://www.ip-rs.si/fileadmin/user_upload/Pdf/novice/Terrorist_
Finance_Tracking_Program__TFTP__inspection_report_-_public_version.pdf.
JSB. 2015. ‘Europol JSB Inspection of the Implementation of the TFTP Agreement’. Letter
from Ms Vanna Palumbo, Chair of the Europol Joint Supervisory Board to Mr Étienne
Schneider, President of the Council 15/28. Brussels.
Kaczmarek, Michael, and Elena Lazarou. 2018. ‘US Counter-Terrorism since 9/11. Trends
under the Trump Administration’. Washington DC: European Parliamentary Research
Service.
Kalyanpur, Nikhil, and Abraham L. Newman. 2019a. ‘Mobilizing Market Power: Jurisdic-
tional Expansion as Economic Statecraft’. International Organization 73 (1): 1–34. doi:
10.1017/S0020818318000334.
Kalyanpur, Nikhil, and Abraham L Newman. 2019b. ‘The MNC-Coalition Paradox: Issue
Salience, Foreign Firms and the General Data Protection Regulation’. Journal of Common
Market Studies 57 (3): 448–67.
Kang, Cecilia. 2018. ‘Tech Industry Pursues a Federal Privacy Law, on Its Own Terms’. The
New York Times, 27 August 2018. https://www.nytimes.com/2018/08/26/technology/
tech-industry-federal-privacy-law.html, accessed 4 July 2022.
Kanter, James, and Raphael Minder. 2010. ‘Air Travelers Lead European Privacy Con-
cerns’. The New York Times, 28 April 2010. https://www.nytimes.com/2010/04/28/
world/europe/28iht-privacy.html, accessed 4 July 2022.
Katz v. United States. 1967. 347.Supreme Court of the US.
Kaunert, Christian, and Sarah Léonard. 2012. ‘Introduction: Supranational Governance
and European Union Security after the Lisbon Treaty: Exogenous Shocks, Policy
Entrepreneurs and 11 September 2001’. Cooperation and Conflict 47 (4): 417–32.
Kaunert, Christian, Sarah Léonard, and Alex MacKenzie. 2012. ‘The Social Con-
struction of an EU Interest in Counter-Terrorism: US Influence and Internal
258 REFERENCES

Struggles in the Cases of PNR and SWIFT’. European Security 21 (4): 474–96. doi:
10.1080/09662839.2012.688812.
Kaunert, Christian, Sarah Léonard, and Alex MacKenzie. 2015. ‘The European Parliament
in the External Dimension of EU Counter-Terrorism: More Actorness, Accountability
and Oversight 10 Years On?’ Intelligence and National Security 30 (2–3): 357–76. doi:
10.1080/02684527.2014.988446.
Kaunert, Christian, and Sarah Léonard. 2019. ‘The Collective Securitisation of Ter-
rorism in the European Union’. West European Politics 42 (2): 261–77. doi:
10.1080/01402382.2018.1510194.
Kauppi, Niilo, and Mikael R. Madsen. 2014. ‘Fields of Global Governance: How Transna-
tional Power Elites Can Make Global Governance Intelligible’. International Political
Sociology 8 (3): 324–30.
Kelion, Leo. 2019. ‘Google Wins Landmark Right to Be Forgotten Case’. BBC News,
24 September 2019. https://www.bbc.com/news/technology-49808208, accessed 4 July
2022, accessed 4 July 2022.
Keller, Reiner. 2018. ‘The Sociology of Knowledge Approach to Discourse’. In The Sociology
of Knowledge Approach to Discourse: Investigating the Politics of Knowledge and Meaning-
Making, edited by Reiner Keller, Anna-Katharina Hornidge, and Wolf J. Schünemann,
16–47. Routledge Advances in Sociology. London and New York: Routledge. doi:
10.4324/9781315170008.
Keohane, Robert O., and David G. Victor. 2011. ‘The Regime Complex for Climate Change’.
Perspectives on Politics 9 (1): 7–23. doi: 10.1017/S1537592710004068.
Kergueno, Raphaël. 2017. ‘The Über-Lobbyists: How Silicon Valley Is Changing Brussels
Lobbying’. Transparency International EU, May. https://transparency.eu/uber-lobbyists,
accessed 4 July 2022.
Kerry, Cameron. 2012. ‘Avoiding a Data Divide between the US and the EU’. POLITICO,
November. https://www.politico.eu/article/avoiding-a-data-divide-between-the-us-
and-the-eu/, accessed 4 July 2022.
Kerry, Cameron. 2014. ‘Missed Connections: Talking with Europe about Data, Privacy,
and Surveillance’. Brookings. https://www.brookings.edu/wp-content/uploads/2016/06/
Kerry_EuropeFreeTradePrivacy.pdf, accessed 4 July 2022.
Kettemann, Matthias C. 2020. The Normative Order of the Internet: A Theory of Rule and
Regulation Online. Oxford: Oxford University Press.
Khan, Mehreen. 2019. ‘Brussels Defends Lack of Blockbuster Fines for Big Tech
Groups’. Financial Times, July. https://www.ft.com/content/3629b0fc-ad73-11e9-8030-
530adfa879c2, accessed 4 July 2022.
King, Colin, and Clive Walker. 2015. ‘Counter Terrorism Financing: Redundant Fragmen-
tation?’ New Journal of European Criminal Law 6 (3): 372–95.
King, Colin, Clive Walker, and Jimmy Gurulé, eds. 2018. The Palgrave Handbook of
Criminal and Terrorism Financing Law. Cham: Palgrave Macmillan.
Kirby, Michael. 2011. ‘The History, Achievement and Future of the 1980 OECD Guidelines
on Privacy’. International Data Privacy Law 1 (1): 6–14. doi: 10.1093/idpl/ipq002.
Kirkhope, Timothy. 2016. Protection of Individuals with Regard to the Processing of
Personal Data: Processing of Personal Data for the Purposes of Crime Prevention
(Debate). 13 April 2016. Strasbourg: European Parliament. http://www.europarl.europa.
eu/doceo/document/CRE-8-2016-04-13-INT-3-521-0000_NL.html?redirect, accessed
4 July 2022.
Kiss, Jemima. 2015. ‘Dear Google: Open Letter from 80 Academics on “Right to Be For-
gotten”’. The Guardian, 14 May 2015. https://www.theguardian.com/technology/2015/
REFERENCES 259

may/14/dear-google-open-letter-from-80-academics-on-right-to-be-forgotten, accessed
4 July 2022.
Kjaer, Poul F., and Antje Vetterlein. 2018. ‘Regulatory Governance: Rules,
Resistance and Responsibility’. Contemporary Politics 24 (5): 497–506. doi:
10.1080/13569775.2018.1452527.
Klass and Others v. Federal Republic of Germany. 1978. European Court of Human Rights.
App no. 5029/71
Knorr Cetina, Karin. 2005. ‘Objectual Practice’. In The Practice Turn in Contemporary The-
ory, edited by Karin Knorr Cetina, Theodore R Schatzki, and Eike von Savigny, 184–97.
London: Routledge.
Kobrin, Stephen J. 2004. ‘Safe Harbours Are Hard to Find: The Trans-Atlantic Data Privacy
Dispute, Territorial Jurisdiction and Global Governance’. Review of International Studies
30 (1): 111–31.
Kornprobst, Markus. 2014. ‘From Political Judgements to Public Justifications (and Vice
Versa): How Communities Generate Reasons upon Which to Act’. European Journal of
International Relations 20 (1): 192–216.
Kottasová, Ivana. 2018. ‘These Companies Are Getting Killed by GDPR’. CNN Money,
May. https://money.cnn.com/2018/05/11/technology/gdpr-tech-companies-losers/
index.html.
Kreuder-Sonnen, Christian. 2018. ‘Political Secrecy in Europe: Crisis Manage-
ment and Crisis Exploitation’. West European Politics 41 (4): 958–80. doi:
10.1080/01402382.2017.1404813.
Kreuder-Sonnen, Christian. 2019. Emergency Powers of International Organizations:
Between Normalization and Containment. Oxford: Oxford University Press.
Kreuder-Sonnen, Christian, and Michael Zürn. 2020. ‘After Fragmentation: Norm Col-
lisions, Interface Conflicts, and Conflict Management’. Global Constitutionalism 9 (2):
241–67.
Krisch, Nico. 2010. Beyond Constitutionalism: The Pluralist Structure of Postnational Law.
Oxford: Oxford University Press.
Krisch, Nico, Francesco Corradini, and Lucy Lu Reimers. 2020. ‘Order at the Margins:
The Legal Construction of Norm Collisions Over Time’. Global Constitutionalism 9 (2):
343–63.
Kulesza, Joanna. 2018. ‘Balancing Privacy and Security in a Multistakeholder Environment:
ICANN, WHOIS and GDPR’. The Visio Journal 3: 49–58.
Kuner, Christopher. 2013. Transborder Data Flows and Data Privacy Law. Oxford: Oxford
University Press.
Kuner, Christopher. 2015. ‘Extraterritoriality and Regulation of International Data Trans-
fers in EU Data Protection Law’. International Data Privacy Law 5 (4): 235–45. doi:
10.1093/idpl/ipv019.
Kuner, Christopher. 2017. ‘Reality and Illusion in EU Data Transfer Regulation Post
Schrems’. German Law Journal 18 (4): 881–918.
Kydd, Andrew H. 2021. ‘Decline, Radicalization and the Attack on the US Capitol’. Violence:
An International Journal 2 (1): 3–23.
Lamla, Jörn. 2016. ‘Schluss: Demokratische Alternativen der Reterritorialisierung’. In Die
Reterritorialisierung des Digitalen: Zur Reaktion nationaler Demokratie auf die Krise
der Privatheit nach Snowden, edited by Barbara Büttner, Simon Ledder, Carsten Ochs,
Fabian Pittroff, Christian L. Geminn, Thilo Hagendorff, and Jörn Lamla, 153–9. Kassel:
Kassel University Press.
260 REFERENCES

Lancieri, Filippo Maria. 2018. ‘Digital Protectionism? Antitrust, Data Protection, and
the EU/US Transatlantic Rift’. Journal of Antitrust Enforcement 7 (1): 27–53. doi:
10.1093/jaenfo/jny012.
Larsson, Olof, and Daniel Naurin. 2016. ‘Judicial Independence and Political Uncer-
tainty: How the Risk of Override Affects the Court of Justice of the EU’. International
Organization 70(2): 377–408. doi:10.1017/S0020818316000047.
Laurer, Moritz, and Timo Seidl. 2021. ‘Regulating the European Data-Driven Econ-
omy: A Case Study on the General Data Protection Regulation’. Policy & Internet 13:
257–77.
Law Enforcement Directive (EU) 2016/680 of the European Parliament and of the Coun-
cil of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing
of Personal Data by Competent Authorities for the Purposes of the Prevention, Inves-
tigation, Detection or Prosecution of Criminal Offences or the Execution of Criminal
Penalties, and on the Free Movement of Such Data, and Repealing Council Framework
Decision 2008/977/JHA. 2016. OJ L 119.
Leander, Anna. 2005. ‘The Power to Construct International Security: On the Significance
of Private Military Companies’. Millennium 33 (3): 803–25.
Leander, Anna. 2011. ‘The Promises, Problems, and Potentials of a Bourdieu-Inspired
Staging of International Relations’. International Political Sociology 5 (3): 294–313.
Lee, Dave. 2015. ‘How Worried Is Silicon Valley about Safe Harbour?’ BBC News, 7 October
2015. https://www.bbc.com/news/technology-34461682, accessed 4 July 2022.
Lee, Dave. 2017. ‘US Government “Monitored Bank Transfers”’. BBC News, 16 April 2017.
https://www.bbc.com/news/technology-39606575, accessed 4 July 2022.
Leiser, Mark, and Bart Custers. 2019. ‘The Law Enforcement Directive: Conceptual Chal-
lenges of EU Directive 2016/680’. European Data Protection Law Review 5 (3): 367–78.
Lévy, Emmanuel. 2016. ‘Valls a-t-il voulu la peau d’un préfet?’ Marianne, 31 March 2016.
https://www.marianne.net/politique/valls-t-il-voulu-la-peau-dun-prefet, accessed 4 July
2022.
LIBE. 2010. ‘Report on the Proposal for a Council Decision on the Conclusion of the
Agreement between the European Union and the United States of America on the Pro-
cessing and Transfer of Financial Messaging Data from the European Union to the United
States for Purposes of the Terrorist Finance Tracking Program’. 05305/1/2010REV –
C7-0004/2010 – 2009/0190(NLE). Brussels.
LIBE. 2013. Hearing. LIBE Committee Inquiry on Electronic Mass Surveillance of EU
Citizens. https://multimedia.europarl.europa.eu/en/committee-on-civil-liberties-
justice-and-home-affairs_20131007-1900-COMMITTEE-LIBE_vd. (07/10/2013 video
recording).
LIBE. 2014a. ‘LIBE Committee Inquiry: Electronic Mass Surveillance of EU Citizens’.
https://europarl.europa.eu/document/activities/cont/201410/20141016ATT91322/
20141016ATT91322EN.pdf, accessed 4 July 2022.
LIBE. 2014b. ‘Report on the US NSA Surveillance Programme: Surveillance Bodies in
Various Member States and Their Impact on EU Citizens’ Fundamental Rights and on
Transatlantic Cooperation in Justice and Home Affairs’. 2013/2188(INI). Brussels.
LIBE. 2015. ‘Second Report on the Proposal for a Directive of the European Parliament
and of the Council Second Report on the Use of Passenger Name Record Data for the
Prevention, Detection, Investigation and Prosecution of Terrorist Offences and Serious
Crime’. (COM(2011)0032 – C7-0039/2011 – 2011/0023(COD)). Rapporteur: Timothy
Kirkhope. Brussels.
REFERENCES 261

LIBE. 2019. ‘Draft Report on the Proposal for a Regulation of the European Parliament
and of the Council on European Production and Preservation Orders for Electronic
Evidence in Criminal Matters (COM(2018)0225 – C8-0155/2018 – 2018/0108(COD)).
Rapporteur: Birgit Sippel’. Brussels.
Lichtblau, Eric, and James Risen. 2006. ‘Bank Data Is Sifted by U.S. in Secret to Block Terror’.
The New York Times, 23 June 2006. https://www.nytimes.com/2006/06/23/washington/
23intel.html, accessed 4 July 2022.
Lien, Tracy. 2015. ‘Microsoft Handed FBI Data on Charlie Hebdo Probe in 45 Minutes’.
Los Angeles Times, 20 January 2015. https://www.latimes.com/business/technology/la-
fi-tn-microsoft-charlie-hebdo-20150120-story.html, accessed 4 July 2022.
Ligeti, Katalin, and Gavin Robinson. 2021. ‘Sword, Shield and Cloud: Toward a Euro-
pean System of Public–Private Orders for Electronic Evidence in Criminal Matters?’
In Surveillance and Privacy in the Digital Age: European, Transatlantic and Global Per-
spectives, edited by Valsamis Mitsilegas and Niovi Vavoula, 1st edn, 27–70. Oxford: Hart
Publishing.
Ligue des droits humains. 2022. Ligue des droits humains v. Conseil des ministres, Court
of Justice: Grand Chamber. Case (C- 817/19)
Lillington, Karlin. 2019. ‘Schrems II Will Seriously Stress Test EU’s Data Privacy Rules’. The
Irish Times, 11 July 2019. https://www.irishtimes.com/business/technology/schrems-ii-
will-seriously-stress-test-eu-s-data-privacy-rules-1.3952925, accessed 4 July 2022.
Linklater, Andrew. 1998. The Transformation of Political Community: Ethical Foundations
of the Post-Westphalian Era. Studies in International Relations. Cambridge: Polity.
Lobbyfacts. 2022. ‘LobbyFacts Database’. https://lobbyfacts.eu/, accessed 4 July 2022.
Locke, John. 1967. Locke: Two Treatises of Government. Cambridge: Cambridge University
Press.
Long, William J., and Marc Pang Quek. 2002. ‘Personal Data Privacy Protection in an Age of
Globalization: The US–EU Safe Harbor Compromise’. Journal of European Public Policy
9 (3): 325–44. doi: 10.1080/13501760210138778.
Lowe, Peter. 2006. ‘Counterfeiting: Links to Organised Crime and Terrorist Funding’.
Journal of Financial Crime 13 (2): 255–7.
Lynskey, Orla. 2015a. The Foundations of EU Data Protection Law. Oxford: Oxford Univer-
sity Press.
Lynskey, Orla. 2015b. ‘Control over Personal Data in a Digital Age: Google Spain v AEPD
and Mario Costeja Gonzalez’. The Modern Law Review 78 (3): 522–34. doi: 10.1111/1468-
2230.12126.
Lynskey, Orla. 2019. ‘Grappling with “Data Power”: Normative Nudges from Data Protec-
tion and Privacy’. Theoretical Inquiries in Law 20 (1): 189–220.
Lyon, David. 1994. The Electronic Eye: The Rise of Surveillance Society. Minneapolis, MN:
University of Minneapolis Press.
Lyon, David. 2007. Surveillance Studies: An Overview. Cambridge: Polity.
Lyon, David. 2015. Surveillance after Snowden. Cambridge and Malden, MA: Polity.
McCourt, David M. 2016. ‘Practice Theory and Relationalism as the New Constructivism.’
International Studies Quarterly 60 (3): 475–85.
McCulloch, Jude, and Sharon Pickering. 2009. ‘Pre-Crime and Counter-Terrorism: Imag-
ining Future Crime in the “War on Terror”’. The British Journal of Criminology 49 (5):
628–45.
Mac Ginty, Roger. 2017. ‘A Material Turn in International Relations: The 4x4, Intervention
and Resistance’. Review of International Studies 43 (5): 855–74.
262 REFERENCES

McGoldrick, Dominic. 2013. ‘Developments in the Right to Be Forgotten’. Human Rights


Law Review 13 (4): 761–76. doi: 10.1093/hrlr/ngt035.
MacIntyre, Alasdair C. 1999. Dependent Rational Animals: Why Human Beings Need the
Virtues. Vol. 20. London: Duckworth.
MacKenzie, Alex, and Ariadna Ripoll Servent. 2012. ‘The European Parliament as a “Norm
Taker”? EU–US Relations after the SWIFT Agreement’. European Foreign Affairs Review
17 (2/1): 71–86.
MacKinnon, Catharine A. 1987. Feminism Unmodified: Discourses on Life and Law. Cam-
bridge, MA, and London: Harvard University Press.
McNamee, Joe. 2016. ‘The Curious Tale of the French Prime Minister, PNR and Peculiar
Patterns’. EDRi, October. https://edri.org/curious-tale-french-prime-minister-pnr-and-
peculiar-patterns/, accessed 4 July 2022.
The Madrid Privacy Declaration: Global Privacy Standards for a Global World. 2009.
https://thepublicvoice.org/madrid-declaration/, accessed 4 July 2022.
Madsen, Mikael Rask. 2013. ‘The Power of Legal Knowledge in the Reform of Fundamental
Law’. In Lawyering Europe: European Law As a Transnational Social Field. Modern Stud-
ies in European Law, edited by Antoine Vauchez and Bruno de Witte, 197–220. Oxford:
Hart Publishing.
Mager, Astrid, and Christian Katzenbach. 2021. ‘Future Imaginaries in the Making and
Governing of Digital Technology: Multiple, Contested, Commodified’. New Media &
Society 23 (2): 223–36. doi: 10.1177/1461444820929321.
Mahoney, James, and Kathleen Thelen. 2010. ‘A Theory of Gradual Institutional Change’.
Explaining Institutional Change: Ambiguity, Agency, and Power 1: 1–37.
Manancourt, Vincent. 2020. ‘Europe’s Data Grab’. POLITICO, 19 February 2020. https://
www.politico.eu/article/europe-data-grab-protection-privacy/, accessed 4 July 2022.
Mansell, Robin. 2011. ‘New Visions, Old Practices: Policy and Regulation in the Internet
Era’. Continuum 25 (1): 19–32.
Mari, Angelica. 2021. ‘WhatsApp to Adjust Privacy Rules in Brazil’. ZDNet. https://
www.zdnet.com/article/whatsapp-to-adjust-privacy-rules-in-brazil/, accessed 4 July
2022.
Maricut, Adina. 2016. ‘The Institutional Development of the EU’s Area of Freedom, Security
and Justice: Roles, Behaviors, and the Logic of Justification’. PhD Dissertation. Budapest:
CEU
Markoff, John, and Andrew E. Kramer. 2009. ‘In Shift, U.S. Talks to Russia on Internet Secu-
rity’. The New York Times, 12 December 2009. https://www.nytimes.com/2009/12/13/
science/13cyber.html, accessed 4 July 2022.
Marquenie, Thomas. 2017. ‘The Police and Criminal Justice Authorities Directive: Data
Protection Standards and Impact on the Legal Framework’. Computer Law & Security
Review 33 (3): 324–40.
Masiero, Silvia, and Soumyo Das. 2019. ‘Datafying Anti-Poverty Programmes: Implica-
tions for Data Justice’. Information, Communication & Society 22 (7): 916–33. doi:
10.1080/1369118X.2019.1575448.
Mayer-Schönberger, Viktor. 2010. ‘Beyond Privacy, Beyond Rights: Toward a “Systems”
Theory of Information Governance’. California Law Review 98 (6): 1853–85.
Mayer-Schönberger, Viktor. 2011. Delete: The Virtue of Forgetting in the Digital Age.
Princeton, NJ, and Oxford: Princeton University Press.
Mearsheimer, John J. 2003. The Tragedy of Great Power Politics. New York and London:
WW Norton & Company.
REFERENCES 263

Meehan, Patrick. 2011. ‘Congressional Hearing: Intelligence Sharing and Terrorist Travel:
How DHS Addresses the Mission of Providing Security, Facilitating Commerce and
Protecting Privacy for Passengers Engaged in International Travel’. Washington DC.
Mérand, Frédéric. 2010. ‘Pierre Bourdieu and the Birth of European Defense’. Security
Studies 19 (2): 342–74.
Microsoft. 2018. ‘Brief in Opposition. In Re Search Warrant. USA v. Microsoft’. 584 U.S. __
(2018). US Supreme Court.
Microsoft. 2020. ‘Law Enforcement Requests Report: Microsoft CSR’. Microsoft. https://
www.microsoft.com/en-us/corporate-responsibility/lerr, accessed 4 July 2022.
Mill, John Stuart. 1859. On Liberty. 4th edn. London: Longman, Roberts, & Green Co.
http://www.econlib.org/library/Mill/mlLbty3.html, accessed 4 July 2022.
Miller, David. 2017. Liberty Reader. London: Routledge.
Milliken, Jennifer. 1999. ‘The Study of Discourse in International Relations: A Critique of
Research and Methods’. European Journal of International Relations 5 (2): 225–54.
Mitsilegas, Valsamis. 2014. ‘Transatlantic Counterterrorism Cooperation and European
Values: The Elusive Quest for Coherence’. In A Transatlantic Community of Law: Legal
Perspectives on the Relationship between the EU and US Legal Orders, edited by Elaine
Fahey and Deirdre Curtin, 289–315. Cambridge: Cambridge University Press.
Mitsilegas, Valsamis. 2020. ‘The Preventive Turn in European Security Policy: Towards
a Rule of Law Crisis?’ In EU Law in Populist Times: Crises and Prospects,
edited by Francesca Bignami, 301–18. Cambridge: Cambridge University Press. doi:
10.1017/9781108755641.011.
Möllers, Christoph. 2020. The Possibility of Norms. Oxford: Oxford University Press.
Monar, Jörg. 2010. ‘The Rejection of the EU–US SWIFT Interim Agreement by the Euro-
pean Parliament: A Historic Vote and Its Implications’. European Foreign Affairs Review
15: 143–51.
Monar, Jörg. 2015. ‘The EU as an International Counter-Terrorism Actor: Progress
and Constraints’. Intelligence and National Security 30 (2–3): 333–56. doi:
10.1080/02684527.2014.988448.
Moody, Glyn. 2016. ‘In “an Unusual Move,” US Government Asks to Join Key EU Face-
book Privacy Case’. Ars Technica, June. https://arstechnica.com/tech-policy/2016/06/
eu-facebook-schrems-case-us-government-amicus-curiae/, accessed 4 July 2022.
Mueller, Milton. 2010. Networks and States: The Global Politics of Internet Governance.
Information Revolution and Global Politics. Cambridge, MA: MIT Press.
Mueller, Milton, and Mawaki Chango. 2008. ‘Disrupting Global Governance: The Internet
Whois Service, ICANN, and Privacy’. Journal of Information Technology & Politics 5 (3):
303–25.
Murray, Andrew. 2007. The Regulation of Cyberspace: Control in the Online Environment.
Abingdon and New York: Routledge.
Nachtwey, Oliver, and Timo Seidl. 2020. ‘The Solutionist Ethic and the Spirit of Digital
Capitalism’. Soc ArXiv Working Paper, March. https://osf.io/preprints/socarxiv/sgjzq/,
accessed 4 July 2022.
Nagel, Thomas. 2005. ‘The Problem of Global Justice’. Philosophy & Public Affairs 33 (2):
113–47.
Nance, Mark T. 2018. ‘Re-Thinking FATF: An Experimentalist Interpretation of the Finan-
cial Action Task Force’. Crime, Law and Social Change 69 (2): 131–52.
Neveu, Erik. 2018. ‘Bourdieu’s Capital(s): Sociologizing an Economic Concept’. In The
Oxford Handbook of Pierre Bourdieu, edited by Thomas Medvetz and Jeffrey J. Sallaz,
347–74. New York: Oxford University Press.
264 REFERENCES

Newman, Abraham L. 2008. Protectors of Privacy: Regulating Personal Data in the Global
Economy. Ithaca, NY: Cornell University Press
Newman, Abraham L., and David Bach. 2004. ‘Self-Regulatory Trajectories in the Shadow
of Public Powe: Resolving Digital Dilemmas in Europe and the United States’. Gover-
nance 17 (3): 387–413.
The New York Times. 2015. ‘Microsoft Challenges Warrant for Emails Stored in Ireland’. The
New York Times, 9 September 2015. https://www.nytimes.com/2015/09/10/technology/
microsoft-challenges-warrant-for-emails-stored-in-ireland.html, accessed 4 July 2022.
Niemann, Holger. 2019. The Justification of Responsibility in the UN Security Council: Prac-
tices of Normative Ordering in International Relations. Routledge Global Cooperation
Series. London: Routledge.
Ni Loideain, Nora. 2016. ‘The End of Safe Harbor: Implications for EU Digital Privacy and
Data Protection Law’. Internet Law 19 (8): 7–14.
Nissenbaum, Helen. 1998. ‘Protecting Privacy in an Information Age: The Problem of
Privacy in Public’. Law and Philosophy 17 (5): 559–96.
Nissenbaum, Helen. 2009. Privacy in Context: Technology, Policy, and the Integrity of Social
Life. Stanford Law Books. Stanford, CA: Stanford University Press.
noyb. 2020. ‘CJEU Judgment: First Statement’. Noyb.Eu. 16 July 2020. https://noyb.eu/en/
cjeu, accessed 4 July 2022.
Nozick, Robert. 1974. Anarchy, State, and Utopia. Vol. 5038. New York: Basic Books.
Nussbaum, Martha. 1994. ‘Patriotism and Cosmopolitanism’. Boston Review XIX (5): 3–16.
OAS. 2004. ‘A Comprehensive Inter-American Cybersecurity Strategy: A Multidimensional
and Multidisciplinary Approach to Creating a Culture of Cybersecurity’. AG/RES. 1939
(XXXIII-O/03).
Obama, Barack. 2013. ‘Statement by the President’. San Jose, California. https://
obamawhitehouse.archives.gov/the-press-office/2013/06/07/statement-president,
accessed 4 July 2022.
Obama, Barack. 2014. ‘Text of the President’s Remarks on NSA and Surveillance’. Lawfare,
January. https://www.lawfareblog.com/text-presidents-remarks-nsa-and-surveillance,
accessed 4 July 2022.
Obendiek, Anke S. 2021. ‘Take Back Control? Digital Sovereignty and a Vision for Europe’.
Policy Paper. Jacques Delors Centre. May.
Obendiek, Anke S. 2022. ‘What Are We Actually Talking about? Conceptualizing Data as a
Governable Object in Overlapping Jurisdictions’. International Studies Quarterly 66 (1).
doi: 10.1093/isq/sqab080.
Occhipinti, John D. 2015. ‘Still Moving toward a European FBI? Re-Examining the Politics
of EU Police Cooperation’. Intelligence and National Security 30 (2–3): 234–58.
Ochs, Carsten, Fabian Pittroff, Barbara Büttner, and Jörn Lamla. 2016. ‘Governing the
Internet in the Privacy Arena’. Internet Policy Review 5 (3). doi: 10.14763/2016.3.426.
OECD. 1980. ‘OECD Guidelines on the Protection of Privacy and Transborder Flows
of Personal Data’. http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotection
ofprivacyandtransborderflowsofpersonaldata.htm, accessed 4 July 2022.
OECD. 2013. Recommendation of the Council concerning Guidelines Governing the
Protection of Privacy and Transborder Flows of Personal Data. OECD/LEGAL/0188
https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0188.
Office of the United States Trade Representative. 2017. ‘2017 Special 301 Report’. https://
ustr.gov/sites/default/files/301/2017%20Special%20301%20Report%20FINAL.PDF,
accessed 4 July 2022.
REFERENCES 265

Ohmae, Kenichi. 1995. The End of the Nation State: The Rise of Regional Economies. Free
Press Paperback. London and New York: Free Press.
Olmstead v. United States. 1928. 277 U.S. 438. Supreme Court of the US.
O’Neill, Onora. 2000. Bounds of Justice. Cambridge: Cambridge University Press.
Parkin, Joanna, Didier Bigo, Sergio Carrera, Amandine Scherrer, Julien Jeandesboz,
Nicholas Hernanz, and Francesco Ragazzi, European Parliament, and Directorate-
General for Internal Policies of the Union. 2013. National Programmes for Mass Surveil-
lance of Personal Data in EU Member States and Their Compatibility with EU Law.
Luxembourg: Publications Office.
Parsons, Christopher. 2017. ‘The (In)Effectiveness of Voluntarily Produced Transparency
Reports’. Business & Society 58 (1): 103–31. doi: 10.1177/0007650317717957.
Pasquale, Frank, and Tara Adams Ragone. 2013. ‘Protecting Health Privacy in an Era of Big
Data Processing and Cloud Computing’. Stanford Technology Law Review 17: 595–654.
Patten, Chris. 2004. ‘Speech by the Rt. Hon Chris Patten, Commissioner for External Rela-
tions: Passenger Name Records (PNR) EC /US Agreement’. SPEECH/04/189 Presented
at the European Parliament Plenary Session, Strasbourg, April 21.
Pawlak, Patryk. 2009. ‘Network Politics in Transatlantic Homeland Security Cooperation’.
Perspectives on European Politics & Society 10 (4): 560–81.
Pellandini-Simányi, Léna. 2014. ‘Bourdieu, Ethics and Symbolic Power’. The Sociological
Review 62 (4): 651–74. doi: 10.1111/1467-954X.12210.
Pettit, Philip. 1997. Republicanism: A Theory of Freedom and Government. Oxford: Claren-
don Press.
Pieth, Mark. 2006. ‘Criminalizing the Financing of Terrorism’. Journal of International
Criminal Justice 4 (5): 1074–86.
Pirkova, Eliska, and Estelle Massé. 2019. ‘EU Court Decides on Two Major “Right to Be
Forgotten” Cases: There Are No Winners Here’. Access Now, October. https://www.
accessnow.org/eu-court-decides-on-two-major-right-to-be-forgotten-cases-there-are-
no-winners-here/, accessed 4 July 2022.
PNR Directive. 2016. Directive (EU) 2016/681 of the European Parliament and of the
Council of 27 April 2016 on the Use of Passenger Name Record (PNR) Data for the
Prevention, Detection, Investigation and Prosecution of Terrorist Offences and Serious
Crime. OJ L 119.
Pogge, Thomas W. 1989. Realizing Rawls. Ithaca, NY, and London: Cornell University
Press.
Pogge, Thomas W. 2008. World Poverty and Human Rights. 2nd edn. World Poverty and
Human Rights: Cosmopolitan Responsibilities and Reforms. Cambridge and Malden,
MA: Polity.
Pohle, Jörg. 2017. ‘Datenschutz und Technikgestaltung: Geschichte und Theorie des Daten-
schutzes aus informatischer Sicht und Folgerungen für die Technikgestaltung’. PhD
Dissertation. Humboldt-Universität zu Berlin. doi: 10.18452/19136.
Pohle, Julia, M. Hösl, and Ronja Kniep. 2016. ‘Analysing Internet Policy as a Field of
Struggle’. Internet Policy Review 5 (3). doi: 10.14763/2016.3.412.
Pohle, Julia, and Thorsten Thiel. 2020. ‘Digital Sovereignty’. Internet Policy Review 9 (4).
doi: 10.14763/2020.4.1532.
Porter, Andrew L., and Annegret Bendiek. 2012. ‘Counterterrorism Cooperation in
the Transatlantic Security Community’. European Security 21 (4): 497–517. doi:
10.1080/09662839.2012.688811.
Posner, Richard A. 1977. ‘The Right of Privacy’. Georgia Law Review 12 (3): 393–422.
266 REFERENCES

Posner, Richard A. 1981. ‘The Economics of Privacy’. The American Economic Review 71
(2): 405–9.
Pouliot, Vincent. 2007. ‘“Sobjectivism”: Toward a Constructivist Methodology’. Interna-
tional Studies Quarterly 51 (2): 359–84.
Pouliot, Vincent. 2016. ‘Hierarchy in Practice: Multilateral Diplomacy and the Governance
of International Security’. European Journal of International Security 1 (1): 5–26. doi:
10.1017/eis.2015.4.
Pouponneau, Florent, and Frédéric Mérand. 2017. ‘Diplomatic Practices, Domestic Fields,
and the International System: Explaining France’s Shift on Nuclear Nonproliferation’.
International Studies Quarterly 61 (1): 123–35. doi: 10.1093/isq/sqw046.
Powles, Julia. 2015a. ‘The Case That Won’t Be Forgotten’. Loyola University Chicago Law
Journal 47: 583–615.
Powles, Julia. 2015b. ‘Data Privacy: The Tide Is Turning in Europe: But Is It Too Little, Too
Late?’ The Guardian, 6 April 2015. https://www.theguardian.com/technology/2015/apr/
06/data-privacy-europe-facebook, accessed 4 July 2022.
Powles, Julia. 2015c. ‘Tech Companies like Facebook Not above the Law, Says Max Schrems’.
The Guardian, 9 October 2015. http://www.theguardian.com/technology/2015/oct/09/
facebook-data-privacy-max-schrems-european-court-of-justice, accessed 4 July 2022.
Powles, Julia, and Enrique Chaparro. 2015. ‘How Google Determined Our Right to Be
Forgotten’. The Guardian, 18 February 2015. https://www.theguardian.com/technology/
2015/feb/18/the-right-be-forgotten-google-search, accessed 4 July 2022.
Pritzker, Penny. 2015. ‘Statement from U.S. Secretary of Commerce Penny Pritzker on Euro-
pean Court of Justice Safe Harbor Framework Decision’. Department of Commerce,
October. https://2014-2017.commerce.gov/news/press-releases/2015/10/statement-us-
secretary-commerce-penny-pritzker-european-court-justice.html.
Privacy International. 2018. ‘Privacy Protection Principles’. https://privacyinternational.
org/sites/default/files/2018-09/Part%203%20-%20Data%20Protection%20Principles.
pdf, accessed 4 July 2022.
Privacy International. 2020. ‘Here’s How a Well-Connected Security Company Is Quietly
Building Mass Biometric Databases in West Africa with EU Aid Funds’. Privacy Inter-
national. 10 November 2020. http://www.privacyinternational.org/news-analysis/4290/
heres-how-well-connected-security-company-quietly-building-mass-biometric.
Privacy International. 2020b. Privacy International v. Secretary of State for Foreign and
Commonwealth Affairs, Secretary of State for the Home Department, Government Com-
munications Headquarters, Security Service, Secret Intelligence Service. Court of Justice
of the EU: Grand Chamber. Case C-623/17.
Price, Rob. 2015. ‘Facebook Responds to Europe’s Top Court Striking down a Vital US-EU
Data Sharing Agreement’. Business Insider. http://uk.businessinsider.com/facebook-safe-
harbor-response-ecj-max-schrems-2015-10.
‘Protection of Individuals with Regard to the Processing of Personal Data - Processing
of Personal Data for the Purposes of Crime Prevention (Debate)’. 2016. European Par-
liament, Strasbourg, April 13. http://www.europarl.europa.eu/doceo/document/CRE-8-
2016-04-13-INT-3-521-0000_NL.html?redirect.
Puschmann, Cornelius, and Jean Burgess. 2014. ‘Big Data, Big Questions: Metaphors of Big
Data’. International Journal of Communication 8 (June). https://ijoc.org/index.php/ijoc/
article/view/2169, accessed 4 July 2022.
Putnam, Tonya L. 2009. ‘Courts without Borders: Domestic Sources of US Extraterritori-
ality in the Regulatory Sphere’. International Organization 63 (3): 459–90.
REFERENCES 267

Raab, Charles, and Ivan Szekely. 2017. ‘Data Protection Authorities and Information Tech-
nology’. Computer Law & Security Review 33 (4): 421–33. doi: 10.1016/j.clsr.2017.05.002.
Radu, Roxana. 2019. Negotiating Internet Governance. Oxford: Oxford University
Press.
Ramer, Samuel R. 2017. ‘Letter from Samuel R. Ramer, Acting Assistant Attorney General,
Office of Legislative Affairs, U.S. Department of Justice, to Hon. Paul Ryan, Speaker, U.S
House of Representatives’. [Cited in Statement of Richard W. Downing at a Hearing Enti-
tled ‘Data Stored Abroad: Ensuring Lawful Access and Privacy Protection in the Digital
Era’, 15 June 2017, Appendix A].
Rankin, Jennifer. 2016. ‘Brussels Bombings: EU Ministers to Meet’. The Guardian,
24 March 2016. https://www.theguardian.com/world/2016/mar/24/brussels-bombings-
eu-ministers-meet-as-border-controls-criticised, accessed 4 July 2022.
Raustiala, Kal. 2011. Does the Constitution Follow the Flag? The Evolution of Territoriality
in American Law. Oxford and New York: Oxford University Press.
Raustiala, Kal. 2013. ‘Institutional Proliferation and the International Legal Order’. In Inter-
disciplinary Perspectives on International Law and International Relations: The State of
the Art, edited by J. L. Dunoff and M. A. Pollack, 293–320. Cambridge and New York:
Cambridge University Press.
Rawls, John. 1971. A Theory of Justice. Cambridge, MA, and London: Belknap Press of
Harvard University Press.
Rawls, John. 2001. The Law of Peoples: With ‘The Idea of Public Reason Revisited’. Cam-
bridge, MA, and London: Harvard University Press.
Raymond, Mark, and Laura DeNardis. 2015. ‘Multistakeholderism: Anatomy of an Inchoate
Global Institution’. International Theory 7 (3): 572–616.
RCFP and Others. 2017. ‘Statement in CJEU Case C-507/17’. Court of Justice of
the European Union. https://www.rcfp.org/wp-content/uploads/imported/2017-11-29-
Statement-CJEU-in-Googe-v.-CNIL-RCFP-et-al.-as-filed.pdf, accessed 4 July 2022.
Redeker, Dennis, Lex Gill, and Urs Gasser. 2018. ‘Towards Digital Constitutionalism? Map-
ping Attempts to Craft an Internet Bill of Rights’. International Communication Gazette
80 (4): 302–19.
Reding, Viviane. 2010. ‘Viviane Reding Vice-President of the European Commission
responsible for Justice, Fundamental Rights and Citizenship Building Trust in Europe's
Online Single Market Speech at the American Chamber of Commerce to the EU’.
Brussels, 22 June 2010.
Reding, Viviane. 2014. ‘Facebook Post’. Facebook. 13 May 2014. https://www.facebook.com/
VivianeRedingEU/posts/304206613078842.
Rees, Wyn. 2007. Transatlantic Counter-Terrorism Cooperation: The New Imperative. Lon-
don and New York: Routledge.
Reform Government Surveillance. 2015. ‘Reform Government Surveillance’. Reform Gov-
ernment Surveillance. http://reformgovernmentsurveillance.com/, accessed 4 July 2022.
Regan, Priscilla M. 1995. Legislating Privacy: Technology: Social Values, and Public Policy.
Chapel Hill, NC: University of North Carolina Press.
Regan, Priscilla M. 2003. ‘Safe Harbors or Free Frontiers? Privacy and Transborder Data
Flows’. Journal of Social Issues 59 (2): 263–82.
Reiberg, A. 2018. Netzpolitik: Genese eines Politikfeldes. Policy Analyse. Baden-Baden:
Nomos.
Reicherts, Martine. 2014. ‘The Right to Be Forgotten and the EU Data Protection Reform’.
Lyon. http://europa.eu/rapid/press-release_SPEECH-14-568_en.htm, accessed 4 July
2022.
268 REFERENCES

Reidenberg, Joel R. 2000. ‘Resolving Conflicting International Data Privacy Rules in


Cyberspace’. Stanford Law Review 52 (5): 1315–71. doi: 10.2307/1229516.
Rettman, Andrew. 2013. ‘EU Should Create Own Spy Agency, Reding Says’, 4 November
2013. https://euobserver.com/justice/121979, accessed 4 July 2022.
Reus-Smit, Christian. 2001. ‘Human Rights and the Social Construction of Sovereignty’.
Review of International Studies 27 (4): 519–38.
Ripoll Servent, Ariadna. 2013. ‘Holding the European Parliament Responsible: Policy Shift
in the Data Retention Directive from Consultation to Codecision’. Journal of European
Public Policy 20 (7): 972–87. doi: 10.1080/13501763.2013.795380.
Ripoll Servent, Ariadna. 2017. ‘Protecting or Processing?’ In Privacy, Data Protection and
Cybersecurity in Europe, edited by Wolf J. Schünemann and Max-Otto Baumann, 115–30.
Cham: Springer.
Ripoll Servent, Ariadna, and Alex MacKenzie. 2011. ‘Is the EP Still a Data Protection Cham-
pion? The Case of SWIFT’. Perspectives on European Politics and Society 12 (4): 390–406.
doi: 10.1080/15705854.2011.622957.
Risse, Mathias. 2012. On Global Justice. Princeton, NJ: Princeton University Press.
Rittberger, Berthold. 2012. ‘Institutionalizing Representative Democracy in the European
Union: The Case of the European Parliament’. Journal of Common Market Studies 50
(s1): 18–37. doi: 10.1111/j.1468-5965.2011.02225.x.
Rodotà, Stefano. 2003. ‘Letter to the Committee on Citizens’ Freedoms and Rights, Jus-
tice and Home Affairs, Brussels’. https://www.statewatch.org/news/2003/mar/art29ch.
pdf, accessed 4 July 2022.
Rodotà, Stefano. 2009. ‘Data Protection as a Fundamental Right’. In Reinventing Data Pro-
tection?, edited by Serge Guthwirth, Yves Poullet, Paul de Hert, Cecile de Terwangne, and
Sjaak Nouwt, 77–82. Amsterdam: Springer.
Rosen, Jeffrey. 2011. ‘The Right to Be Forgotten’. Stanford Law Review Online 64: 88–92.
Ross, Wilbur. 2018. ‘EU Data Privacy Laws Are Likely to Create Barriers to Trade’.
Financial Times, 30 May 2018. https://www.ft.com/content/9d261f44-6255-11e8-bdd1-
cc0534df682c, accessed 4 July 2022.
Rössler, Beate. 2005. The Value of Privacy. Cambridge and Malden, MA: Polity.
Rotenberg, Marc. 2001. ‘Fair Information Practices and the Architecture of Privacy: (What
Larry Doesn’t Get)’. Leland Stanford Junior University, Stanford Technology Law Review
2001 (January): 1–34.
Rothermel, Ann-Kathrin. 2020. ‘Global–Local Dynamics in Anti-Feminist Discourses: An
Analysis of Indian, Russian and US Online Communities’. International Affairs 96 (5):
1367–85. doi: 10.1093/ia/iiaa130.
Rousseau, Jean-Jacques. 1998. The Social Contract. Classics of World Literature. Ware:
Wordsworth Editions Limited.
Ruggie, John Gerard. 1993. ‘Territoriality and Beyond: Problematizing Modernity in Inter-
national Relations’. International Organization 47 (1): 139–74.
Ruiz, David. 2018. ‘Responsibility Deflected, the CLOUD Act Passes’. Electronic
Frontier Foundation. March. https://www.eff.org/de/deeplinks/2018/03/responsibility-
deflected-cloud-act-passes, accessed 4 July 2022.
Rushe, Dominic. 2014. ‘Tech and Media Companies Join Microsoft in Privacy Case against
US Government’. The Guardian, 15 December 2014. http://www.theguardian.com/
technology/2014/dec/15/microsoft-email-warrant-lawsuit-tech-media-companies-join,
accessed 4 July 2022.
Ryngaert, Cedric. 2015. ‘Symposium Issue on Extraterritoriality and EU Data Protection’.
International Data Privacy Law 5 (4): pp. 221–225. https://doi.org/10.1093/idpl/ipv025.
REFERENCES 269

Ryngaert, Cedric M. J., and Nico A. N. M. van Eijk. 2019. ‘International Cooperation
by (European) Security and Intelligence Services: Reviewing the Creation of a Joint
Database in Light of Data Protection Guarantees’. International Data Privacy Law 9 (1):
61–73. doi: 10.1093/idpl/ipz001.
Saco, Diana. 1999. ‘Colonizing Cyberspace: “National Security” and the Internet’. In Cul-
tures of Insecurity: States, Communities, and the Production of Danger, edited by Jutta
Weldes, Mark Laffey, H Gusterson, and Raymond Duvall, 261–92. Minneapolis, MN,
and London: University of Minnesota Press.
Safran. 2016. ‘Safran: Very Strong Revenue Growth for First-Quarter 2016’. Safran, April.
https://www.safran-group.com/pressroom/safran-very-strong-revenue-growth-first-
quarter-2016-2016-04-26.
Salter, Mark B. 2007. ‘Governmentalities of an Airport: Heterotopia and Confession’.
International Political Sociology 1 (1): 49–66. doi: 10.1111/j.1749-5687.2007.00004.x.
Samonte, Mary. 2020. ‘Google v. CNIL: The Territorial Scope of the Right to Be Forgotten
under EU Law’. European Papers. http://www.europeanpapers.eu/en/europeanforum/
google-v-cnil-territorial-scope-of-right-to-be-forgotten-under-eu-law, accessed 4 July
2022.
Sanchez v. France. 2021. European Court of Human Rights. App no. 45581/15.
Sandel, Michael J. 1998. Liberalism and the Limits of Justice. Cambridge and New York:
Cambridge University Press.
Sapiro, Gisèle. 2018. ‘Field Theory from a Transnational Perspective’. In The Oxford Hand-
book of Pierre Bourdieu, edited by Thomas Medvetz and Jeffrey J. Sallaz, 161–82. Oxford
and London: Oxford University Press.
SAS. 2017. ‘Identify High-Risk Travelers Faster and More Accurately through Better Anal-
ysis of PNR Data’. Solution Brief. https://www.sas.com/content/dam/SAS/en_us/doc/
solutionbrief/identify-high-risk-travelers-pnr-data-109223.pdf, accessed 4 July 2022.
Sassen, Saskia. 1998. Globalization and Its Discontents. Political Science/Economics. New
York: New Press.
Satamedia. 2008. Tietosuojavaltuutettu v. Satakunnan Markkinapörssi Oy, Satamedia Oy.
Court of Justice of the EU: General Court. Case C-73/07.
Satariano, Adam. 2018. ‘G.D.P.R., a New Privacy Law, Makes Europe World’s Leading Tech
Watchdog’. The New York Times, 24 May 2018. https://www.nytimes.com/2018/05/24/
technology/europe-gdpr-privacy.html, accessed 4 July 2022.
Satz, Debra. 2012. Why Some Things Should Not Be for Sale: The Moral Limits of Markets.
Oxford Political Philosophy. Oxford and New York: Oxford University Press.
Savage, Charlie. 2019. ‘Judge Rules Terrorism Watchlist Violates Constitutional Rights’. The
New York Times, 4 September 2019. https://www.nytimes.com/2019/09/04/us/politics/
terrorism-watchlist-constitution.html, accessed 4 July 2022.
Scally, Derek, and Jochen Bittner. 2013. ‘The Man in Portarlington Who Protects People’s
Data’. The Irish Times, 14 August 2013. https://www.irishtimes.com/news/technology/
the-man-in-portarlington-who-protects-people-s-data-1.1493167, accessed 4 July 2022.
Schildt, Henri. 2020. The Data Imperative: How Digitalization Is Reshaping Management,
Organizing, and Work. Oxford: Oxford University Press.
Schmidt, Vivien A., and Mark Thatcher. 2014. ‘Why Are Neoliberal Ideas so
Resilient in Europe’s Political Economy?’ Critical Policy Studies 8 (3): 340–7. doi:
10.1080/19460171.2014.926826.
Schneider, Jens-Peter. 2017. ‘Developments in European Data Protection Law in the
Shadow of the NSA-Affair’. In Privacy and Power: A Transatlantic Dialogue in the Shadow
270 REFERENCES

of the NSA-Affair, edited by Russell A. Miller, 539–63. Cambridge: Cambridge University


Press.
Scholte, Jan Aart. 2017. ‘Polycentrism and Democracy in Internet Governance’. In
The Net and the Nation State. Multidisciplinary Perspectives on Internet Governance.,
edited by Uta Kohl, pp. 165–84. Cambridge, New York: Cambridge University
Press.
Schouten, Peer. 2014. ‘Security as Controversy: Reassembling Security at Amsterdam
Airport’. Security Dialogue 45 (1): 23–42.
Schrems. 2015. Case C-311/18. Schrems v. Data Protection Commissioner. Court of Justice
of the EU: General Court. Case C-311/18.
Schrems, Maximilian. 2013. ‘To the Data Protection Commissioner, Canal House, Sta-
tion Road, Portarlington, Co. Laois, Ireland. Complaint against Facebook Ireland Ltd
– 23 “PRISM”’, 25 June 2013. http://www.europe-v-facebook.org/prism/facebook.pdf,
accessed 4 July 2022.
Schrems v. Data Protection Commissioner. Irish High Court, 2014.
Schrems, Maximilian. 2017. ‘Legal Procedure against “Facebook Ireland Limited”’. 2017.
http://europe-v-facebook.org/EN/Complaints/complaints.html.
Schünemann, Wolf Jürgen, and Jana Windwehr. 2020. ‘Towards a “Gold Standard for the
World”? The European General Data Protection Regulation between Supranational and
National Norm Entrepreneurship’. Journal of European Integration 43 (7): 859–74. doi:
10.1080/07036337.2020.1846032
Schwartz, Paul M. 2004. ‘Property, Privacy, and Personal Data’. Harvard Law Review 117
(7): 2056–128. doi: 10.2307/4093335.
Schwartz, Paul M., and Joel R. Reidenberg. 1996. Data Privacy Law: A Study of United States
Data Protection. Charlottesville, VA: Michie.
Scott, Mark. 2015. ‘Data Transfer Pact between U.S. and Europe Is Ruled Invalid’. The
New York Times, 6 October 2015. https://www.nytimes.com/2015/10/07/technology/
european-union-us-data-collection.html, accessed 4 July 2022.
Scott, Mark. 2019. ‘In 2020, Global “Techlash” Will Move from Words to Action’.
POLITICO, 31 December 2019. https://www.politico.eu/article/tech-policy-
competition-privacy-facebook-europe-techlash/, accessed 4 July 2022.
Search and Seizure Warrant for the Southern District of New York. 13 MAG 2814. 4 Dec
2013. Doc #91. In the Matter of a Warrant to Search a Certain Email Account Controlled
and Maintained by Microsoft Corporation. US District Court Southern District of New
York, 2014.
Seger, Alexander. 2019. ‘Towards a Protocol to the Budapest Convention’. Presented at the
Conference on Criminal Justice in Cyberspace, Bucharest, February 25.
Selby, John. 2017. ‘Data Localization Laws: Trade Barriers or Legitimate Responses to
Cybersecurity Risks, or Both?’ International Journal of Law and Information Technology
25 (3): 213–32.
Sen, Amartya. 2005. ‘Human Rights and Capabilities’. Journal of Human Development 6 (2):
151–66.
Sending, Ole Jacob. 2015. The Politics of Expertise: Competing for Authority in Global
Governance. Ann Arbor, MI: University of Michigan Press.
Sending, Ole Jacob. 2017. ‘Recognition and Liquid Authority’. International Theory 9 (2):
311–28. doi: 10.1017/S1752971916000282.
Shen, Hong. 2018. ‘Building a Digital Silk Road? Situating the Internet in China’s Belt and
Road Initiative’. International Journal of Communication 12: 2683–701.
REFERENCES 271

Sherman, Justin. 2022. ‘This Year, Russia’s Internet Crackdown Will Be Even Worse’. Atlantic
Council. 13 January 2022. https://www.atlanticcouncil.org/blogs/new-atlanticist/this-
year-russias-internet-crackdown-will-be-even-worse/, accessed 4 July 2022.
Sieghart, Paul. 1976. Privacy and Computers. London: Latimer New Dimensions.
Simon, Morgan. 2020. ‘Investing in Immigrant Surveillance: Palantir and the #NoTech-
ForICE Campaign’. Forbes, 15 January 2020. https://www.forbes.com/sites/
morgansimon/2020/01/15/investing-in-immigrant-surveillance-palantir-and-the-
notechforice-campaign/, accessed 4 July 2022.
Singer, Peter W., and Ian Wallace. 2014. ‘Secure the Future of the Internet’. Brookings. Jan-
uary. https://www.brookings.edu/research/secure-the-future-of-the-internet/, accessed
4 July 2022.
Slaughter, Anne-Marie. 2005. ‘Security, Solidarity, and Sovereignty: The Grand Themes of
UN Reform’. American Journal of International Law 99 (3): 619–31.
Smale, Alison. 2013. ‘Anger Growing among Allies on U.S. Spying’. The New York Times,
23 October 2013. https://www.nytimes.com/2013/10/24/world/europe/united-states-
disputes-reports-of-wiretapping-in-Europe.html, accessed 4 July 2022.
Smith, Adam. 1999. The Wealth of Nations: Books IV–V. Edited by Andrew S. Skinner.
Harmondsworth: Penguin.
Smith, Brad. 2017a. ‘The Need for a Digital Geneva Convention’. Microsoft on the
Issues, February. https://blogs.microsoft.com/on-the-issues/2017/02/14/need-digital-
geneva-convention/, accessed 4 July 2022.
Smith, Brad. 2017b. ‘US Supreme Court Will Hear Petition to Review Microsoft Search
Warrant Case While Momentum to Modernize the Law Continues in Congress’.
Microsoft on the Issues. 16 October 2017. https://blogs.microsoft.com/on-the-issues/
2017/10/16/us-supreme-court-will-hear-petition-to-review-microsoft-search-warrant-
case-while-momentum-to-modernize-the-law-continues-in-congress/, accessed 4 July
2022.
Smith, Brad. 2018a. ‘The CLOUD Act Is an Important Step Forward, but Now More Steps
Need to Follow’. Microsoft on the Issues, April. https://blogs.microsoft.com/on-the-
issues/2018/04/03/the-cloud-act-is-an-important-step-forward-but-now-more-steps-
need-to-follow/, accessed 4 July 2022.
Smith, Brad. 2018b. ‘34 Companies Stand up for Cybersecurity with a Tech Accord’.
Microsoft on the Issues, April. https://blogs.microsoft.com/on-the-issues/2018/04/17/
34-companies-stand-up-for-cybersecurity-with-a-tech-accord/, accessed 4 July 2022.
Smith, Brad. 2018c. ‘A Call for Principle-Based International Agreements to Govern Law
Enforcement Access to Data’. Microsoft on the Issues, September. https://blogs.microsoft.
com/on-the-issues/2018/09/11/a-call-for-principle-based-international-agreements-to-
govern-law-enforcement-access-to-data/, accessed 4 July 2022.
Smith, H. Jeff, Tamara Dinev, and Heng Xu. 2011. ‘Information Privacy Research: An
Interdisciplinary Review’. MIS Quarterly 35 (4): 989–1016.
Smith v. Maryland. 1979. 442 U.S. 735. Supreme Court of the US.
Snowden, Edward. 2014. ‘Testimony to the European Parliament’. https://www.techdirt.
com/2014/03/07/snowden-gives-testimony-to-european-parliament-inquiry-into-
mass-surveillance-asks-eu-asylum/.
Snowden, Edward. 2019. Permanent Record. London: Macmillan.
Solove, Daniel J. 2002. ‘Conceptualizing Privacy’. California Law Review 90 (4): pp. 1087–
1155. https://doi.org/10.2307/3481326.
Solove, Daniel J. 2007. ‘I’ve Got Nothing to Hide and Other Misunderstandings of Privacy’.
San Diego Law Review 44: 745–72.
272 REFERENCES

Solove, Daniel J. 2008. Understanding Privacy. Cambridge, MA: Harvard University Press.
Solove, Daniel J., and Woodrow Hartzog. 2014. ‘The FTC and the New Common Law of
Privacy’. Columbia Law Review 114: 583–676.
Sorell, Tom. 2016. ‘Law and Equity in Hobbes’. Critical Review of International Social and
Political Philosophy 19 (1): 29–46.
Spanish Delegation to the Council of the EU. 2015. ‘Information by the Commission on
the PNR Legislation Adopted by Mexico and the Republic of Argentina Requesting the
Transfer of PNR Data from the EU’. 6857/15. Brussels. http://data.consilium.europa.eu/
doc/document/ST-6857-2015-INIT/en/pdf.
Spar, Debora L. 1999. ‘Lost in (Cyber)Space. The Private Rules of Online Commerce”. In
Private Authority and International Affairs, edited by A. Claire Cutler, Virginia Haufler,
and Tony Porter, 31–51. SUNY Series in Global Politics. Albany, NY: State University of
New York Press.
Steger, Manfred B. 2013. ‘Political Ideologies in the Age of Globalization’. In The Oxford
Handbook of Political Ideologies, edited by Michael Freeden and Marc Stears, 214–30.
Oxford: Oxford University Press.
Steger, Manfred B, and Erin K Wilson. 2012. ‘Anti-Globalization or Alter-Globalization?
Mapping the Political Ideology of the Global Justice Movement’. International Studies
Quarterly 56 (3): 439–54.
Stevenson, Alexandra. 2018. ‘Facebook Admits It Was Used to Incite Violence in Myan-
mar’. The New York Times, 6 November 2018. https://www.nytimes.com/2018/11/06/
technology/myanmar-facebook.html.
Stigler, George J. 1980. ‘An Introduction to Privacy in Economics and Politics’. The Journal
of Legal Studies 9 (4): 623–44.
Strange, Susan. 1996. The Retreat of the State: The Diffusion of Power in the World Econ-
omy. Cambridge Studies in International Relations. Cambridge: Cambridge University
Press.
Suda, Yuko. 2013. ‘Transatlantic Politics of Data Transfer: Extraterritoriality, Counter-
Extraterritoriality and Counter-Terrorism’. Journal of Common Market Studies 51 (4):
772–88.
Suda, Yuko. 2017. The Politics of Data Transfer: Transatlantic Conflict and Cooperation over
Data Privacy. Routledge Studies in Global Information, Politics and Society. New York
and Abingdon: Taylor & Francis.
Supreme Court of the US. 2018. ‘Oral Hearing in the Supreme Court of the United
States United States, Petitioner, v. Microsoft Corporation, Respondent’. No 17–2.
Washington DC.
Susen, Simon. 2014. ‘Towards a Dialogue between Pierre Bourdieu’s “Critical Sociology”
and Luc Boltanski’s “Pragmatic Sociology of Critique”’. In The Spirit of Luc Boltanski:
Essays on the ‘Pragmatic Sociology of Critique’, edited by Bryan S. Turner and Simon
Susen, 313–48. London: Anthem Press.
Susen, Simon. 2015. ‘Towards a Critical Sociology of Dominant Ideologies: An Unexpected
Reunion between Pierre Bourdieu and Luc Boltanski’. Cultural Sociology 10 (2): 195–246.
doi: 10.1177/1749975515593098.
Susen, Simon, and Bryan S. Turner, eds. 2014. The Spirit of Luc Boltanski: Essays on the
‘Pragmatic Sociology of Critique’. London: Anthem Press.
Svantesson, Dan. 2016. ‘Enforcing Privacy across Different Jurisdictions’. In Enforcing
Privacy: Regulatory, Legal and Technological Approaches, edited by David Wright
andPaul de Hert, 195–222. Law, Governance and Technology Series. Dordrecht:
Springer.
REFERENCES 273

Svantesson, Dan Jerker B. 2015a. ‘Limitless Borderless Forgetfulness? Limiting the Geo-
graphical Reach of the “Right to Be Forgotten”’. Oslo Law Review 2 (2): 116–38. doi:
10.5617/oslaw2567.
Svantesson, Dan Jerker B. 2015b. ‘Extraterritoriality and Targeting in EU Data Privacy Law:
The Weak Spot Undermining the Regulation’. International Data Privacy Law 5 (4): 226–
34. doi: 10.1093/idpl/ipv024.
SWIFT. 2006. ‘Executive Summary of SWIFT’s Response to the Belgian Privacy Commis-
sion’s Advisory Opinion 37/2006 of 27 September 2006’. La Hulpe, Belgium. https://www.
swift.com/node/14941, accessed 4 July 2022.
SWIFT. 2007. ‘SWIFT Board Approves Messaging Re-Architecture’. SWIFT, October.
https://www.swift.com/insights/press-releases/swift-board-approves-messaging-re-
architecture, accessed 4 July 2022.
SWIFT. 2022. ‘SWIFT’. 2022. https://www.swift.com/about-us, accessed 4 July 2022.
Swire, Peter P., and Robert E. Litan. 1998a. None of Your Business: World Data Flows,
Electronic Commerce, and the European Privacy Directive. Washington DC: Brookings
Institution Press.
Swire, Peter P., and Robert E. Litan. 1998b. ‘Avoiding a Showdown over EU Privacy Laws’.
Brookings Policy Brief 29. https://www.brookings.edu/research/avoiding-a-showdown-
over-eu-privacy-laws/, accessed 4 July 2022.
Szpunar, Maciej M. 2019. ‘Conclusions de l’Avocat Général: Affaire C-507/17ʹ. Court
of Justice of the European Union. https://eur-lex.europa.eu/legal-content/FR/TXT/
?uri=CELEX:62017CC0507.
Tapscott, Don. 1996. The Digital Economy: Promise and Peril in the Age of Networked
Intelligence. Vol. 1. New York: McGraw-Hill.
Tarrow, Sidney. 2005. The New Transnational Activism. Cambridge Studies in Contentious
Politics. Cambridge: Cambridge University Press.
T-CY. 2014. ‘T-CY Guidance Note # 3 Transborder Access to Data (Article 32) Adopted
by the 12th Plenary of the T-CY (2–3 December 2014)’. T-CY (2013)7 E. Strasbourg:
Council of Europe.
T-CY. 2016a. ‘Criminal Justice Access to Data in the Cloud: Cooperation with “Foreign”
Service Providers. Background Paper Prepared by the T-CY Cloud Evidence Group’. T-
CY (2016)2. Strasbourg: Council of Europe.
T-CY. 2016b. ‘Criminal Justice Access to Electronic Evidence in the Cloud: Recommenda-
tions for Consideration by the T-CY. Final Report of the T-CY Cloud Evidence Group’.
Strasbourg.
Thévenot, Laurent, Michael Moody, and Claudette Lafaye. 2000. ‘Forms of Valuing Nature:
Arguments and Modes of Justification in French and American Environmental Disputes’.
In Rethinking Comparative Cultural Sociology: Repertoires of Evaluation in France and
the United States, edited by Michèle Lamont, 229–72. Cambridge: Cambridge University
Press.
Thierer, Adam. 2016. Permissionless Innovation: The Continuing Case for Compre-
hensive Technological Freedom. Washington DC: Mercatus Center, George Mason
University.
Tippmann, Sylvia, and Julia Powles. 2015. ‘Google Accidentally Reveals Data on “right
to Be Forgotten” Requests’. The Guardian, 14 July 2015. http://www.theguardian.com/
technology/2015/jul/14/google-accidentally-reveals-right-to-be-forgotten-requests,
accessed 4 July 2022.
Tjong Tjin Tai, TFE. 2016. ‘The Right to Be Forgotten: Private Law Enforcement’. Interna-
tional Review of Law, Computers & Technology 30 (1–2): 76–83.
274 REFERENCES

Tocci, Nathalie. 2017. ‘From the European Security Strategy to the EU Global Strategy:
Explaining the Journey’. International Politics 54 (4): 487–502. doi: 10.1057/s41311-017-
0045-9.
Seger, Alexander. 2019. ‘Towards a Protocol to the Budapest Convention’. Bucharest,
February 25.
Trauner, Florian, and Helena Carrapiço, eds. 2012. ‘The External Dimension of EU Justice
and Home Affairs: Post-Lisbon Governance Dynamics’. European Foreign Affairs Review,
Special Issue, 17 (5): 1–18.
Tynan, Dan. 2018. ‘Silicon Valley Finally Pushes for Data Privacy Laws at Senate Hearing’.
The Guardian, 26 September 2018. https://www.theguardian.com/technology/2018/
sep/26/silicon-valley-senate-commerce-committee-data-privacy-regulation, accessed 4
July 2022.
Tzanou, Maria. 2017. The Fundamental Right to Data Protection: Normative Value in the
Context of Counter-Terrorism Surveillance. London: Bloomsbury Publishing.
UK Home Office. 2019. ‘Online Harms White Paper’. https://www.gov.uk/government/
consultations/online-harms-white-paper/online-harms-white-paper, accessed 4 July
2022.
Ulbricht, Lena. 2018. ‘When Big Data Meet Securitization: Algorithmic Regulation with
Passenger Name Records’. European Journal for Security Research 3 (2): 139–61.
UN. 2011. ‘UN Guiding Principles on Business and Human Rights’. HR/PUB/11/04.
Unabhängiges Landeszentrum für Datenschutz. 2018. Unabhängiges Landeszentrum für
Datenschutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH.
Court of Justice of the EU: Grand Chamber. Case C-210/16.
UNCTAD. 2016. ‘Data Protection Regulations and International Data Flows: Implications
for Trade and Development’. New York, Geneva.
UNCTAD. 2022. ‘Data Protection and Privacy Legislation Worldwide’. 2022. https://
unctad.org/en/Pages/DTL/STI_and_ICTs/ICT4D-Legislation/eCom-Data-
Protection-Laws.aspx, accessed 4 July 2022.
UNGA. 1990. ‘Guidelines for the Regulation of Computerized Personal Data Files’.
A/RES/45/95.
UNGA. 2000. ‘United Nations Convention against Transnational Organized Crime and the
Protocols Thereto’. https://www.unodc.org/unodc/en/organized-crime/intro/UNTOC.
html, accessed 4 July 2022.
UNGA. 2013. ‘The Right to Privacy in the Digital Age’. A/RES/68/167. https://documents-
dds-ny.un.org/doc/UNDOC/GEN/N13/449/47/PDF/N1344947.pdf ?OpenElement,
accessed 4 July 2022.
UNHRC. 2018. ‘Report of the Special Rapporteur on the Right to Privacy’. A/73/45712.
https://digitallibrary.un.org/record/1656178.
United States v. Miller. 1976. 425 U.S. 435. US Supreme Court of the US.
United States v. Microsoft Corp. 2018. Supreme Court of the US.
UNSC. 2017. ‘Resolution 2396 (2017)’. https://documents-dds-ny.un.org/doc/UNDOC/
GEN/N17/460/25/PDF/N1746025.pdf ?OpenElement, accessed 4 July 2022.
UN Special Rapporteur on the Right to Privacy. 2017. ‘Brief Amicus Curiae of UN Special
Rapporteur on the Right to Privacy Joseph Cannataci in Support of Neither Party, United
States of America v. Microsoft Corporation’. 17–2. US Supreme Court.
US. 2017. ‘In the Matter of United States v. Microsoft Corp.: Petition for a Writ of Certiorari’.
17–2. United States Court of Appeals for the Second Circuit.
US Chamber of Commerce. 2013. ‘The Economic Importance of Getting Data Protec-
tion Right: Protecting Privacy, Transmitting Data, Moving Commerce’. https://www.
REFERENCES 275

uschamber.com/sites/default/files/documents/files/020508_EconomicImportance_
Final_Revised_lr.pdf, accessed 4 July 2022.
US Congress. 2015. ‘The House Energy and Commerce Subcommittees on Commerce,
Manufacturing and Trade and Communications & Technology: U.S.-EU Safe Harbor
Framework’. Washington DC.
US Department of Commerce. 2000a. ‘Safe Harbor Workbook’. https://2016.export.gov/
safeharbor/eg_main_018238.asp.
US Department of Commerce. 2000b. ‘Safe Harbor Privacy Principles’. OJ L25/10.
US Department of Commerce. 2016. ‘EU–U.S. Privacy Shield Framework Principles Issued
by the U.S. Department of Commerce’. https://www.privacyshield.gov/Privacy-Shield-
Principles-Full-Text, accessed 4 July 2022.
US Department of Justice. 2019. ‘Promoting Public Safety, Privacy, and the Rule of Law
around the World: The Purpose and Impact of the CLOUD Act: White Paper’. https://
www.justice.gov/opa/press-release/file/1153446/download, accessed 4 July 2022.
US Department of the Treasury. 2002. ‘Passenger Name Record Information Required for
Passengers on Flights in Foreign Air Transportation to or From the United States’. Federal
Register Vol. 67, No. 122. https://www.govinfo.gov/content/pkg/FR-2002-06-25/pdf/02-
15935.pdf, accessed 4 July 2022.
US Department of the Treasury. 2007a. ‘Letter from United States Department of the
Treasury Regarding SWIFT/Terrorist Finance Tracking Programme’. OJ 2007/C 166/08.
US Department of the Treasury. 2007b. ‘Processing of EU Originating Personal Data by
United States Treasury Department for Counter Terrorism Purposes: “SWIFT” (2007/C
166/09) Terrorist Finance Tracking Program: Representations of the United States
Department of the Treasury’. OJ C 166/18.
US Department of the Treasury. 2022. ‘Terrorist Finance Tracking Program (TFTP)’. 2022.
https://home.treasury.gov/policy-issues/terrorism-and-illicit-finance/terrorist-finance-
tracking-program-tftp, accessed 4 July 2022.
US Embassy Berlin. 2006a. ‘Scenesetter: FBI Director Mueller’s Berlin Visit’. Wikileaks
Public Library of US Diplomacy, 06BERLIN2303_a. https://wikileaks.org/plusd/cables/
06BERLIN2303_a.html, accessed 4 July 2022.
US Embassy Berlin. 2006b. ‘DHS A/S Baker Engages on PNR, Seeks Greater CT Info Shar-
ing’. Wikileaks Public Library of US Diplomacy, 06BERLIN3173_a. https://wikileaks.org/
plusd/cables/06BERLIN3173_a.html, accessed 4 July 2022.
US Embassy Dublin. 2006. ‘Next Steps on TFTP: Ireland Takes Hard Line on SWIFT’. Wik-
ileaks Public Library of US Diplomacy, 06DUBLIN1396_a. https://wikileaks.org/plusd/
cables/06DUBLIN1396_a.html, accessed 4 July 2022.
US Embassy London. 2006. ‘Terrorism Finance: Next Steps on the TFTP’. Wikileaks Pub-
lic Library of US Diplomacy, 06LONDON8247_a. https://wikileaks.org/plusd/cables/
06LONDON8247_a.html, accessed 4 July 2022.
US Mission to the EU. 2007. ‘European Commission Outlines PNR-Style SWIFT Issue’.
07BRUSSELS253_a. Brussels.
US Mission to the EU. 2009. ‘European Parliament: Introducing the New Civil Liberties,
Justice, and Home Affairs Committee’. Wikileaks Public Library of US Diplomacy, Brus-
sels 001105. https://wikileaks.org/plusd/cables/09BRUSSELS1105_a.html, accessed 4
July 2022.
US Mission to the EU. 2015. ‘Data Privacy and Protection’, December. https://web.archive.
org/web/20151231220650/http://useu.usmission.gov/st-09282015.html, accessed 4 July
2022.
276 REFERENCES

US Patriot Act. 2001. ‘Uniting and Strengthening America by Providing Appropriate Tools
Required to Intercept and Obstruct Terrorism (USA PATRIOT ACT) Act of 2001’.
US White House. 1997. A Framework for Global Electronic Commerce. https://
clintonwhitehouse4.archives.gov/WH/New/Commerce/read.html.
US White House. 2011. ‘National Strategy for Counterterrorism’. https://obamawhitehouse.
archives.gov/blog/2011/06/29/national-strategy-counterterrorism, accessed 4 July
2022.
US White House. 2014. ‘Presidential Policy Directive 28: Signals Intelligence Activ-
ities’. PPD-28. https://obamawhitehouse.archives.gov/the-press-office/2014/01/17/
presidential-policy-directive-signals-intelligence-activities, accessed 4 July 2022.
Vanbever, Francis. 2006. ‘SWIFT Statement: Francis Vanbever, Chief Financial Officer,
Member of the Executive Committee, SWIFT’. European Parliament Hearing, 4 October
2006.
Vauchez, Antoine, and Bruno de Witte. 2013. Lawyering Europe: European Law as a
Transnational Social Field. Oxford and Portland, OR: Hart Publishing.
Veljanovski, Cento. 2010. ‘Economic Approaches to Regulation’. In The Oxford Handbook of
Regulation, edited by Robert Baldwin, Martin Cave, and Martin Lodge, 17–38. Oxford:
Oxford University Press.
Viola, Lora Anne, and Paweł Laidler. 2022. Trust and Transparency in an Age of Surveillance.
Abingdon, New York: Routledge.
Volokh, Eugene. 1999. ‘Freedom of Speech and Information Privacy: The Troubling Impli-
cations of a Right to Stop People from Speaking about You’. Stanford Law Review 52:
1–65.
Voss, W. Gregory. 2016. ‘The Future of Transatlantic Data Flows: Privacy Shield or Bust?’
Journal of Internet Law 19 (11): 1, 9–18.
Wacquant, Loïc, and Aksu Akçaoğlu. 2017. ‘Practice and Symbolic Power in Bour-
dieu: The View from Berkeley’. Journal of Classical Sociology 17 (1): 55–69. doi:
10.1177/1468795X16682145.
Walker, Clive. 2018. ‘Counter-Terrorism Financing: An Overview’. In The Palgrave Hand-
book of Criminal and Terrorism Financing Law, edited by Colin King, Clive Walker, and
Jimmy Gurulé, 737–53. Cham: Palgrave Macmillan.
Walker, Kent. 2016. ‘A Principle That Should Not Be Forgotten’. Google, May. https://blog.
google/around-the-globe/google-europe/a-principle-that-should-not-be-forgotten/,
accessed 4 July 2022.
Walker, Kent. 2017. ‘Defending Access to Lawful Information at Europe’s Highest
Court’. Google, November. https://www.blog.google/around-the-globe/google-europe/
defending-access-lawful-information-europes-highest-court/, accessed 4 July 2022.
Walker, Neil. 2008. ‘Taking Constitutionalism beyond the State’. Political Studies 56 (3):
519–43.
Waltz, Kenneth N. 2010. Theory of International Politics. Long Grove, IL: Waveland Press.
Walzer, Michael. 2008. Spheres of Justice: A Defense of Pluralism and Equality. New York:
Basic Books.
Warren, Samuel D., and Louis D. Brandeis. 1890. ‘The Right to Privacy’. Harvard Law
Review 4 (5): 193–220.
Weiss, Martin A., and Kristin Archick. 2016. ‘US–EU Data Privacy: From Safe Harbor to
Privacy Shield’. Congressional Research Service.
Weiss, Moritz. 2008. ‘The “Political Economy of Conflicts”: A Window of Opportunity for
CFSP?’ Journal of Contemporary European Research 4 (1): 1–17.
REFERENCES 277

Weldes, Jutta. 1999. Cultures of Insecurity: States, Communities, and the Production of
Danger. Vol. 14. Minneapolis, MN: University of Minnesota Press.
Wensink, Wim, Bas Warmenhoven, Roos Haasnoot, Rob Wesselink, Bibi van Ginkel,
Stef Wittendorp, Christophe Paulussen, et al. 2017. ‘The European Union’s Policies
on Counter-Terrorism: Relevance, Coherence and Effectiveness’. Policy Department C:
Citizens’ Rights and Constitutional Affairs, European Parliament.
Wesseling, Mara. 2016. ‘An EU Terrorist Finance Tracking System’. Occasional Paper. Royal
United Services Institute for Defence and Security Studies.
Wesseling, Mara, Marieke de Goede, and Louise Amoore. 2012. ‘Data Wars beyond Surveil-
lance: Opening the Black Box of SWIFT’. Journal of Cultural Economy 5 (1): 49–66.
Westin, Alan F. 1967. Privacy and Freedom. New York: Athenum.
Williams, Michael C. 2003. ‘Words, Images, Enemies: Securitization and Interna-
tional Politics’. International Studies Quarterly 47 (4): 511–31. doi: 10.1046/j.0020-
8833.2003.00277.x.
Wittgenstein, Ludwig. 2001. Philosophische Untersuchungen. Edited by Joachim Schulte,
Heikki Nyman, Eike von Savigny, and Georg Henrik von Wright. Frankfurt am Main:
Suhrkamp.
Wong, Julia Carrie. 2018. ‘Microsoft Revenue Exceeds $100bn Boosted by Cloud Ser-
vices’. The Guardian, 20 July 2018. https://www.theguardian.com/technology/2018/jul/
19/microsoft-earnings-cloud-services-revenue, accessed 4 July 2022.
Yeung, Karen. 2018. ‘A lgorithmic Regulation: A Critical Interrogation’. Regulation & Gov-
ernance 12 (4): 505–23. doi: 10.1111/rego.12158.
Die Zeit. 2016. ‘Microsoft: US-Regierung darf nicht auf Kundendaten im Ausland zugreifen’.
Die Zeit, 14 July 2016. https://www.zeit.de/digital/datenschutz/2016-07/microsoft-
datenschutz-us-regierung, accessed 4 July 2022.
Zhao, Bo, and Weiquan Chen. 2019. ‘Data Protection as a Fundamentl Right: The European
General Data Protection Regulation and Its Extraterritorial Application in China’. US–
China Law Review 16 (3): 97–113. doi: 10.17265/1548-6605/2019.03.002.
Zhao, Bo, and Jeanne Mifsud-Bonnici. 2016. ‘Protecting EU Citizens? Personal Data in
China: A Reality or a Fantasy?’ International Journal of Law and Information Technology
24 (2): 128–50. doi: 10.1093/ijlit/eaw001.
Zittrain, Jonathan. 2008. The Future of the Internet and How to Stop It. New Haven, CT:
Yale University Press.
Zuboff, Shoshana. 2018. The Age of Surveillance Capitalism: The Fight for a Human Future
at the New Frontier of Power. New York: Public Affairs.
Zürn, Michael, Martin Binder, Matthias Ecker-Ehrhardt, and Katrin Radtke. 2007. ‘Poli-
tische Ordnungsbildung wider Willen’. Zeitschrift für Internationale Beziehungen 14 (1):
129–64.
Zürn, Michael, and Pieter de Wilde. 2016. ‘Debating Globalization: Cosmopolitanism and
Communitarianism as Political Ideologies’. Journal of Political Ideologies 21 (3): 280–301.
doi: 10.1080/13569317.2016.1207741.
Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April
2016 on the protection of natural persons with regard to the processing of personal data
and on the free movement of such data, and repealing Directive 95/46/EC, Pub. L. No.
OJ L119/1 (2016).
Index

9/11 terrorist attacks 32, 56, 59t, 68, 84, 104–8, Article 29 Working Party 63–4, 83, 109, 111–12,
116, 126, 129, 131, 146, 148–9, 209 116, 126–7, 133–4, 137, 150, 155, 171,
29WP, see Article 29 Working Party 183–4, 201
1995 Data Protection Directive 2, 42t, 52, 54–5, ASEAN, see Association of Southeast Asian
58, 60t, 74–5, 77, 79, 87, 95, 99–100, 109, Nations Framework on Personal Data
111, 133, 137, 162, 177, 179, 182 Protection
reform 99–102, see also General Data Asia-Pacific Economic Cooperation Privacy
Protection Regulation Framework 42t, 55, 64, 65, 66, 68, 71, 73, 75
and Safe Harbour 54–5 Association of Southeast Asian Nations Frame-
2004/82/EC Carrier Directive 108 work on Personal Data Protection 42, 44,
55, 62, 63, 65, 68, 71, 75
Abbott, Andrew 3, 8, 18, 21–2, 31, 205, 209 ATSA, see Aviation and Transportation Security
adequacy 2, 79, 85–92, 96–103, 108–9, 112–13, Act
122, 213–14, 219 Aviation and Transportation Security Act 104,
CJEU definition 2, 91, 97–103 106, 108–11, 115, 121
decision 79, 85–92, 96–7, 112–13, 213–14, aviation security 104, 108–9
219
invalidation based on 91–2, 96–9, 103 banks 106, 132, 134, 138, 146, 150, 153
Advanced Passenger Information 108–9 Belgian Society for Worldwide Interbank
African Union 55 Financial Telecommunications, see SWIFT
Convention on Cybersecurity and Personal
Bennett, Colin 2, 6, 53–4, 75, 175
Data Protection 42, 55, 63, 65–6, 68–9, 71,
Bigo, Didier 10, 22, 26, 34, 86, 104, 151, 153, 209
73
BMI, see Federal Ministry of the Interior
AFSJ, see Area of Freedom, Security and Justice
(Germany)
Agencia Española de Protección de Datos 178,
Boltanski, Luc 21, 24, 27–9, 32–3, 36, 44, 143,
208
193, 210
agency 10–11, 18, 21, 24, 65, 206
reflexivity 24–7, 40, 44 and Bourdieu, Pierre 10, 18, 21–4, 34–5,
197–8
airlines 14, 56, 102, 104, 106, 108–10, 112–16,
124, 127, 153 pragmatism 40, 54
algorithm 10, 176, 188 sociology of critique 18, 24, 27–30
regulation 10, 19, 44 and Thévenot, Laurent 8, 10, 15, 18–19, 21,
Science and Technology Studies 10, 19 24, 27–30, 32, 34, 36–7, 41, 44, 48, 51, 64,
Alphabet, see Google 90, 95, 117, 130, 148, 151, 192, 197, 206,
AML, see anti-money laundering 209–10
anonymization (of data) 5, 119 Bourdieu, Pierre
anti-money laundering 130–4, 149–50 and Boltanski, Luc 10, 18, 21–4, 34–5, 197–8
AML Directive 141 fields 8, 10–12, 17–18, 21–2, 24–6, 34, 39, 51,
FATF 149–50 89, 121, 130, 169, 197, 206, 212
APEC, see Asia-Pacific Economic Cooperation Bradford, Anu 2, 13, 45, 62, 79, 99, 165, see also
Privacy Framework Brussels effect
API, see Advanced Passenger Information Brussels effect 2, 13, 45, 62, 79, 99, 165, 218
Aradau, Claudia 10, 19, 67–8 Budapest Convention 154, 159, 168–71
Area of Freedom, Security and Justice 118 Bureau of Customs and Border Protection
Article 19 43 (CBP) 112
INDE X 279

California Privacy Act (US) 56 counterterrorist financing, see combating the


Cambridge Analytica 1, 52, 56, 63, 191 financing of terrorism (CFT)
Cannataci, Joseph A., see also United Nations 5, Court of Justice of the European Union
53, 81, 163 (CJEU) 14, 77, 81, 88, 91–103, 105, 112–13,
Charter of Fundamental Rights of the European 115, 126–7, 129, 136, 144, 149, 174, 177–92,
Union 55 194–5, 200t, 211–13
Chertoff, Michael 115–16, 120, 124, see also Advocate General 91–3, 97, 99, 126, 180, 184,
Department of Homeland Security 187, 212
China 13, 16, 45, 186, 196, 201–2, 214–19 judicialization 91–2, 211–12
global actor 13, 16, 45, 55, 214–19 Kadi case 99, 149
CJEU, see Court of Justice of the European cybercrime 68, 154, 155, 168–9, see also
Union Budapest Convention
Clarifying Lawful Overseas Use of Data Cybersecurity Tech Accord 43t, 66, 157, 203
Act 153–4, 164–8, 170–3, 204, 210, 216
CLOUD Act, see Clarifying Lawful Overseas Use data governance
of Data Act definition of 4
CNIL, see Commission Nationale de global framework 55–7, see also globalism
l’Informatique et des Libertés data localization 16, 72, 165, 215–17
Combating the financing of terrorism data protection
(CFT) 129–31, 149–51 and data governance 49
Commission de la Protection de la Vie as a fundamental right 54–5, 91–4, 126, 180
Privée 132–4, 137 legislation, historical emergence of 52–6
Commission Nationale de l’Informatique et des and privacy 4–5, 52–3
Libertés 174–5, 182–90, 193–5 Data Protection Authority (DPA) 15, 22, 47,
common good 10–12, 14, 17–19, 27–30, 32, 100, 110, 112, 114, 120, 123, 133–4, 137–8,
34–7, 44, 48–51, 54–5, 57–8, 64, 69, 75–6, 144, 150–1, 155, 157, 169, 171, 183, 207t,
93, 116, 121, 130–1, 143, 147, 161, 168, 196, 211
202, 206 as challengers 112, 155, 157, 169, 171, 183,
private goods 29–30 207t
communitarianism 5 n.4, 199–201 expertise 22, 120, 195
composite objects 15, 36–7, 49, 131, 146–7, 151, Data Protection Directive 2016, see Law
197, 210 Enforcement Directive 2016
content 188, 190–2 data retention
hate speech 188, 191 directive 98–9, 129–30, 139–40, 212, see also
Convention on cybercrime, see Budapest Digital Rights Ireland case
Convention period 112, 119, 126, 140, 213
copyright 189 de Goede, Marieke 9–10, 19–20, 45, 69, 104–8,
cosmopolitanism 62, 73–4, 196, 199, 202, 204 110, 118, 129–30, 132, 139, 143–4, 146–50,
Costeja case 178–82 209, 217
Council of Europe (CoE) 53, 69, 103, 149, Department of Commerce 74, 80, 89, 93
168–9, 176, 219 Department of Homeland Security 107, 112,
Convention 108 42, 53, 65, 71–3, 219 115, 119–21, 124, 127–8, 149
Convention 108+ 42, 60, 61, 75 Department of Justice 153, 155–67
Convention on Cybercrime, see Budapest Department of the Treasury 109, 129–39, 141,
Convention 143–8
Cybercrime Convention Committee de Wilde 11, 24, 38–9, 70, 73, 199–204
(T-CY ) 155–7, 168 digital economy 12–13, 44, 54, 55, 65–6, 85, 93,
and electronic evidence 167–70 99, 200t
Octopus Conference 47, 169 Digital Geneva Convention 157, 189, 202–3,
Council of the European Union (Council) 56, see also Microsoft
111–123, 125, 134–5, 139–45, 171, 203 Digital Rights Ireland case 91, 98, 212
counterterrorism, see terrorism Digital Services Package 191–2, 214
280 INDE X

digital sovereignty 2, 173, 216, see also EU-US agreement on personal data protection,
sovereignty see Umbrella Agreement
Digital Switzerlands 157–8, 169, 189, 202 EU-US PNR Agreement 114, 116–23, 126, 130,
DPA, see Data Protection Authority 139, see also Passenger Name Records
dystopian scenario 29, 52, 60t, 78, 83, 85, 90, 98, extraterritoriality 12, 15, 56, 71, 74–6, 79, 81–2,
103, 115, 120–2, 146, 161, 168, 180–6 99, 100, 103, 133, 152, 156–62, 163, 165,
167, 172, 184–9, 193, 201, 211–12
Economic Community of West African States definition of 3
Supplementary Act A/SA.1/01/10 On Personal
Data Protection Within ECOWAS 42t, 55, Facebook 51, 54, 56–7, 83, 87, 90, 96–7, 166,
66, 68, 71–3 169, 173, 179, 183, 190, 191, 194, 195, 202,
EDPB, see European Data Protection Board 216
EDPS, see European Data Protection Supervisor Oversight Board 190
EDRi, see European Digital Rights fairness (order of ) 14–15, 28, 37, 44, 52, 58–9t,
electronic evidence (e-evidence) 61–3, 71, 76, 78–80, 82–6, 90–2, 94–5, 97–9,
101, 103, 109, 113–14, 116–20, 122, 134–7,
Council of Europe 167–9
142, 145–6, 151, 158, 161–2, 168–9, 174–6,
definition of 152–6
180–2, 184–7, 192–4, 196–8f, 200–2, 204–5,
legislative frameworks 164–72
210, 212, 227
Mutual Legal Assistance Treaties
Farrell, Henry 6, 54–5
(MLAT) 152–5
and Newman, Abraham 4, 6–7, 9–11, 19–20,
Electronic Privacy Information Center
45, 56, 77, 79, 83, 88, 90, 94–5, 103, 105–8,
(EPIC) 54, 158, see also non-governmental
110–13, 115, 118–19, 121, 123, 130, 135–6,
organizations
138–40, 153, 204, 206, 209, 212
European Commission 14, 48, 78–88, 91–103,
Federal Ministry of the Interior (Germany) 86,
107, 109–24, 135–42, 144–8, 162–77,
114–15, 122
179–86, 202–4, 209, 213–14, 219
Federal Trade Commission (FTC) 70, 80, 94,
DG JUST (Justice) 1, 80, 85, 135, 177, 180–1, 101, 183, 204
190–1
field
internal market 54, 135 Bourdieu 8, 10–12, 17–18, 21–2, 24–6, 34, 39,
as negotiator 94, 172 51, 89, 121, 130, 169, 197, 206, 212
European Commission and DHS Joint genesis of the 41–4, 51–8, 75–6
Statement 56, 111, 123 theory 24–6, 34
European Data Protection Board 64, 100 Financial Action Task Force (FATF) 149–50
European Data Protection Supervisor 64, 101, FISA, see Foreign Intelligence Surveillance Act
116–17, 122, 124, 130, 138–40, 144, 148, Five Eyes network 81, 86, 125
151, 155, 201, 204, 211 Fligstein, Neil 26
European Digital Rights Initiative 119, 124, 158, and McAdam, Doug 10, 13, 17–18, 24, 26,
169, 171 34–5, 38, 78, 206, 209, 212
European Parliament 9, 14–15, 20, 82–3, 85–6, strategic action fields 26, 34, 86
90, 92–6, 102–7, 111–28, 130–46, 168–70, Foreign Intelligence Surveillance Act (FISA) 82,
201, 206 96, 107
Committee on Civil Liberties, Justice, and fragmentation
Home Affairs (LIBE) 82–4, 113, 118, 122, of data governance regimes 55, 74, 76, 123,
123, 138–9, 142, 145, 151, 171 171–2, 187–8, 204, see also data localization
LIBE hearing 82–4, 142, 145 of financial security 149
Lisbon Treaty 107–8, 117–22, 129, 138–41, in international law 6
206–7 of the internet 216–20
MEPs 119, 122–8, 139, 141, 144–5 of law enforcement 171–2
resolutions 9, 111–12, 118, 130, 133–6, 138–9, threat of 53, 60t, 74, 85, 123, 204, see also
144, 168–9 harmonization
responsibility 9, 14, 117–24, 140
Europol 107, 140, 144–5, 151, 153–5 G7 149, 154
Information System (EIS) 129 Joint Declaration by G7 ICT Ministers 42t, 61,
Joint Supervisory Body (JSB) 144–5, 147 65
INDE X 281

G20 India 13, 45, 125, 196, 216–18


Digital Economy Development and information sharing 107, 123–4, 203
Cooperation Initiative 42t, 61, 65 infrastructure 4, 16, 20, 33, 57, 105, 130, 158,
GDPR, see General Data Protection Regulation 214–17
General Data Protection Regulation 1–2, 9–10, International Air Transport Association 104
42t, 55–6, 65, 69, 72, 97, 99–101, 122, 165, International Conference of Data Protection and
175, 177–9, 182–3, 216 Privacy Commissioners 53, 184
fines 1–2, 182, 195 Montreux Declaration 43t, 73, 74–5, 204
Ireland 87 international organizations (in data
scope 99–101, 182–3 governance) 55–8
global cooperation, see globalism Internet Association
globalism (order of ) 11, 14, 44, 49, 52, 58, 60t, Internet Association Privacy Principles 42t,
65, 69, 72–5, 85, 93, 96, 101, 114–15, 125, 64, 65, 66, 93, 189
145, 171–2, 196–8, 200t, 202–5, 227 internet governance 6, 10, 12, 38–9, 44, 46–7,
globalization ideologies 11, 24, 38–9, 70, 73, 57–8, 157, 213, 215, 218
199–204 interoperability 73, 75, 96, 204
Global Network Initiative 43 in ’t Veld, Sophie, see also European Parliament
The Global Principles on Protection of 119, 121, 123, 128, 144–5, 151, 167
Freedom of Expression and Privacy 43 Ireland 15, 87, 95, 138, 152, 157, 162–3, 183, 219
GNI, see Global Network Initiative Irish Data Protection Commission (DPC) 87,
Google 1, 15, 46t, 54, 56–7, 83, 87, 137, 156, 158, 89, 96–7
161, 174–6, 182–94, 202, 206–7
Irish High Court 87–8, 90, 92, 96
Google Spain 41, 174, 178–82
Walker, Kent 185
Japan 2, 79, 125, 186, 218
Greenwald, Glenn 1, 81, 142–3
Jourová, Věra 1, 85, 167
Guardian, the (newspaper) 4, 89–90, 92, 122,
judicial review 127, 152, 166, 170
191, 217
Justice and Home Affairs, 107, 114, 118, 123,
136, see also Area of Freedom, Security and
Hanrieder, Tine 20–1, 28, 29–30, 36, 44, 189,
Justice
197
harmonization (legal) 44, 65, 73–4, 99, 113, 123, EU pillars 107
170, 173, 204 High Level Meetings 95, 136
electronic evidence 170 justification 11–12, 17–18, 20–4, 27–9, 31f,
Passenger Name Records 123 33–6, 38–9, 45–9, 52, 67–8, 70, 76, 78, 82–4,
hierarchy 11, 15, 18–19, 24, 26–7, 29, 31–4, 86–90, 103, 109, 113, 128, 148, 154, 156,
36–7, 44, 60t, 66, 76, 117, 153, 159–61, 163, 158–62, 166–8, 175–6, 178, 181–2, 184–6,
174, 176, 180, 186–7, 192–3, 206–7t 192–4, 197–200, 202, 205–7, 209, 211, 216,
in-order hierarchization 76, 163, 176, 186, 227
192–3, 206–7t
High Level Contact Group 136 Kaunert, Christian 9, 19, 32, 56, 105, 107–9,
High Level Political Dialogue on Border and 114, 117, 119, 130, 140
Transportation Security 136 Kuner, Christopher 19, 43, 77, 91, 96, 100, 201
High-Level Working Group on Information
Sharing and Privacy and Personal Data law enforcement agencies 4, 15, 43, 46, 57, 59t,
Protection 136 69, 74, 89–90, 95, 99, 101–2, 107–8, 112–13,
133, 138, 149, 152–5, 156–7, 160–1, 164–73,
IATA, see International Air Transport 191, 204, 220
Association informal cooperation 69, 135–6, 152–5,
ICDPPC, see International Conference of Data 156–7, 165, 172–3, 191, 204, 220
Protection and Privacy Commissioners judicial review 152, 166, 170
ideology 38–9, 199–205 MLATs 152–5
illusio 18, 19, 25–7, 37, 41, 51, 54, 121–2, 130, Law Enforcement Directive 2016 101
140, 143, 147, 149, 169, 197, 206 leaked material 48–9, 63–4, 86, 89, 115, 116,
impact assessment 30, 32, 66, 79, 206 118, 135, 211
282 INDE X

legal (un)certainty 66, 93, 95, 110, 113–14, 120, epistemic 12, 199, 201
127, 156–7, 166, 173, 202 of governance 21–3, 51, 102, 116, 193, 200
liberalism 61–2, 65, 67, 199–200, 202 valued 59t–60t, 82, 91, 101, 120, 137, 192–3,
liberal order 4, 58, 215 197
liberal values 49, 62–3, 67, 69–70, 123, 201 OECD, see Organization for Economic
local liberalism 12, 184, 200t–2, 205 Co-Operation and Development
Lisbon Treaty 15, 55, 88, 107, 118, 122, 134, 139, ombudsperson 95, 97–8, 124, 145–6
141, 206, 209 Office of the Ombudsperson (EU) 124, 145–6
empowerment of the European Par- Privacy Shield 95, 97–8
liament 118, 122, 134, 141, 206, open internet 66, 165, 200t
209 orders of worth, see value orders
lobbying 100, 139, 189–90, 203, 216 Organization for Economic Co-Operation and
Lyon, David 5, 63, 77, 81, 128 Development 154
Guidelines on the Protection of Privacy
MAXQDA 41, 48–9, 222–8 and Transborder Flows of Personal Data
McAdam, Doug updated guidelines 42t, 51, 53–6, 59t, 65–7,
and Fligstein, Neil 10, 13, 17–18, 24, 26, 34–5, 71–5
38, 78, 206, 209, 212 Organization of American States
Mérand, Frédéric 10, 24, 25, 41, 54, 78 Comprehensive Inter-American
Meta, see Facebook Cybersecurity Strategy 42t, 69,
Microsoft 15, 46t, 53, 56, 83, 87, 152–73, 185, 74
188–90, 194, 202–3, 207t, 209 oversight mechanisms 15, 63, 73, 86, 96, 103,
as norm entrepreneur 157, 190 129–32, 136, 140–1, 144–6, 148, 151, 157,
Smith, Brad 157–8, 166, 181, 189, 203, 210 190, 210, 213, 219
Mitsilegas, Valsamis 6, 9, 10, 145 hearing 82–4, 89, 93, 142
money laundering, see anti-money laundering review 15, 47, 69, 85, 89–91, 95–6, 103, 120,
Mutual Legal Assistance Treaties (MLAT) 152–5 126–7, 130–1, 138–9, 141, 143–9, 151–2,
166, 170, 191, 206, 210, 213, 218
Newman, Abraham L. 2, 6, 9, 19, 45, 53, 55–6,
66, 70, 72, 75, 77–9, 81, 99–100, 103 Palantir 151
and Farrell, Henry 4, 6–7, 9–11, 19–20, 45, passenger name records (PNR) 14, 95, 101,
56, 77, 79, 83, 88, 90, 94–5, 103, 105–8, 104–28, 129–30, 136, 139–41, 207t–10, 212
110–13, 115, 118–19, 121, 123, 130, 135–6, Canada-EU agreement 126, 212
138–40, 153, 204, 206, 209, 212 European PNR Directive 101–2, 104, 108,
New York Times, the 1, 2, 129–30, 133, 143, 157 110, 121–6
Nissenbaum, Helen 5, 114, 116 review 120, 126–7, 130
non-governmental organizations 54 Patriot Act 107–8
Access Now 54 Pegasus, see NSO Group
Electronic Frontier Foundation 54 PNR, see passenger name records
Electronic Privacy Information Center 54, political philosophy 14, 41, 44, 52, 59t, 60, 62,
158 67, 70
European Digital Rights (EDRi) 119, 124, pragmatism 40, 54
158, 169, 171 Presidential Policy Directive 83–4, 88–9, 96
noyb 97, 201, see also Schrems, Maximilian PRISM Program 81–2, 87, 166, see also
privacy activists 49, 54, 103 surveillance
Statewatch 54 Pritzker, Penny 93
NSO Group 4, 217–18 privacy
activists 49, 54, 103, see also Bennett, Colin
Obama, Barack 66, 83–4, 168 John
administration of 83–4 and data governance 49
objects 11, 18, 21–2, 52, 82, 91, 102, 104, 122, and data protection 4–5, 52–3
130, 162, 192–3, 197 history of 4–5, 52–3
composite 36–7, 49, 130, 146, 151, 197, 210 Privacy and Civil Liberties Oversight Board 96
construction of 12, 18–19, 21–3, 102 Privacy International 54, 75, 121, 212, 213, 219
INDE X 283

Privacy Shield securitization 9, 12–13, 29, 34–6, 67, 110, 209


invalidation 77, 92–9, 205, 212–13 existential threat 29, 69, 110, 114, 124, 127,
provisions 93–5 160, 181, 203
review 95–8, 103 security (order of ) 14–15, 28, 44, 45, 49, 52, 58,
private security companies 83, 105–6, 124–6, 59t, 62, 67–71, 82–4, 90–5, 106, 108–21,
151 125, 127, 130, 135, 139, 141, 146–7, 150–1,
Public Voice Coalition, the 153, 161–2, 171–2, 198, 200t, 203–4, 227
Madrid Declaration 43t, 74 Seidl, Timo
and Laurer, Moritz 9, 80, 99–100, 203
Rawls, John 59t, 61–2, 64, 71 and Nachtwey, Oliver 29, 57, 202, 220
reciprocity (data sharing) 138–44 Single Euro Payments Area (SEPA) 141
Reding, Viviane 80, 84, 177, 181 Snowden, Edward 2, 47 n.17
reference community 14, 21–2, 29–30, 34, 37–9, impact of 9, 14, 51, 56–7, 77–8, 80–4, 87,
48, 68–9, 71, 83–4, 89, 90, 94–5, 100–1, 99–103, 120–2, 141–2
110–11, 159–60, 162, 168, 180–1, 184, 186, Solove, Daniel J. 4–5, 44, 53, 62–3, 67, 175
195, 199–204 solutionism 57, 202, 220
relationalism 12, 18, 20, 23–5 sovereignty
responsibility 106, 107, 118, 171, 178–95, 203, digital sovereignty 2, 173, 215–16
209–11 order of 14, 28, 44, 49, 52, 58, 60t, 68, 69–72,
private 10, 12–13, 20, 63, 97, 132–3, 137–8, 78, 82–3, 86, 101, 109, 112, 114, 133–4, 137,
202 146, 158–63, 165, 168, 172–6, 184–6, 192–5,
of tech companies 90, 153–4, 158, 160–1, 198, 200t–2, 203, 227
166–9, 173, 175–6, 178–95, 202, 209–10, Stockholm Programme 119
218
surveillance
review 15, 47, 69, 84–6, 89, 91–2, 95–6, 103, 120,
criticism of 78, 81–91, 94–5, 102–3, 120–1,
126–7, 130–1, 138–9, 141, 143–9, 151, 152,
133, 139–40, 213
166, 170, 191, 206, 210, 213, 218, see also
by EU member states 86, 88, 94, 98, 104, 110,
judicial review
123, 209, 213
right to be delisted, see right to be forgotten
global nature 1, 4, 56, 77, 81–5, 93, 96, 101,
right to be forgotten 3, 9, 15, 46t, 100, 137,
103, 106, 110, 126, 131
175–95, 208t, 212, 218
NSA 1, 77, 81–5, 121, 141–2
Costeja case 178–82
surveillance-industrial complex 106, 125
definition of 176–7
SWIFT 15, 129–43, 150
GDPR 100, 177–8, 182–8
Ripoll Servent, Ariadna 6, 9, 20, 100, 117–19, responsibility 132–3, 137–8
122, 130, 140, 203
Russia 125, 168, 186, 217 T-CY, see Council of Europe (CoE), Cybercrime
Convention Committee (T-CY)
safeguards 53, 63, 68, 79, 95, 97, 102, 103, 106, tech companies
108, 112, 113–17, 121–2, 126, 132–4, 136–7, history of 53–4
141–2, 144, 147, 155, 165, 169–72, 192, 196, lobbying 100, 139, 189–90, 203, 216
199, 210, 213–14 power 175
Safe Harbour responsibility 12, 90, 153–4, 158, 160–1,
enforcement 78–80 166–9, 173, 175–6, 178–95, 202, 209–10,
framework review 84–6 218
invalidation 12, 14, 77–8, 87, 91–3, 103, 205 terrorism, see also United Nations
provisions of 78–80 anti-money laundering 130–4, 149–50
Schäuble, Wolfgang 115, 122 combating the financing of terrorism 129–31,
Schrems, Maximilian 77 149–51
Schrems case I 2, 14, 77–8, 87–96, 100–3, 106, in Europe 116, 123, 141, 148
123, 207t, 212 foreign fighters 68, 203
Schrems case II 96–9, 100–3, 153, 207t, terrorist attacks 32, 59t, 67–9, 105–8, 110–11,
212–13 116, 131, 146, 148, see also 9/11 terrorist
secrecy 36, 69, 128, 130, 141, 143–6, 148 attacks
284 INDE X

Terrorist Finance Tracking Program 46t, 95, Guidelines for the Regulation of Personal
108, 122, 129–150, 206, 207t–8t, 209–10 Data Files 42t, 53, 63, 65, 71
review procedure 146–9 Kadi case 99, 149
Terrorist Finance Tracking System 140 Security Council 28, 43, 104, 126, 137, 149
TFTP, see Terrorist Finance Tracking Program Security Council Resolution 2396 42, 43, 59t,
TFTS, see Terrorist Finance Tracking System 68, 104, 126
Thévenot, Laurent 28, 44 Special Rapporteur on the Right to
and Boltanski, Luc 8, 10, 15, 18–19, 21, 24, Privacy 56, 63, 81, 162–3, 200, 204
27–30, 32, 34, 36–7, 41, 44, 48, 51, 64, 90, United States
95, 117, 130, 148, 151, 192, 197, 206, 209–10 Congress 51, 93, 119–20, 136, 164, 214
sociology of critique 18, 24, 27–30 congressional hearing 51, 93
transparency 5, 81–2, 95, 130, 141, 167, 206, data protection 213–14
209–10, 213 US Supreme Court 52–3
Microsoft case 152, 154, 157–66, 170–2,
of private companies 1, 66, 155–6, 166, 210
211–12
of public actors 69, 120, 127–31, 142, 143–6,
148–9, 151, 170–1, 213, 218
value orders, see also Boltanski, Luc and
transparency reports 66, 155–6, 210
Thévenot, Laurent
Trump, Donald 51
definition of 28, 32
administration of 96, 100, 164, 216 reconstruction 8, 28–9, 40–1, 44, 51–2, 58, 76
trust 12, 32, 59t, 65–6, 85, 99, 145, 168, 172, tests 28, 32–4, 36, 143, 206, 210
200t, 202 visions of data governance 3, 6, 8, 10–13, 15, 17,
24, 37–41, 49–50, 58, 76, 101–2, 115, 125,
Umbrella Agreement 42t, 69, 72, 74, 95, 101, 130, 135, 159, 172, 173, 184, 187, 196–206,
108, 155, 204 217
United Kingdom 1, 53, 79, 81, 88, 116, 123, 135, metaphors of data 199
142, 171, 191, 213–14 typology of 200t–205
United Nations vulnerable groups 62, 67, 69, 106, 203
counterterrorism 42, 43, 59t, 68, 104, 107, children 69, 154, 160, 181
121, 126, 137, 149–50, see also security
(order of) Whatsapp 1, 173, 216
General Assembly Resolution 68/167 The whistle-blowers 63, 84, 103
Right to Privacy in the Digital Age 42t, 56,
59t, 61, 81, see also Special Rapporteur on Zuboff, Shoshana 10, 63, 174–5, 194, 202
the Right to Privacy Zürn, Michael 11, 24, 38, 70, 73, 199–200, 202

You might also like