Professional Documents
Culture Documents
Data Governance
Value Orders and Jurisdictional Conflicts
A N K E S O PH I A O B EN D I EK
Great Clarendon Street, Oxford, OX2 6DP,
United Kingdom
Oxford University Press is a department of the University of Oxford.
It furthers the University’s objective of excellence in research, scholarship,
and education by publishing worldwide. Oxford is a registered trade mark of
Oxford University Press in the UK and in certain other countries
© Anke Sophia Obendiek 2023
The moral rights of the author have been asserted
Impression: 1
All rights reserved. No part of this publication may be reproduced, stored in
a retrieval system, or transmitted, in any form or by any means, without the
prior permission in writing of Oxford University Press, or as expressly permitted
by law, by licence or under terms agreed with the appropriate reprographics
rights organization. Enquiries concerning reproduction outside the scope of the
above should be sent to the Rights Department, Oxford University Press, at the
address above
You must not circulate this work in any other form
and you must impose this same condition on any acquirer
Published in the United States of America by Oxford University Press
198 Madison Avenue, New York, NY 10016, United States of America
British Library Cataloguing in Publication Data
Data available
Library of Congress Control Number: 2022943693
ISBN 978–0–19–287019–3
DOI: 10.1093/oso/9780192870193.001.0001
Printed and bound by
CPI Group (UK) Ltd, Croydon, CR0 4YY
Links to third party websites are provided by Oxford in good faith and
for information only. Oxford disclaims any responsibility for the materials
contained in any third party website referenced in this work.
Preface and Acknowledgements
We produce data every day. As learning, working, and socializing have moved
online, their significance has only increased. And yet many of the frameworks
that regulate how exactly data are governed are rarely discussed by the broader
public. While the European overhaul of data protection legislation has left many
annoyed by cookie notifications and regular email updates about long-forgotten
subscriptions, the extent to which our data are used, by whom, and for what pur-
poses is often considered an enigma. This book tracks how data governance has
developed in recent decades. It focuses on key controversies between the main
players of the regulatory landscape—the European Union and the United States—
and the major companies that hold significant amounts of data (and power). It
shows that many of the conflicts we see today are not that different from the ones
we had two decades ago. Instead, we see that they touch upon foundational nor-
mative questions about society: what societal goals matter most? And how do we
evaluate justifications and assess evidence provided by different parties?
This book began as a PhD dissertation at the Hertie School during my time at
the Berlin Graduate School for Global and Transregional Studies. As per usual,
there are many people to thank, so this is by no means an exhaustive list. First and
foremost, I would like to thank my thesis adviser Markus Jachtenfuchs for being
an exceptional supervisor and constant mentor. Thank you, for always giving me
the freedom and encouragement to conduct my own research but providing help-
ful guidance whenever I needed it. I owe you so much. I also wish thank my
second thesis adviser Frédéric Mérand, who always cheers me up with his kind
and thought-provoking comments, and my third thesis adviser Tine Hanrieder,
whose excellent research inspired a significant part of this book. Both have pro-
vided invaluable feedback and support. Further thanks to Adina Maricut-Akbik
for her smart and thoughtful comments as a doctoral committee member. I also
had the great pleasure of working with my project partner Daniëlle Flonk, who
is not only an exceptional colleague and scholar but also a great friend. I am
extremely grateful for the encouraging efforts by Oxford University Press, par-
ticularly by Dominic Byatt, Ryan Morris, the editorial team, and my kind and
constructive anonymous reviewers, to improve this book. My anonymous intervie-
wees have also kindly provided great insights. I gratefully acknowledge funding by
the Deutsche Forschungsgemeinschaft (DFG), FOR 2409, JA 772/8-1 and thank
the OSAIC research group for engaging discussions.
vi PREFACE AND ACKNOWLEDGEMENTS
References 229
Index 278
List of Illustrations
G7 Group of Seven
G8 Group of Eight
G10 Group of Ten
G20 Group of Twenty
GCHQ Government Communications Headquarters (United Kingdom)
GDP gross domestic product
GDPR General Data Protection Regulation (European Union)
GNI Global Network Initiative
IA Internet Association
IATA International Air Transport Association
ICANN Internet Corporation for Assigned Names and Numbers
ICAO International Civil Aviation Organization
ICDPPC International Conference of Data Protection and Privacy Commissioners
ICT information and communications technology
IGF Internet Governance Forum
ILC International Law Commission
IO international organization
IP international organization
IR Internet Protocol
ITU International Telecommunication Union
JHA Justice and Home Affairs (European Union)
JSB Joint Supervisory Body (Europol)
LIBE Committee on Civil Liberties, Justice, and Home Affairs (European
Union)
MEP Member of the European Parliament
MLAT Mutual Legal Assistance Treaty
NGO non-governmental organization
NSA National Security Agency (United States)
NSCT National Strategy for Counterterrorism (United States)
OAS Organization of American States
OECD Organization for Economic Cooperation and Development
PNR Passenger Name Record
PPD Presidential Policy Directive
SCC standard contractual clause
SME small to medium-sized enterprise
STS Science and Technology Studies
SWIFT Society for Worldwide Interbank Financial Telecommunication
T-CY Cybercrime Convention Committee (Council of Europe)
TFEU Treaty on the Functioning of the European Union
TFTP Terrorist Finance Tracking Program (United States)
TFTS Terrorist Finance Tracking System (European Union)
UK United Kingdom
UN United Nations
UNCTAD United Nations Conference on Trade and Development
UNGA United Nations General Assembly
xii LIST OF ABBREVIATIONS
In May 2018, the European Union’s (EU) General Data Protection Regulation
(GDPR) came into effect. And clearly, something significant was going on. The
BBC asked, ‘GDPR: Are you ready for the EU’s huge data privacy shake-up?’
(Coleman 2018), while an article in the New York Times suggested that ‘G.D.P.R.,
a New Privacy Law, Makes Europe World’s Leading Tech Watchdog’ (Satariano
2018). For the EU, the GDPR represented a significant step towards a digital sin-
gle market (Reicherts 2014, 3). It also offered an opportunity to assert European
interests and values in light of various privacy scandals. In 2013, the publication of
classified documents by the former United States (US) National Security Agency
(NSA) contractor Edward Snowden revealed widespread surveillance by intelli-
gence services (Greenwald 2013, 2015). In March 2018, just two months before
the GDPR came into effect, the Cambridge Analytica data scandal exposed poten-
tial voter manipulation in the US elections and the United Kingdom (UK) Brexit
referendum in 2016 (Cadwalladr and Graham-Harrison 2018). At the height of
the revelations, great faith was placed in the GDPR. The law arguably represents
the strictest and most advanced data protection legislation in the world. It signif-
icantly shapes the corporate conduct of multinational companies. And it delimits
how public agencies collect data about Europeans. Věra Jourová, then European
Commissioner for Justice, described the GDPR as a ‘loaded gun’ (cited in Khan
2019) in the hand of regulators. Indeed, data protection authorities have since hit
various companies with exceptional penalties for their lack of compliance. Fines
may reach up to 4 per cent of companies’ annual global turnover or €20 million,
whichever is higher. In 2021, the Luxembourg Data Protection Commission issued
a record penalty of €746 million to the online retailer Amazon, while the French
data protection authority claimed €90 million from Google Ireland, both cases
concerning cookie consent. In the same year, the Irish Data Protection Commis-
sion imposed a €225 million fine on the messaging service WhatsApp due to a lack
of transparency. While some charges are still under dispute, the total amount of
GDPR fines exceeded €1.5 billion in early 2022—and it is rising quickly.
The high level of fines is just one of the reasons why the GDPR was not wel-
comed unanimously. The then US Secretary of Commerce Wilbur Ross argued
in the Financial Times that ‘EU data privacy laws are likely to create barriers to
trade—GDPR creates serious, unclear legal obligations for both private and pub-
lic entities’ (Ross 2018). An opinion piece in the New York Times unambiguously
stated that ‘Europe’s Data Protection Law Is a Big, Confusing Mess’ (Cool 2018),
while CNN suggested that companies were ‘getting killed by GDPR’ (Kottasová
2018). Some feared ‘Europe’s data grab’ (Manancourt 2020), while others articu-
lated concerns about collective security and safety, arguing that ‘EU Privacy Laws
May Be Hampering Pursuit of Terrorists’ (Drozdiak 2019).
Whatever the verdict, the GDPR’s data protection standards are a striking
example of how regulatory standards in one jurisdiction may have de jure and de
facto effects beyond their territory. The new legislation has both a deeper reach
and a wider scope than the 1995 Data Protection Directive it replaced. Prin-
ciples such as the right to erasure (Art. 17), the demand for privacy by design
and by default (Art. 25), or the right to data portability (Art. 20) (GDPR 2016),
deepen the regulatory impact of the GDPR. Furthermore, the legislation sets strict
restrictions for cross-border data transfers (Art. 44, 45). Should the standards of
a third country’s data protection not be adequate, that is, ‘essentially equivalent’
(Schrems 2015, para. 73) to protection in the EU, the GDPR severely limits the
transfer of data. On the basis of the marketplace principle, the GDPR determines
the applicability of EU law for all businesses and organizations that target cus-
tomers residing in the EU, irrespective of the location of the service, data, or
processing practices. Due to the physical disconnection between data collection,
storage, and processing, this has implications for organizations across the globe.
Through this widening and deepening of data protection standards, the GDPR
shapes regulatory conduct in unprecedented ways and far beyond European
territory.
In response, some countries such as Brazil and Japan have adopted legislation
very similar to the GDPR (Goddard 2017). Indeed, Bradford has suggested that
through the imposition of high restrictions, the EU may elevate global regulatory
standards through a regulatory race to the top via the so-called ‘Brussels effect’
(2020, 167). In early 2022, 137 or about 71 per cent of countries have enacted
domestic legislation on data protection (UNCTAD 2022), compared with merely
17 countries with legislation in the 1980s, 36 in the 1990s, and 68 in the 2000s,
which demonstrates an accelerating growth (Greenleaf 2011, 2).¹ As the EU is
increasingly pushing for ‘digital sovereignty’ (Obendiek 2021; Pohle and Thiel
2020), its ‘“gold standard” for data protection’ is often considered to be a sign of
increasing assertiveness in the digital space (Schünemann and Windwehr 2020).
Yet while the Brussels effect (Bradford 2020) may sketch a rosy future for data
governance, concerns persist.
Critics have suggested that the exertion of unilateral influence tends to exac-
erbate regulatory conflicts (Kuner 2015, 2017; Newman 2008). In the absence of
¹ For a comprehensive overview, see Bennett and Raab (2006); Greenleaf (2014a).
INTRODUCTION 3
² The International Law Commission (ILC) has defined extraterritoriality as follows: ‘An attempt to
regulate by means of national legislation, adjudication or enforcement the conduct of persons, property
or acts beyond its borders which affect the interests of the State in the absence of such regulation under
international law’ (International Law Commission 2006 Annex E, para. 2). For a discussion of the
different understandings of extraterritoriality concerning the EU data protection regime, see Kuner
(2015); for the US, see Putnam (2009); and for a general discussion, see Bignami and Resta (2015).
4 DATA GOVERNANCE
Questions of data use, processing, and governance have become a major topic of
concern in the digital age. The recent revelations of widespread surveillance of
journalists, activists, and dissidents by the private NSO Group is Guardian 2021)
have again highlighted how data not only are becoming ever more central in our
daily lives but are also at the core of global power asymmetries and vulnerabilities.
Data governance plays out in various arenas and areas, touching upon important
challenges to the liberal world order (Farrell and Newman 2021). Tech compa-
nies that hold massive amounts of data have emerged as uniquely powerful actors
on the global stage (Culpepper and Thelen 2019), impacting issues from hous-
ing to health to relationships. While they have enabled global connections and
friendships as well as economic growth, data-driven technologies such as facial
recognition or targeted advertising may also contribute to discrimination, pro-
filing, or election manipulation. Furthermore, data create significant regulatory
challenges, mainly because they are at odds with the territorial character of most
legal systems. Data may move from one jurisdiction to another in an instant, mul-
tiply indefinitely, and be in several places at once. They are protected by human
rights law but also constitute a key resource in today’s economy, while simulta-
neously holding significant value for law enforcement authorities and intelligence
agencies.
Efforts to govern digital data are embedded in the broader governance of the
internet, which comprises physical infrastructure (e.g. cables or exchange points),
virtual infrastructure (e.g. protocols or technical standards), content (e.g. postings
on social media or images on news websites), and data (Haggart et al. 2021, 2).
Data governance denotes ordering processes that include but are not limited to
the actions of states, international organizations, local authorities, and private
actors in relation to the processing, transfer, sharing, and general use practices
of digital data.³ Mechanisms may include legislation, common standards, terms
of service agreements, or looser recommendations. Data governance crucially
extends beyond privacy or data protection, including measures that aim to steer
rather than restrict data flows, such as mandatory processing or transfer practices
for security purposes.
³ Terminology and differences in understanding depend on societal and legal traditions. The under-
standing of data governance put forward here differs from an alternative use of the term to describe
data management within businesses. For a comprehensive overview of conceptual debates on privacy
and data protection, see, e.g., Solove (2008); Jörg Pohle (2017).
INTRODUCTION 5
This distinguishes data governance from informational privacy and data pro-
tection, which, historically, have constituted important starting points for debates
about the role of data in society and politics. Conceptual thinking on privacy
dates back to historical figures, such as Aristotle (2012) or Mill, who distinguished
spheres of governmental and private authority (Mill 1859; see also Arendt 1958,
ch. 2; Habermas 1962). In 1890, the US-American legal scholars and practition-
ers Warren and Brandeis more specifically linked threats to individual privacy to
technological progress. Their seminal article, which defined privacy as the ‘right
to be let alone’ (Warren and Brandeis 1890), influenced legal and philosophical
debates for decades.⁴ More recently, the scholarly debate has shifted towards a soci-
etal understanding of privacy (e.g. Nissenbaum 2009; Regan 1995; Solove 2002)
considering its public and collective dimensions (Regan 1995, 213), particularly
concerning the exercise of human dignity and autonomy (Rössler 2005).
Due to its more recent significance based on technological innovations, the
related concept of data protection has received less conceptual attention. Some
have questioned the relevance of data protection in view of surveillance and social
sorting (Lyon 1994); others argue that data, in contrast to information, are only
worthy of protection if they specifically relate to natural persons (Floridi 2005).
This idea is increasingly problematized in the context of big data and deep learn-
ing (Jörg Pohle 2017), as these technologies may make inferences about personal
information possible from seemingly anonymous or non-personal data. Concep-
tual discussions have increasingly outlined the normative value of the right to
data protection as distinct from privacy (Cannataci and Mifsud-Bonnici 2005;
Lynskey 2015a, 2019). For instance, Tzanou (2017) suggests that, in contrast to
privacy, which works as a defensive shield, data protection enables openness and
transparency.
In sum, data governance is related to but different from data protection in that
it comprises processes of ordering and steering rather than only the restriction of
data flows. This often contributes to competing governance objectives, which may
result in jurisdictional conflicts, which I outline in Section 1.2.
Jurisdictional conflicts are not specific to data governance but represent a general
phenomenon of contemporary global governance. Particularly in light of increas-
ing institutional density (Raustiala 2013), the global realm is frequently the site of
clashing interpretations of norms and rules. International relations (IR) literature
⁴ Scholars have since problematized the depoliticizing character of a clearly demarcated private
sphere, for example in relation to domestic abuse (Allen 1988; MacKinnon 1987; Nissenbaum 1998)
but also by questioning its value from a communitarian perspective (Etzioni 1999).
6 DATA GOVERNANCE
on ‘regime complexes’ (Keohane and Victor 2011) has conceptualized how legal
inconsistencies and possibilities for forum shopping may negatively affect global
order (Alter and Meunier 2009; Drezner 2013). The literature on fragmentation in
international law (Biermann et al. 2009) has identified ways of establishing norma-
tive compatibility in the absence of global government, including through varying
notions of global constitutionalism (Dunoff and Trachtman 2009; Grimm 2016;
N. Walker 2008). However, these scholars rarely examine how actors try to manage
conflicts that emerge at the interfaces of legal, institutional, or normative overlap
(Kreuder-Sonnen and Zürn 2020), particularly in the absence of institutionalized
dispute resolution mechanisms in areas such as internet governance (Flonk et al.
2020). Legal pluralists have questioned the normative desirability and practicality
of a coherent or even constitutionalized international order (Halberstam 2012;
Krisch 2010), also with regard to data governance (Kuner 2017). They suggest
that the continuous re-emergence and resolution of conflicts might contribute to
political equilibrium through mutual openness. Conflicts may thereby prevent the
unilateral imposition of specific norms and rules. Yet the empirical and conceptual
relevance of such conflicts warrants further empirical scrutiny.
In data governance, legal scholars have highlighted the challenges of diverging
data governance regimes (Bignami and Resta 2015; Greenleaf 2012; Kuner 2013;
Mitsilegas 2014; Zhao and Mifsud-Bonnici 2016). They have outlined the pol-
icy challenges of self- and co-regulation in the fragmented regulatory structure of
data governance (Bennett 1992; Bennett and Raab 2006; Flaherty 1989; Greenleaf
2014a), in particular focusing on the EU (Busch and Levi-Faur 2011; Mayer-
Schönberger 2010; Ochs et al. 2016) and contested jurisdictions (Ryngaert 2015).
However, this literature has rarely systematically investigated how conflicts play
out in practice (for an exception, see Bignami and Resta 2015; Brkan 2016; Farrell
2003, 2006; Kuner 2013; Mitsilegas 2014).
The legal perspective has shortcomings, because it fails to understand why juris-
dictional conflicts, despite their seeming resolution through bilateral agreements
or legislation, frequently and unexpectedly remanifest. Some scholars have sug-
gested that both the diversity of existing norms and rules (Bennett and Raab 2020)
and their clashes (Farrell and Newman 2019) are rooted in distinct ‘regulatory
visions’ (Kuner 2017, 918) and ‘values’ (Mitsilegas 2014, 290) and characterized
by a lack of normative ‘fit’ (MacKenzie and Ripoll Servent 2012, 74). Yet how juris-
dictional claims are rendered (il)legitimate according to these visions and values
is not yet sufficiently understood through theoretical concepts like norm diffusion
(Newman 2008) or regulatory regimes (Bennett and Raab 2006; Bessette and Hau-
fler 2001). Thus, while the continuous re-emergence of conflict appears to indicate
prevailing differences, it is not clear how these differences play out.
Yet understanding these conflicts is important, as they may have serious con-
sequences. For instance, the threat of transatlantic data deadlock has prompted
INTRODUCTION 7
This book aims to fill this conceptual and empirical gap in the literature by
investigating the following overarching research question:
1. How do actors deal with conflicting demands about legitimate control over
data?
2. How are agreements (temporarily) stabilized?
3. How are conflicts interlinked with the evolution of the field of data gover-
nance?
8 DATA GOVERNANCE
The book offers a conceptual and empirical framework to understand the uneven
resolution processes of jurisdictional conflicts in data governance: in Chapters 1–
3, I develop a theoretical framework based on international political sociology.
I combine a Bourdieusian perspective on fields (Bourdieu 1993, 1996) with the
framework of value orders, drawing on Boltanski and Thévenot (2006) to anal-
yse ‘jurisdictional conflicts’ (Abbott 1986). I set up the contours of the field of
data governance as a social space of interaction. Through inductive reconstruc-
tion, I sketch the normative dynamics and the relations between actors which, I
claim, are rooted in five distinct value orders. In Chapters 4–8, I explore these
dynamics with regard to more specific instances of conflict and illustrate how
they connect the broader trajectory of the field of data governance. The analy-
sis of five case studies set at the intersection of jurisdictional claims by actors in
the EU, the US, and private companies focuses on normative ordering processes
and conflict resolution strategies. The analysis extends beyond the mere applica-
tion of the theoretical and methodological framework and uses the case studies
as a medium for continued conceptual discussion. In Chapter 9, I develop an
empirically informed typology of four distinct visions of data governance based on
cross-case comparison before discussing the implications and outlook for future
research. In short, this book argues that jurisdictional conflicts have a so far under-
estimated normative quality that creates significant opportunities for disruption,
while constituting challenges for durable conflict resolution. In the context of
increasing interdependence, actors are united by the common objective of govern-
ing data but divided by deep normative convictions concerning the right conduct
and goals of data governance. Actors perceive differently not only the societal goals
of data governance but also what data are in the first place. While some perceive
data as a neutral tool for security agencies, others prioritize their character as an
economic resource, while some perceive them to be closely interlinked with indi-
viduals. When claims of legitimate control over data overlap, actors attempt to
impose their normative visions or socially constructed realities of data governance
(see also Obendiek 2022), on others to shape the criteria for evaluation. Mitigated
by actor positions in the field, they may find recognition only if they are able to
create normative fit with the underlying principles in the field. Frequently, actors
resolve conflicts through institutional remedies that fail to address these underly-
ing divisions. Therefore, the field is characterized by the continuous re-emergence
of conflict.
In Sections 1.4–1.6 of this chapter, I first relate the book to relevant strands
of legal and political science research. Second, I outline the main theoretical and
empirical arguments of the book before, third, providing a detailed outline of the
following chapters.
INTRODUCTION 9
(Lynskey 2015a, 2019; Tzanou 2017). This includes criticism of ‘surveillance cap-
italism’ (Zuboff 2018), state surveillance (Bauman et al. 2014; Bigo et al. 2013;
Parkin et al. 2013), and ‘dataveillance’ (Amicelle 2013; Amoore 2020; Amoore and
de Goede 2005; Aradau 2020), particularly concerning the preventive character
of data-based security policies (Mitsilegas 2020), or the demand for an internet
constitution (Fischer-Lescano 2016). An emerging literature particularly in Sci-
ence and Technology Studies (STS) outlines private responsibilities (Chenou and
Radu 2019) and imaginaries (Mager and Katzenbach 2021) but focuses mainly
on the effects rather than the foundation of such visions. While data protection
principles, including the GDPR, explicitly articulate deeper normative aspirations
(see Chapter 3), the sociological relevance of normative arguments in data gover-
nance is insufficiently theorized. An emerging branch of literature that considers
the normative order of the internet (Kettemann 2020), internet constitutionalism
(Redeker et al. 2018), or value judgements in algorithmic regulation (Bellanova
and de Goede 2022; Johns and Compton 2022; Ulbricht 2018) is just begin-
ning to understand the underlying normative principles in the field of internet
governance.
In sum, jurisdictional conflicts, and particularly the processes of resolving
them, remain undertheorized and understudied (for the most significant excep-
tion, see Farrell and Newman 2019; Obendiek 2022). While the literature
illustrates the existence of regulatory differences and the prevalence and nor-
mative implications of transatlantic and intra-EU conflicts in data governance,
the sociological relevance of normative arguments is still underestimated. Of
course, data governance is not only a question of values and the pursuit of
the common good, but ignoring the normative dimension of data governance
to the benefit of legal or power dynamics risks obscuring the deep divisions
that contribute to the prevalent re-emergence of conflicts. To address this lack
of understanding, this book adopts a bifocal theoretical framework inspired
by international political sociology, uniting a focus on power asymmetries
with the investigation of justificatory practices that draw on higher normative
principles.
In Chapter 2, I develop a theoretical framework selectively drawing on Bour-
dieusian field theory (Bourdieu 1996) and Boltanski and Thévenot’s (2006) soci-
ology of critique. Field theoretical approaches are particularly well suited to the
study of complex relational dynamics in contested policy areas (Mérand 2010;
Pouponneau and Mérand 2017; Vauchez and de Witte 2013). The sociology of
critique by Boltanski and Thévenot contributes a moral perspective to the analy-
sis of contested subject areas through the concept of value orders. The developed
framework draws on these approaches as theoretical building blocks to bridge
conceptual divisions. It thereby contributes to approaches that stress the reflexive
capacities of actors and strategic action in a social space that is already hierar-
chized (Fligstein and McAdam 2012; Sending 2015). While some scholars have
INTRODUCTION 11
identified benefits from theoretical cross-fertilization (e.g. Leander 2011), this spe-
cific framework is new. It adds, I argue, reflexivity and agency to the Bourdieusian
perspective while making the value orders framework more receptive to dynamics
of power and hierarchy. To analyse conflict resolution processes in data gover-
nance, I focus on how field dynamics structure the ‘space of possibles’ (espace de
possibles, Bourdieu 1996, 234–9) for justifications and practices. More specifically,
I argue that the field of data governance is structured by distinct value orders that
each define a common good, invoke different reference communities, and out-
line valued objects as well as contrasting dystopian scenarios. Together with calls
for action, they form comprehensive and competing visions of data governance.
This book shows that diverging conceptualizations of societal goals constitute an
important factor in the emergence and resolution of conflicts between the EU, the
US, and private companies.
governance and cooperation with the search for safety and security. Proponents
attempt to establish data as a crucial tool in the fight against terrorism and crime.
Finally, (d) a vision of global cooperation emphasizes global cooperation in the
pursuit of compatible data protection principles.
In short, I suggest that the normative plurality in data governance is undertheo-
rized but has significant implications for the continuous re-emergence of conflict.
I argue that while challenging actors are particularly capable of formulating an
alternative vision of the field, positional differences and the pervasiveness of the
field’s structuring effects foster convergence in the long term. In resolving jurisdic-
tional claims, actors loosely try to impose one of four competing visions of data
governance that conceptualize the field and data according to normative objec-
tives and link these to specific calls for action. While the analysis is focused on the
transatlantic context, which has been the main site of both regulation and conflict
since the 1990s, the findings of this book are globally relevant. As countries across
the globe, including major economic players such as China and India, have been in
the process of adopting new data protection measures, it is crucial to understand
the normative questions at the heart of these legal and technical decisions.
hand, three substantive value orders are based on the common goods of security,
fairness, and production, while, on the other hand, two institutional orders are
based on sovereignty and globalism. I trace the emergence and institutional mani-
festations of these orders. To embed their normative underpinnings conceptually,
I outline similarities to broader debates in political philosophy. The value orders
structure processes of meaning-making and define morally worthy and deficient
behaviour. They define a common good, a reference community, and dystopian
and valued principles as well as criteria of evaluation.
Chapters 4–8 are empirical. Here, I analyse how actors respond to and deal with
jurisdictional conflicts and how their resolution is interlinked with the broader tra-
jectory of the field of data governance. I draw on a discourse-analytical reading of
policy documents, court proceedings, and press releases as well as semi-structured
expert interviews and observations. Using the value orders as my analytical frame-
work, I analyse five jurisdictional conflicts that involve public and private actors
loosely associated with the EU and the US, the most significant regulatory pow-
ers in data governance. Each case makes a specific contribution to understanding
conflict resolution processes and key dynamics, such as the fragility (Chapter 4) or
stabilization of agreements (Chapters 5, 6, and 7), the disruptive role of non-state
actors (Chapters 4, 7, 8), or the political conduct of courts (Chapters 4 and 8).
Chapter 4 focuses on the disruption of transatlantic commercial data flows
after the Snowden revelations. Despite intense criticism of the lack of protection
through the framework which had governed commercial data transfers since 2000,
the European Commission seemed extremely reluctant to address how it facili-
tated US intelligence data access. A complaint by the Austrian lawyer Maximilian
Schrems resulted in the invalidation of the agreement by the Court of Justice of
the European Union (CJEU) in 2015 and imposed principles of the order of fair-
ness, particularly fundamental rights consideration, on the negotiations. Hence,
this empirical chapter highlights the vulnerability of highly institutionalized and
power-laden transatlantic relations, demonstrating the disruptive effects of indi-
vidual actors. The fragile normative balance of institutional agreements creates
opportunity structures particularly for actors holding more peripheral positions
in the field.
Chapter 5 outlines the emergence of a global regime of Passenger Name Records
(PNRs) sharing. In 2001, the US passed a law that required airlines to share pas-
senger data with US authorities, contributing to significant tensions in the EU–US
relationship. Despite several disruptions in the legalization of the jurisdictional
claim, the discourse significantly shifted from its initial focus on the order of fair-
ness. I explore the increasing imposition of security as a criterion of evaluation
and discuss the difficult and uneven rise of the EU as an international countert-
errorism actor in the field. This chapter also illustrates how shared concepts of
responsibility constrain the space of possibles of challengers such as the European
Parliament.
INTRODUCTION 15
In Chapter 6, I examine a jurisdictional conflict between the US, the EU, and the
Belgian Society for Worldwide Interbank Financial Telecommunication (SWIFT)
in financial data sharing. In 2005, widespread access by US authorities to SWIFT
data on financial data flows strained the transatlantic relationship. While EU actors
initially strongly contested the jurisdictional claim of the US and the European
Parliament even rejected a high-level bilateral agreement after the Lisbon Treaty,
EU institutions increasingly supported financial data sharing for security pur-
poses. As in Chapter 5, the successful imposition of security as the criterion of
evaluation has entrenched financial data sharing as a necessary security measure.
The chapter shows the importance of ‘composite objects’ (Boltanski and Thévenot
2006, 279) that unite characteristics of different orders. Review and oversight
mechanisms contribute to a lasting compromise between principles of the order
of fairness and the order of security, despite limited impacts on data practices.
In Chapter 7, I explore a conflict between the US and the private company
Microsoft in online law enforcement. Access to electronic evidence such as email
account information by law enforcement authorities abroad raises questions of ter-
ritoriality and public–private cooperation. In the conflict, Microsoft refused US
authorities access to data stored in Ireland. The conflict demonstrates how the
significant role of private actors in data governance makes public actors reliant
on the goodwill and cooperation of private actors. The emergence of diverse
regional regimes on e-evidence shows how actors draw on different normative
and institutional resources to sustain their jurisdictional claims.
Chapter 8 discusses the contested right to be forgotten and the more recent
debates that not only stress the balance between privacy, freedom of information,
and business interests but also address the question of extraterritorial applica-
bility. In the latest instance of conflict, Google has become a focal point for the
fight to preserve freedom of information online, while the French data protection
authority (DPA) emphasizes the necessity of protecting European citizens’ privacy
also beyond European borders. The case demonstrates the varying capabilities of
actors to establish compelling jurisdictional claims, depending on field-inherent
restrictions and the normative quality of arguments.
In Chapter 9, I compare justificatory practices across conflicts and outline how
they are embedded in broader struggles between value orders. I conceptualize
these struggles through four distinct and competing visions of data governance
that link the principles articulated in the value orders with concrete calls for
actions. Subject to existing hierarchies, I suggest that actors struggle to improve
their position and invoke both normative principles and other resources to claim
jurisdiction. They point to hierarchical contradictions and inconsistencies or rely
on a set of alternative principles of worth to contest the status quo. Their suc-
cess varies depending on their capacity to establish an alternative vision of the
field. I outline the contributions of the book in the context of global data gover-
nance. Pointing to implications and future trajectories, the chapter discusses the
16 DATA GOVERNANCE
implications of the findings beyond the transatlantic context. The chapter points
to the significance of conflicts over meaning involving data localization practices,
global infrastructure, and sovereigntist assertions, particularly considering the rise
of China as a central actor in data governance.
In sum, the book offers a fresh perspective on the processes of meaning-making
that accompany the constitution, stabilization, and contestation of jurisdictional
claims in the field of data governance. I suggest that these conflicts are symp-
tomatic of larger normative and institutional struggles that play out not only
between the EU and the US but also globally. In the coming years, data gov-
ernance frameworks across the globe will be (re)negotiated. As an increasing
number of actors extend their claim over data, this book shows that a clear idea of
what data mean for us and what is at stake when governing them interlinks with
fundamental questions of collective values and goals. Data governance unites fea-
tures of increasing importance in contemporary politics, including the changing
relations between citizens, states, and private companies. Thus, it provides indi-
cations of how struggles about the global order may evolve under conditions of
ever-increasing inter- and transnational interlinkages.
2
Theorizing the Resolution of Jurisdictional
Conflicts
of a field, actors are less socialized into a specific context and develop, as Harbisch
terms it, ‘change agency’ (2021, 42). Over time, they lose their ‘“oppositional” per-
spective’ (Fligstein and McAdam 2011, 4). This explains why institutionally weak
actors are often able to severely disrupt even long-standing institutionalized agree-
ments against the interests of more powerful actors but then fail to transform these
challenges into change in the long term. Thus, the resolution of conflicts in data
governance and its consequences depend on the character of the normative chal-
lenge, the actors’ capacity to engage in justificatory practices that create shared
normative understandings, and the existing hierarchization in the field.
The chapter will proceed as follows: first, I outline the theoretical approach in
relation to the existing literature and point to the specific contributions of the
relationalist approach taken here. Second, I sketch the concept of jurisdictional
conflicts as a particular type of conflict. Third, I establish the contributions of
Bourdieu’s critical sociology and the sociology of critique based on Boltanski and
Thévenot. Regarding critical sociology, I establish the field as a space of interac-
tion that is structured around the common ‘illusio’ (Bourdieu 1996, 331–6; see
also 1990, 1998) of control over data. I explore the constitution and specific char-
acteristics of this field in more detail in the Chapter 3 but outline the mechanisms
that shape actor behaviour in principle. With regard to the sociology of critique, I
specify the categorization of justifications in situations of normative dispute, draw-
ing on the concept of value orders. Fourth, I outline the theoretical framework
that aims to conceptualize the resolution of conflicts in data governance. In three
steps, I explore the resolution of the conflict through its emergence, responses, and
outcome. Fifth, I outline the research design and method, including case selection.
The argument presented in this chapter draws on selected concepts from interna-
tional political sociology to highlight the constructed and historical nature of the
space, objects, and categories of governance that significantly shape the evolution
and resolution of these conflicts. It draws on a relationalist approach that fore-
grounds processes of mutual construction and stabilization between actors and
objects (McCourt 2016, 479). First, the theoretical framework establishes the soci-
ological nature of jurisdictional conflicts. It suggests that these conflicts involve
claims of legitimate control over an issue (Abbott 1986, 191). These are themselves
constitutive of the contents of this issue as an object of governance as well as the
form and goals of its governance. Second, the book draws on field theory. The the-
oretical framework establishes the hierarchized space of interaction that organizes
the site of jurisdictional conflicts as a Bourdieusian field (Bourdieu 1996, 1998).
As I outline in Chapter 3, actors engaged in the field are united by the percep-
tion that data governance is meaningful in the pursuit of the common good, the
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 19
illusio of the field. The field as a concept offers a particularly useful perspective on
the co-constitutive relationship between specific instances of conflict and under-
lying structures. Third, to unpack and categorize the deep normative divisions in
jurisdictional claims and conflicts that are formed and expressed in justificatory
practices, I combine this approach with the concept of value orders, drawing on
Boltanski and Thévenot (2006). Value orders stipulate distinct conceptualizations
of the common good, reference communities, dystopian scenarios, and evaluative
mechanisms. I rely on the concept as a powerful heuristic tool for analysing how
actors define jurisdictional claims as morally worthy or deficient, depending on
the perceived contribution to common societal values.
Through the understanding of the multidimensional nature of jurisdictional
conflicts, the approach seeks to make three contributions to existing debates on
data governance and jurisdictional conflicts. First, the bifocal approach illuminates
both short- and long-term dynamics. This helps understand which factors in dif-
ferent issue areas and contexts shape the likelihood of disruption and conflict or
stabilization and convergence. Existing approaches have illustrated the relevance
of (inter)dependence (Argomaniz 2009; Kaunert et al. 2012, 2015) and access to
transnational fora (Farrell and Newman 2019), the difficulty of reconciling differ-
ent legal approaches (Bignami and Resta 2015; Kuner 2015; Reidenberg 2000), or
the role of norm entrepreneurs (Newman 2008). These approaches tend to illus-
trate either short- or long-term dynamics but rarely both. The emphasis on social
learning processes based on the asymmetrical transatlantic security relationship
(Argomaniz 2009) does not explain the continual re-emergence of disruptions in
similar issue areas. In contrast, the literature on legal pluralism has outlined not
only the empirical but also the normative relevance of the coexistence and reg-
ular clashes between legal systems in a field (Krisch 2010). Yet the macro-level
approach rarely considers how the political, legal, and normative claims play out in
practice and are mediated by positional asymmetries. By emphasizing the norma-
tive nature of conflicts as well as their embeddedness in a hierarchically structured
field, this book outlines how short-term disruption strategies often fail to instigate
change in the long term.
Second, the approach allows the problematization of data as an object of gov-
ernance and conceptualizes the plurality of normative underpinnings of such
constructions. This helps understand how more peripheral actors contribute to
disruptions and how even strongly institutionalized compromises fail. While most
scholars perceive data as fixed, this book contributes to existing interdisciplinary
work on data governance that analyses how data are constructed as govern-
able rather than simply how they are framed. This includes, for example, work
in media studies (van Dijck 2014; Gitelman 2013) or critical security studies
(Amoore 2011; Aradau 2020; de Goede 2018b; de Goede and Wesseling 2017).
Drawing on Science and Technology Studies (STS) or ‘algorithmic regulation’
(Yeung 2018), scholars have problematized data-based social ordering through
20 DATA GOVERNANCE
¹ Due to the lack of internal organization and external representation as professions, apart from,
potentially, Data Protection Authorities (DPAs), the professional character of such a group is of less
relevance in the field of data governance (see also Raab and Szekely, 2017). The diverse expertise of a
legal, economic, and technical character required to enter the field of data governance has prevented
the domination of a particular profession.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 23
In the framework, I broadly distinguish between three phases of the conflict: emer-
gence, responses, and outcome. The emergence of the conflict defines the point
in time when at least one actor perceives overlapping claims to be conflictual
and voices dissent. Since I focus on conflicts that have already emerged, I do not
analyse extensively under what conditions actors perceive overlapping jurisdic-
tional claims to be problematic and activate the conflict (see, e.g., Holzscheiter
et al. 2020). Yet the justifications actors use in their initial jurisdictional claims
form the starting point and shape the further evolution of the conflict, including
its resolution. The responses to the conflict include, for example, explicit strate-
gies actors pursue as well as less strategic reactions they display throughout the
conflict. Here, I particularly focus on the actors’ justifications and their reliance
on varying conceptions of worth, as I outline later. The outcome may involve the
institutional or normative resolution of the conflict, for example through an agree-
ment based on compromise or through one side prevailing. This may encompass
different levels of formality, for example a bilateral or international agreement, a
court judgment, the publication of guidelines, or less institutionalized forms of
intersubjective agreement. However, the conflict may also be ongoing.
In conclusion, the concept of jurisdictional conflicts highlights how overlapping
jurisdictional claims that aim to assert control over a specific issue put forward a
specific definition of this issue and how these clashing definitions may result in
conflict. By claiming jurisdictions, actors put forward normative goals. Therefore,
I specifically aim to investigate the production of data governance in its different
dimensions through jurisdictional claims. The conditions, constraints, and conse-
quences of such processes will be examined in more detail later. In Section 2.3, I
outline the analytical tools that illustrate how jurisdictional conflicts play out in
practice.
illegitimate jurisdictional claims. This ‘turns the focus away from the idea that
objects or structures have assumed a fixed, stable identity and that closure is
achieved at some point’ (Bueger and Gadinger 2014, 456). In the context of con-
flicts, I show how jurisdictional claims construct fragile structures of meaning in
specific situations and at the same time appeal to and constitute relatively sta-
ble normative visions. The analytical recognition of normative plurality does not
imply the absence of power. On the one hand, actors simply do not have equal
access to the discursive sites of conflicts, such as courtrooms, bilateral meetings,
or media outlets. On the other hand, and due to asymmetry in the distribution of
material or symbolic resources, they also have unequal abilities to engage in jus-
tificatory practices. For example, major private tech companies provide backing
to their claims by drawing on significant financial resources, fostering allegedly
independent expertise, for example by funding academic research or conferences
(Bodó et al. 2020) or setting up expert boards (Chenou and Radu 2019). Juris-
dictional claims, therefore, take place in a space that is already hierarchically
structured.
The recognition of the potential for justificatory practices and normative plu-
rality within an already structured space requires a conceptualization of agency
and normativity that bridges insights from Bourdieusian critical sociology and
the Boltanskian sociology of critique with insights from related IR and sociolog-
ical literature (e.g. Fligstein and McAdam 2011, 2012; Sending 2015; Zürn and
de Wilde 2016).² I specifically emphasize both critical actor reflexivity (Boltanski
and Thévenot 2006, 146) and the situatedness of actors in hierarchical struc-
tures (Bourdieu 1996; Bourdieu and Wacquant 1992, 123) in the jurisdictional
claims. The investigation of how actors draw on and enact normative principles
while recognizing the possibility of domination enables a richer account of the
multidimensional character of jurisdictional conflicts.
In summary, critical sociology highlights the hierarchies and rules of inter-
action, while the sociology of critique illustrates the specific justifications and
broader shifting orders of value. The sociology of critique approach adds a more
nuanced perspective on the categorizations actors employ to justify inequality.
² Drawing on the concept of ‘habitus’ (Bourdieu and Wacquant 1992, 123), Bourdieu puts emphasis
on how different dispositions shape present and future practices. Space for change is mostly restricted
to external shocks that provoke a crisis in the field (Mérand, 2010, 352). This approach, according to
Boltanski, unfairly limits the reflexivity of actors and their ability to address and change their social
surroundings (2011, 19). Boltanski (2011, 21) specifically problematizes that Bourdieu problematically
assumes this reflexivity for scholars. Thus, the assumption that scientists are able to reflect on social
structures, while ordinary actors rarely escape them, he argues, undermines the postulated reflexive
character of the approach. Boltanski and Thévenot instead highlight the ‘ability to detach oneself from
the immediate environment’ (2006, 146), stressing actors’ capacity to reflect on their position. The
recognition of actor reflexivity also has implications for normativity. In field theory, the invocation
of normative questions is often conceptually limited to the legitimization of hierarchy (see Pellandini-
Simányi 2014), while Boltanski’s sociology of critique emphasizes the normative and moral dimensions
of action.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 25
The field approach emphasizes the dynamics of power that constrain them. Both
show the co-constitutive relationship of the conflict resolution processes with the
specific logics of the field. In Sections 2.3.1 and 2.3.2, I describe each in more depth.
Drawing on Bourdieu, I use the concept of a ‘field’ (1996, 1998; Bourdieu and Wac-
quant 1992) to conceptualize the relationship between specific conflicts and the
framework of endogenous logics that constitute the structure of the field. My main
argument is that the common stake at the centre of the field of data governance,
that is, that data governance is meaningful in the pursuit of common societal goals,
leads to self-reinforcing dynamics that make control over data seem valuable. This
contributes to the pervasiveness of competing jurisdictional claims on data. These
dynamics seem to prevail even in cases in which data governance is not effective
with regard to the achievement of the particular societal goal that motivated actors
to join the field in the first place.³ The concept of fields has been used in a diver-
sity of areas, including international diplomacy (Adler-Nissen and Pouliot 2014;
Pouliot 2016), security (Mérand 2010), or sustainability (Dingwerth and Pattberg
2009), but recently also in relation to internet policy (Julia Pohle et al. 2016; Radu
2019; Reiberg 2018). It is particularly suitable for recognizing long-term institu-
tional and social processes in evolving issue areas (see also Hamm et al. 2021).
In the following, I demonstrate how it can also contribute to a better understand-
ing of the specific ‘logic of practice’ (Bourdieu 1990) at play in data governance.
I briefly present the concept and illustrate the implications for interaction in the
field and actor responses in conflicts.
While Wacquant and Akçaoğlu (2017, 62–3) criticize the preponderance of
identified fields in the social sciences, I use this term to describe a space of inter-
action and social relations that is united by actors’ belief in the significance of
common stakes that constitute the centre of the field. In contrast to a community
based on values or common goals, actors share a much thinner form of mate-
rial and ideational interest (Kauppi and Madsen 2014). Bourdieu describes this
unity through the concept of a common illusio. The illusio depicts actors’ notions
that their engagement in the field and the central issue at the field’s centre is
meaningful (Neveu 2018, 364). In the field, actors follow specific logics (Bour-
dieu 1996, 227) that emerge from the illusio and engage in common struggles over
meaning-making (Bourdieu and Wacquant 1992, 102).
³ Scholars have described this belief in the power of data or the perceived pressure to obtain data.
For example, van Dijck captures this idea through the concept of ‘dataism’ (2014), while Fourcade and
Healy (2017) and Schildt (2020) use the term ‘data imperative’.
26 DATA GOVERNANCE
While the common stakes may appeal to outsiders, fields tend to have some
barriers to entry. For example, even if actors subscribe to the arguably more
widespread illusio that data governance is important, outside actors are likely to
experience difficulties in successfully navigating the field. Actors that have been
engaged in the field for a longer time are likely to have gained material or symbolic
resources, such as accumulated data or specific legal expertise. These resources
can help navigate or structure existing hierarchies. This is well illustrated by the
monopolist position of large tech companies that prevent smaller companies from
becoming more meaningful participants in the field by buying them off (Culpep-
per and Thelen 2019). Nevertheless, fields are always surrounded by, nested in, and
interconnected with other fields (Fligstein and McAdam 2011, 3), shaped by both
external and internal principles of legitimation (Bourdieu 1996, 217; A. Cohen
2018, 203).
While the field establishes the social space, the framework for political action
is formed by the distribution of resources and the perceived social reality of the
field. On the one hand, resources delineate how actors are positioned in a spe-
cific field. Bourdieu describes these resources by the concept of ‘capital’ (Bourdieu
1991), which may manifest in diverse forms, for example through social, eco-
nomic, political, or symbolic resources. The positioning of actors only becomes
meaningful in the context of the specific principles or truths of the field. Thus,
on the other hand, the framework accounts for the subjectivist perceptions and
categories actors use to understand their surroundings (Bourdieu 1985, 727–8).
These truths set the limitations to what actors perceive as thinkable and sayable
under specific circumstances. The truths and struggles of the field are contained
in what Bourdieu describes as the ‘space of possibles’ (1996, 234–9). Actors tend
to relate to these truths (Bigo 2011, 232), which typically results in a reproduction
of existing power dynamics (Susen 2014, 326). For the resolution of jurisdictional
conflicts, this means that actors tend to articulate visions of the field that con-
form to the existing principles of the field. To some extent beyond Bourdieusian
assumptions, I consider actors as making ‘articulate, and more or less strategic,
attempts at gaining recognition in the field’ (Sending 2015, 29). Rather than just
navigating the field in a practical sense, this actor reflexivity also creates the space
for strategic action (see also Susen 2014, 335). Therefore, I argue against an overly
strong emphasis on actor attributes as primary defining characteristics, as these
sources do not just pre-exist but must themselves be constructed and made oper-
ational. In the effort to define the criteria they and others use in their judgement
(Sending 2015, 19; 2017, 318–19), actors’ success largely depends on their capacity
to reflect on the possibilities within a specific field (Fligstein 2001, 114).
In sum, actors engage in constant struggles to ‘produce and to impose the legit-
imate vision of the world’ (Bourdieu 1989, 20). These processes create meaning
and develop into endogenous logics that structure how actors perceive their posi-
tion and the specific truths of the field. In line with the Bourdieusian approach,
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 27
I highlight how actors’ reflexive horizon and their practice are shaped by a per-
ceived space of possibles. However, against the backdrop of the Boltanskian
framework, I emphasize the situatedness of action rather than specific predispo-
sitions (Boltanski 2011, 63). The field of data governance is formed around the
common illusio that specifies control over data as a common stake as well as
meaning-making processes that involve the construction of legitimate governance
responses in the pursuit of the common good. To analytically grasp the concep-
tualizations of this common good, I draw on the concept of value orders, which I
outline in Section 2.3.2.
Section 2.3.1 established the field as a space of interaction that shapes (the per-
ception of ) the positions from which actors articulate jurisdictional claims. In
this section, I outline the contributions of the sociology of critique to the theo-
retical framework, which enables the analysis of justificatory practices. As they
establish jurisdictional claims of control over data as morally worthy or deficient,
justifications lie at the very core of jurisdictional conflicts.
Drawing on Boltanski, I assume that justifications aim ‘to reconcile a require-
ment of common humanity, which presupposes the equality of actors, with their
ordering in a hierarchy’ (Boltanski 2011, 112). They unite these two aspects
through references to higher common principles. Kornprobst describes this pro-
cess as ‘subsuming particulars under universals’ (2014, 193), pointing to the link
between the general and the specific. Justifications are, therefore, not only rele-
vant as ex post clarifications for action, but also provide orientation to political
decision-making ex ante.⁴ In other words, in invoking justifications, actors, while
implicitly or explicitly referring to specific imagined communities, draw on higher
principles to legitimize inequality between different actors. In making a jurisdic-
tional claim, actors, in reference to moral worth, justify why their claim rather than
others’ is legitimate. Therefore, in situations of conflict, ‘each of the protagonists
presents not only a different interpretation of what has “really” occurred, but also
different facts in support of her truth claim’ (Boltanski 2011, 61).
Value orders constitute an analytical tool for capturing the grammar actors
employ under these circumstances. Boltanski and Thévenot stipulate that under
conditions of normative uncertainty or dispute, individuals follow an ‘imperative
of justification’ (2006, 346). In feeling compelled to justify their claims, they draw
⁴ The significance of public justifications for state and non-state actors to take or guide political
actions is firmly established (e.g. Deitelhoff 2009; Bially Mattern 2001; Hansen 2000a). For a more
comprehensive overview of justification, see, e.g., Kornprobst (2014) or Forst (2015).
28 DATA GOVERNANCE
on distinctive orders of worth, polities (cités), or, as I refer to them, value orders.⁵
Hanrieder defines them as ‘repertoires of evaluation consisting of moral narra-
tives and objects that enable tests of worth’ (2016, 391). The concept has been
used in empirical analyses of, for example, global health (Hanrieder 2016), the
UN Security Council (Niemann 2019), or organizational structures (Jagd 2011).
It should be noted that value orders do not presuppose strong intersubjective
agreement or shared values but also present in heterogeneous settings. This is par-
ticularly compatible with the conceptualization of data governance as a field which
is characterized by unity only in the sense that actors share a belief in common
stakes.⁶
Boltanski and Thévenot reconstruct six orders from seminal works in Western
political philosophy⁷ but acknowledge the need to account for different periods or
contexts—including those in which justification is not as central (Boltanski and
Thévenot 2006, 347).⁸ Later modifications also include new orders (e.g. Boltanski
and Chiapello 2005; Thévenot et al. 2000). Likewise, I use value orders as a pre-
dominantly heuristic concept adjusted to the field of data governance. As I outline
in Section 2.5, I reconstruct five distinct orders, which I outline in Chapter 3. While
the orders are irreducible to each other, I do not perceive them to be fundamentally
incompatible, as a common good may be understood as the orientation towards a
specific goal or the orientation towards a specific community. More specifically, I
suggest that there is not only a plurality of substantive common goods but also a
plurality of procedural common goods.⁹ For example, while the pursuit of safety
and security constitutes a substantive normative goal, the pursuit of sovereignty
to guard specific communities from excessive influence in the global realm con-
stitutes a procedural normative goal. This distinction between procedural and
substantive does not speak to the normative or political quality of the higher
⁵ The terms ‘polity’, ‘cité’, and ‘order of worth’ are used interchangeably in the original work. This
book uses the term ‘value orders’. The concept of ‘world’ describes the justificatory language, object,
and practices that are available to actors in a specific situation rather than the grammar of principles
of the polities.
⁶ This aspect also distinguishes value orders from similar concepts that have been prominent in
social theory. For example, Walzer’s ‘spheres of justice’ (2008) assume that certain principles govern a
particular domain, while value orders explicitly account for the simultaneous plurality of principles of
worth. In contrast, Forst problematizes the limitations of narratives of justifications or normative orders
both concerning their level of specificity and their numerical occurrence (Forst 2015, 31). However, I
suggest that focusing on a limited number of more or less dominant value orders provides analytical
specificity. Clustering the plurality of conceptions of worth and comparing them on a limited number
of dimensions helps understand the main normative reference points that inform political action.
⁷ Boltanski and Thévenot (2006) define six different orders: market, industrial, civic, domestic,
inspired, and fame. Hanrieder (2016) defines four orders: fairness, production, security, and spirit.
⁸ Naturally, the importance of justification presupposes a society in which justification is possible
and to some extent, socially and politically meaningful; see Blokker and Brighenti (2011) for a more
detailed discussion.
⁹ Kornprobst (2014, 8) similarly specifies a distinction between substantive and procedural based
on the specification of the right action or the specification of the applicable community, respectively.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 29
Against the theoretical backdrop, this section formulates the concrete theoret-
ical framework for understanding how actors resolve conflicts emerging from
overlapping jurisdictional claims. To understand the resolution of jurisdictional
conflicts in data governance, the book emphasizes two related aspects. On the
one hand, the theoretical framework establishes the space of interaction in which
conflicts take place, that is, the field. The field works as a structuring mechanism
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 31
3. Outcome
but also establishes a structured space, because the principles of the field and
jurisdictional conflicts are assumed to be mutually constitutive. On the other hand,
to understand the evolution of conflicts in more detail, I conceptualize the justi-
fications actors bring forward in their jurisdictional claims. This analysis draws
on the concept of value orders. Therefore, the framework outlines how jurisdic-
tional conflicts emerge in an already structured space and clarifies how actors draw
on evaluative principles and tools in their resolution. In Section 2.4.1, I outline
the three phases of the conflict, that is, emergence, responses, and outcome, as
illustrated in Figure 2.1. I also illustrate how these relate to the questions of actor
responses, the stabilization of agreements, and the interlinkages with the broader
principles of the field.
As outlined above, the specific character of the conflicts I analyse in this book
presupposes that actors have articulated ‘jurisdictional claims’ (Abbott 1986)
that aim to establish control over data. Jurisdictional claims demarcate a space
of control and, at the same time, are constitutive of this space. In claiming
legitimate control of a problem (Abbott 1986, 191), actors also make norma-
tive claims about the character of data and implicitly or explicitly outline how
to govern them. Jurisdictional conflicts arise when at least one actor (P1) per-
ceives overlapping jurisdictional claims to be problematic (see also Holzscheiter
et al. 2020). The act of questioning the current moral or normative hierarchy
may result from perceived injustice or inconsistency. In this case, actors may
32 DATA GOVERNANCE
decide to ‘test’ the existing hierarchies to confirm or challenge the existing order
(Boltanski and Thévenot 2006, 40; Boltanski 2011, 103). In tests, actors must
invoke their critical capacities in coordinating the plurality of orders and their
intersubjective interpretations. While all tests ‘exploit contradictions’ (Boltan-
ski 2011, 110), they can be ranked according to their potential for normative
change. I use these tests to conceptualize the normative quality of the jurisdictional
claim.
Boltanski identifies three types of tests: truth tests, reality tests, and existen-
tial tests. Truth tests mainly rhetorically confirm (and only rarely disconfirm) the
hierarchization within the existing order. They tend to prevent rather than enable
critique (Boltanski 2011, 62). Due to this largely non-conflictual character, I focus
on the other forms of tests.¹⁰ Reality tests more strongly question the status quo
by explicitly pointing to inconsistencies or injustices (Boltanski 2011, 104). Yet
the underlying order still finds acceptance. For example, a reality test in reference
to the common good of economic progress and innovation might point to prac-
tices that undermine consumer trust, such as privacy violations. Yet the reality test
would not question whether data practices should be subjected to a commodifying
logic. Existential tests are most radical in their challenge, bringing forward funda-
mentally antagonistic positions. While reality tests problematize that a particular
practice contradicts the overall pursuit of a specific common good, existential tests
draw on completely different standards, that is, they question the common good
as such. In our example, data access might be considered no longer in relation to
its contribution to consumer trust and economic progress but as necessary to the
prevention of terrorism. While data protection rules initially were mainly evalu-
ated with regard to their conduciveness to economic progress and the enjoyment
of human rights, the consideration of security as an evaluative criterion experi-
enced a strong rise in reaction to the 9/11 terrorist attacks (Etzioni 2018, 112–13;
Kaunert et al. 2012). Thus, a test that formerly might have involved economic
impact assessments and human rights law may now stress the ease of data avail-
ability or sharing mechanisms in the fight against terrorism to prove worth and/or
contest the status quo. In sum, the theoretical framework assumes jurisdictional
conflicts as arising from tests that question the existing moral hierarchization in
the context of competing jurisdictional claims. Conflicts may emerge regarding
the appropriate interpretation of hierarchy within the existing order (within-order
conflict) or regarding the conceptualization of an issue based on principles from
alternative value orders (across-order conflict). Both instigate an ‘imperative of jus-
tification’ (Boltanski and Thévenot 2006, 346). Most of the conflicts I observe fall
¹⁰ While I do not investigate these cases, the successful prevention of conflicts may be a sign of insti-
tutionalized coordination mechanisms, such as regular truth tests, or the particularly strong position
in the field. While they are, therefore, of great relevance, they are epistemologically challenging, as their
identification is based purely on (counterfactual) assumptions by the researcher.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 33
into the latter category, that is, the rejection of the applicability of a specific regime
of justification (Chapters 4–7) rather than the hierarchy within one order (but see
Chapter 8).
In Section 2.4.1, I illustrated how actors invoke specific normative principles after
reflecting on their present and future situation in a given field. In this section, I
outline potential actor responses. While jurisdictional overlap needs to be per-
ceived as problematic by at least one actor (P1) for a conflict to emerge, there are
fundamentally distinct ways in which other actors (Px) may respond to the emer-
gence of conflict or a challenge to the status quo. I discuss three distinct responses
as components of the conflict resolution process. First, actors may not respond to
the imperative of justification at all; second, they may contest the hierarchization
within the prevailing orders; and third, they may attempt to impose principles
from an alternative order. While actors might formulate an immediate compro-
mise, I discuss this in Section 2.4.3, as it constitutes an outcome rather than a
response.
First, actors may decide not to follow the imperative of justification or simply
lack the opportunity to do so. Reasons for this may vary; actors might be aware
of (material or ideational) dependency, for example on the basis of their domi-
nant position in the field. This could prevent a further escalation of the conflict,
as actors either assert their dominance or submit to their marginal position in the
field. Actors may also draw on ‘morally unjustified social norms’ (Möllers 2020,
22) such as the status quo of the existing social order. Some actors may also be
unable to engage meaningfully in justificatory practices because they lack access
to relevant fora or capacities for justification. In short, while the most powerful
actors may decide to resort to their position in the field to evade the imperative
of justification, weak actors might not have the option or capacity to participate
in justificatory practices in the first place. Therefore, actors may simply (be forced
to) remain silent, intentionally ignore the conflict or its normative character, or
fall back on coercive power, such as through infrastructural or technical means.
Second, if actors respond to the imperative of justification, I expect that they
draw on value orders that have already manifested in some form in the broader
field. Particularly as a response to a reality test, actors may emphasize the appli-
cability of and hierarchy within the existing order or the balance of orders that is
dominant in the situation. As illustrated above, the framework of justification is
not necessarily tied to a specific call for action (Boltanski 2011, 111). Therefore,
it may be easier to contest the specific conclusions derived from the predominant
principles and evaluative criteria rather than drawing on a completely distinct set
of principles.
34 DATA GOVERNANCE
Third, actors may also rely on principles or objects associated with an alterna-
tive value order. The reference to an alternative common good also enlarges the
space of possibles and opens the space for legitimate political action. Therefore,
actors might draw on an alternative justification order if their preferred outcome
is so far removed from their interests or normative convictions that they feel the
need to break with the status quo. I suggest that in contrast to actors holding a posi-
tion of power in the field, what Fligstein and McAdam call ‘incumbents’ (2011, 2),
this type of challenge may be particularly likely to come from actors that hold a
more peripheral position in the field. Due to their outside perspective, those ‘chal-
lengers’ (Fligstein and McAdam 2011, 2) are able to formulate a fundamentally
distinct vision of the field. This bears significant disruptive potential. While value
orders are considered to remain relatively stable, any agreements or compromises
remain open to challenge (Boltanski and Thévenot 2006, 177).
There is not necessarily a clear link between the articulated challenge or test
(by P1) and the response of other involved actors (Px). For example, if actor P1
articulates a reality test, Px may decide to deny the addressed inconsistencies or
justify them in reference to alternative principles. In turn, if actor P1 articulates
an existential test that draws on principles of an alternative order, actor Px may
decide to deny the applicability of this order or similarly draw on order(s) that
are neglected in the status quo. Compared with the contestation or affirmation of
moral hierarchization within the existing order(s), the attempt to apply different
evaluative criteria is usually more difficult and costlier. An established justification
order ‘constitutes an entire lifeworld and hence leads to stable habits of action
and perception’ (Honneth 2010, 386). It also presupposes a specific community
that recognizes the moral claim. If the appeal to an alternative order is consid-
ered a sufficient strategy and succeeds, this readiness to accept suggests a deeper
entrenchment of the order in the field.
A substantial alteration of the universe of possible and legitimate policy options
is particularly well established through the concept of securitization. It outlines
how actors conceptualize a referent object as existentially threatened through a
‘securitizing move’ (Buzan et al. 1998, 25) and justify extreme measures. In and
beyond data governance, IR scholars have employed Bourdieusian frameworks to
outline securitization processes (e.g. Bigo 2014; Huysmans 2006). Securitization
may be based on strategic action or incremental processes of structural or nor-
mative change (Williams 2003, 521). However, securitization does not capture the
plurality of common goods in data governance. The exclusive focus on governance
efforts to respond to existential threats risks ignoring other underlying reference
goods and evaluative mechanisms. For example, in data governance, the norma-
tive discourse is to a significant extent shaped by the importance of the free flow
of information, innovation, and economic progress through the data economy
or the importance of protecting the privacy norms of a specific reference com-
munity. I suggest that securitization is one specific mechanism through which a
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 35
In Section 2.4.2, I outlined the different ways in which actors may respond to the
emergence of a jurisdictional conflict. This section more explicitly focuses on the
outcome of the conflict through the attempt to find a temporary stabilization or
agreement distinguishing between three types of outcomes. These outcomes may
be institutionalized to varying degrees. An outcome may consist of a minor or
major reform of the existing institutional order, for example through the revi-
sion or abolishment of a bilateral agreement, or it may lead to the emergence of
new structures such as laws or treaties. However, if the underlying fundamental
¹² Boltanski (2012) has also identified other grammars of dispute resolution, such as love and vio-
lence. Nevertheless, as the book’s focus is on conflicts between globally prominent entities where such
an arrangement is less likely to be considered legitimate in the long run, I do not consider them here.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 37
of these compromises. For example, a ‘user’ may unite traits of a ‘customer’ and
an ‘individual’ and thus represents a worthy reference community in the pur-
suit of economic progress and individual rights. The more ambiguous and open
these principles and objects are, the more likely it is that they sustain compro-
mises, as they may enable a variety of interpretations that can satisfy multiple
concerns at once. As soon as compromises are translated into tangible political
action, disagreement may re-emerge. Thus, any unexpected tilt towards a spe-
cific notion of the common good may lead to the dissolution of the compromise,
because ‘an exploration of the grounds for agreement shows the compromise up as
a simple assemblage without any foundation at all’ (Boltanski and Thévenot 2006,
336). By recognizing the implicit malleability of the reference points, the analyt-
ical framework takes into account that the process of compromising opens new
indeterminacies but may also entrench certain principles in the field.
In summary, the potential for normative change is low when actors compro-
mise to satisfy private rather than public goods. If one side normatively prevails,
the normative quality of the challenge or response is decisive. A revision of the
existing hierarchy will have a lesser potential for normative change than the suc-
cessful imposition of an alternative value order. Compromises may have varying
implications for normative change, depending on the normative proximity of the
justificatory principles. The outcomes of conflicts may also have consequences
beyond the specific instance of conflict. For instance, if the outcome of the con-
flict consists in a renewed emphasis on the order of fairness, this may inspire other
actors to challenge the dominant order in a related area. The outcomes of con-
flict resolution processes may be of a more permanent or temporary character,
depending on whether they are able to create a normative fit with the underlying
institutional and normative logics in the field, that is, the constellations of value
orders and the distribution of resources and positions in the field.
The central claim of this book is that conflicts in data governance must be under-
stood in their multidimensional quality as normative, institutional, power-related,
and legal conflicts. I have argued that in particular the normative principles actors
draw on in their resolution processes to justify their jurisdictional claims are
central but understudied. When actors fail to address normative differences in
the resolution of the conflict, agreements are particularly vulnerable to disrup-
tion. Sections 2.4.1–2.4.3 outlined how these value orders are organized around
the field’s common illusio and link higher common principles, specific reference
communities, and evaluative mechanisms. In this section, I establish the part of
the framework that zooms out of the specific conflict situations to reconstruct
how justificatory practices are used across conflicts. As I have illustrated above,
38 DATA GOVERNANCE
the process of conflict resolution may involve the attempt to refine or impose
a specific value order or may be based on newly created compromises between
orders. The reference to dissimilar principles requires trade-offs, which may foster
the instability of compromises. However, if these trade-offs become normalized,
the justificatory space is likely to change significantly. This potential normaliza-
tion of emerging, restructured, or combined orders also again demonstrates that,
far from being fixed and clearly demarcated, value orders are in flux. They are con-
sistently constructed, combined, and revised throughout the entire conflict. While
conceptualized as sticky and fluid, different value orders may emerge over time,
either because of compromises that become normalized or because of new prefer-
ences or grievances. Thus, the evolution and combination of different orders may
contribute to the constitution of new ordering principles (see also, e.g., Boltan-
ski and Chiapello 2005). When actors engage in justificatory practices, they not
only provide reasons for their jurisdictional claims in the sense of higher common
principles or constituency communities but also propose specific actions. There-
fore, while the value orders framework outlines general guiding principles, this
part of the theoretical framework tries to illustrate how they are applied in juris-
dictional conflicts. More specifically, I aim to outline how they are combined and
linked to specific calls for action, for example whether an emphasis on the order
of sovereignty is frequently linked to a call for stricter data protection rules. These
links produce more comprehensive visions of data governance that consist of a
justificatory basis and reference community based on the value orders and a cor-
responding conceptualization of the object of governance, as well as specific policy
preferences. These visions are relevant for the conflict resolution process because
they have the potential to redesign the normative structure of the field through
the normalization of trade-offs. The normative structure demarcates the space of
possibles for further contestation or dissent, which may delimit and restrict the
opportunities for disruption.
These visions offer the opportunity to sketch how the field of data governance
relates to its surroundings. As I outlined above, fields are always interlinked with
other fields. While even fully settled fields are rarely completely autonomous (Flig-
stein and McAdam 2011, 8), the field of data governance, due to the pervasiveness
of data in several sectors, has even stronger interlinkages. These interlinkages are
likely to create constraints and opportunities for actor justifications. Established
normative foundations in neighbouring fields or in global governance negotia-
tions are likely to provide reference points. Thus, for the final part of the analysis, I
compare how distinct visions of data governance relate to more general normative
underpinnings and objectives of governance. For contextualization, I draw selec-
tively on the literature on globalization ideologies (de Wilde 2019; Steger 2013;
Zürn and de Wilde 2016). The analysis of political ideologies in IR has so far largely
disregarded data or internet governance in favour of a focus on, for example, bor-
ders and migration (de Wilde et al. 2019), climate change (de Wilde 2019), or the
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 39
As outlined above, this book aims to understand how actors resolve jurisdictional
conflicts in data governance. I focus on the responses of specific actors, the stabi-
lization of agreement in conflict resolution processes, and existing interlinkages
with the broader evolution of normative and institutional logics of the field. The
understudied character of jurisdictional conflicts in data governance demands a
thorough understanding of the phenomenon and in-depth analysis. Hence, this
book aims to answer ‘how possible’ questions that focus on the production of
meanings that create and restrict the space of possibilities (Doty 1993, 279–99)
rather than assuming specific causal or unidirectional mechanisms. The under-
lying methodological foundation of the project is interpretivist. Interpretivism
highlights the significance of meaning-making processes. Language is perceived
not only as representing the social world but as actively shaping it (Wittgen-
stein 2001). I additionally draw on methodological insights from pragmatism.
Approaches that are close to or draw on pragmatist approaches offer a reflex-
ive understanding of research that recognizes its character as a social practice
(Friedrichs and Kratochwil 2009, 711). By addressing inherent bias and unpack-
ing fixed concepts, I try to incorporate this reflexivity into my own research. The
reconstructive impetus of the project is similarly guided by pragmatist and prac-
tice theoretical research that highlights the historical situatedness of meaning. The
project starts reasoning at an intermediate level (abduction), which aims ‘to enable
orientation in a relevant field. It consists of mapping a class of phenomena to
increase cognitive understanding and/or practical manipulability’ (Friedrichs and
Kratochwil 2009, 716). A significant part of the empirical research is inductive, but
I consistently draw on existing literature for orientation and aim to provide such
orientation for further studies.
The book analyses the broader trajectory of data governance through the key
controversies and their resolution. The research links long-term institutional and
normative processes with specific instances of conflicts and justificatory practices.
This bifocal approach requires a research design that can capture both dynamics.
While neither field theory nor the value orders framework constitutes a distinct
method, they provide guidance for the three-step research design of the book:
First, I reconstruct the emergence of the field of data governance and the value
orders that are prominent in the field. Second, drawing on this analytical frame-
work, I analyse conflict resolution processes in five cases which represent examples
of jurisdictional conflicts as specific empirical phenomena (Gerring 2004, 342).
Third, I identify how actors draw on value orders across conflicts. I construct a
typology of distinct visions of data governance that can provide orientation for
further (deductive) research.
The analytical focus adopted in this book transcends common binaries, such
as local/global or public/private, to highlight interlinkages between them. When I
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 41
distinguish actors on the basis of those lines, I mainly refer to the sociological rele-
vance of such distinctions rather than their conceptual distinctiveness. In addition,
when I speak of the preferences of specific institutions or actors, this is not meant to
reduce the entire political system of the respective states or institutions to a unitary
position.¹³ This is also relevant for companies. While I highlight intra-institutional
dynamics, for example, in the EU, I restrict multinational companies to a unitary
position, despite complex intra-organizational dynamics. I refer to Google, rather
than Google Inc. or the more recent Google LLC, which is part of the holding com-
pany Alphabet, and distinguish only in cases where the distinction is relevant for
the understanding of the conflict, for example regarding Google Spain and Google
Inc. in Chapter 8.
The first step in the project is the analysis of the genesis of the field of data gov-
ernance focused particularly on the construction of a common ‘illusio’ (Bourdieu
1996, 376), and the ‘orders of worth’ (Boltanski and Thévenot 2006) in the field.
The reconstruction of the long-term development of the field illuminates how
principles of vision and division become entrenched and institutionalized over
time (Mérand 2010). I start with a brief historical account of the emergence of the
field of data governance and then reconstruct the value orders. While Boltanski
and Thévenot use canonical texts of Western philosophy in the reconstruction,
they acknowledge that the orders are likely to vary in different contexts (2006,
347). Thus, I rely on a largely inductive approach and embed the identified orders
in political philosophy debates ex post. This also addresses a potential lack of atten-
tion given to the contextual nature of these orders. Using the software MAXQDA
(maxqda.com), I conducted a discourse analytical reading of 25 key documents
(see Table 2.1) with 1,136 individual codings. The approach underlines the con-
stitutive functions of discourse as well as the importance of context (Milliken
1999), outlining how actors ‘are engaged in the politics of knowledge and know-
ing, that is, in meaning/world making’ (Keller 2018, 17). By tracing how actors
make data knowable, it also outlines adequate and inadequate approaches to their
governance.
I triangulated the results of the analysis with information from qualitative
expert interviews.¹⁴ The selected key texts represent crucial moments for the
¹³ In this book, I have recourse to simplifications, like EU and US, when actors are authorized to
speak on behalf of these entities. While, particularly for the EU, I demonstrate that this does not exclude
the possibility of inter-institutional differences, this level of nuance for other parties extends beyond
the scope of this analysis.
¹⁴ The MAXQDA file, including all coded documents and segments, is available digitally for ref-
erence purposes. I chose MAXQDA because it is well established among qualitative researchers and
offers a free reader for the data and coding structure, which enhances transparency.
42 DATA GOVERNANCE
a
The Cybersecurity Tech Accord (2018) is a declaration of principles by more than sixty global
companies.
b
The GNI is one of the most important multistakeholder initiatives in internet governance. For more
information, see https://globalnetworkinitiative.org, accessed 26 June, 2022.
constitution and further formation of the field of data governance and cover dif-
ferent regions, sectors, and time periods.¹⁵ I expect these foundational documents
to refer to their governance objectives in a general fashion which facilitates the
reconstruction of normative dimensions. I assume that organizations draw on
understandings of worth eclectically. The document selection was mainly based
on secondary literature (Greenleaf 2011, 2014b; Kuner 2013, 189) and policy doc-
uments (UNCTAD 2016) that list important principles of data governance. Where
no specific guidelines or agreements existed, I included principles on related sub-
jects such as cybersecurity which often have data governance elements. Europe
and the industrialized West are over-represented. This reflects the position of
policy entrepreneurs and most influential actors (e.g. Suda 2013, 2017) and thus
existing power asymmetries in the field. To avoid over-representation of specific
subject areas, the selected documents address data governance as a broader issue.
Therefore, for example international Passenger Name Record sharing agreements
were excluded but more general agreements on cooperation in law enforcement
and counterterrorism, such as the UN Security Council (UNSC) Resolution on
Counterterrorism (UNSC 2017), were included.
I coded largely inductively and developed and adapted the coding scheme (see
Appendix 2) through multiple iterations, clustering codes and coded segments
according to underlying conceptions of worth. On the basis of the codings, I
distinguished each order according to the different characteristics and outlined
their historical emergence and institutionalization. I also investigated parallels
¹⁵ The documents do not represent an equal distribution over time periods but were selected
according to their significance for the formation of the field.
44 DATA GOVERNANCE
¹⁶ Boltanski and Thévenot (2006) define six orders: market, industrial, civic, domestic, inspired, and
fame, while Hanrieder (2016) identifies the order of survival, the order of fairness, the order of produc-
tion, and the order of spirit. I changed the term survival to security because data governance offers less
immediate links to death and survival but has strong security dimensions. The term fairness reaches
beyond requests to, for example, address bias in algorithmic governance and speaks to questions of
social justice (see Hoffmann 2019 for a more comprehensive discussion).
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 45
principles of worth might contrast with the focus on the continuous negotiation
of such orders, I argue that there is significant analytical value in the systematic
clustering of justifications, which I will outline in Section 2.5.2.
In the second step of the research, I apply this coding scheme in the analysis
of five case studies. The analysis aims to identify processes of meaning-making
in conflictual situations which involve different actors that draw on background
knowledge, evaluative schemes, and their moral sense of justice to resolve these
situations. I analyse these conflicts through qualitative interviews, interpretive
document analysis, and a contextual embedding.
Both formally and normatively, the interaction of European institutions, the US,
and private companies has significantly shaped the field of data governance (Far-
rell and Newman 2019). The EU in particular has been described as a norm
entrepreneur (Newman 2008) and as a ‘global regulatory hegemon’ (Bradford
2020, xvii) in data protection, while the US is home to many of the most significant
tech companies. Together, they have entrenched the use of data in security and
counterterrorism practices (de Goede 2008). Both entities can arguably be con-
sidered global authorities that significantly shape the evolution of the field. While
other actors, most significantly China (Erie and Streinz 2022), are important,
access to reliable information is notably more difficult because of language barriers
and data accessibility. The highly intergovernmental and often informal charac-
ter of data governance, outlined in more detail in Section 2.5.4, restricts access
to reliable data. This makes academic research even more difficult for less lib-
eral countries like China, which presents added challenges due restrictions on free
speech and therefore access to public justifications. Other countries have a lively
debate about data protection, such as India (Burman 2020) or Brazil (Doneda and
Mendes 2014) but were excluded due to their current lack of formal and public
inter- or transnational activities and conflicts (but see Chapter 9).
This book analyses five transatlantic jurisdictional conflicts that have a formal
legal dimension (i.e., a court case or negotiations about a potential conflict of laws)
and involve private companies to varying degrees (Table 2.2). The universe of
jurisdictional conflicts in data governance is small, which also limits the rationale
for case selection. There is another conflict concerning the publication of personal
data on the WHOIS database by the Internet Corporation for Assigned Names
and Numbers (ICANN 2017), which is comparatively well researched due to its
46 DATA GOVERNANCE
embedding in the broader internet governance context (Kulesza 2018; Mueller and
Chango 2008). Together with this dispute, these cases, to the best of my knowledge,
represent the only instances of jurisdictional conflict that have received signifi-
cant public attention (see Chapter 9 for potential conflicts). I selected these cases
because they bring to the fore underlying substantive or procedural differences
but at the same time show the temporary stabilization of agreement. In addition,
the chosen examples represent important typical features, such as the multifari-
ous character of public–private relationships. These conflicts touch upon the most
important areas of data governance, including commercial, counterterrorist, and
law enforcement as well as mixed forms of data processing and regulation. They
cover different time spans and periods, various substantial topics, and several
types of challenges. Showing how these cases relate to each other and are subject to
similar normative patterns offers significant insight into the evolution of the field
at large. Therefore, the insights generated from this book provide a framework for
helping to understand conflicts in other jurisdictions or related areas, or conflicts
that are yet to come.
For the analysis of justification strategies and interrelations in the field, I used
a combination of textual analysis and qualitative interviews. There is limited
direct access to documents produced by primary actors, such as autobiographies,
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 47
¹⁷ Some examples include Kirby (2011) who describes the negotiation of the 1980 OECD Guidelines
as well as a recent autobiography by Edward Snowden (2019).
48 DATA GOVERNANCE
¹⁸ As mentioned above, the MAXQDA file, including all coded documents and segments, is available
digitally for reference purposes.
THEORIZING THE RESOLUTION OF JURISDICTIONAL CONFLICTS 49
In the third step of the research, I make limited comparisons between the cases and
analyse how actors use justificatory practices across cases. In Chapter 9, I create a
typology of ‘visions’ of data governance that express specific social realities actors
draw on in their justifications.
For the analysis of broader patterns of justification, I examine how concepts
from different orders potentially form groupings that illustrate along what lines
worth is articulated in situations of jurisdictional conflict. Some codes already
encompass principles from different value orders as ‘composite objects’ (Boltan-
ski and Thévenot 2006, 279). For example, ‘cross-border security cooperation’ or
‘protection of liberal values through security measures’ appeals to both the order
of security and the order of globalism. These codes were added to both orders
to indicate their complex normative nature. To describe and analyse normative
ordering processes more systematically, I also investigated code co-occurrences
in the empirical case studies. Code co-occurrence in MAXQDA describes the
50 DATA GOVERNANCE
simultaneous occurrence of two or more codes for a particular segment and thus
demonstrates to what extent actors simultaneously appeal to distinct orders. The
analysis is based on 6,381 codings from the analysis of the case studies. Coded
expressions vary in length and may comprise sentences but also entire paragraphs
depending on their interpreted statement as one justificatory expression. As actors
are expected to have general references to all or most orders within a document,
I chose to focus solely on intersections between codes, that is, coded segments
needed to directly overlap, to illustrate what trade-offs are considered legitimate
within one justificatory expression. The varying length of statements and the
qualitative nature of the inquiry point to the limitations of this approach.¹⁹
The visualization of co-occurrence offers only a superficial insight into potential
linkages between different value orders. It indicates how actors link specific value
orders in their justificatory practices. I combined this analysis with insights from
the qualitative case studies to create a typology of distinct visions of data gover-
nance. Due to the small number of cases, the visions of data governance represent
preliminary ideal types in need of further scrutiny.
2.6 Conclusion
¹⁹ In Appendix 2.2, I provide two additional visualizations for cross-checking. One visualization
excludes superficial references to human rights, and the other is based on a co-occurrence analysis
that measures proximity. Neither changed the results significantly.
3
Value Orders and the Genesis of the Field
of Data Governance
From today’s perspective, the relevance of data governance seems rather intu-
itive. In 2018, the Facebook CEO Mark Zuckerberg was famously called to the US
Congress to testify on how a data analytics firm called Cambridge Analytica pur-
chased tens of millions of user data to build software designed to influence voters
(Cadwalladr and Graham-Harrison 2018). The potential role of election interfer-
ence and voter manipulation in the success of the Brexit and the Trump campaigns
in 2016 produced arguably the biggest international outcry on data processing
practices since the Snowden revelations. A vast number of people began to perceive
the potential consequences of data (mis)use. Considering the implications for the
enactment of free democratic elections, requests for regulation, even from compa-
nies, grew louder (Kang 2018; Tynan 2018). Yet data governance dates back much
further. Indeed, nearly forty years before the Cambridge Analytica scandal, the
Organization for Economic Cooperation and Development (OECD) adopted the
Guidelines on the Protection of Privacy and Transborder Flows of Personal Data
(OECD 1980). This also predated the dominance of today’s ubiquitous tech com-
panies. Nonetheless, a group of actors was united by the perception of common
inter- and transnational stakes in data governance. This chapter aims to investigate
how data governance emerged as a matter of concern in the global context long
before it became salient widely and how it has transformed into an increasingly
central element in global governance.
As outlined in Chapter 2, I reconstruct the emergence of data governance as
a Bourdieusian field (Bourdieu 1996, 1998) in which actors organize around the
common idea that their engagement is meaningful (Neveu 2018, 364). I argue that
the field emerged around the idea that data governance is meaningful in the pur-
suit of the common good. While actors in the field recognize this common ‘illusio’
(Bourdieu 1996, 331–6), there are critically distinct notions of the common good.
Others have investigated competing interpretations as different framings (Epstein
et al. 2016). These differences represent deeper normative divisions that produce
and conceptualize data as a governable object in relation to broader societal goals.
This chapter aims to disentangle and categorize these notions through the con-
cept of value orders, which I introduced in Chapter 2 as justificatory grammars
based on higher common principles (Boltanski and Thévenot 2006). Through an
inductive analysis of key data governance agreements and principles that represent
constitutive moments in the field (see Table 2.1), the chapter identifies five orders,
links them to debates in political philosophy, and situates them in the context of
the emerging field of data governance. These orders include, on the one hand,
the order of fairness, the order of production, and the order of security, which
define more substantive goals, and, on the other hand, the order of sovereignty
and the order of globalism, which define more procedural goals. By embedding
this reconstruction of evaluative repertoires in the analysis of the genesis of the
field, I illustrate the orders’ contextual nature, their institutionalization, and the
field’s settling logics.
In Section 3.2, I first provide a brief overview of the historical genesis of the
field. Second, in Sections 3.3–3.7, I reconstruct the five orders. I start with a gen-
eral description of their virtues and characteristics and embed them in debates in
political philosophy. After that, I explore a dystopian scenario for each order point-
ing to justifications that evoke fear or danger, illustrate different ways to prove
worth through evaluative repertoires or valued objects, and finally demonstrate
the specific conceptualization of data and data governance in this order. Third, I
briefly summarize the main dynamics in the field and give a short overview of the
application of the framework.
In this section, I briefly outline the historical emergence of the field of data gover-
nance. As the case studies provide a deeper reconstruction of key conflicts and
events, I only sketch the most important steps in the genesis of an incomplete
global field. This section particularly focuses on the formation of transnational
stakes that originated in the EU’s Data Protection Directive (1995).
As outlined in Chapter 1, the origins of data governance reach back to the
end of the nineteenth century. In 1890, the legal scholars Warren and Brandeis
explicitly warned of the dangers of intrusion through technological innovations
and articulated a legal right to privacy. Their definition of privacy as the ‘right
to be let alone’ (Warren and Brandeis 1890) long shaped the legal and philo-
sophical debate, as did Brandeis’s dissenting opinion in a case concerning the
wiretapping of telephone conversations (Olmstead v. United States 1928). The
experiences of twentieth-century totalitarianism contributed to the institutional-
ization of the right to privacy, which was first enshrined in the 1948 Universal
Declaration of Human Rights and later in the 1976 International Covenant on
Civil and Political Rights. Yet data governance was perceived predominantly as a
matter of domestic regulation rather than a challenge in the international context
(Bessette and Haufler 2001, 74). In view of increasing automated data collec-
tion and information technology after the Second World War, both academics
(such as Flaherty 1989; Sieghart 1976; Westin 1967) and governments began to
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 53
address potential implications for privacy more directly.¹ The German state of
Hesse introduced the first data protection law in 1970, and Germany adopted a
data protection law in 1977, reinforced by the seminal judgment of the Constitu-
tional Court that established the right to informational self-determination in 1983.
Other industrialized states followed shortly after, such as Switzerland in 1973 or
the US in 1974 (Bennett 1992; Newman 2008, 65, 102). Domestic legislative ini-
tiatives aimed to increase the protection of privacy against both economic and
public security invasions, but there are also indications of economic incentives,
for example in the United Kingdom (UK) (Dowd 2019) or the US (Solove and
Hartzog 2014, 587).
The developments in the domestic context also fostered a political debate about
privacy and data protection on a global level. In 1968, at the twentieth anniver-
sary of the adoption of the Universal Declaration of Human Rights, the UN was
the first international organization to reflect explicitly on data privacy (Heisen-
berg 2005, 52). In 1979, the first International Conference of Data Protection and
Privacy Commissioners (ICDPPC) marked the emergence of a transnational net-
work of professionals. Finally, in the late 1970s, the OECD and the Council of
Europe (CoE) both established expert groups to draft regulations (Kirby 2011, 8).
Both instruments constituted major steps towards a distinct field beyond domes-
tic boundaries, as data governance came to be recognized as a relevant inter-
and transnational concern. When the CoE Convention 108 entered into force
in 1985, it became the first legally binding international data protection instru-
ment (Cannataci and Mifsud-Bonnici 2005, 6). Yet, due to its European character
and lack of enforcement, it was not significant enough to make a truly global
impact. OECD member states were ‘[d]etermined to advance the free flow of infor-
mation’ (OECD 1980). Therefore, the influential OECD guidelines were mainly
designed to overcome barriers to free data flows and prevent further fragmentation
(Kirby 2011).² Apart from some efforts under the auspices of the UN (e.g. UNGA
1990), global cooperation was limited to these initiatives for the decades that
followed.
Besides international institutions, non-state actors were also increasingly rel-
evant. Microsoft and Apple were founded in the mid-1970s but the rise of
information and communications technologies (ICT) and particularly the com-
mercialization of the internet in the early 1990s catalysed the perception of data
The key impetus for the formation of a multilevel field with common stakes man-
ifested at the intergovernmental level with the implementation of the EU’s 1995
Data Protection Directive. Through domestic law, the EU unilaterally asserted
strict data protection rules not only for the EU market but also for cross-border
data transfers, notably against powerful US interests (Heisenberg 2005). Swire
and Litan consider the 1995 Directive as a constitutive moment that ‘repre-
sents a dramatic increase of the reach and importance of data protection laws’
(1998a, 24; see also UNCTAD 2016, 32). As the EU implicitly claimed control
over data beyond its territorial borders, a multiplicity of rules from different
jurisdictions overlapped, which, in turn, required meaning-making processes
about common concepts, goals, and institutional solutions. Most significantly, this
included negotiations to bridge regulatory differences between the EU and the US
(Farrell 2003).
The 1995 Directive not only marked an increase in scope but also entrenched
the normative character of data governance. While the OECD Privacy Princi-
ples had pragmatically stated that ‘Restrictions on these [data] flows could cause
serious disruption in important sectors of the economy, such as banking and
insurance’ (OECD 1980 Preface), the 1995 Directive established the premise that
‘data-processing systems are designed to serve man’ (Data Protection Directive
1995, para. 2). This significantly raised the stakes in international cooperation
and thus contributed to the formation of a common inter- and transnational field.
By linking data governance to the common good of mankind, the directive speci-
fied the ‘illusio’ of the field. Based on the premise that data processing can ‘serve
man’, the directive suggested that engagement in data governance contributes
to the pursuit of this common good. Due to limited EU competences in other
areas, the 1995 Directive was officially based on internal market considerations
(Lynskey 2015a, 3). Yet privacy and data protection rights became increasingly
entrenched and constitutionalized in the EU, particularly with the inclusion of
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 55
After the 1995 Directive had fostered meaning-making processes about com-
mon stakes, institutionalization increased globally. The Asia Pacific Economic
Cooperation (APEC) Privacy framework entered into force in 2005 (APEC 2005;
Greenleaf 2006), the Economic Community of West African States (ECOWAS)
(2010) adopted a Supplementary Act on Personal Data Protection in 2010, and
the African Union (AU) adopted a Convention on Cyber Security and Personal
Data Protection in 2014. In the early 2010s, the OECD and the Council of Europe
updated their original instruments, as did the EU with the adoption of the GDPR
(2016), which came into effect in 2018. In these instruments, data is conceptu-
alized as ‘a valuable asset’ (OECD 2013, ch. 1), and data governance is directly
linked to the pursuit of the common good. For example, the ASEAN framework
recognizes ‘the importance in strengthening personal data protection with a view
to contributing to the promotion and growth of trade and flow of information
within and among ASEAN Member States in the digital economy’ (ASEAN 2016,
1; see also APEC 2005, para. 20).
For a long time, China and the US were significant outliers to this global trend
(Newman 2008). In contrast to efforts to promote content control norms glob-
ally (Flonk 2021), China does not yet strongly engage in debates on international
data protection standards. However, China has recently developed a comprehen-
sive regulatory framework on cybersecurity and data protection (Erie and Streinz
2022). While there is potential for conflict, particularly considering the GDPR
(Zhao and Chen 2019), no major public clashes have emerged. The US still does
not have a comprehensive data protection law but a fragmented regime of sectoral
laws. Data protection is restricted to cases in which the individual has ‘a legit-
imate expectation of privacy’ (Katz v. United States 1967, 360). This is not the
case, for example, if an individual voluntarily turns over private data to a finan-
cial institution (United States v. Miller 1976) or dials a phone number, thereby
transmitting data to a phone company (Smith v. Maryland 1979). The US Pri-
vacy Act of 1974 restricts only government activities. In the US, the use of data
for economic purposes by private companies is often perceived more favourably
³ The full legal effect of the EU Charter of Fundamental Rights only came about with the Lisbon
Treaty in 2009.
56 DATA GOVERNANCE
than public surveillance practices (Schwartz and Reidenberg 1996). The debate
on a federal data protection law has accelerated due to the adoption of the Cali-
fornia Privacy Act in 2018, which offers levels of protection similar to the GDPR
(Harding et al. 2019).
Despite these limited efforts in privacy and data protection, the US is a signifi-
cant player in the field of data governance. On the one hand, US-based companies
such as Facebook, Google, or Microsoft hold significant market power and have
shaped the practices of data governance considerably. On the other hand, the US
has contributed to the emergence and institutionalization of data governance for
security purposes. In particular, US unilateral jurisdictional claims in countert-
errorism, often with extraterritorial effects, have shaped the view of data and the
objectives of their governance (Kaunert et al. 2012). Tensions between security
and privacy have been discussed before the internet age (e.g. Klass and Others
v. Federal Republic of Germany 1978; Westin 1967), but digital data gained their
security dimension mainly in the last twenty years (see Saco 1999 for exceptions).
This dimension was increasingly ‘internalized’ (Farrell and Newman 2019, 34)
in view of the increasing use of the internet for criminal and terrorist purposes.
External shocks, particularly 9/11, worked as a catalyst for the institutionaliza-
tion of security measures (Etzioni 2018, 112–13; Kaunert et al. 2012) such as the
systematic gathering of airline passenger and financial data (Council of the EU
2016). Yet the salience of data protection was limited until the revelations of mass
surveillance in 2013 (Kalyanpur and Newman 2019b).
through the Diem project,⁴ made the calls for regulation more urgent. On the one
hand, there is a growing reliance on data in the security sector, as law enforcement
and intelligence agencies use data in the fight against terrorism and crime, as well
as in the context of the ‘data economy’ (EC 2020a). On the other hand, govern-
ments face increasing incentives to exercise (sovereign) control over the internet
(Drezner 2007; Goldsmith and Wu 2006) when dealing with the growing power
of platforms and other private actors. As already outlined, this has contributed
to an increase particularly in domestic data protection legislation. Yet tech compa-
nies exercise unique forms of power. Culpepper and Thelen (2019) argue that tech
companies’ capacity to combine consumer dependence and consumer attachment
constitutes a key source of their power. Nachtwey and Seidl characterize Silicon
Valley elite attitudes as a ‘solutionist ethic’ (2020) which unites the concern for the
common good with a flexible, risk-taking embrace of the capitalist ideal of profit
and thus promises technological solutions to humanity’s most pressing problems.
Therefore, tech companies promote the normative character of their engagement
in the field, for example through Google’s articulated goal to ‘organize the world’s
information and make it universally accessible and useful’ (Google 2020) or Face-
book’s mission ‘to give people the power to build community and bring the world
closer together’ (Facebook 2020).
The infrastructure that governs transnational data flows is complex and inter-
governmental, which creates significant barriers to entry, particularly for NGOs
(NGO representatives, personal communication 2018, 2019) but also for actors
that lack legal expertise. The current intergovernmental character of data gov-
ernance might seem surprising in view of the internet’s inherently transnational
nature. While data governance is part of internet governance, they are empirically
disconnected (Julia Pohle et al. 2016; Radu 2019; Reiberg 2018). While a subject
of discussion, for example at the Internet Governance Forum (Epstein et al. 2014),
data governance is negotiated in intergovernmental rather than multistakeholder
fora that have emerged before or in parallel to central internet governance bodies
(Murray 2007). Indeed, other areas of internet governance have been character-
ized by a significant scepticism about intergovernmentalism and instead promote
multistakeholderism (Hofmann et al. 2016). While, for example, the governance of
the domain name system is characterized by a deliberate emphasis on private and
non-state actors (Mueller 2010), the origins of data governance are firmly tied to
national and subnational actors. This also shows how transnational fields tend to
retain strong linkages to the domestic level (Sapiro 2018). In internet governance,
This section outlines five value orders in data governance; an overview is pro-
vided in Tables 3.1 and 3.2. I identify two dimensions that specify the common
good of data governance: a substantive and a procedural dimension, which, in
contrast to variation on a single dimension, might be compatible beyond the for-
mation of a (temporary) compromise. The substantive orders are the order of
fairness, the order of production, and the order of security. The procedural orders
are the order of sovereignty and the order of globalism. Naturally, this is not an
exhaustive list of existing conceptions of common societal goals but outlines the
most important or dominant orders. As outlined in Chapter 2, I reconstructed
the orders through the inductive analysis of key data governance agreements and
principles that form constitutive moments of the field (see Table 2.1). I then went
back to the literature and linked the orders to philosophical underpinnings. They
provide a theoretical basis for the diversity of claims in justificatory practices. In
Sections 3.3–3.7, I discuss the general principles of each order, their philosophical
underpinnings, and outline how this links to data governance, including institu-
tional frameworks, and outline dystopian scenarios of the order as well as objects
of value.
Table 3.1 Substantive value orders in data governance
Value order Community Historical context Relevant Dystopian Data governance Valued objects
virtues political scenario
philosophers
Fairness Freedom, indi- Early debates, stronger Appiah, Beitz, Power asymme- Data as a human Human rights con-
vidual and emergence in 19th century, Rawls try, repression, rights and ventions, e.g. UNGA
collective rights more explicit with auto- surveillance autonomy concern Resolution Privacy in
mated data collection in capitalism the Digital Age
1960s
Production Innovation, Early 1990s, stronger with Hegel, Economic Data as an Consumer trust, free
economic commercialization of the Rousseau, Smith downturn, economic resource trade agreements,
rationality internet in 1996 inefficiency free flow of data,
e.g. OECD Privacy
Guidelines
Security Vigilance, safety Emergence in 1980s and Locke, Hobbes Terrorism, Data as crucial Law enforcement
1990s, strong salience after crime, death information cooperation, informa-
9/11 terrorist attacks tion sharing, police,
e.g. UNSC Resolution
2396
Table 3.2 Procedural value orders in data governance
Value order Community Historical context Relevant political Dystopian Data governance Valued objects
virtues philosophers scenario
Globalism Global Emergence with Held, Ohmae, Sassen Fragmentation, Data as inter- Inter-
cooperation, increasingly global international /transnational /transnational
multilateralism character of data discord frameworks,
flows since 1980s IOs, e.g. CoE
Convention 108+
Sovereignty Territory, Early emergence in Bell, Nagel, Sandel Intervention, inter- Data as territorial Domestic law,
collective 1960s and 1970s ference by IOs or or community- hierarchies,
self-determination with national data states based e.g. EU 1995
protection legislation Data Protection
Directive
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 61
The first order I identify is the order of fairness, which is perhaps most deeply
entrenched in the institutional and scholarly debate on data governance (Clif-
ford and Ausloos 2018). It is characterized by a strong emphasis on the human
rights implications of data governance, particularly regarding power and infor-
mation asymmetries (Rössler 2005). Due to the effects on personal autonomy
(CoE 2018a, 16), this particularly includes privacy (UNGA 2013, 1), but also the
right of data protection is considered to protect informational self-determination
(Rodotà 2009, 80) and human dignity (CoE 2018a, 16). Yet there is no unidirec-
tional assessment of data governance. Privacy is rarely considered as a value in
itself but evaluated in light of the implications for individual liberty and autonomy
(Rössler 2005). In consequence, the order of fairness demands a balance between
individual rights, such as freedom of expression (G7 2016, paras. ii, a, b; G20 2016,
3) and freedom of speech (Volokh 1999), the right to access (US White House
1997), and other goods that potentially benefit from data sharing, such as health
(Pasquale and Ragone 2013). The NGO Article 19’s Principles most explicitly
state that:
This also resonates with feminist critiques concerning the separation of public and
domestic spheres that enable the perpetuation of power asymmetries, for example
through abuse (MacKinnon 1987).⁵
1971), others demand the inclusion of ‘the equity and efficiency of the sub-
stantive opportunities that people can enjoy’ (Sen 2005). Risse defines fairness
as ‘the proportionate satisfaction’ (2012, 276, emphasis in original) of certain
demands. The proponents of the order of fairness are united by the substantive
goal of protecting fairness and the basic rights of the most vulnerable, but they
diverge regarding the conception of the desirable level of governance. Some fol-
low a conception of (Kantian) cosmopolitanism, proposing global principles or
global public reasoning (Beitz 2001; Pogge 1989; Sen 2005), while others (Nagel
2005; Rawls 2001) argue that principles of justice only hold in the context of
the state.
In this order, data are considered in light of the implications for individual rights.
This conceptualization may be based on concerns for privacy, which has been
subject to various debates in political philosophy (e.g. Nagel 2005) and in which
it is often coupled with other values, such as human dignity and personhood
(Rotenberg 2001). Data may be owned in a sense that is different from that of
property assumptions but expresses ‘a sense of constitutive belonging, not of exter-
nal ownership’ (Floridi 2005, 195). Data are considered more broadly in their
relationship to the right to access, freedom of expression, and power asymme-
tries. In more recent academic debate, scholars discuss specific questions of data
justice. They tie inequalities, which often asymmetrically hit vulnerable or socio-
economically disadvantaged groups (Masiero and Das 2019), to the information
society (J. J. Britz 2008) or data specifically (Dencik et al. 2016). Similarly, con-
cerns for redistributive justice are increasingly gaining traction in the debate on
competition (Lynskey 2019). This is consistent with the emphasis on privacy and
data protection as collective rights (Solove 2008) moving beyond the liberal focus
on individual rights.
The conception of community in this order is linked to vulnerable people
(Epstein et al. 2014, 161). In contrast to the order of security, vulnerability is not
perceived as bodily vulnerability but in terms of threats to a community of individ-
uals and collectives in their pursuit of justice, the enactment of human rights, and
the avoidance of repression and power asymmetries. The emphasis on autonomy
assumes an inherent worth of people as individuals with dignity (Appiah 2010, 61)
that forms the basis of this community construction.
There is significant variation in the level of institutionalization of the order
of fairness. Even though there is, for example, a significant global diffusion of
data protection principles from Europe (Bradford 2020; Greenleaf 2012), other
instruments are less precise. For instance, the ASEAN Framework on Personal
Data Protection states: ‘An organisation may collect, use or disclose personal data
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 63
about an individual only for purposes that a reasonable person would consider
appropriate in the circumstances’ (ASEAN 2016, para. 6b, emphasis added), which
allows considerable room for interpretation.
The dystopian scenario of the order of fairness sees individual rights bargained
away to satisfy other interests which may diminish individual liberty and auton-
omy. A society based on commercial or public surveillance with inadequate
privacy protection is, therefore, considered ‘oppressive’ (Solove 2008, ch. 5), par-
ticularly in the absence of democratic oversight and control. It creates significant
power asymmetries and injustices (Zuboff 2018), such as voter manipulation in
the Cambridge Analytica scandal. This is, inter alia, problematic due to ‘chilling
effect[s] on the exercise of the right to freedom of expression and the right to
hold and form an opinion by searching and accessing and disseminating informa-
tion online’ (Article 19 2017, sec. 4.1). While there is a significant variation in the
perception of threat (Epstein et al. 2014), the intricate relationship between pub-
lic and private surveillance in particular poses new challenges (Fischer-Lescano
2016; Lyon 1994, 180).
This order emphasizes the value of objects that institutionalize and protect fair-
ness legally and politically, such as regulations or other legally binding human
rights frameworks. For example, the AU explicitly refers to the African Charter
of Human and Peoples’ Rights (AU 2014, 25), while the UN links its goals to the
International Bill of Human Rights (UNGA 1990, Art. 6). All frameworks contain
some reference to safeguards. Greenleaf identifies ten elements, including stan-
dards of collection, data quality, purpose specification, or notice, that form the
basis of most data agreements and domestic legislation (Greenleaf 2012, 73–5). In
addition, value might be assigned to frameworks that highlight the responsibility
of businesses, such as the UN Guiding Principles on Business and Human Rights
(UN 2011). Strong enforcement capacities for independent oversight or monitor-
ing bodies (GDPR 2016) or the establishment of a UN Special Rapporteur (Article
19 2017) indicate the manifestation of the order. The order of fairness is sometimes
associated with civil liberties organizations and moral authority (Avant et al. 2010),
particularly in international institutions or fora (Epstein et al. 2014, 160). Actors
use different resources, including statements from whistle-blowers or leaked docu-
ments (Gros et al. 2017), to prove conformity to this order. Proponents tend to rely
on reports and expertise by data protection authorities, such as the European Data
64 DATA GOVERNANCE
The second order I identify, the order of production, is tied to the promotion of
economic progress and a strong logic of efficiency. Legislation, as well as general
conduct, should be ‘practical’ (APEC 2005, 3), ‘flexible’ (IA 2018, 2), and ‘reason-
able’ (IA 2018, 3). In the academic debate, this approach finds resonance in the
privacy economics literature, which focuses on the economic potentials of data
sharing. Proponents weigh the trade-offs between the protection of privacy and
the (economic) efficiency benefits resulting from the collection and processing of
data (Acquisti et al. 2016). A prominent proponent is Posner (1977, 1981), with
his vision of a ‘legal right to privacy based on economic efficiency’ (1977, 404).
While some scholars reject public regulatory measures due to their ineffectiveness
(Stigler 1980), regulation finds support for addressing market failures (Veljanovski
2010, 18), such as monopolies.
Basic property rights are widely argued to be part of the pursuit of justice and
the common good (Rawls 1971; Rousseau 1998).⁷ However, the normative and
philosophical underpinnings of the order of production are most clearly artic-
ulated in the seminal works of Adam Smith (see also Boltanski and Thévenot
2006). Smith famously argued that behaviour focused on self-love would result
in aggregate social benefits due to the ‘invisible hand’ of market competition (A.
Smith 1999). While proponents of libertarian philosophy have echoed the empha-
sis on the minimalist state (e.g. Nozick 1974), many scholars acknowledge that a
free market creates problems and demand limited restrictions or state involvement
(Hegel 2015; see also Satz 2012). For example, Friedman proposed that ‘govern-
ment is essential both as a forum for determining the “rules of the game” and as
an umpire to interpret and enforce the rules decided on’ (M. Friedman 2009, 15).
Even Smith allowed for protective measures for defence purposes or through tariffs
(A. Smith 1999, bk. 4, chs 4–5).
While most proponents have an explicitly international outlook, economic ben-
efits are often limited to a restricted group, particularly Europe and the Americas
(A. Smith 1999, bk. 5, ch. 3). While principles of the order of production are
generally portrayed as going hand in hand with globalism, scholars have demon-
strated how sovereigntist or even nationalist tendencies have found expression in
neoliberal policies (Harmes 2012).
In the order of production, data embodies the ‘new oil’ (Economist 2017), ‘the
fuel that drives much commercial activity online’ (UNCTAD 2016, iv), ‘a valu-
able asset’ (OECD 2013, ch. 1), or even ‘the lifeblood of the global economy’
(Kirkhope 2016). To foster innovation and economic growth, the free flow of
data is crucial (e.g. OECD 2013, 12). This conceptualization makes data trade-
able: users exchange their data for (free) services. Privacy protection is often a
means to secure a trust relationship with customers (see Fischer-Lescano 2016)
or to avoid legally binding regulation. Data governance aims to harmonize and
improve efficiency (Reicherts 2014). For instance, in the G7 ICT Ministers’ Decla-
ration, privacy protection is an action item to support the economically beneficial
free flow of information through trust and confidence building (G7 2016, para.
ii, a, b; see also G20 2016, 5; ASEAN 2016, 1). However, the commodification of
data also offers opportunities to strengthen individual control. A property-based
understanding of data may increase agency for individuals regarding the current
and future use of their data (Schwartz 2004, 2029–116), for example through fines
or sanctions (Janger 2002, 914).
Efficiency-maximizing individual and collective actors form the order’s refer-
ence community. Consumers and entrepreneurs interact to strengthen economic
progress. Actors may include private companies but also public actors that aim
at improved economic performance, ‘permissionless innovation’ (Thierer 2016),
and harmonization.
The order of production has found expression in numerous institutional frame-
works, particularly in treaties designed to reduce barriers to data flows (Kirby 2011,
8). For instance, the OECD defines privacy and the free flow of information as ‘fun-
damental but competing values’ (OECD 1980; CoE 1981, Preamble; UNGA 1990,
Art. 9), and even the GDPR refers to ‘the free movement of data’ in its title. There
is also a strong emphasis on business interests and public–private initiatives. The
APEC framework specifically proposes a policy that ‘balances information pri-
vacy with business needs and commercial interests’ (AU 2014, 13; APEC 2005, 3).
The Internet Association’s Privacy Principles point out that while their company
members welcome efforts to protect privacy, ‘laws and regulations should avoid
a prescriptive approach to doing so, as such an approach may not be appropri-
ate for all companies’ (IA 2018, 4). The order of production puts emphasis on
soft solutions rather than strict enforceable legislative approaches. Governance
efforts aim to harmonize standards, foster trust in the digital economy, and reduce
66 DATA GOVERNANCE
barriers to the free flow of information (APEC 2005, foreword). Trust is ‘funda-
mental to their [IA companies] relationship with individuals’ (IA 2018, 2), as is
consent (ECOWAS 2010, 10).
The dystopian scenario in the order of production expresses concerns about the
inefficiency of barriers to trade and data sharing, often associated with strong gov-
ernmental interference (GNI 2017, 4), which may even be considered a pretext
for economic protectionism (Obama cited in Farrell 2015; Lancieri 2018). Restric-
tive effects on innovation are perceived as particularly problematic (e.g. Kottasová
2018). In view of legal uncertainty, which poses ‘major obstacles to the develop-
ment of electronic commerce’ (AU 2014, 1), and inequality of competition, global
players have more recently become proponents of common standards (Hern
2019). As lack of trust constitutes a potential interference with the digital economy
(EC 2013e), private companies in particular aim to distance themselves from, for
example, ‘[m]alicious actors, with motives ranging from criminal to geopolitical,
[that] have inflicted economic harm, put human lives at risk, and undermined the
trust that is essential to an open, free, and secure internet’ (Cybersecurity Tech
Accord 2018).
In the order of production, most actors value soft standards to avoid overregula-
tion and the undermining of trust. Under the governmental ‘shadow of hierarchy’
(Héritier and Lehmkuhl 2008, 5), self-regulatory measures, especially in the US
(Newman and Bach 2004), aim to ensure the free flow of data. Therefore, refer-
ences to agreements, codes of conduct, and terms of service agreements provide
conformity with this order (OECD 2013, Art. 19; US Department of Com-
merce 2000a, 2016). The ‘legitimate need’ (IA 2018, 3) of companies is strongly
emphasized but rarely defined. In addition, membership of private or multistake-
holder initiatives such as the GNI Principles, the IA Privacy Principles, or the
Internet and Jurisdiction Policy Network (GNI 2017, 5; IA 2018) provides evi-
dence of compliant behaviour. Transparency reports issued by tech companies
(see Parsons 2017) have a similar function. Economic impact assessments (e.g.
US Chamber of Commerce 2013) or statistics highlight the immense value of
the digital economy. They emphasize the ‘unprecedented personal, social, pro-
fessional, educational, and financial benefits’ (IA 2018, 2; see also APEC 2005,
2) stemming from the sector’s contribution to GDP and the creation of jobs
(IA 2018, 2).
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 67
Third, the order of security is the most recent substantive addition to concep-
tions of worth in data governance. While tensions between privacy and security
have been discussed extensively in the literature (e.g. Westin 1967), the specifi-
cation of data governance as a security concern has emerged only in the last two
decades. The order underlines safety and the prevention of physical harm, particu-
larly for vulnerable actors, such as minors or law-abiding citizens. It unites a variety
of implications of harm, such as the threat of terrorism, serious crime, or cyber-
attacks. Proponents juxtapose an inconvenient but harmless invasion of privacy
with an existential threat to the security of humans or contrast security as a com-
mon good with privacy as a liberal individual right (Etzioni 2018, 124; see Solove
2007). While not all proponents use this zero-sum logic, a general sense of urgency
is attached to justifications that rely on the principles of the order of security.
This dynamic is illustrated by the literature on ‘securitization’ (Buzan et al. 1998),
which refers to the active transformation of an issue through security-embedded
speech acts.
The appeal to the order of security already invokes an implicit assumption of inse-
curity (Dillon 2002, 120–2). Any reference to the order of security is, therefore,
closely tied to the dystopian scenario of terrorism, crime, and death, evoking an
existential threat to a community of humans. Particularly vulnerable groups such
as children and other innocent actors are frequently mentioned to emphasize the
importance of the claim. Extreme measures are justified by the perception of an
existential threat (Buzan et al. 1998, 25). These measures may include the tempo-
rary suspension of rules, such as sovereignty in the case of intervention (Buzan
et al. 1998, 158), but also the creation of new rules, such as the resort to war
(de Goede 2008). Thus, references to a security threat can enlarge the space of
possibles for action.
In this order, proponents ascribe value to vigilance (AU 2014, 25) in the fight
against impunity. There are also a significant number of references to informal
or formal cooperation agreements, for example in law enforcement or in the fight
against terrorism (OAS 2004, Appendix A), joint databases, and watch lists of
potential terrorists (Ryngaert and van Eijk 2019; UNSC 2017, para. 13), as well
as more active measures such as the promotion of ‘counter-terrorist narratives’
(UNSC 2017, 4). While there is a general recognition that interference with pri-
vate life needs to be constrained (CoE 2002, v), this interference is conceptualized
as a necessity in the protection of liberal values. For example, in 2014, President
Obama stated that ‘Throughout American history, intelligence has helped secure
our country and our freedoms’ (Obama 2014). Proponents may use statistics
and evidence for successfully prevented terrorist attacks or major crime, includ-
ing review reports (EU and US 2016, Art. 23), to prove worth in this order.
The demonstration of an existential threat may rely on speech acts or shocking
images (Williams 2003). By referring to pre-emptive security, proponents empha-
size secrecy (de Goede 2018b; de Goede and Wesseling 2017) and the restriction
of transparency as a way of proving worth rather than a deficiency.
In addition to the three substantive orders, I also identify two orders that relate to a
more procedural dimension of the common good, fourth, the order of sovereignty,
and fifth, the order of globalism. Sovereignty is one of the foundational principles,
70 DATA GOVERNANCE
even a ‘Grundnorm’ (Reus-Smit 2001), of the international order but at the same
time heavily contested. The order of sovereignty is centred on the interrelations
between community members and between communities. In consequence, the
order underlines the importance of community decision-making and legitima-
tion processes.⁸ Any obligations or restrictions stemming from institutions or
entities outside the community seem problematic, as they interfere with domes-
tic authority (Hutter et al. 2016). This may also include considerable scepticism
about the exercise of power by private or multinational companies, which states
often aim to resist (Kalyanpur and Newman 2019a). Differences between norms
and rules in the international context are perceived as resulting from a necessary
and normatively desirable differentiation of political objectives (Reidenberg 2000;
FTC employee, pers. comm. 2019; former FTC employee, pers. comm. 2019). For
example, scholars and policymakers emphasize the potential effect of totalitarian-
ism that may have entrenched a perception of worth of data in European societies
(Kirby 2011, 8).
⁸ The book uses the term sovereigntism rather than statism, despite conceptual similarities (Forst
2001; Zürn and de Wilde 2016). In the context of international negotiations in data governance, the
EU acts as a sovereign community in the sense that the authority of the EU institutions, particularly
the EC, is well established.
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 71
is compatible with the order of fairness (Rawls 1971), production (Walzer 2008),
and security (Hobbes 2016).
By defining data governance as an issue best dealt with at the level of sovereign
communities, proponents of the order of sovereignty conceptualize data either as
a primarily territorial or domestic issue or as a potential threat to the exercise of
territorial sovereignty (Ruggie 1993). Proponents are united by the prioritization
of the interests and values of the reference community. While outside interfer-
ence and extraterritoriality challenge the principle of external sovereignty, the full
enactment of community interests may have ‘spillover’ effects (Kobrin 2004, 111)
into other jurisdictions, which results in an imposition of domestic norms on other
communities (see Chapter 7).
Sovereign states and their citizens constitute the order’s reference community.
Building on the idea of ‘insiders’ and ‘outsiders’ (Rawls 2001), most frameworks
offer significantly less or no protection for outsiders, mostly non-citizens (AU
2014; e.g. APEC 2005; OECD 1980).⁹ The absence of international fora shows the
entrenchment of this order in data governance, particularly compared with other
areas such as intellectual property or internet protocols (DeNardis 2013).
Beside the existence of numerous domestic frameworks, many international
frameworks explicitly highlight community values or norms, such as the AU,
which argues for ‘an appropriate normative framework consistent with the African
legal, cultural, economic and social environment’ (AU 2014, 2). APEC similarly
‘accords due recognition to cultural and other diversities that exist within mem-
ber economies’ (APEC 2005, paras. 6, preamble). While the diffusion of common
norms to other communities might be considered desirable (EC 2010a), the pro-
tection of sovereign norms constitutes the highest priority. The responsibility for
implementing decisions resides with governments and the national legislature
(APEC 2005, 32–3; CoE 1981, Art. 4; ECOWAS 2010, 7), but international agree-
ments may be valued if they emphasize the competences and responsibilities of
states, for example, when public instruments contain specific exceptions for the
provision of public goods, such as national security or monetary interests (CoE
1981, Art. 9), the ‘ordre public’ as well as the ‘public interest’ (APEC 2005, 8;
ASEAN 2016, Art. 4; ECOWAS 2010, Art. 4), or ‘public health or morality’ (UNGA
1990, Art. 6).
⁹ The Netherlands is one of the few countries where restrictions on intelligence gathering apply to
non-citizens. However, a recent judgment of the German Federal Constitutional Court also highlighted
that intelligence service data collection is bound to German constitutional law, even when this occurs
outside German territory and exclusively on non-German citizens (1 BvR 2835/17 -, Rn. 1-332 2020).
72 DATA GOVERNANCE
The fifth order I identify is the order of globalism. In view of the increasingly
transnational character of contemporary problems, the call for globalism through
the delegation of authority to international institutions has found considerable
resonance in recent decades, which is in line with the proliferation of regional
VALUE ORDERS AND THE GENESIS OF THE FIELD OF DATA GOVERNANCE 73
for businesses to collect and use information early demonstrated the inter- and
transnational dimensions of data transfers (Bessette and Haufler 2001, 74), the
specific conception of the global character of data governance emerged in view of
the potential dangers of fragmentation. The unilateral and extraterritorial impetus
of the EU’s 1995 Data Protection Directive provided salience to this order, because
it heightened the stakes of control over data. The possibility of fragmentation (US
Department of Commerce 2000a, 1) and diverging standards emerged as serious
threats in view of multiple and overlapping jurisdictional claims.
Proponents of the order of globalism consider the relevant community as being
located at the global level. However, the extent to which this community is based
on a global citizenry is contested, as is the degree to which this community struc-
ture implies a duty for the provision of, for example, justice (Buchanan 2007, 83;
Pogge 2008; Zürn et al. 2007). Nevertheless, a connection between institutions,
individuals, and states is based on their common membership of the global order
(Risse 2012, 8), which also evokes ideas of the existence of an international society
in which actors ‘conceive themselves to be bound by a common set of rules in their
relations with one another’ (Bull 2002, 13).
There are institutional expressions of the order of globalism, including calls for a
global framework. The Montreux Declaration by the International Conference for
Data Protection Commissioners emphasizes the ‘universal character’ (ICDPPC
2005) of data protection that requires a global framework. In 2009, the NGO-based
Madrid Resolution aimed to ‘define a set of principles and rights guaranteeing
the effective and internationally uniform protection of privacy with regard to the
processing of personal data’ (Madrid Privacy Declaration 2009). Yet while there
are common data governance elements (Greenleaf 2012, 73–5), there is no global
framework for data governance, and its likelihood beyond a superficial agree-
ment is limited. Beyond these more cosmopolitan expressions for the protection
of human rights, increasing cooperation also constitutes a stated goal in economic
cooperation and the area of counterterrorism and law enforcement. For instance,
the EU and US (2016) Umbrella Agreement was specifically designed to set up the
terms of transatlantic cooperation and law enforcement. The OAS emphasizes that
‘threats to our citizens, economies, and essential services, such as electricity net-
works, airports, or water supplies, cannot be addressed by a single government or
combated using a solitary discipline or practice’ (OAS 2004, Appendix A), which
suggests a more efficiency-based, instrumental perspective.
Art. 6; OECD 1980) and the ‘creation of unjustified obstacles to the development
of economic and social relations’ (OECD 2013, 12; GDPR 2016, para. 6). Conflicts
of laws are highlighted as a significant problem by both policymakers and pri-
vate companies (Apple et al. 2018), because they require adherence to conflicting
standards. The prioritization of unilateral decision-making processes over coordi-
nation and interoperability is perceived as morally deficient (e.g. Newman 2008).
The OECD framework similarly emphasizes the dangers arising from unilateral
legislation and states that the problem ‘cannot be solved exclusively at the national
level’ (OECD 2013, 7).
3.8 Conclusion
The historical struggles over the right conduct of data governance have con-
tributed to the formation of a field that is shaped by institutional legacies and
normative divisions. This chapter has illustrated the emerging field of data gov-
ernance through the evolution of endogenous logics that are centred on the
assumption that data governance is meaningful in the pursuit of the common
good. The intergovernmental roots of the field, most notably its genesis around
the extraterritorial impact of the EU’s 1995 Data Protection Directive, have shaped
76 DATA GOVERNANCE
In July 2020, the Court of Justice of the European Union (CJEU) made interna-
tional headlines when it invalidated the so-called Privacy Shield, an agreement
for commercial transatlantic data transfers. The CJEU argued that data transfers
under the agreement did not satisfy the requirements of European’s fundamental
rights protection (Data Protection Commissioner 2020, para. 185). Observers were
concerned about the negative consequences for the $7.1 trillion transatlantic eco-
nomic relationship but not entirely surprised by the Court’s reasoning: in 2015,
the CJEU had similarly caused diplomatic uproar by invalidating Safe Harbour,
the predecessor of Privacy Shield. Both instances followed complaints regarding
the lack of data protection by Maximilian Schrems, an EU private citizen, in light
of the revelations of mass surveillance by intelligence agencies. Shortly after the
EU’s General Data Protection Regulation (GDPR) had sparked controversy, the
high-stakes transatlantic relationship was again in disarray due to data politics.
What happened?
The roots of this conflict reach back much further and started with the EU’s
first adoption of comprehensive data protection legislation in 1995. Through
an analysis of global commercial data flows as the earliest site of transatlantic
data governance conflicts, this chapter investigates a mismatch between the
strongly normative jurisdictional claims and their largely superficial resolution.
The chapter adds to literature that has discussed the conflicts between the EU and
the US in the context of the adoption of the 1995 Directive (Newman 2008) and
the emergence of Safe Harbour (Farrell 2003; Farrell and Newman 2019, ch. 5;
Kobrin 2004; Long and Quek 2002) as instances of regulatory divide and con-
vergence. In addition, a broader strand of literature has focused on the societal
and political implications of the NSA scandal (Bauman et al. 2014; Gros et al.
2017; Lyon 2015) and described the multifaceted reactions of the EU regard-
ing commercial data protection and surveillance (Schneider 2017; Schünemann
and Windwehr 2020). In contrast, the invalidation of the Safe Harbour agree-
ment has found only limited attention outside the legal literature (Kuner 2017;
Ni Loideain 2016; Weiss and Archick 2016; for an exception, see Farrell and
Newman 2018, 2019, ch. 5). Due to its recent nature, even fewer scholars have
investigated the invalidation of Privacy Shield yet (for exceptions, see Chuches
and Zalnieriute 2020; Fahey and Terpan 2021). My analysis will, therefore, focus
transfers. The US Federal Trade Commission (FTC) and the Department of Trans-
portation (DOT) were tasked with monitoring compliance (US Department of
Commerce 2000b). In sum, the agreement constituted an institutional solution to
bridge normative differences but had the potential to foster convergence in the
long term.
Soon after its implementation, however, significant doubts concerning the effec-
tiveness of the framework emerged (Dhont and Asinari 2004; EC 2002; Kobrin
2004, 122). A report conducted for the European Commission openly questioned
the FTC’s interest in enforcing and implementing Safe Harbour (Dhont and Asi-
nari 2004), and a 2008 study found that of 1,597 organizations in the Safe Harbour
list, 206 were false entries (Connolly 2008), which rose to 427 in 2013 (Connolly
2013). In response, FTC staff pointed out that ‘While the Framework contemplated
that EU data protection and other authorities would provide us with such referrals,
we received none for the first ten years of the program, and only a few over the past
three years’ (FTC 2013, 3). In view of ten total enforcement cases between 2000
and 2013, the promoted ‘vigilant Safe Harbor enforcement’ (FTC 2013, 3) seems
somewhat overstated. Yet the fact that the European Commission never articulated
any profound challenges to the framework highlights that, as one FTC employee
put it, Safe Harbour ‘was just not a priority for most people’ (FTC official, pers.
comm. 2019).
In summary, the Safe Harbour framework represented the solution to a juris-
dictional conflict that emerged due to regulatory and normative divisions between
market-focused US and rights-based EU regulation. While the framework facili-
tated data exchanges and contributed to the entrenchment of economic principles
in line with the order of production, the rights-based approach in line with the
order of fairness was undermined by a lack of enforcement.
In the early 2000s, the expansion of US data access for security purposes con-
tributed to major jurisdictional conflicts in transatlantic data governance (see
Chapters 5–6). In contrast, the Safe Harbour framework was largely unaffected
by these conflicts. In 2012, the European Commissioner for Justice, Fundamen-
tal Rights, and Citizenship, Viviane Reding, publicly and unambiguously stated
that ‘Safe Harbour will stay’ (Reding, cited in Kerry 2012). Not even one year
later, the Snowden revelations challenged this position. The Snowden revela-
tions can arguably be seen as a key disruptive event, inter alia, concerning
the evolution of data protection standards in the EU (Laurer and Seidl 2021;
Schünemann and Windwehr 2020). Existing institutional structures between the
US and the EU were re-evaluated, such as the agreement on the sharing of finan-
cial data (see Chapter 6). Yet the overturn of the Safe Harbour agreement by
SAFE HARBOUR AND ITS DISCONTENTS 81
the CJEU in 2015 constitutes the most direct and prominent effect. To illustrate
the re-emergence of jurisdictional conflict between the EU and the US in light
of these revelations, I briefly outline the responses towards the Snowden rev-
elations in the field before zooming in on the challenges to the Safe Harbour
framework.
The EU side asked for further specification of what is covered under ‘foreign intel-
ligence information’, within the meaning of FISA 50, U.S.C. §1801(e), such as
references to legal authorities or internal guidelines substantiating the scope of
foreign intelligence information and any limitations on its interpretation, but the
US explained that they could not provide this as to do so would reveal specific
operational aspects of intelligence collection programmes.¹
(EU–US Working Group 2013, 3)
The report juxtaposes the overt exercise of power with the more consistent, rule-
following approach of the EU in its claim to jurisdiction based on sovereignty.
The EU stressed the need for legal references and internal guidelines as eval-
uative objects in line with the order of sovereignty. The European Parliament
similarly tried to assert the applicability of EU law to ensure that ‘the legal force of
the EU Treaties is not undermined by a dismissive acceptance of extraterritorial
effects of third countries’ standards or actions’ (EP 2014a, para. K). Actors in the
EU attempted to prove their morally superior status by referencing existing legal
standards and problematizing the absence of such standards in the US. The EU
used the absence of such objects as evidence that the US failed to prove worth
within this order. In his statement before the European Parliament, Snowden
problematized the same ambiguity in wording associated with the order of secu-
rity, highlighting the lack of bindingness of US commitments (Snowden 2014).
The European Parliament initiated an extensive investigation of the Snowden rev-
elations, specifically through a series of hearings by the European Parliament’s
Committee on Civil Liberties, Justice, and Home Affairs (LIBE Committee) and
reports. According to Gros et al. (2017), these hearings transformed the files dis-
closed by Snowden and reports into compelling evidence. Some experts tried to
question the validity of the allegations and shift the debate away from more general
societal or political questions (Gros et al. 2017, 79). Yet, more importantly, a sig-
nificant number of invited representatives from the US and intelligence services
of member states declined the invitation to participate in these public hearings
(LIBE 2014a, 125–6), deciding not to follow the imperative of justification. Due
to the lack of explicit justifications attempting to outline legitimate control over
data, the LIBE committee and the participants who highlighted the dystopian sce-
narios of mass surveillance in reference to the order of fairness shaped the criteria
of evaluation (Gros et al. 2017). The report ‘strongly rejects the notion that these
¹ According to the report, ‘[f[oreign intelligence information’ includes not only specific categories of
information such as international terrorism but also ‘information relating to the conduct of the foreign
affairs of the US’, which may also include intelligence on government agencies (EU–US Working Group
2013, 3–4).
SAFE HARBOUR AND ITS DISCONTENTS 83
issues are purely a matter of national security and therefore the sole competence
of Member States’ (LIBE 2014b, para. 16). This not only denied the dominance of
the order of security but also shifted the reference community away from sovereign
security actors.
The European Parliament also explicitly highlighted the dystopian scenario
of increasing interlinkages between public and private surveillance and the eco-
nomic incentives of the process. For example, the European Parliament ‘[d]eplores
the fact that many mass and large-scale intelligence programmes seem to be
also driven by the economic interests of the companies that develop and run
those programmes’ (EP 2015, 39). Juxtaposed with a reference community of
right-bearing individuals, this problematizes the perpetuation of potentially illegal
practices removed from parliamentary scrutiny. Nonetheless, the Snowden revela-
tions created significant tensions not only between public actors but also between
public and private actors. As the revelations increasingly disclosed the interlink-
ages between public and private surveillance, many companies tried to distance
themselves from public surveillance. For instance, in 2015, major companies such
as Apple, Google, Facebook, and Microsoft requested reforms in government
surveillance (Reform Government Surveillance 2015).
In turn, the continued denial demonstrated that US actors neglected the appli-
cability of evaluative criteria of the order of sovereignty and fairness and relied on
ambiguous references to the higher common principle of security. The US publicly
justified the revelations only in a very general way, emphasizing, for example, the
need for balance. For instance, President Obama argued, ‘But I think it’s impor-
tant to recognize that you can’t have 100% security and also then have 100%
privacy and zero inconvenience. We’re going to have to make some choices as a
society’ (Obama 2013). This statement makes a connection between security and
the need for sacrifices, while establishing privacy as a convenience. Therefore, sac-
rifices to protect the greater good of society through security rather than binding
or transparent standards express worthiness in the order of security. While the
US government had been slow to address the allegations, in January 2014, Pres-
ident Obama offered justifications in the form of a Presidential Policy Directive
(PPD-28) (US White House 2014) and a more extensive speech that announced
reform measures and provided a general sense of direction for US intelligence in
the future. The policy directive lays out the framework and principles of signals
intelligence by the US and specifies limited protections for non-US citizens, which
represents a significant and ‘unprecedented’ (Farrell and Newman 2019, 144) step.
Despite a general willingness to engage with European data protection princi-
ples (Suda 2017), both the speech and the directive demonstrated the intention
to maintain surveillance practices:
The United States must preserve and continue to develop a robust and techno-
logically advanced signals intelligence capability to protect our security and that
84 DATA GOVERNANCE
of our partners and allies. Our signals intelligence capabilities must also be agile
enough to enable us to focus on fleeting opportunities or emerging crises and to
address not only the issues of today, but also the issues of tomorrow, which we
may not be able to foresee.
(US White House 2014)
In comparison with the broader reactions to the Snowden revelations in the EU,
the discourse on the Safe Harbour agreement seemed initially modest in its cri-
tique, particularly from the European Commission. Even Commissioner Reding,
who articulated strong criticism of bulk surveillance practices, declined to partic-
ipate in the LIBE hearing that focused on Safe Harbour. This prompted the pre-
siding LIBE Chair, Juan Fernando López Aguilar, to emphasize ‘disappointment
that the Commission could not attend this hearing’ (LIBE 2013, pts. 19:12–19:13).
² For a more detailed discussion of the justification of exceptionalism or emergency, see Huysmans
(2008); for a more conceptual discussion, see Amoore (2013) or Kreuder-Sonnen (2019) for its effects
on global politics.
SAFE HARBOUR AND ITS DISCONTENTS 85
Right after the disclosures, the European Commission decided to conduct a Safe
Harbour framework review. Notably, this was only the third review since the
framework’s adoption in 2000.
The review focused primarily on the economic implications for Safe Harbour,
and only the last of four points referred to ‘the information recently released on
US surveillance programmes’ (EC 2013d, 3). It highlighted the framework’s eco-
nomic relevance, thus establishing the order of production as the main evaluative
principle. The European Commission specified Safe Harbour as ‘one of the con-
duits through which access is given to US intelligence authorities’ (EC 2013d,
16). While avoiding explicit references to mass surveillance, it argued for limit-
ing ‘the national security exception foreseen by the Safe Harbour Decision’ (EC
2013d, 19) to necessary and proportionate circumstances. While the Commis-
sion acknowledged that Safe Harbour facilitated access to commercial data, it did
not explicitly problematize the actions of intelligence agencies. The main point
of contention seemed to be broken trust. Throughout the report, the Commis-
sion underlined broken trust as an impediment to future cooperation and barely
referred to potential human rights violations (Rettman 2013). The report even
reconceptualized human rights violations according to criteria consistent with the
order of production:
If citizens are concerned about the large-scale processing of their personal data
by private companies or by the surveillance of their data by intelligence agencies
when using Internet services, this may affect their trust in the digital economy,
with potential negative consequences on growth.
(EC 2013a, 3)
This element was also emphasized by other former officials and think tanks (e.g.
Kerry 2014; Singer and Wallace 2014). The idea of broken trust further estab-
lished a dystopian scenario in the order of globalism, particularly concerning the
‘credibility’ (e.g. EC 2013d, 7–11) of cooperation. The Commission rejected the
idea of a new framework. Instead, it envisioned ‘strengthening’ the Safe Harbour
framework, emphasizing how a potential institutional change could be ‘adversely
affecting the interests of member companies’ (EC 2013a, 7). Reports highlighted
the risk of fragmentation (EC 2013d, 5) in light of diverging enforcement practices
by data protection authorities. While Věra Jourová, the then European Commis-
sioner for Justice, emphasized the protection of data as the first priority, she called
the digital economy ‘the backbone of our economy’ (EC 2015a), establishing its
necessity. The European Commission seemed confident that a reform of Safe
Harbour was the right step and emphasized that ‘it is the competence of the Com-
mission’ (EC 2013a, 4) to adapt any decision on the adequacy of data protection,
thereby establishing its jurisdictional claim.
In contrast, the European Parliament, drawing on principles of the order of
fairness, called for an immediate suspension of the Commission’s 2000 adequacy
86 DATA GOVERNANCE
decision (EP 2014a, 40). Members of the European Parliament (MEPs) questioned
the legitimacy of the jurisdictional claims by the Commission and the US and
pointed to the persistent and ongoing lack of oversight (EP 2014a, AN). In its writ-
ten observations to the following court case, the Parliament, asked ‘who will guard
the guardians?’ (EP 2014b, para. 80) to emphasize the need of independent super-
visory authorities (EP 2014b, paras. 74–80) as proof of worth within the order of
fairness.
The Commission’s reluctance to address the significant shortcomings of the
framework may have different reasons, including a status quo bias that favours
legal consistency (Hartlapp et al. 2014, 302) but also the potential for a significant
economic fallout, given the dependency of European businesses on American-run
digital services. The strong focus on principles of the order of production is also
in line with assumptions in the literature that EU policymaking favours deregula-
tory and market-friendly policies (e.g. Schmidt and Thatcher 2014). Even though
recent approaches have argued that the Commission’s regulatory ambition pits it
against business interests (Dür et al. 2019), the statements by the European Com-
mission seem to correspond to the prioritization of economic progress. It is likely
that there was additional pressure from member states, such as Germany, that
supported the reform of Safe Harbour. This pressure notably extended beyond
policy decisions and more generally concerned the position of the Commission
in the field: in a leaked document, the Bundesministerium des Innern und für
Heimat (BMI, the German Ministry of the Interior) criticized the Commission’s
decision to comment on any action in relation to national intelligence agencies.
The BMI instead emphasized that any reports should be published in the name
of the member states (BMI 2013, 14 [71]). While this illustrates to what extent
states aim to shield their ‘core state powers’ (Genschel and Jachtenfuchs 2018),
the emphasis on the sovereign rights of member states seemed to be largely instru-
mental. Emphasizing member state competences not only limited the evaluation
of US intelligence but also insulated the surveillance activities of national intel-
ligence agencies from scrutiny by EU institutions (Bigo et al. 2013). Intelligence
activities are to a significant extent based on collaboration and security-based data
sharing between domestic intelligence services. This cooperation includes the Five
Eyes network but also involves the French, German, or Swedish intelligence ser-
vices (Bauman et al. 2014; Fischer-Lescano 2016). This indicates that justifications
referring to the order of sovereignty were mainly strategically deployed to retain
sufficient room for manoeuvre.
The general reluctance to overthrow the Safe Harbour framework shaped initial
talks about reform. The European Parliament tried to challenge the existing system
SAFE HARBOUR AND ITS DISCONTENTS 87
but did not find sufficient support. This makes the invalidation of the framework
in 2015 even more surprising. An Austrian citizen was able to catalyse a process
that translated a justificatory challenge into institutional disruption. He articulated
a challenge to the implicit jurisdictional claim of the US that attempted not only to
undermine its legitimacy but also to change practices in the field, thereby express-
ing an alternative vision of the field. How exactly did this unfold? Shortly after the
Snowden revelations, in June 2013, Maximilian Schrems, at the time a law student
in Austria, filed a complaint with the Irish Data Protection Commission (DPC).
He specified that the PRISM programme granted ‘mass access’ (Schrems 2013,
7) to commercial data, including data held by the US private company Facebook,
where he was registered as a user. Through its membership of the Safe Harbour
self-certification scheme, Facebook had guaranteed to maintain an adequate level
of data protection. Schrems requested an investigation regarding the existing level
of protection, asking the data protection authority to stop data transfers if the Safe
Harbour standards had been undermined.
Here, it is important to note the central position of Ireland in the European
digital space. Ireland hosts the European headquarters of several major tech
companies, including Google, Facebook, and Microsoft. The corporate-friendly
approach to regulation in Ireland has been criticized as relatively ‘cosy’ (Cadwal-
ladr and Campbell 2019) due to low taxes and lax enforcement. In contrast to the
vast material and ideational resources of tech companies, the Irish DPC occupies
an at best marginal position in the field. Until parts of its operations moved to
Dublin in 2015, the DPC had gained prominence mainly due to its lack of fund-
ing and the fact that it shared space with a local supermarket outside Portarlington,
a small town in the Irish countryside (Scally and Bittner 2013).
Whether for a lack of resources or political will, the Irish DPC dismissed the
complaint and argued that it had an ‘obligation to accept “adequacy” decisions’
(Irish DPC 2013, 1) by the Commission. The justification referred to Article
25(6) of the 1995 Data Protection Directive (Data Protection Directive 1995),
which authorizes only the European Commission to issue such adequacy deci-
sions (Schrems 2015). The DPC decided not to investigate the complaint, which
started a heated exchange of letters and eventually led to a court case. The DPC
spoke out against a formal obligation for investigation, while Schrems argued the
refusal to investigate could legally only be justified if the complaint was consid-
ered ‘frivolous and vexatious’ (Irish DPC 2013).³ In correspondence with the High
Court, the DPC pointed out that this included any claims unsubstantiated by law
(Schrems 2014, para. viii).
In contrast to the DPC, the Irish High Court strongly argued against the
legitimacy and legality of US surveillance practices. The Court took the factual cor-
rectness of the surveillance measures largely as given, arguing that ‘denials from
official sources, such as they have been, were feeble and largely formulaic, often
couched in carefully crafted and suitably ambiguous language designed to avoid
giving diplomatic offence’ (Schrems 2014, para. iii). The Court also emphasized the
necessity of considering changes since the original adequacy decision, including
a new security environment but also ‘disclosures regarding mass and undiffer-
entiated surveillance of personal data by the US security authorities’ (Schrems
2014, para. xii) and the elevated status of data protection in the EU as a fun-
damental right since the 2009 Lisbon Treaty. Due to the potential implications
for the EU, the Irish High Court decided to refer the case to the CJEU. This
case was exceptional to the extent that, on the basis of the Treaty on the Func-
tioning of the European Union (TFEU), the CJEU does not have jurisdiction
over cases that involve the internal security of member states or surveillance con-
ducted by national authorities for these purposes (2016, Art. 4, para. 2). However,
due to significant reliance on cooperation with private service providers, data
collection and transfer practices could be considered as commercial. This made
them subject to EU law and thus the jurisdiction of the CJEU. This ‘issue link-
age’ (Farrell and Newman 2018) provided a possibility for Schrems to do what
the institutions had failed to achieve: contest the Safe Harbour framework as the
basis for the collection of data for both surveillance and commercial purposes
(Schrems 2013).
The referral to the CJEU manifested a shift away from the political to the legal
arena. The extent to which actors recognized this jurisdictional challenge as a
serious threat, however, varied significantly. In contrast to the more recent pro-
ceedings against Privacy Shield, the US authorities did not engage in public
justifications in the court cases. While Safe Harbour permitted derogations for rea-
sons of national security, even statements outside the courtroom failed to justify
surveillance in reference to such derogations. The US Presidential Policy Directive
generally emphasized the relevance of commercial data for surveillance purposes,
stating: ‘The evolution of technology has created a world where communications
important to our national security and the communications all of us make as
part of our daily lives are transmitted through the same channels’ (US White
House 2014). US actors simply did not address the Safe Harbour framework, thus
providing no justifications for its continuation.
EU member states and institutions submitted opposing statements. While the
European Commission and the UK argued for the maintenance and reform of
Safe Harbour, other member states such as Belgium, Austria, or Poland argued
against the acceptance of the adequacy decision. While the US might have antic-
ipated a more comprehensive consideration of the exception for ‘necessary’ (US
SAFE HARBOUR AND ITS DISCONTENTS 89
due to their generality and abstractness, Mr Schrems’ concerns about the surveil-
lance programmes of the US national security agencies are exactly the same as
those which have led the Commission to start the review of the Safe Harbour
Decision.
(EC 2014, 19)
The Commission emphasized its position in the field as the competent authority
and decided against significantly engaging with substantive issues. It made claims
on behalf of the EU as a constituent community, emphasizing its position vis-à-vis
the member states and the US. The submission suggested that even if the case con-
cerned national security exemptions, these exemptions benefited the US rather
than the member states, thereby re-emphasizing its competence (EC 2014, 15).
Even though the Commission recognized the violation of Safe Harbour principles,
both in their written observation (EC 2014) and in the oral hearing (Gibbs 2015a),
it concluded that the Irish DPC was bound by the existing adequacy decision. In an
interview with The Guardian newspaper, Schrems described his surprise at what
he perceived as a weakness of the Commission’s justifications:
⁴ My translation; the original version reads: ‘Darüber hinaus halten wir den Verweis auf die Tätigkeit
der Strafverfolgungsbehörden im Zusammenhang mit der Stärkung der Privatsphäre des Einzelnen
verfehlt’ (BMI 2014, 396–7).
90 DATA GOVERNANCE
I even had the feeling that the European Commission was not too interested in
winning the case. Their representation was really bad. […] I think they wanted
to get rid of it anyway. And now they can say, ‘The court is taking a decision, we
can’t be blamed for it anymore.’
(cited in Powles 2015c)
This statement draws on binding human rights conventions to prove worth with
regard to the EU as a liberal community of individuals. It also emphasizes the
responsibility of strong independent data protection authorities as ‘“guardians”’
(EP 2014b, para. 80) of this community.
I argue that the decision of the European Commission and the US to follow
the ‘imperative of justification’ (Boltanski and Thévenot 2006, 346) only to a very
limited extent contributed to the evaluation of the issue through criteria embed-
ded by actors actively engaged in meaning-making processes. As I outline below
and as also highlighted by the Irish High Court, both courts accepted the fac-
tual record of surveillance as presented by the only parties willing to comment
(see also Farrell and Newman 2019, 147). The meaning-making processes were
significantly shaped by criteria and reference communities relating to the order of
SAFE HARBOUR AND ITS DISCONTENTS 91
fairness rather than principles embedded in the order of security. There was no
account that specifically linked the Safe Harbour framework to the necessity for
surveillance measures.
In September 2015, the opinion of the CJEU’s Advocate General (AG) Yves Bot
received widespread attention. He emphasized that in light of the fact that ‘the law
and practice of the United States allow the large-scale collection of the personal
data of citizens of the EU which is transferred, without those citizens benefiting
from effective judicial protection’ (Bot 2015, para. 158), the European Commis-
sion should have invalidated the adequacy decision. In the landmark ruling that
followed shortly after and was widely discussed both academically (Kuner 2017;
Ni Loideain 2016) and publicly (Gibbs 2015b; Scott 2015), the Court similarly
emphasized the inadequacy of protection. More specifically, it proposed that the
broad derogations in the adequacy decision ‘enabled interference founded on
national security and public interest requirements or on domestic legislation of
the United States’ (Schrems 2015, paras. 87–9) with the fundamental rights of
persons. On the basis of the cooperation requirements for companies specified
by law, the CJEU considered that the Safe Harbour principles were systemati-
cally undermined. In the judgment, the Court frequently referred to the Digital
Rights Ireland v. Commission (2014) case, which affirmed the importance of data
protection in the digital context as well as to the TFEU and the EU Charter of
Fundamental Rights. This highlights again the importance of case law and human
rights conventions as significant objects of value. The CJEU also problematized
that the European Commission in its initial adequacy decision had never stated
‘that the United States in fact “ensures” an adequate level of protection by reason
of its domestic law or its international commitments’ (Schrems 2015, para. 98).
Therefore, the Court considered the adequacy decision to be invalid, which also
invalidated the framework as a whole.
The judges further specified the understanding of adequate as ‘essentially equiv-
alent’ (Schrems 2015, para. 73) to EU protection, which reduced the Commission’s
discretion in the adequacy findings and has implications for other data shar-
ing frameworks, such as the Passenger Name Records agreement (Kuner 2015,
2017). Schrems considered the judgment a ‘puzzle piece in the fight against mass
surveillance, and a huge blow to tech companies who think they can act in total
ignorance of the law’ (cited in Powles 2015c). It is also evidence of increasing
judicialization in data governance (Fahey and Terpan 2021).
It is difficult to say how much stronger engagements in the meaning-making
process would have changed that particular situation. The lack of effectiveness
of the Safe Harbour framework in light of minimal enforcement and review
procedures as well as continued sovereignty infringing bulk surveillance practices
92 DATA GOVERNANCE
surveillance practices and ignored reform measures (IA 2015, 2). For example, in a
congressional hearing on the issue, the US Department of Commerce representa-
tive argued that Safe Harbour had become ‘a target for continued criticism largely
based on misunderstanding and false assumptions about its purpose and opera-
tion and the important privacy benefits it provided’ (E. M. Dean 2015, 2). The US
mission stated that ‘The United States does not and has not engaged in indiscrimi-
nate surveillance of anyone, including ordinary European citizens’ (US Mission to
the EU 2015). The US Secretary of Commerce Penny Pritzker stressed that the US
was ‘deeply disappointed’ (Pritzker 2015; see also, e.g., Beckerman 2015, 2) and
suggested that the invalidation ‘puts at risk the thriving transatlantic digital econ-
omy’ (Pritzker 2015). The congressional hearing dedicated to the invalidation of
the Safe Harbour agreement revolved around the detrimental consequences for
the digital economy, such as the free flow of information, the transatlantic com-
mercial relationship, and legal uncertainty, as well as the increased burden and
damage for job-generating companies. While it mainly evaluated the situation in
reference to the common good of production and economic progress, there is also
a recognition that the CJEU decision entails a strong emphasis on future cooper-
ation and the urgency of agreement, thus emphasizing the order of globalism (US
Congress 2015).
Within the EU, both the European Parliament and the European Commission
attempted to gain recognition for their central positions in the field. The European
Parliament emphasized its role as a strong protector of human rights, arguing ‘this
ruling has confirmed the long-standing position of Parliament regarding the lack
of an adequate level of protection under this instrument’ (EP 2015, 2014a, K). Even
the European Commission framed the judgment as a confirmation of its mandate:
we have made important progress that we can now build on in light of the judg-
ment. Our aim is to step up discussions with the US towards a renewed and safe
framework for the transfer of personal data across the Atlantic.
(EC 2015a)
After the judgment, there was a general sense of urgency for a new agreement but
also a heightened sensibility towards the fundamental right status of privacy and
data protection. The European data protection authority 29WP (2015) threatened
to stop data transfers if no agreement were found soon.
In light of intense pressure also from the private sector, which acted in a legal limbo
and emphasized the adverse effect on businesses, including, in particular, small
and medium enterprises (IA 2015, 2), negotiations for a new framework began.
94 DATA GOVERNANCE
Underlying differences remained, but there was general agreement on the fact
that data flows were essential. As a representative of the FTC put it in an inter-
view, ‘nobody wanted to say: “no data should flow”, well almost nobody’ (FTC
official, pers. comm. 2019). Concerns regarding the extent of data access by secu-
rity agencies had remained a significant part of the negotiations. However, the
US had agreed to a clarification of the term ‘bulk collection’ and maintained that
collection needs to be ‘as tailored as feasible’ and ‘reasonable’ (EC 2016d, 2). Nev-
ertheless, concerns persisted, particularly with regard to the requirements set out
by the CJEU (EC 2016e, 1). The European Parliament also criticized the fact that
the restrictions on bulk collection did not meet ‘the stricter criteria of necessity and
proportionality as laid down in the Charter’ (EP 2016b, 4). The EP largely stayed
committed to principles of the order of fairness, arguing on behalf of a reference
community of individuals that ‘“protecting data” means protecting the people to
whom the information being processed relates’ (EP 2016b, B).
However, some perceive that the strong juxtaposition of EU and US surveillance
practices did not consider the significant extent to which EU member states are
engaged in similar practices (Farrell and Newman 2019, 1–3; EU FRA 2018). The
strong emphasis on dystopian scenarios of mass surveillance created the impres-
sion that the EU attempts to impose standards externally that it is not willing
to uphold internally (see also Lamla 2016). While this points to an emphasis on
sovereign community norms rather than the prioritization of the order of fairness
and human rights principles, some actors also problematized abusive powers in
the EU. In 2015, a European Parliament resolution argued that ‘the lack of a clear
definition [of national security] allows for arbitrariness and abuses of fundamen-
tal rights and the rule of law by executives and intelligence communities in the
EU’ (EP 2015, para. 24). The criticism of significant ambiguity and uncertainty
concerning both the actual wording and the level of bindingness delegitimizes
jurisdictional claims based on the order of security.
In February 2016, renewed negotiations replaced the Safe Harbour framework
with the EU–US Privacy Shield. Highly similar in structure and content, Pri-
vacy Shield offered voluntary self-certification for companies in the US under
the scrutiny of the Federal Trade Commission and the Department of Trans-
port. The framework represented a compromise and was as much an institutional
solution to bridge the existing normative divisions as Safe Harbour had been.
However, it included some stronger provisions compared with its predecessor. On
the one hand, Privacy Shield introduced stricter commitments as well as trans-
parency, overview, and enforcement measures. The FTC seems to have become
more assertive in the enforcement practices under Privacy Shield (FTC official,
pers. comm. 2019). The European Commission also highlighted constraints and
obligations in relation to its own work and referred to the Schrems decision as a
way to ‘breathe life into the framework’ (2016c, 10). Similarly, the contrast with
a ‘formalistic exercise without consequences’ (10) seems to ‘mark a significant
SAFE HARBOUR AND ITS DISCONTENTS 95
departure from the previous static situation’ (11). On the other hand, Privacy
Shield created new opportunities for (judicial) redress as well as an ombudsman
position. The agreement was part of a package deal with the US Judicial Redress
Act and the so-called Umbrella Agreement. The US Judicial Redress Act of 2015
extends some data and privacy protection measures to non-US citizens or resi-
dents. It offers EU citizens the possibility of contesting both US companies’ data
disclosures and government practices, a right that had been granted to US citizens
in the EU with the Data Protection Directive (1995). While there had been con-
siderable disagreements about the scope of judicial redress for EU citizens in the
area of law enforcement since it was first addressed in the 2007 High Level Contact
Group (Farrell and Newman 2019, 138), the necessity of a transatlantic agreement
forced negotiators to find a common position. The Umbrella Agreement (EU and
US 2016) came into force in February 2017. Under the agreement, data transfers
for law enforcement purposes between the EU and the US, also when carried out
by private entities, are subject to a common standard of data protection, with the
exception of Ireland and Denmark.
Thus, Privacy Shield offered a stronger focus on rights protection, and it more
thoroughly and explicitly addressed fundamental rights concerns. The Euro-
pean Commission was confident about the improved safeguards, for example
with the then Commission Vice President Andrus Ansip stating that Privacy
Shield ‘will protect the personal data of our people’ (Ansip, cited in EC 2016f),
which again establishes a reference community of Europeans. Nevertheless, many
changes compared with Safe Harbour remained limited, particularly regarding
bulk surveillance (Weiss and Archick 2016, 8–11), which contributed to con-
siderable uncertainty (Voss 2016). As in other data-sharing agreements, the US
administration chose to provide written assurances to confirm restrictions for
public surveillance measures. The EP criticized this due to uncertainty about a
lack of legal bindingness (EP 2016b, 7). In the 2019 review of Privacy Shield,
the European Commission noted that only one complaint, which was considered
inadmissible, had been filed with the Ombudsperson since the framework had
become operational (EC 2019d, 27). This is a common feature of compromises in
data-sharing frameworks, as I also illustrate with regard to the PNR (see Chapter 5)
and the TFTP (see Chapter 6) agreements. While review and transparency mech-
anisms prove worth in the order of fairness, they work as implicit legitimation of
the order of security. While, formally, US authority was curtailed, practices did
not seem to change substantially (Suda 2017).
In sum, Privacy Shield constituted a temporary compromise between differ-
ent orders. Bulk surveillance practices did not stop, but they should have become
more transparent and challengeable. While the vague specification of principles
in compromises frequently is a major factor in maintaining its stability (Boltanski
and Thévenot 2006, 279–80), this proved insufficient to withstand judicial scrutiny
by the CJEU, as I demonstrate in Section 4.2.8.
96 DATA GOVERNANCE
offered adequate protection, and the CJEU confirmed this opinion in its judgment.
The Court emphasized that the use of SCCs may only be considered adequate if
strong safeguards are in place and emphasized the responsibility of the transfer-
ring actors, particularly controllers, for ensuring these safeguards (Data Protection
Commissioner 2020, paras. 131–46). While the case constituted only an indirect
challenge to Privacy Shield (EP 2017a), the CJEU again invalidated the agree-
ment. A direct challenge by the French NGO La Quadrature du Net was scheduled
to be heard after the Schrems II judgment (Joined Cases C-511/18, C-512/18 and
C-520/18 2016), but is now moot. Schrems argued:
The judgment makes it clear that companies cannot just sign the SCCs, but also
have to check if they can be complied with in practice. In cases such as Facebook,
where they don’t take action, the DPC had the solution to this case in her own
hands all along. She could have ordered Facebook to stop transfers years ago …
Instead, she turned to the CJEU to invalidate the SCCs, which are valid. It’s like
screaming for the European fire brigade, because you don’t feel like blowing out
a candle yourself.
(noyb 2020)
(CJEU 2020, 3, emphasis in original). The Court highlighted the limited powers
of the Ombudsperson particularly concerning US intelligence. It emphasized the
broad nature of US surveillance, highlighting a dystopian scenario in the order of
fairness:
In the view of the Court, the limitations on the protection of personal data arising
from the domestic law of the United States on the access and use by US pub-
lic authorities of such data transferred from the European Union to that third
country, which the Commission assessed in Decision 2016/1250, are not cir-
cumscribed in a way that satisfies requirements that are essentially equivalent to
those required under EU law, by the principle of proportionality, in so far as the
surveillance programmes based on those provisions are not limited to what is strictly
necessary.
(CJEU 2020, 3, original emphasis)
It is notable that the Court made an explicit assessment of the necessity and
proportionality of surveillance practices by US intelligence services, which are
clearly outside the Court’s jurisdiction. Thus, while the US joined the conflict as
a party and attempted to influence meaning-making processes, the evaluative cri-
teria already entrenched in the 2015 judgment prevailed. The Court’s conclusion,
under the same presiding judge, similarly remained unchanged. Privacy Shield, as
already mentioned, constituted essentially an updated version of Safe Harbour. In
2015, the Court had invalidated Safe Harbour on the basis of the Commission’s
lack of an assessment of the legal situation in 2000. The judgment already con-
tained strong indicators that the Court did not deem US intelligence compatible
with fundamental rights in the EU. Therefore, this judgment should, like the case
of data retention with the invalidation of the Data Retention Directive in 2014
(Digital Rights Ireland 2014) and the invalidation of member states’ general obli-
gation to data retention just two years later (Joined Cases C-203/15 and C-698/15
2016), be understood as a way by which the Court stressed the seriousness of its
initial assessment.
The case thus highlights the role of judicial actors as part of political conflicts.
Some (Brkan and Psychogiopoulou 2017) have explored the role of courts in
political controversies from a legal perspective, including the CJEU (Brkan 2017).
However, in the conflict resolution process, the courts play an important political
role, particularly in cases of less stringent institutionalization (see also Fahey and
Terpan 2021). It shifted from a more member states-oriented approach that left
significant leeway for national authorities and courts (Bagger Tranberg 2011, 242)
to a stricter interpretation of necessity and proportionality, for instance in Schecke
(Joined Cases C-92-93/09 2010). This emphasis was upheld even compared with
other fundamental rights such as freedom of expression in Satamedia (2008). This
seems to have increased in recent years. In 2014, the CJEU had surprisingly, and
SAFE HARBOUR AND ITS DISCONTENTS 99
against the recommendations of the AG Yves Bot, backed the ‘right to be forgot-
ten’ and thereby increased obligations particularly for search engines. In the same
year, the Court also invalidated the European Data Retention Directive, which had
been in place since 2006 (Data Retention Directive 2006). In member states, sev-
eral constitutional courts had formerly declared laws based on the directive to be
unconstitutional. However, this, as Schneider highlights, was a first opportunity
for the Court to ‘establish itself as a fundamental rights court, even in the field
of security law’ (2017, 549). Thus, compared with the restrictive interpretation of
security-relevant data in 2006 (see also Chapter 5), the requirement of ‘essentially
equivalent’ standards for data transfers generally indicates a shift to and firmer
entrenchment of the order of fairness. As in the prominent Kadi case of 2009, the
Court used human rights law to refer to extraterritorial rights (Schrems 2015).
It remains to be seen whether the Court’s recent decisions will contribute to a
backlash (Burchardt 2020) or appreciation (Powles 2015b).
4.4. Conclusion
This chapter has illustrated the evolution of the transatlantic commercial data
transfer regime as a space that has been characterized by remarkable continuity
compared with that of other areas of data governance for a significant time but
that has experienced significant changes and conflicts in the last years.
The chapter demonstrates how distinct normative visions across and within
institutions play out in and shape jurisdictional conflicts. Jurisdictional claims by
intelligence services formed data as an object of governance through linking its
public/private and commercial/security character. This contributed to unprece-
dented bulk access to data transferred from the EU to the US for commercial
reasons. While the jurisdictional claim by the US intelligence services was subject
to significant contestations in other areas of data governance, such as airline travel
or financial data, it remained largely implicit due to its secret character. When the
mass data access was disclosed by the Snowden revelations, this brought up not
only the underlying jurisdictional conflict between the regulatory regimes but also
the inconsistencies and contradictions in the EU’s approach to data governance.
The broad concessions to the order of production that had characterized the
transatlantic relationship under Safe Harbour notably and to the frustration of the
European Parliament persisted in the European Commission’s approach after the
Snowden revelations. The entrenched character of the order of production is also
highlighted by the fact that the US administration at no point felt the need to jus-
tify the explicit facilitation of surveillance through a framework that was designed
to provide safeguards.
The further evolution of the conflict also demonstrates the importance of rec-
ognizing and addressing the profound character of normative challenges. The fact
that the European Commission and the US decided to follow the imperative of
SAFE HARBOUR AND ITS DISCONTENTS 103
justification only to such limited extents left a normative vacuum that created
the opportunity for a profound normative challenge. Rather than by the Euro-
pean Parliament, the successful challenge was articulated by an outsider. Schrems,
backed by the European Parliament, NGOs in both the EU and the US, and the
EU’s legal order was able to gain recognition for the criteria of evaluation pre-
sented, demonstrating a dystopian scenario in the order of fairness highlighting
indiscriminate mass surveillance.
Importantly, this case is also a demonstration of the importance of individual
actors, particularly as proponents of the order of fairness. Even if they do not
have immense political or other forms of capital from the outset, this case demon-
strates how individuals can have particularly disruptive qualities. The revelations
by Edward Snowden catalysed significant discursive and institutional develop-
ments and created a ‘salience shock’ (Kalyanpur and Newman 2019b, 7) that
reached a broader audience. Building on the Snowden revelations, Maximilian
Schrems, due to persistent complaints regarding the violation of data protection
principles and privacy rights in the EU, disrupted the commercial data sharing
regime.
The multilevel character of the jurisdictional conflict in Schrems reshuffled the
institutional and discursive dynamics within the EU and contributed to a strength-
ened EU position in the global field. After the Snowden revelations, the public
outcry seemed to lack a transformation in practices, as the US denied breaking
any international agreements or laws. The CJEU judgment strengthened the posi-
tion of NGOs and privacy activists vis-à-vis the European Commission (Farrell
and Newman 2019, 158). The actions of Schrems established better access for
non-state actors and oversight bodies in the long term. This is also in line with
stronger protection for whistle-blowers in the EU, following calls by the Council
of Europe to implement stronger protections (CoE 2014). In addition, consider-
ing the Court’s binding interpretation that any adequate protection would need
to be ‘essentially equivalent’ (Schrems 2015, para. 73), any transatlantic agree-
ment required more comprehensive privacy protection principles, which shifted
the space of possibles towards a stronger recognition of the order of fairness.
While Privacy Shield represented essentially an updated version of Safe Har-
bour, the invalidation provided additional leeway for the European Commission
to demand stronger safeguards, also in the review processes, and thus poten-
tially restrict potential US encroachment. The invalidation of Privacy Shield in
Schrems II is more difficult to assess at this point. The strict and extraterrito-
rial standards set by the CJEU create a strong jurisdictional claim which will
be difficult to reconcile with the existing intelligence practices in the US. In
view of the continued dependence on US services and the largely unchanged
position of US governments toward foreign intelligence collection, substan-
tial progress seems unlikely, as ongoing negotiations indicate a potential Safe
Harbour 3.0.
5
Passenger Data in Air Travel
Establishing Data as a Security ‘Tool’
While PNR sharing seems to follow a linear path of expansion, the adoption
of PNR sharing was far from smooth, particularly in the context of transatlantic
relations. In the early 2000s, when the system was introduced by the US, the 9/11
attacks were insufficient to convince the EU of the necessity and effectiveness of
PNR sharing, and heavy pushback resulted in significant tensions. This involved a
referral to the Court of Justice of the European Union (CJEU) in 2006 and a signif-
icant challenge to a bilateral agreement in 2010, both instigated by the European
Parliament (see also Kaunert et al. 2012, 484). Thus, the conflict resolution pro-
cess not only stabilized but also had effects beyond the resolution of overlapping
jurisdictional claims in the domestic context. How is it possible that, despite suc-
cessful disruptions and strong resistance, PNR is now part of an emerging global
regime?
Compared with the other cases I examine, the earlier phases of PNR sharing
(de Goede 2008; de Hert and de Schutter 2008; Kaunert et al. 2012) in particular
have received more attention in the literature. Scholars offer different explanations
for the persistence of the regime and particularly for the transatlantic agreement.
On the one hand, Argomaniz has argued that ‘border security cooperation is
far from being a “partnership”, resembling instead an asymmetrical relationship’
(2009, 120). This explanation of the stabilization of PNR sharing conceptualizes
the EU as a ‘norm-taker’ (Argomaniz 2009) and establishes the influence of the
US as the key driver behind the continuation of the regime. A less stark version of
this argument simply points to normative convergence (Porter and Bendiek 2012).
Others have argued that this interpretation ignores important aspects of the evo-
lution of PNR or EU security policies more generally (Bellanova and Duez 2012,
123; Monar 2015). For instance, de Goede and Wesseling argue that actors in the
EU seem to have ‘appropriated’ (2017) security norms by adjusting them to the
local context, which also has significant benefits for EU counterterrorism efforts
(Monar 2015, 351). Farrell and Newman (2019, ch. 3) have highlighted the role
of influential transnational coalitions that successfully insulated security policy-
making from contesting actors such as the European Parliament (see also Pawlak
2009). Bellanova and de Goede (2022), as well as Ulbricht (2018), have identified
technological choices and the infrastructural setting as significant factors in the
regulatory process.
This chapter adds a more detailed analysis of how PNR was transformed in insti-
tutional fora, not despite but crucially also through contestations, and how these
processes are interlinked with private security actors. As the earlier phases of the
conflict have been examined in more detail, this chapter offers the opportunity to
focus more on the consequences of conflict resolution strategies. In this chapter, I
make three broad arguments: first, in line with existing literature, I argue that PNR
should be understood not only institutionally but also normatively embedded in
a more comprehensive strategy of post-9/11 counterterrorist activities (Kaunert
and Léonard 2019). By providing a detailed analysis of justificatory practices in
106 DATA GOVERNANCE
be significantly shaped by the war on terror but also fundamentally changed the
character of commercial air travel.
The 9/11 terrorist attacks killed nearly 3,000 people and contributed to a signifi-
cant overhaul of counterterrorist policies. Within a year after the attacks, the US
had introduced more than 130 pieces of legislation. The US Department of Home-
land Security (DHS), established in 2002, is now one of the largest ministries in
terms of both budget and personnel. Legislation broadened information sharing
competences, for example through the US Patriot Act (2001), and intelligence and
surveillance capacities, for example through the Intelligence Reform and Terror-
ism Prevention Act or the classified Terrorist Surveillance Program. The Foreign
Intelligence Surveillance Act (FISA) passed in 1978 was amended in 2008—
through section 702—to prevent the surveillance of US citizens (Kaczmarek and
Lazarou 2018).¹
In the EU, many actors used the 9/11 attacks as a ‘window of opportunity’
(Kaunert et al. 2015) to similarly widen and deepen the competences of the EU
in counterterrorism. Terrorism featured prominently in diverse strategies, such as
the Counterterrorism Strategy, the European Security Strategy in 2003, and the
Internal Security Strategy (Tocci 2017). Until 2009, EU policy had been divided
into three pillars, that is, common market activities, foreign and security policy,
and home and justice affairs, which specified rules and responsibilities for the EU
institutions in these areas. The Lisbon Treaty significantly increased EU compe-
tences in security, particularly for the Commission and the Parliament (Kaunert
and Léonard 2012).
Besides the national and regional overhaul of counterterrorist policies, the 9/11
terrorist attacks also contributed to the proliferation of international efforts, par-
ticularly UNSC activity (Kreuder-Sonnen 2019). It also reinforced cooperation
between intelligence services and law enforcement agencies in the transatlantic
context (Farrell and Newman 2019, 45), including an operational agreement
between Europol and the US, Justice and Home Affairs (JHA) ministerial meet-
ings, and the European Counter Terrorism Centre (ECTC), as well as cooperation
with social media or tech companies (Bartlett and Reynolds 2015). Coopera-
tion efforts did not always run smoothly, for example concerning extraordinary
rendition and detention facilities (Porter and Bendiek 2012) but also regarding
data processing and protection standards. The conflicts in this area prompted
¹ For a more comprehensive overview of counterterrorism in the US, see Cook (2020); for the EU,
see Argomaniz, (2011); Argomaniz et al. (2017); and for the transatlantic community, see Anagnostakis
(2017); Porter and Bendiek (2012); for a specific focus on data, see Amoore and de Goede (2005);
Farrell and Newman (2019, ch. 2).
108 DATA GOVERNANCE
In view of the key role of air travel in the attacks, aviation security in the US
and elsewhere immediately expanded. Security measures at airports, such as secu-
rity checks before check-in, or the prohibition on carry-on liquids, have been
buttressed by less visible safeguards, such as cooperation between airlines and
security authorities, the introduction of no-fly lists, or the increase in the number
of Federal Air Marshals (Salter 2007; Schouten 2014). The adoption of the US Avi-
ation and Transportation Security Act (ATSA 2001) in December 2001 expanded
the surveillance of passengers travelling to or from the US.
The ATSA introduced an obligation to provide passenger data to the US Cus-
toms and Border Protection Bureau for all airlines passing through the US (Guild
and Brouwer 2006, 1). There are different types of passenger data that correspond
to varying sensitivity levels. For instance, advanced passenger information (API)
includes basic data such as passenger name, document identification number, or
address. These data were shared between the US Customs and Border Protection
Bureau and air carriers on a voluntary basis prior to the ATSA. In the EU, the 2004
Carrier Directive 2004/82/EC also introduced API sharing on request to improve
border controls and fight illegal immigration. The ATSA demanded sharing of the
more comprehensive PNR data. PNRs consist of up to seventy-six elements of
personal information, depending on the airline and the legal requirements of the
jurisdictions in which it operates. PNR data are generated and transferred as part
of the booking process and stored in a computer system, often one of the global
distribution systems or the airline reservation system. They facilitate processing
of customers on flights with cooperating airlines, for example under a code-share
system. PNR data include, for example, personal and travel details of the passen-
ger and co-travellers, baggage information, IP addresses, flight meal preferences,
payment details or discount codes, and other services booked in connection with
PASSENGER DATA IN AIR TRAVEL 109
the flight, such as hotel or hire car bookings (EU and US 2012, Annex). While the
data were originally collected for commercial purposes, PNR transfers to security
authorities aim to identify passengers linked to terrorism and serious crime on the
basis of certain travel or intelligence patterns (Brouwer 2009). While API data may
identify known terrorists, criminals, or suspects, PNR data aim to flag persons who
were previously unsuspected of involvement in such activities for further inves-
tigation. They also reveal potentially highly sensitive information. For example,
specific service requests, such as particular meal preferences for kosher or dia-
betic food, or special assistance requests indicate religious affiliations or health
concerns, while emergency contact or co-traveller details reveal information on
close personal contacts.
Despite the high salience of terrorism at the time of adoption, the introduction
of the ATSA provoked significant tensions between the EU and the US. The US
claim of legitimate control over passenger data via the ATSA conflicted with provi-
sions of the EU data protection regime. In a 2002 letter, the European Commission
informed the US authorities about potential conflicts of laws regarding the 1995
Data Protection Directive (Joined Cases C-317/04 and C-318/04 2006, para. 33).
The conflict concerned not only the potential violation of passenger rights but also
the violation of established principles of sovereignty as specified in international
law (29WP 2004; EP 2004), as US authorities directly accessed data stored on EU
soil. In particular, the so-called pull system, which enables direct access to airline
databases for security agencies, was strongly contested. In contrast to the push sys-
tem, in which airlines transfer or ‘push’ the data to these agencies themselves, data
access is potentially more intrusive. The normative dimension of the conflict thus
concerned incompatible demands based on justifications rooted in the order of
security (PNR as necessary in the fight against terrorism) and the order of fairness
(PNR as a privacy concern). In addition, there were competing concerns based on
the order of sovereignty (data access as a territorial intrusion).
In the months that followed, the US successfully stabilized the jurisdictional
claim of control over data. Consistent enforcement as well as strict punishments
for non-compliance established the use of PNR as ‘necessary for purposes of
ensuring aviation safety and protecting national security’ (US Department of the
Treasury 2002, 42,711). The strict enforcement secured support of the airlines that
wanted to avoid heavy fines of up to $5,000 per passenger or the loss of landing
rights. While the US postponed enforcement for EU airlines, US authorities argued
that compliance was necessary after March 2003 (Kaunert et al. 2012, 484). The
European Commission, faced with the unilateral jurisdictional claim of a major
power, referred to ‘competing, even to some extent irreconcilable concerns, all of
110 DATA GOVERNANCE
which are legitimate in themselves’ (Bolkestein 2003, 2). In 2003, the European
Commission announced that compliance with the ATSA would not be punished
(EC and US Customs 2003, para. 4). Yet legal uncertainty for airlines persisted
due to the potential enforcement of data protection law by member state DPAs
(Beatty 2003). The European Commission attempted to terminate this uncertainty
(Patten 2004, 3) and reified the issue as an international cooperation problem
that needed to be addressed via a bilateral or global agreement. This manifested
the convergence of the goals of global security cooperation and private company
interests.
The US strengthened its claim through security references. Already in 2003, the
EU–US joint declaration explicitly established a reference community of vulnera-
ble citizens using PNR data to ‘identify and interdict potential terrorists and other
threats to national and public security, and to focus Customs resources on high
risk concerns, thereby facilitating and safeguarding bona fide traveller’ (EC and
US Customs 2003, Annex, emphasis added). The reference to bona fide travel (see
also, e.g., EU and US 2007, 8 Annex II) establishes an imaginary reference com-
munity of innocent and helpless potential victims that is existentially threatened
and in need of protection. This move provided legitimacy for stricter preventative
surveillance by demarcating the space of possible options through the evaluative
criterion of security.
The EU echoed this shift. Already in 2003, the European Commission had
argued that ‘the EU’s approach cannot be limited to responding to the initiatives of
others’ (EC 2003, 3). It emphasized the EU’s aspiration to improve its position in
the field and establish its own PNR directive. This interpretation resonates with the
analysis of EU counterterrorism by Monar (2015, 355–6), who argues that the EU
established an increasing international presence and capability in this area, despite
the firm entrenchment of operational capacities in the member states. Both the
early articulations of an EU interest in developing its own PNR directive and the
securitization attempts in other documents highlight that, at least for some actors,
there was a pre-existing interest in strengthening security goals (de Goede 2008;
Farrell and Newman 2019) rather than a unidirectional construction or norm-
taking process. Nevertheless, terrorist attacks, particularly on EU soil, increased
public pressure for action. The United Sates and the EU aimed to mitigate this
pressure through cross-border cooperation.
The pre-existence of interests in broadening security-related policies did not
equal an uncritical acceptance of such instruments on either side of the Atlantic.
Farrell and Newman highlight how strong resistance from civil liberties orga-
nizations in the US challenged the set-up of domestic databases and surveil-
lance programmes such as CAPPS II.² The domestic failure of security measures
² The Computer Assisted Passenger Prescreening System II assessed the risk of individual passen-
gers by comparing PNR information with a number of governmental databases. It was terminated
PASSENGER DATA IN AIR TRAVEL 111
In March 2004, the European Council asked the European Parliament for an
opinion on the conclusion of a bilateral agreement with the US. The Council
emphasized its urgency and stated that:
The fight against terrorism, which justifies the proposed measures, is a key prior-
ity of the European Union. Air carriers and passengers are at present in a situation
of uncertainty which urgently needs to be remedied. In addition, it is essential to
protect the financial interests of the parties concerned.
(Council of the EU, cited in Joined Cases C-317/04 and C-318/04 2006, para. 37)
This statement not only highlights the importance of the fight against terrorism
and its interlinkages with private interests but also emphasizes the EU’s position.
MEPs were largely critical of this approach. In a 2003 resolution, the Parliament
emphasized:
the doubts and concerns that have been expressed by the national authorities
concerning the legitimacy of this demand, including its legitimacy under US
law, and in particular doubts about its compliance with EU data protection leg-
islation given the risk that reservation system databases may become de facto
‘data-mining’ territory for the US Administration.
(EP 2003)
in 2004 and replaced by the more limited programme Secure Flight; see https://www.tsa.gov/travel/
security-screening, accessed 30 June 2022.
112 DATA GOVERNANCE
of its own public interest should lead to the routine and wholesale transfer of
data protected under the directive’ (29WP 2002, 6). In 2003, the head of the
29WP strongly contested the Commission’s approach arguing that ‘it has not yet
been clarified how the Joint Statement might provide a sound legal base to jus-
tify an exception to that rule’ (Rodotà 2003, 2). In the domestic context, DPAs
began to issue opinions and fines, stating the EU airlines were not allowed to
transfer data (Farrell and Newman 2019, 77). The fact that the Commission
largely ignored this pressure from the designated expert authorities reinforced
tensions.
In the spring of 2004, the discord between the European Commission and the
European Parliament manifested as a conflict over the actors’ positions in the field.
In a 2004 resolution, the Parliament ‘reminds the Commission of the require-
ment for cooperation between institutions which is laid down in Article 10 of
the Treaty’ (EP 2004, para. 7) and highlighted the lack of enforcement of EU law
(EP 2004, para. B). In April 2004, the European Parliament decided to request a
CJEU opinion on the basis of doubts about the potential overreach of Commis-
sion competences as well as the compatibility of the agreement with the right to
privacy granted under Article 8 of the ECHR. This also constituted an implicit
rejection of the Council’s request for urgent consideration and its jurisdictional
claim.
Despite the resistance of the European Parliament and the fact that the CJEU
had not articulated an opinion on the matter, the Commission decided to adopt
an adequacy decision in May 2004. The adequacy decision constituted a neces-
sary legal basis for any further agreement, as it established that EU passengers’
data would be protected sufficiently according to EU law. The decision was
tied to the adoption of a set of ‘Undertakings’ by the US Homeland Security
Bureau of Customs and Border Protection (CPB 2004), which specified condi-
tions for the collection and transfer of PNR. Like the Safe Harbour principles
(see Chapter 4), these undertakings reflected basic data protection elements, such
as purpose limitations, sensitive data restrictions, and a significantly reduced
retention time from fifty to three and a half years. The EU was not success-
ful in its demand for a push system, but the CBP committed to switching from
the pull system once airlines were able to provide the necessary data. Onward
transfers were allowed for other US agencies as well as other governmental author-
ities in the framework of counterterrorism and law enforcement (CPB 2004).
In sum, the conditions constituted an effort to strengthen privacy safeguards
and at the same time appealed to the order of sovereignty, conforming to legal
requirements.
In early June 2004, the Council informed the European Parliament that it had
approved the decision and stressed the fight against terrorism and the legal uncer-
tainty for the airlines. Shortly after, the US and the EU concluded a first bilateral
agreement that set standards for passenger data sharing and transfers to third
authorities. At the beginning, the agreement states:
PASSENGER DATA IN AIR TRAVEL 113
The European Parliament has not given an Opinion within the time-limit which,
pursuant to the first subparagraph of Article 300(3) of the Treaty, the Council
laid down in view of the urgent need to remedy the situation of uncertainty in
which airlines and passengers found themselves, as well as to protect the financial
interests of those concerned.
(EU and US 2004, 1)
As in the earlier proposal, the emphasis on legal certainty for companies was
strongly tied to the fight against terrorism, highlighting the convergence of eco-
nomic and security goals. The Council used this to justify the lack of inter-
institutional coordination with the Parliament.
Due to the treaty provisions at the time, the Parliament’s opinion was not bind-
ing. The Parliament decided to file a lawsuit with the CJEU, even though it was not
completely clear whether the case was actionable (Farrell and Newman 2019, 80).
As in the request for the court opinion, MEPs voiced concerns about the legality of
the adequacy decision, contesting the provisions based on justifications rooted in
the order of fairness, as well as the legal basis of the agreement, contesting partic-
ularly the Commission’s position in the field. In contrast, the Council emphasized
that the agreement was based on the first pillar, as it concerned commercial data
collection practices by airlines. Due to the harmonized nature of the data protec-
tion regime, unequal treatment for airlines could negatively affect competition.
Under Article 95 of the EU Treaty, the Council of Ministers would be authorized
to secure the conditions for the internal market.³
In 2006, the Court annulled both the Council decision on the agreement and
the European Commission’s decision on adequacy (Joined Cases C-317/04 and
C-318/04 2006) due to an inadequate legal basis. The Court found that both the
Council of Ministers and the European Commission had acted outside the legal
framework:
While the view may rightly be taken that PNR data are initially collected by air-
lines in the course of an activity which falls within the scope of Community law,
namely sale of an aeroplane ticket … the data processing which is taken into
account in the decision on adequacy is, however, quite different in nature. As
pointed out in paragraph 55 of the present judgment, that decision concerns not
data processing necessary for a supply of services, but data processing regarded
as necessary for safeguarding public security and for law enforcement purposes.
(Joined Cases C-317/04 and C-318/04 2006, para. 57)
While this echoed some of the concerns voiced by the European Parliament, the
Court’s reasoning had unintended consequences and the effect of the judgment
³ In July 2005, the LIBE Committee asked for full co-decision rights on PNR issues, which would
have been possible under the ‘passerelle’ clause of the EU Treaty whereby the Council would share
decision-making authority, but the Council declined.
114 DATA GOVERNANCE
was contrary to the intention of the Parliament. By establishing PNR data as a mat-
ter of public security, the Court shifted institutional responsibility to the Council
and particularly the interior ministers, who tend to focus more on security con-
cerns due to their professional mandate. Rather than limiting the influence of the
European Commission, the judgment altered the space of possibles and enabled
further shifts towards the order of security, entrenching PNR as a necessity.
This institutional shift towards the JHA Council resulted in an exclusion of
strong proponents of the order of fairness from the negotiations, particularly
the European Parliament and DPAs (de Hert and de Schutter 2008; Kaunert
et al. 2012; Monar 2015). The change of institutional forum also put the Coun-
cil presidency instead of the European Commission in charge, which required
the unanimous approval of the deal by the Council of Ministers (Kaunert et al.
2012, 485). In addition, the Court set a time limit for a new agreement of the end
of September 2006, thus imposing a sense of urgency, which was reinforced by
the legal uncertainty for European airlines. Therefore, these institutional changes
provided improved conditions to subject meaning-making processes to princi-
ples of the order of security. The reasoning of the judgment allowed the Court
to avoid any substantial balancing act between fundamental rights or goals, which
implicitly provided the Council with a legitimation not to push for stronger data
protection safeguards.
In summary, the institutional situation at the time provided significant advan-
tages to the Council and enlarged the space of possibles for jurisdictional claims
based on a pursuit of security, while it weakened the position of the Parliament
and DPAs in their attempts to strengthen principles of the orders of fairness and
sovereignty. While the European Commission’s reasoning to push for a bilateral
agreement seemed to be grounded in the orders of globalism and production,
emphasizing cooperation and the potential negative impacts of legal uncertainty
for airlines, the European Parliament highlighted a violation of principles of the
order of sovereignty. While the Parliament’s claim for invalidation of the PNR
agreement was successful, the judgment also changed the conditions for justifi-
catory practices. It altered the space of possibles to entrench evaluative criteria of
the order of security, which I outline in Section 5.2.
In Sections 5.2.1–5.2.4, I aim to demonstrate how the discursive and practical fixa-
tion of meaning demarcated the space of possible claims to jurisdiction. I highlight
the transformation of PNR into a key ‘tool’ in the fight against terrorism, which
entrenched a broader normative shift to the order of security.
The invalidation of the first bilateral PNR agreement by the CJEU in 2006 did not
attract significant attention in the US, as domestic authorities just continued to
enforce the ATSA. Yet a new agreement was mandated by the CJEU. The increas-
ingly powerful DHS (Farrell and Newman 2019, 84) explicitly tried to distance
itself from former commitments regarding data protection safeguards. Michael
Chertoff, the then Secretary of Homeland Security, argued that ‘we still remain
handcuffed in our ability to use all available resources to identify threats and stop
terrorists’ (Chertoff 2006), pointing to the importance of transnational coopera-
tion and referring to a potential dystopian scenario in the order of security. This
emphasis on strong security cooperation was reinforced in informal settings. For
instance, the German Federal Minister of the Interior Wolfgang Schäuble allegedly
promised an informal solution if the PNR agreement failed. According to a leaked
US document, Schäuble:
instructed his staff, for example, to find a way to bilaterally share airline Passenger
Name Records (PNR) data with the U.S., in case the EU is unable to re-establish
a legal basis for PNR data sharing before the September 30 EU court-imposed
deadline. The German data privacy commissioner opposes the move and claims it
cannot be done legally, but Schaeuble told his ministry to find a legal way to do it.
(US Embassy Berlin 2006a, para. 4)
While the emphasis on cooperation paved the way for a global regulation of
PNR-sharing practices, there is still a tendency to leave room for manoeuvre to
domestic actors. For instance, the US practice of relying on letter exchanges has
been criticized for avoiding legally binding principles (EDPS 2007, 2). According
to leaked documents, ‘The U.S. seeks a different kind of PNR agreement, based
on general principles, not a list of detailed “dos and don’ts”’ (US Embassy Berlin
2006b, para. 7). The emphasis on vague principles conforms to the expectations in
the literature that states demand a considerable amount of leeway concerning core
state powers such as security (Genschel and Jachtenfuchs 2018). While considered
deficient in the order of fairness, informal and flexible guidelines may prove worth
in the order of security.
The attempts to evaluate the situation according to criteria based in the order
of security were underlined by references to concrete threats. Actors frequently
pointed to the threat of terrorism by referring to, for example, the 9/11 attacks,
the terrorist attacks in Madrid in 2004 and in London in 2005, or the failed
attacks on Christmas Day aboard a Northwest Airlines flight headed to the US
in 2009 (EC 2010b; Patten 2004). In 2007, Chertoff again addressed the Euro-
pean Parliament, emphasizing he believed that ‘we are at war’ and ‘life is the
liberty on which all others depend’ (Chertoff, cited in EP 2007a). This not only
highlights the immediate threat but also explicitly links the common good, includ-
ing the exercise of civil liberties, to the higher common principle of security.
While the literature has emphasized the role of ‘external shocks’ (Mahoney and
Thelen 2010, 4) in institutional change (Busch 2013), the terrorist attacks did
not contribute to increased politicization but to increased convergence around
security interests (Maricut 2016). This is in line with the expected depoliticizing
effect of security references in tech-heavy sectors (Hansen and Nissenbaum 2009,
1168).
In consequence and despite a lack of evidence regarding the effect on pub-
lic safety (29WP 2013a, 1), PNR became increasingly normalized as a significant
instrument in the fight against terrorism. A new agreement was adopted in 2007.
In line with the court judgement, PNR was increasingly referred to as a ‘tool’
(EU and US 2007, 2). For instance, the Council proposal characterized PNR data
as ‘a very important tool for carrying out risk assessments of the persons, for
obtaining intelligence and for making associations between known and unknown
people’ (Council of the EU 2007, 3). In contrast to a conception of data as a mat-
ter of individual rights and personal autonomy, the conceptualization of data as
a tool entrenches the fixed and apolitical nature of data as a governable object.
In this understanding, data may be employed for a certain purpose and then
be disregarded without any implications for the passenger’s identity or dignity.
This conceptualization altered the space of possibles regarding the necessary
safeguards.
The new interim PNR agreement sometimes offered a lower level of data
protection. For example, the 2004 agreement emphasized in its first paragraph:
PASSENGER DATA IN AIR TRAVEL 117
In contrast, the first paragraph of the 2007 PNR agreement specifically stated that
the purpose of the agreement was to ‘prevent and combat terrorism and serious
transnational crime effectively as a means of protecting their respective democratic
societies and common values’ (EU and US 2007, para. 1). The fourth paragraph
referred first to the fight against terrorism and crime and then introduced the con-
dition ‘while respecting fundamental rights and freedoms, notably privacy’ (EU
and US 2007, para. 4). This emphasis was, however, again limited in the fifth
paragraph, which stated that ‘U.S. and European privacy law and policy share a
common basis and that any differences in the implementation of these principles
should not present an obstacle to cooperation between the U.S. and the European
Union’ (EU and US 2007, para. 5). While these are seemingly small differences and
details, the position of provisions in such agreements generally provides an esti-
mation of the hierarchization of priorities. The emphasis on the order of fairness,
which featured prominently in the 2004 agreement, was replaced by an emphasis
on security cooperation and the fight against terrorism. The Parliament and the
European Data Protection Supervisor heavily contested the 2007 agreement due
to the lack of data protection. Both problematized that the safeguards specified in
the letters could be unilaterally withdrawn at any point (EDPS 2007, 1). Yet this
criticism had no significant effects.
In summary, the increasing shift towards the order of security as the evaluative
criterion of PNR data was based on the transformation of data into a neutral ‘tool’
in transnational security cooperation. This not only shaped the space of possible
policy options but also had an impact on the space for contestations by defining the
relevant ‘rules of acceptability’ (Boltanski and Thévenot 1999, 360). The changing
space of possibles also significantly shaped the conditions for further cooperation
and contestation, as I highlight in Section 5.2.2.
highlight that the European Parliament had been ‘most resistant to norm taking’
(2012, 72, emphasis in original). Yet the normative character of the challenges to
the status quo in the field decreased considerably over time. This suggests that
its increased institutional competences and the corresponding responsibility dis-
course ‘have led the EP to abandon, or at least compromise, on its principles’
(Ripoll Servent and MacKenzie 2011, 401).
Until the Lisbon Treaty came into force in 2009 and due the institutional shift
towards the JHA Council in 2006, the institutional setup had largely contributed
to the entrenchment of PNR. However, the Lisbon Treaty significantly changed
the institutional competences of the European Parliament.⁴ In the Area of Free-
dom, Security and Justice (AFSJ), often also referred to under the former label of
JHA, the European Parliament has risen to the role of a co-legislator which estab-
lishes increased parliamentary accountability in a particularly contested policy
area (Rittberger 2012). While the Parliament had been able to provide opinions
on so-called ‘third pillar’ decisions, these were not binding. This made it possible
for member states to adopt the 2004 PNR agreement despite the determination
of the European Parliament that transfers were illegal (US Mission to the EU
2009, para. 2). As is evident from leaked reports (US Mission to the EU 2009,
para. 4), there were increasing concerns due to the looming implementation of
the treaty. This concerned particularly the requirement for parliamentary ratifi-
cation of international agreements. The European Parliament does not have the
formal right to initiate legislation. Yet the LIBE Committee was now able to con-
duct reports and issue resolutions on its own initiative. These authority gains of
the Parliament constituted a potential threat to the stability of the PNR agree-
ment. The agreement was formally still only implemented as an interim agreement
because of an insufficient number of approvals by the national parliaments. This
meant it now required approval by the European Parliament (Farrell and Newman
2019, 86).
In May 2010, MEPs rejected a vote on the interim agreement. This veto initially
seemed to represent a more foundational normative challenge based on the order
of fairness. The European Parliament had just decided to reject an agreement on
financial data sharing for counterterrorist purposes (see Chapter 6on the basis of
a lack of normative fit (MacKenzie and Ripoll Servent 2012, 80). The PNR agree-
ment seemed to constitute a similar assertion of the Parliament’s newfound power
(de Goede 2012b). Yet while the European Commission was forced to negotiate
a new PNR agreement, the assessment of the European Parliament as a signifi-
cant challenger was put in doubt when MEPs accepted a largely unchanged PNR
agreement proposal in 2012. This prompted the NGO EDRi to call it a ‘bad day for
civil liberties in Europe’ (EDRi 2012). The Liberal MEP Sophie in ’t Veld who acted
as a rapporteur even withdrew her name in protest, because she felt that the agree-
ment failed to respect the Parliament’s demands on fundamental rights concerning
the retention period, necessity and proportionality concerns, and provisions for
sensitive data (Carrera et al. 2013, 13).
The 2011 agreement was different mainly in the specifics of the retention period,
which was changed to five years (compared to three in the 2004 agreement and fif-
teen in the 2007 agreement), and the fact that data were stored in a depersonalized
and then anonymized form. In the agreement, the conceptualization of data as a
tool was largely upheld. PNR was referred to as ‘necessary’ (EU and US 2011) or
even a ‘tool we cannot do without’ (Meehan 2011, 3), while reports of the EU
Counter-Terrorism Coordinator repeatedly stated that ‘PNRs are a key element
in the fight against international terrorism. It has been and remains a valuable
tool to detect terrorist networks and movements’ (Council of the EU 2010b, 22,
2011, 28).
It seems reasonable to conclude that the European Parliament failed to con-
vert its improved position in the field into the policy change it had previ-
ously demanded. In view of the early challenges, it seems surprising that an
increase in the European Parliament’s formal authority did not lead to signifi-
cant changes concerning demands to respect principles of the order of fairness,
particularly data protection and privacy. The literature seems to converge around
the increasing manifestation of the European Parliament’s role as a responsi-
ble legislator (Ripoll Servent 2013, 982) or partner in global counterterrorism
(Kaunert et al. 2012, 490). The Parliament struggled to be recognized as a
global actor. For instance, members of the US Congress did not necessarily
make time to meet MEP delegations (EP official, pers. comm. 2018) and thus
excluded the European Parliament from transnational processes of meaning-
making (Farrell and Newman 2019, 87). In addition, it is likely that the past
failure to gain recognition for a more fairness-based approach constrained the
perceived space of thinkable and sayable options, including those for civil society
actors.
Moreover, the institutional structure of the field had further converged around
common security interests. For instance, the Stockholm Programme emphasized
the role of ‘necessary technological tools’ (Council of the EU 2010a), including
PNR, in the protection European citizens. It highlights how data-driven secu-
rity policies have persistently shaped EU counterterrorism efforts. This was in
tune with a more comprehensive strategic shift towards prevention and risk in
the understanding of security (see also Amoore 2013).
Furthermore, and in contrast to other cases, the US was closely involved in the
meaning-making process. Already in 2005, the DHS emphasized the important
role of common liberal values, including privacy as ‘an essential right and fun-
damental value’ (DHS 2005, 3). In his 2007 address to the European Parliament,
120 DATA GOVERNANCE
discussed limited but lawful transfers to the NSA. In consequence, PNR was not
more politicized after the Snowden revelations but became increasingly normal-
ized. While there are still some MEPs pushing against PNR, such as in ’t Veld, one
interviewee from the NGO Privacy International even stated ‘PNR has not been
on our agenda for a while; we are not engaged with it’ (NGO representative, pers.
comm. 2019).
In sum, PNR sharing has become an increasingly accepted security practice that
has stabilized through continued contestations of a decreasing normative qual-
ity. By subscribing to the ‘illusio’ (Bourdieu 1996, 331–6) that data governance
can contribute to the common good, actors accepted PNR as a necessary tool in
the fight against terrorism. The space of possible policy options was limited to the
extent of safeguards rather than the question of abolishing data sharing altogether.
As I demonstrate in Section 5.3, this has affected domestic and international
developments.
and the German Minister of the Interior Schäuble suggested that ‘a PNR system
would allow the EU to negotiate with the U.S. on an equal footing and would allow
for balanced cooperation’ (Guardian 2010). A 2008 proposal of the European
Commission failed to be adopted before the Lisbon Treaty came into force, which
made substantial revisions necessary to secure the support of the European Par-
liament. After prolonged debates (Brouwer 2009), a new proposal was presented
in 2011. Shortly before the Snowden revelations, in April 2013, the LIBE Com-
mittee rejected the proposed PNR Directive by thirty votes to twenty-five because
of concerns regarding inadequate safeguards. The EDPS warned of installing ‘the
first large-scale and indiscriminate collection of personal data in the history of the
Union’ (2015b, 1), pointing to a dystopian scenario in the order of fairness.
In 2016 and as part of a ‘package deal’ (EC official, pers. comm. 2019) with the
GDPR, the European Parliament adopted the EU PNR Directive with 461 votes in
favour, 179 against, and 9 abstentions. As the Parliament pushed for the inclusion
of human rights references, the agreement included some concessions. Yet the first
mention of privacy is in paragraph 15 (PNR Directive 2016, para. 15). The PNR
Directive largely focuses on the necessity of PNR, emphasizing that:
Effective use of PNR data, for example by comparing PNR data against various
databases on persons and objects sought, is necessary to prevent, detect, investi-
gate and prosecute terrorist offences and serious crime and thus enhance internal
security, to gather evidence and, where relevant, to find associates of criminals
and unravel criminal networks.
(PNR Directive 2016, para. 6)
Under the PNR Directive, data transfers for intra-European travel are not com-
pulsory, but nearly all member states decided to include such processing in their
legislation. During the negotiation phase, the Conservative LIBE Committee Rap-
porteur, Timothy Kirkhope explicitly recommended the mandatory inclusion of
PASSENGER DATA IN AIR TRAVEL 123
intra-EU flights due to the benefits of a ‘uniform set up and strong security
advantages’ (LIBE 2015, 59). The report stated that:
Given the importance of the purpose for which the PNR data is collected and pro-
cessed, as well as the varied, sophisticated and international nature of the threat
posed, it is necessary to have a system which operates on a 100% collection basis
both within the EU and with third countries in order for the system to be fully
effective. The collection of 100% data also reduces the risk of profiling.
(LIBE 2015, 100)
it to vote for the PNR Directive and stated, ‘We’re at war’ (cited in Davis 2016), thus
repeating the statement the US DHS Secretary Michael Chertoff had made in 2007.
The concept of war re-emphasizes the idea of an existential threat as well as notions
of responsibility. Valls added, ‘I say in particular to the socialist and environmental
groups in the European Parliament: Everyone should assume their responsibili-
ties’ (cited in Davis 2016). French The Minister Bernard Cazeneuve echoed this
narrative, arguing that it was ‘irresponsible of the European parliament to delay
the vote on the adoption of PNR’ (cited in Rankin 2016). The establishment of
an existential threat also invalidated other forms of proving worth. Several actors
tried to contest the effectiveness of PNR. The EDPS said ‘that the Impact Assess-
ment includes extensive explanations and statistics to justify the Proposal. These
elements are however not convincing’ (EDPS 2011a, 3). According to an intervie-
wee, in an attempt to appeal to standards consistent with the order of production,
the European Greens tried to refer to cost-based arguments to emphasize the lack
of effectiveness and efficiency but failed (former MEP adviser, pers. comm. 2019).
Third, it is important to take note of private sector interests. On the one hand,
airlines called for a blanket collection of PNR in all forms of transport, includ-
ing trains, ‘so as to avoid discrimination and competitive distortions’ (Association
of European Airlines 2007, 1), thereby fostering an expansion of the regime.
On the other hand, private security companies designing security solutions con-
tributed to their perceived necessity. As information sharing has become an
integral part of security and defence policies, demand for private sector solutions
is increasing. In 2003, the European Commission convened a ‘Group of Person-
alities’ that included several industry representatives, including some from arms
production companies. The defence industry assumed an important role in shap-
ing future security policies (M. Britz and Eriksson 2005). Since then, different
funding programmes have promoted the use of technological tools in securing
both internal security and external borders. In 2015, the European Commis-
sion convened another group of personalities, which comprised sixteen members,
seven of whom held significant position in private security or defence companies,
such as Indra, MDBA, or BAE Systems. The EU Ombudsperson noted that ‘the
Group of Personalities seems to have played a significant role in preparing the
Commission’s EDAP [European Defence Action Plan], especially as regards the
European Defence Fund’ (European Ombudsman 2018, para. 21). The European
Defence Fund, reduced from the proposed €13 to €8 billion between 2021 and
2027, is intended to include significant investments in disruptive technologies and
data-based security practices. This demonstrates the broader ongoing entrench-
ment of security concerns in the field, which are interlinked with the expansion of
PNR sharing.
There are also indicators of a more substantial involvement of private actors.
Press reports (Lévy 2016) and a publication by the NGO network EDRi
(McNamee 2016) point to a connection between the French PNR proponent
PASSENGER DATA IN AIR TRAVEL 125
Manuel Valls and the company Safran. Safran is based in Évry, where Valls used to
be mayor until 2012. The Safran Group designs and implements data-based secu-
rity measures, including the biometric Aadhaar system in India that aims to be
the largest identity database in the world, a facial recognition program at Changi
Airport in Singapore, and the French army’s drone system (Safran 2016). The
Safran Group’s division Morpho was contracted to supply an API-PNR system by
France in 2014 and by Estonia in 2015.⁵ There is no evidence of any direct involve-
ment, but reports have claimed that Valls intervened when a contract was given to
a competitor (Lévy 2016). While private military contractors are relatively well
researched (Leander 2005), research on what is referred to as the ‘surveillance-
industrial complex’ (Ball and Snider 2013; Hayes 2012) is still in embryo. It is
evident that profits from systems like PNR often benefit the private sector. Compa-
nies such as US software developer SAS, which ‘offers a broad PNR data system for
border management and security that supports continuous management and eval-
uation of high-risk passengers based on their pattern of activities, watch lists and
other data while helping to expedite the smooth movement of legitimate travellers’
(SAS 2017, 1), have an interest in expanding data processing.
In sum, these examples highlight the stabilization of compromises between
the order of security, the order of globalism, and the order of production: Pri-
vate companies have increasing interests in promoting their technological security
solutions based on data availability, mandated by public legislation. To avoid
inequalities between countries, these measures are implemented broadly. This
contributes to a self-reinforcing logic whereby both private companies and public
officials aim to increase the use of data-based security measures, stabilizing juris-
dictional claims based on a vision of data governance as a security partnership.
⁵ The former Morpho Division has since merged with Oberthur Technologies to form new company
Idemia, which concluded both PNR projects. For more information, see https://www.idemia.com/,
accessed 30 June 2022.
126 DATA GOVERNANCE
by terrorist acts’ (UNSC 2017), which requires all UN member states to develop
systems for processing and analysing PNR. The UNSC also explicitly referred to
International Civil Aviation Organization (ICAO) standards and ‘urges ICAO to
work with its Member States to establish a standard for the collection, use, pro-
cessing and protection of PNR data’ (UNSC 2017, para. 12, emphasis in original).
In 2019, a joint EU–US declaration ‘reaffirmed our shared interest in establishing
ICAO standards to encourage rapid and effective implementation of UNSCR 2396
for the use of PNR to combat terrorist travel, with full respect for human rights
and fundamental freedoms’ (EU and US 2019, 2). This also reflects a commit-
ment to strengthening (institutionalized) global cooperation in this area as well as
the increasing role of the UNSC in counterterrorism after 9/11 (Kreuder-Sonnen
2018). This again highlights the important interlinkages to broader debates in
global governance.
Despite its strong entrenchment, there have been some signs of resistance that
highlight the precariousness of these agreements, including both direct and indi-
rect challenges. As is evident from the legal challenges to the Canadian PNR
agreement (CJEU 2017), challenges against PNR continue. The 29WP emphasized
that it ‘is convinced that the reasoning of the Court against the PNR agreement
with Canada is relevant for all PNR instruments’ (29WP 2018, 1–2). Indeed, fol-
lowing strategic litigation against the 2016 EU PNR Directive, a judgment in June
2022(Ligue des droits humains 2022) has set further restrictions on the collec-
tion and use of PNR. The Court broadly followed the recommendation of the
Advocate General Giovanni Pitruzzella, who, in January 2022, found the mass
collection and processing of data to be compatible with fundamental rights, con-
sidering necessity and proportionality as well as existing safeguards (CJEU 2022).
However, the Court set further restrictions on the scope and retention of data
as well as the purposes for their use, the application of the law in the context of
both intra-EU flights and alternative measures of transportations, and the use of
further databases and automated profiling (Ligue des droits humains 2022, para.
299). Thus, while the Court upheld the legality of the PNR Directive, the long-term
consequences of the judgment are unclear. The restrictions set by the Court would
require a significant overhaul of the adopted laws in the member states, and law
suits are already pending (Gesellschaft für Freiheitsrechte 2020). However, further
legal challenges are likely to occur, as the restrictions require further interpreta-
tion regarding specific cases, specific types of data and risk, and concerning the
design of review and supervisory mechanisms. While the Court apparently tried
to balance its approach, this judgment may be subject to criticism in the future, as
it not only sets at time impracticable restrictions but also upholds a directive that
‘entails undeniably serious interferences with the rights guaranteed in Articles 7
and 8 of the Charter, in so far, inter alia, as it seeks to introduce a surveillance
regime that is continuous, untargeted and systematic’ (Ligue des droits humains
2022, para. 111).
PASSENGER DATA IN AIR TRAVEL 127
5.4 Conclusion
To sum up the PNR case, successful efforts to conceptualize data as a ‘tool’ in the
fight against terrorism shifted the emphasis away from data as a protectable part
of human dignity, thereby entrenching the order of security in the field of data
governance. Interests of airlines that favoured legal certainty, private companies
that sell data-based security products, and US and EU officials who emphasized
the importance of security as a fundamental goal of governance converged with
endogenous logics that favoured an ever-increasing expansion of the regime. Con-
testation had significant but unintended consequences in this case. On the one
hand, the unwilling self-exclusion of the European Parliament through the 2006
CJEU case contributed to further entrenchment. On the other hand, incremen-
tal contestations proved stabilizing in the long term. While contestations have
constrained existing sharing practices in some ways via legal specifications, the
institutionalization of PNR has contributed to its normalization as a global secu-
rity standard. The jurisdictional conflict was, therefore, resolved by increasing
processes of normative and interest-based convergence, based on the delimitation
of the space of possibles through the conceptualization of data as a neutral tool in
fighting an existential threat.
From a normative perspective, it should be noted that the entrenchment of PNR
increases despite the absence of publicly available evidence of the effectiveness of
the systems. While the review reports mention concrete numbers (e.g. EC 2017b,
8), they do not show to what extent PNR was necessary for identifying these per-
sons, to what extent less intrusive measures such as API could have been used, or
whether any of these concerns have resulted in an arrest. Even if more details are
given, information is not necessarily more concrete. In answer to a parliamentary
question concerning the effectiveness of the German PNR system, the German
government states that
‘No information can be provided on the extent to which the measures men-
tioned or the data transmitted have contributed or will contribute to the preven-
tion/prosecution of crime. There is no systematic or standardized recording in the
sense of the question.’ (Federal Government (Germany) 2019, 5, my translation).⁶
This means not only that there is no effort to assess the effectiveness, neces-
sity, and proportionality of the system but that it is also not possible to do so.
The redress mechanisms, that provide legitimacy to the jurisdictional claim over
data, are also rarely used. The EU–US report notes that the DHS Traveller Redress
Inquiry Program has not received any enquiries or requests for judicial review.
This might be due not only to a lack of knowledge about the system but also to
⁶ The original reads: ‘Zu der Frage, inwieweit die genannten Maßnahmen oder die übermittelten
Daten in der weiteren Folge zur Straftatenverhütung/-verfolgung beigetragen haben oder noch beitra-
gen werden, kann keine Angabe gemacht werden. Eine systematische bzw. standardisierte Erfassung
im Sinne der Fragestellung erfolgt nicht.’
128 DATA GOVERNANCE
the lack of transparency of security measures. For example, the MEP Sophie in ’t
Veld filed a lawsuit against the DHS after she suspected a freedom of information
request on PNR and other data had her blacklisted, because she needed to undergo
additional security checks when entering the US (Kanter and Minder 2010). Data-
based security measures have been strongly criticized due to their discriminatory
potential, also with regard to, for example, racial profiling and other practices of
‘social sorting’ (Lyon 2007). Resulting watch lists may be used as a justification for
screening measures and for being denied air travel, entering a country, or govern-
ment benefits. These lists have been found to infringe the constitutional rights
of US citizens (Savage 2019). Considering that in Germany one lead generates
about 400 false leads (Endt 2019), there is significant potential for false positives.
In Chapter 6, I further outline how such practices of selective transparency and
secrecy contribute to the stabilization of agreement.
6
Financial Data Sharing
The Extended Arm of the US Treasury
In 2006, the use of data for security purposes was on the rise. The Europol Infor-
mation System, a central criminal and intelligence information database, had been
introduced in 2005 (Europol 2020), and the adoption of the EU Data Retention
Directive (Data Retention Directive 2006) just a year later specified mandatory
retention for telecommunication data. As discussed in Chapter 5, the CJEU had
found sharing passenger data in air travel a ‘necessary’ (Joined Cases C-317/04
and C-318/04 2006, para. 57) instrument in the fight against terrorism. In this
chapter, I explore a conflict over financial data between the EU, the US, and the
Belgian Society for Worldwide Interbank Financial Telecommunication (SWIFT).
When a 2006 article in the New York Times (Lichtblau and Risen 2006) revealed
that US authorities had over several years secretly accessed details of international
financial transactions under the so-called Terrorist Finance Tracking Program
(TFTP), significant transatlantic tensions arose. The TFTP is a combating the
financing of terrorism (CFT) measure that was designed to track financial flows
to and through terrorist networks, which depend on money for purposes includ-
ing training, travel, bribes, or weapons. After the revelation, European actors
initially heavily contested this jurisdictional claim by the US. In 2010, the Euro-
pean Parliament famously rejected a high-level bilateral agreement supposed to
institutionalize these transfers (Monar 2010). However, since 2012, the sharing of
financial data for security purposes has become entrenched in the fight against ter-
rorism, despite ongoing controversies. The conflict shows important parallels to
the Passenger Name Records (PNR) case discussed in Chapter 5. The impetus for
financial data sharing similarly emerged as part of anti-terrorist measures imple-
mented after 9/11. The jurisdictional conflict likewise concerns the clash of these
security measures with EU data protection regulations due to the secondary use of
commercially collected data for security purposes. There was strong contestation
regarding whether financial data sharing would fall under the first (commercial)
or the third (policing and internal security) pillar of the pre-Lisbon EU (de Goede
2012b, 219–20). Yet while access to PNR was based on domestic law, the data
from SWIFT were initially requested in secret. Due to this undisclosed nature of
the initial programme and the continued lack of oversight and transparency, the
TFTP seemed to be even more controversial than the PNR programme. However,
much as with PNR, sharing of financial data has persisted, despite concerns about
its effectiveness (EDPS 2010a, 2010b). Indeed, it has become part of a broader
financial security assemblage that also includes anti-money laundering (de Goede
2012a, 43–5).
In this chapter, I aim to investigate how, despite heavy contestations in the
beginning and throughout the unfolding of the jurisdictional conflict, the con-
flict resolution process has stabilized. Financial data governance has received more
extensive attention in the literature than other issues of data governance. Existing
studies have mostly foregrounded either legal challenges (Fuster et al. 2008; King
et al. 2018) or the respective roles of the EU or the US in security cooperation
(Amicelle 2011; de Goede 2012a, 2012b; Wesseling et al. 2012). As with the PNR
agreement, scholars find the fate of the TFTP to be interlinked with the role of
the Parliament within the EU institutional system (Ripoll Servent and MacKen-
zie 2011) as well as its role as a global actor (Kaunert et al. 2012). De Goede
(2012b) points to pre-existing similarities in risk management and security con-
cerns. While Farrell and Newman rightly stress the access to transnational fora as
a key factor in the persistence of the TFTP (Farrell and Newman 2019; see also
Pawlak 2009), they underestimate the necessity of creating a lasting normative ‘fit’
(MacKenzie and Ripoll Servent 2012, 74).
This chapter aims to add to more recent scholarship that illustrates how dis-
tinct values become inscribed in such data infrastructures and shape regulatory
decision-making (Bellanova and de Goede 2022) and particularly how selective
practices of secrecy and transparency entrench surveillance practices (Amicelle
2013; Curtin 2014; de Goede and Wesseling 2017). I argue that the introduc-
tion of selective transparency measures such as review and oversight mechanisms
strengthened the idea of TFTP effectiveness and legitimacy. They specify trans-
parency requirements but leave significant leeway for actors under scrutiny. This
re-establishes the character of such mechanisms as ‘composite objects’ (Boltanski
and Thévenot 2006, 279) that represent compromises between the order of fair-
ness and the order of security. As expected by the theoretical framework, this has a
significant stabilizing effect on the conflict resolution process and outcome. It has
contributed to the entrenchment of evaluative criteria that tie jurisdictional claims
to a specific social reality. In this case, the vision of data governance as a secu-
rity partnership requires rules to stabilize the regime but works mainly through
informal transnational cooperation in the fight against terrorism and crime. This
chapter thereby discusses the underlying mechanisms that have perpetuated the
‘illusio’ (Bourdieu 1991, 179) of the field of data governance, that is, that data
governance is meaningful in the pursuit of the common good. Even formerly chal-
lenging actors such as the European Parliament are complicit in upholding an
illusio of effectiveness concerning the pursuit of the common good of security.
The chapter is structured as follows: first, I give a brief overview of the emer-
gence of the jurisdictional conflict after the revelation of US access to financial data
by the New York Times and point to responses. Second, I illustrate the increasing
FINANCIAL DATA SHARING 131
with the US Treasury, SWIFT could not disclose the requests (SWIFT 2006). It is
important to note that the TFTP was and is not designed to interdict any transfers
but is restricted to the identification and tracking of terrorist funding networks (US
Department of the Treasury 2022). In contrast to mutual legal assistance treaties,
which are used in the prosecution of crimes (see Chapter 7), the TFTP is based
on a ‘politics of pre-emption’ (de Goede 2018a, 759–60) which tries to identify
potential terrorists before they strike.
SWIFT serves over 11,000 financial institutions in more than 200 countries
and handles more than 7.8 billion financial messages, which makes up the vast
majority of electronic financial transactions in the world (SWIFT 2022). It devel-
ops international standards and facilitates international financial transactions.
SWIFT saves data on financial transactions in case of disputes between banks and
clients (Commission de la Protection de la Vie Privée 2006, 3). Through its quasi-
monopoly position, SWIFT has a key role in international trade and is subject to
oversight by the central banks of the Group of Ten (G10). SWIFT data contains
details about transactions, such as currency, amount, bank and account details,
date, addresses, and in some cases national identification numbers or additional
personal data.
While SWIFT is an organization operating under Belgian law, the data were also
accessible on an identical server from its processing centre in the US. European
data protection authorities were not aware that SWIFT mirrored the data. How-
ever, as they were technically stored in the US, this facilitated direct data access by
the US authorities. The data comprised information about financial transactions to
and from the EU. In principle, this required compliance with both member state
and EU law, including judicial authorization and an adequate standard of data
protection (EP 2006, note 2). SWIFT argued that ‘since the subpoenaed subset of
data already were on US soil, at no point did any crossborder data transfer exist’
(SWIFT 2006, 9). SWIFT, therefore, stated that any cross-border data transfers
had occurred purely for commercial purposes in line with required safeguards.
As a processor, SWIFT did not consider that it had any specific responsibility
for the data that were now stored on US soil. It suggested that any onward trans-
fers were out of its control and consequently complied with subpoenas requiring
data access (Vanbever 2006). Nonetheless, executives at SWIFT seemed to have
been aware of potential legal implications. They asked for additional oversight
and considered withdrawal from the agreement in 2003 (Commission de la Pro-
tection de la Vie Privée 2006, 6–8; Lichtblau and Risen 2006). In response, the US
Treasury introduced additional safeguards, and SWIFT subsequently informed
the National Bank of Belgium and members of the G10 group of central banks
that it was subject to US subpoenas (Commission de la Protection de la Vie Privée
2006, 7–8). While the jurisdictional claim by the US Treasury had been extended
in secret, it now included a broader group of actors that were aware of some form
of data access.
FINANCIAL DATA SHARING 133
While some actors in the EU had been informed about the overlapping claims,
most were not, and the 2006 article in the New York Times (Lichtblau and Risen
2006) that revealed the extent of international financial transactions data accessed
by the US caused significant tensions. In particular, data protection authorities
and the European Parliament felt blindsided (DPA official, pers. comm. 2017).
Through subpoenas, the US had forced a Belgian firm acting under EU law to
submit information through extraterritorial rule and thereby potentially break
EU data protection rules. Again, the secondary use of bulk commercial data for
law enforcement and security purposes was potentially at odds with the purpose
limitation principle as specified by the 1995 Data Protection Directive (Data Pro-
tection Directive 1995, Art. 6(1)(b)) (see also 29WP 2006; Commission de la
Protection de la Vie Privée 2006). The extent of the data transfers was not clear,
particularly because a single subpoena might grant access to a significant amount
of data (Bruguière 2010). Access practices not only concerned the data of European
and US citizens but—due to the international character of the SWIFT network—
impacted financial transactions all over the world. Complaints to data protection
authorities were filed not only in all EU member states but also in Australia,
Canada, Iceland, Liechtenstein, New Zealand, Norway, Switzerland, and Hong
Kong (Bilefsky 2006a).
When the transfers became public, SWIFT rejected allegations of legal mis-
conduct, arguing that it was acting in accordance with US law and Belgian
data protection law (SWIFT 2006). The insistence on legality is, however, to
some extent invalidated by the statement of Leonard Schrank, the Chief Exec-
utive Officer of SWIFT, who said in the same year, ‘We are caught between
complying with the U.S. and European rules, and it’s a train wreck’ (cited in
Bilefsky 2006b), thus hinting at conflicting obligations under different regu-
latory systems. SWIFT established itself as a target of US power, emphasiz-
ing that it ‘had no choice but to comply with compulsory subpoenas lawfully
issued by the UST [United States Treasury] to its US branch’ (SWIFT 2006,
6). SWIFT also emphasized that it has ‘obtained unique and effective data
protection guarantees from the UST, in circumstances where the Act [Belgian
data protection law] was not applicable’ (SWIFT 2006, 9). One of these guar-
antees was that SWIFT was not to be named as the original source of any
intelligence derived from the data (Commission de la Protection de la Vie
Privée 2006, 7). Thus, while establishing safeguards, SWIFT had also made sure
that the US authorities would not disclose the role of SWIFT in intelligence
collection.
Initial EU responses to the revelations were very critical, especially since the
US had kept the programme secret. The European Parliament adopted various
resolutions (EP 2006, 2007b, 2009) expressing deep concern and disappointment
regarding the TFTP. The Parliament justified these concerns in reference to the
134 DATA GOVERNANCE
(Monar 2010, 144). Yet due to the significant attention to the secret nature of the
TFTP, widespread open support for US practices was politically sensitive. The
Belgian prime minister Guy Verhofstadt publicly called the programme ‘an abso-
lute necessity’ (cited in Anderson 2006), but most member state representatives,
including those from Germany and France, expressed their support in more infor-
mal or private settings. Only the UK seemed to be hesitant about an EU–US
agreement, because it was likely that the Commission’s leading role in negotiations
would imply more constraints on data access and sharing (Farrell and Newman
2019, 104–5). Furthermore, according to a leaked document by the US embassy in
London, the UK seemed:
extremely concerned that the Terrorist Finance Tracking Program, and SWIFT’s
involvement in it, is being treated by the European Commission purely as a pri-
vacy issue when it should be ‘rooted in a national security debate’. While HM
Treasury officials are currently preparing advice for ministers on UK next steps,
they are planning to argue that the European Council, not the Commission, has
competency over this issue.
(US Embassy London 2006)
Hence, UK officials seemed to have been concerned about a potential shift in the
debate towards evaluative mechanisms rooted in the order of fairness. Instead,
they pointed to principles rooted in the order of security and emphasized informal
cooperation mechanisms and transnational dialogue, which is more consistent
with a vision of data governance as a security partnership. The European Com-
mission was generally in support of the programme. In 2005, the unit responsible
for data protection had been transferred from the Internal Market Directorate
General to the Directorate General for Justice, Freedom, and Security. This unex-
pected move was strongly criticized, due to the potential stronger influence of
security-favouring officials (Farrell and Newman 2019, 109). A leaked document
describes meetings between, inter alios, the Treasury Deputy Assistant General
Counsel James Freis and various representatives from the European Commission,
the Council, and member states, including Jonathan Faull, the Director General
for Justice, Freedom, and Security. The report notes the commitment of both EU
and US officials to the programme and states that:
There were opportunities for exchange between high-level officials which, how-
ever, crucially excluded data protection and European Parliament representa-
tives. In this regard Pawlak (2009) stresses in particular the role of various
136 DATA GOVERNANCE
the programme, this statement also points to valued objects and principles, such
as law, and privacy safeguards in line with the order of sovereignty and the order of
fairness.
In 2006, both the 29WP and the Belgian DPA had problematized the role of
SWIFT, suggesting that it was acting as a co-controller rather than a mere pro-
cessor of data (29WP 2006; Commission de la Protection de la Vie Privée 2006).
This distinction is central, because it assigns different levels of responsibility. A
‘processor’ is considered an entity ‘which processes personal data on behalf of the
controller’, which, in contrast to a ‘controller’, ‘determines the purposes and means
of the processing of personal data’ (Data Protection Directive 1995, Art. 2(d)–(e))
and has limited data protection responsibilities. For example, on the basis of the
definition of Google as a data controller rather than a data processor, the company
must limit the availability of search results under the EU ‘right to be forgotten’ (see
Chapter 8).
In 2008, the Belgian DPA largely changed its approach to the nature of the finan-
cial surveillance measures. It came to the new conclusion that SWIFT was to be
considered a ‘de facto delegate [délégué de fait]’ (Commission de la Protection
de la Vie Privée 2008, 59) for the financial community. This concept was intro-
duced by the report and, in contrast to a former report, stipulated that SWIFT
had not violated Belgian law. In the 2006 decision, the Belgian Privacy Commis-
sion had criticized the vague and unconventional definition of terrorism and had
emphasized the non-individualized and massive nature of the requests, compar-
ing it to the measures of ‘Rasterfahndung’ and ‘carpetsweeping’ (Commission de
la Protection de la Vie Privée 2006, 5). In contrast, in its 2008 report, the DPA
referred to several UNSC Resolutions (Commission de la Protection de la Vie
Privée 2008, 11–12) which had not been considered in the initial report but were
emphasized in the US undertakings (US Department of the Treasury 2007b, 19).
The changed assessment resolved the jurisdictional conflict, as the DPA evalu-
ated the claim according to principles and valued objects of the order of security
such as UNSC resolutions rather than private responsibility or privacy principles.
Nevertheless, this should be considered as a conflict management and contain-
ment strategy rather than as an indication that the conflict did not exist in the first
place, as is indicated by the initial opinion of the Belgian DPA and the 29WP state-
ments. At that point, the Belgian DPA had severely limited enforcement capacities
(Pawlak 2009, 572). Without the political support of the member states, it is likely
that it would have failed to assert its jurisdiction on the basis of alternative norma-
tive principles. This points to ignorance of a jurisdictional conflict as a potential
management strategy. For SWIFT, admitting that it was in conflict with European
138 DATA GOVERNANCE
law would have meant risking a penalty, as SWIFT did not notify the Belgian DPA
about the subpoenas. For the DPA, it meant not exposing its weak position in the
field. This shift in position by the DPA had crucial implications for the space of
possibles for future policy options. On the one hand, it limited the legal actions
that could be taken against SWIFT and the banks. On the other hand, it provided
legitimacy to the agreement, addressing concerns about potential ramifications of
openly contradicting DPA opinions, for example by Ireland (US Embassy Dublin
2006).
While willingness to pursue SWIFT’s practices legally was limited, DPA enforce-
ment was still possible, and SWIFT tried to address existing criticisms. On the one
hand, in 2007, SWIFT adopted the Safe Harbour principles, privacy principles that
allowed transatlantic data transfers for commercial purposes (see Chapter 4). As
a European firm, SWIFT was not obligated to join the Safe Harbour framework—
and in 2006 had argued that it was not even possible (SWIFT 2006, 8). The
decision to join must, therefore, be understood as a symbolic rather than substan-
tive move. On the other hand, and more importantly, in October 2007 SWIFT
announced the opening of an operating centre in Switzerland to handle European
data and stop data mirroring in the US. This had wide-ranging implications, as it
meant that the data were out of reach for US subpoenas (SWIFT 2007). The exclu-
sive storage of European data outside US territory made formal actions necessary if
the transfers were to continue after the envisaged moving date of 1 January 2010.
In sum, the secret and sovereignty-intrusive nature of the TFTP had caused an
initial furore in the EU. However, those concerns were increasingly marginalized.
While the jurisdictional conflict seemed to be resolved through letter exchanges,
the change of SWIFT’s technical architecture demanded a legally binding
agreement.
problematized the restricted nature of the negotiating directive and the lack of
information on the suggested legal basis. MEPs demanded the negotiation of a
permanent agreement with increased participation and scrutiny (EP 2009, D, 7(i)).
On 30 November 2009, exactly one day before the Lisbon Treaty was set to come
into force, the Council decided to abandon any attempts to include the Parliament
and unilaterally adopted an interim agreement. However, the Council failed to
fully adopt the agreement before the Lisbon Treaty came into force (Farrell and
Newman 2019, 114). The agreement now required the consent of the European
Parliament according to Article 218(6) of the TFEU. It foresaw a starting date
for the agreement of 1 February 2010 (Council of the EU 2009), a short dead-
line with limited options for review (EDPS 2010b; LIBE 2010). The Council’s
delayed request for consent therefore seemed to represent an attempt to present
the Parliament with a done deal (Monar 2010, 144–6). In consequence, MEPs were
largely critical of the solution, as the last-minute agreement seemed to undermine
the increased institutional powers of the Parliament in EU external relations. The
adoption or rejection of the agreement was perceived as an important test case
for the new powers of the Parliament (de Goede 2012b; de Goede and Wesseling
2017, 254–5).
As a response to this perceived challenge to the its position in the field, the Parlia-
ment rejected the agreement by 378 votes to 196, with 31 abstentions (EP 2010a;
LIBE 2010). This assertion of power was remarkable, particularly in light of what
Monar describes as extraordinary lobbying efforts, which included a phone call
from the US Secretary of State Hillary Clinton to the European Parliament Pres-
ident Jerzy Buzek as well as internal appeals from the EU Presidency and the
European Commission (Monar 2010, 143). In the debate, the Home Affairs Com-
missioner Cecilia Malmström had warned that the Parliament’s rejection would
constitute ‘a serious threat to the security’ (cited in EP 2010a) of EU citizens,
thus attempting to impose criteria of the order of security. MEPs voiced concerns
regarding the retention period, the extent of bulk data transfers, and the shar-
ing of policies with other authorities and third countries (EP 2010a). The fact
that the agreement had never been critically evaluated with regard to its effec-
tiveness constituted a further concern (LIBE 2010). Most significantly, even for
strong supporters of data sharing for security purpose, the attempt to undermine
the European Parliament’s strengthened position in the field constituted a signif-
icant problem. For example, the Conservative MEP Timothy Kirkhope, a strong
supporter of the PNR agreement (see Chapter 5), expressed his anger at the Coun-
cil’s procedural decisions and argued that ‘Parliament’s right to consent should
not be used as a retrospective tool’ (cited in EP 2010a). Thus, while there were
140 DATA GOVERNANCE
concerns about the potential moral worthiness of the agreement, a significant part
of the rejection was based on the Council’s attempt to undercut the Parliament’s
strengthened position.
Despite this extraordinary display of power, the European Parliament accepted
a revised version of the agreement only four months later, in July 2010 (Kaunert
et al. 2012, 489), shortly after the then US Vice President Joe Biden had addressed
MEPs, arguing that ‘not less than privacy, physical safety is also an inalienable
right’ (cited in EP 2010b). The revised agreement contained concessions to the
Parliament’s concerns regarding bulk data transfer, oversight, blocking possibil-
ities, and retention time. It guaranteed that data were ‘pushed’ from the EU to
the US rather than ‘pulled’ directly by the US authorities (EU and US 2010). The
agreement also required the evaluation of transfer requests by Europol. Some still
considered concessions too limited (EDPS 2010b).
As in the PNR case, the literature converges around the assumption that the
Parliament mostly wanted to cement its role as a strict but responsible and coop-
erative partner also in ‘high politics’ (Kaunert et al. 2012, 489; Ripoll Servent
and MacKenzie 2011). Some additional factors contributed to the acceptance.
First, the institutionalization of reciprocity in Art. 9 and 10 of the agreement,
which the European Parliament (2009) had stressed, proved important. The
possibility of reciprocal data access stripped away the opposition by the social
democrats in particular (Farrell and Newman 2019, 114–15). Second, the agree-
ment recognized the European Parliament’s request for increased parliamentary
and EU scrutiny, which increased normative fit between the jurisdictional claims
(MacKenzie and Ripoll Servent 2012, 87) and further entrenched the Parliament’s
role in transatlantic security policymaking. Third, the revised agreement proposed
the implementation of a European Terrorist Finance Tracking System (TFTS)
(EU and US 2010, Art. 11), which would make the transfer of raw data to the
US expendable. This constituted one of the conditions under which an increasing
number of MEPs were willing to accept the agreement, as they hoped for addi-
tional protection once data processing would be restricted to EU soil (Farrell and
Newman 2019, 119). This is evidence of an increasing acceptance of the value
of TFTP data in the fight against terrorism. The Parliament had recognized the
field’s illusio regarding the effectiveness of financial surveillance with regard to
the pursuit of security.
In contrast to the similar PNR case (see Chapter 5), the initial plan of adopt-
ing a European TFTS has not been implemented (Wesseling 2016). In 2013, the
European Commission concluded that there was no demonstrated need for set-
ting up a European system. The Commission argued that TFTP reciprocity was
working well and doubted the added value of a European TFTS, from both data
protection and security perspectives (EC 2013b). Thus, the idea of potential intru-
sions on sovereignty and the ‘asymmetric relationship between the United States
and other sovereign entities in relation to the growing area of financial intelli-
gence’ (Amicelle 2013, 5) seemed to be outweighed by potential security benefits.
FINANCIAL DATA SHARING 141
The question was revisited after terrorist attacks on European soil (EC 2017a). As
with the revival of the PNR system after its rejection in 2013, new developments in
finances might prompt the expansion of the regime. The existing agreement to a
considerable degree depends on the monopoly position of SWIFT, and the TFTP
programme does not cover Single European Payments Area transactions within
the EU or transactions in cryptocurrency. In 2016, the Commission presented
plans to investigate potential additional coverage of intra-EU financial transactions
(EC 2016b), while a recent consultation does not yet discuss this option specifi-
cally (EC 2020d). Terrorism financing has also been addressed by the recent AML
Directive and in the 2017 Directive on Combating Terrorism (Haffke et al. 2019).
In sum, the Council attempted to avoid a more comprehensive inclusion of
MEP demands for safeguards in the negotiation of a bilateral EU–US agreement
by adopting an interim agreement shortly before the Lisbon Treaty strengthened
the European Parliament’s competences. When asked to consent to an agreement
already been provisionally in force, the Parliament asserted its new position in
the field but shortly after accepted a revised version. The agreement was based
on enhanced review and oversight mechanisms, reciprocity agreements, and a
potential future shift of data processing to the EU.
Over time, the TFTP became entrenched, particularly as member states increas-
ingly relied on the submission of requests to the US Treasury (EC 2013c, 9).
While the 2013 revelations of widespread surveillance by the US National Secu-
rity Agency (NSA) by the former intelligence contractor Edward Snowden (for
an overview, see Chapter 4) constituted a challenge in the field, it did not funda-
mentally change the TFPT. In Section 6.2.1, I describe how practices of secrecy,
selective transparency, and review have contributed to the entrenchment of the
order of security as evaluative standards.
In 2013, after the Snowden revelations, the European Parliament called for a
suspension of the TFTP agreement on the basis of indications:
that the US National Security Agency (NSA) has had direct access to the IT
systems of several private companies and gained direct access to financial pay-
ment messages referring to financial transfers and related data by a provider
of international financial payment messaging services currently covered by the
Agreement.
(EP 2013, para. B)
142 DATA GOVERNANCE
In response to allegations that the US accessed data beyond the TFTP agree-
ment, the European Commission initiated talks but eventually decided not to
engage in formal consultations (EP 2014a, para. BC). In a report in late 2013,
the European Commission emphasized that the informal dialogue ‘did not reveal
any elements proving a breach of the TFTP Agreement, and they led the US
to provide written assurance that no direct data collection has taken place con-
trary to the provisions of the Agreement’ (EC 2013e, 5). Yet the lack of trans-
parency regarding US compliance with the agreement (EP 2014a, para. BB)
and the fact that member states did not request a technical investigation that
could have addressed concerns about direct data access (EP 2013b, para. 4) wor-
ried MEPs. Notably, in the LIBE hearings following the Snowden revelations
(see Chapter 4), the Commission’s Director of Home Affairs, Reinhard Priebe,
refused to comment on whether direct access to SWIFT by intelligence services
would actually constitute a derogation of the agreement (LIBE 2013, third hear-
ing, first session, 24 September). In a letter, the US stated that ‘the US Government
is using the TFTP to obtain SWIFT data that we do not obtain from other
sources’ (letter from the US authorities, dated 18 September, cited in EP 2014a,
note 1).
The letter’s ambiguity with regard to the collection of SWIFT data implied that
direct data collection could have potentially taken place in compliance with the
agreement. Press reports about financial surveillance of credit card data and direct
access to SWIFT networks prompted the Parliament to repeat its call for a suspen-
sion of the agreement (EP 2014a, para. 54). The Parliament further referred to the
testimony of the journalist Glenn Greenwald, who argued in a LIBE committee
meeting that both the NSA and the UK’s Government Communications Head-
quarters (GCHQ) had directly obtained data from SWIFT (EP 2014a, para. BD).
In 2017, the hacker group Shadow Brokers published documents that suggested
the NSA might have had access to two SWIFT service bureaux to monitor trans-
actions in the Middle East and Latin America. SWIFT rejected these allegations
(Lee 2017).
None of the references to dystopian scenarios in the order of fairness, such as
the lack of compliance with existing safeguards and transparency, had significant
effects on the persistence of the TFTP agreement. Despite the calls for its suspen-
sion, the European Parliament was acutely aware of the limitations of its position
in the field, being restricted to commenting on the existing agreement. However,
MEPs argued that they would ‘take account of the responses of the Commission
and the Council in relation to this Agreement’ (EP 2013b, sec. 11).
Thus, while the European Parliament seemed willing to discursively con-
test the jurisdictional claim of the US, it seemed to accept the lack of realistic
options to translate this challenge into tangible institutional change. The con-
flict resolution stabilized, on the one hand, through sheer necessity. On the other
hand, the Parliament’s perception of possible policy options was also shaped
FINANCIAL DATA SHARING 143
¹ The JSB, which was composed of DPAs from different member states (JSB 2011), has since 2017
been replaced by the EDPS (EDPS 2017b).
FINANCIAL DATA SHARING 145
for the LIBE committee. This sparked a major controversy. The European Com-
mission was clearly dissatisfied with the JSB’s decision to investigate the TFTP.
In the joint review, it highlighted that ‘parallel or uncoordinated initiatives or
inquiries should be avoided because they undermine the article 13 review pro-
cess and have caused considerable workload on the Treasury in particular’ (EC
2012b, 15–16). The Commission further considered the JSB’s decision to pro-
vide access to the report to be ‘a clear violation of applicable security rules and
a breach of mutual trust’ and asked for future coordination between supervisory
bodies ‘in order to avoid overlapping activities and misleading public statements’
(EC 2012b, 16). The Commission’s statement clearly highlights the selective char-
acter of transparency and review procedures. While the Commission review is
conceptualized as an effort to prove worth in the order of fairness, citing data
protection principles and the importance of review procedures, the Commission
criticizes the JSB’s efforts due to their negative impact on transatlantic relations
and trust, referring to evaluative criteria based in the order of globalism. The
Commission report can be interpreted as a clear attempt to appease US concerns
and demonstrate strong alignment of the Commission with the US government,
which had articulated concerns about the JSB’s investigation (Mitsilegas 2014,
303). More specifically, according to a letter cited in an Ombudsperson report,
the US authorities considered that:
The US authorities considered that such a breach ‘may potentially undermine the
relationship of trust’ (letter from the US authorities, cited in European Ombuds-
man 2014, para. 6) between the parties, thus also emphasizing a lack of compliance
and reliability. The jurisdictional overlap became even more conflictual. When
LIBE committee members asked for privileged access to the full JSB report,
Europol refused access. Europol argued that ‘technical modalities’ which were
agreed upon in the TFTP agreement but never shared with the Parliament or
the Council prevented it from sharing the report as such (European Ombudsman
2014, para. 17). The MEP Sophie in ’t Veld again decided to lodge a complaint
with the EU ombudsperson to gain access to the report, which the Ombudsper-
son considered as the reflection of ‘a democratic deficit at the level of the EU which
must be addressed’ (European Ombudsman 2014, para. 17). The Ombudsper-
son was, despite initiating a dialogue with the US embassy, not able to gain
access to the review either. She strongly problematized her constrained ability to
access a document produced by an EU oversight body on the data of EU citizens.
146 DATA GOVERNANCE
She concluded that the Parliament should consider ‘whether it is acceptable for
arrangements to be agreed with a foreign government which has the consequence
of undermining mechanisms established by or under the EU Treaties for the
control of EU executive action’ (European Ombudsman 2014, para. 20), thus
highlighting concerns of internal and external sovereignty. The Ombudsperson’s
decision shows contempt for the selective practices of secrecy and transparency,
highlighting principles based in the order of fairness, such as democratic princi-
ples and parliamentary scrutiny, as well as principles of sovereignty referring to a
dystopian scenario of foreign encroachment.
This example demonstrates how the conflict resolution process is stabilized
by selective practices of secrecy and transparency. The possibility of demanding
transparency, even for MEPs, who have high political status, is severely con-
strained. The US jurisdictional claim over data notably extended beyond the
data as such and also covered the review procedures. As with the revelations,
transparency rather than the practices of secrecy were considered to be morally
deficient. This isolated the TFPT from comprehensive scrutiny, which contributed
to its stabilization. Regarding the book’s main question of conflict resolution, the
example points to the confirmatory character of review and scrutiny procedures,
which tend to circumvent rather than represent challenges to the stability of an
agreement.
based on the understanding that it was not its task to provide a political judgement
on the Agreement, this being considered outside the scope and mandate under
Article 13 … Where recommendations are presented, these are aimed at further
increasing the effectiveness of the application of the Agreement, in particular its
safeguards.
(EC 2012b, 4)
This statement, which can also be found in subsequent reviews, clearly highlights
how the European Commission constrained its space of possibles voluntarily. In
submission to the illusio of the field, particularly the interpretation that the TFTP
is meaningful in the pursuit of the common good of security, the European Com-
mission limited its role to informing about the programme while refraining from
value judgements or demanding consequences.
The agreement requires only minimal evidence for the effectiveness of the pro-
gramme. In its first review, the Commission noted the need to provide more
publicly available statistical information to demonstrate the benefits of the TFTP
(EC 2011, 1). However, the TFTP agreement does not require the US to give
detailed information on the total volume of financial messages. In consequence, no
information on the volume of data in total is given, but the US Treasury provides
information on ‘leads’ or ‘reports’ that may vary in data volume (EC 2012b, Annex
II, Q4). For example, in the 2019 review period, 1,115 searches were conducted
on average per month and the leads shared with the EU increased significantly
(EC 2019c, 8–9). However, ‘The Treasury maintains its view that disclosure of
overly detailed information on data volumes would, in fact, provide indications as
to the message types and geographical regions sought … and would have the effect
that terrorists would try to avoid such message types in those regions’ (EC 2019c,
9). Thus, the volume of data transfers was ‘reconfigured into sensitive security
information’ (de Goede and Wesseling 2017, 263), with any disclosure poten-
tially threatening the integrity of the programme. Therefore, any re-emergence
of conflict was avoided due to the classification of any potentially threatening
data as secret. Nonetheless, in 2015, the JSB emphasized that ‘There is a clear
tension between the idea of limiting the amount of data to be transmitted by tai-
loring and narrowing the requests and the nature of the TFTP’ (JSB 2015, 3).
While there are statements that question the broad nature of the requests, even
in the European Commission joint review (EC 2019c, 19), these are mainly char-
acterized by a lack of consequences, thus again stabilizing the agreement. For
148 DATA GOVERNANCE
instance, in the 2018–19 period, overseers queried 645 cases, ‘the overwhelm-
ing majority of which [queries] were selected for routine auditing purposes’ (EC
2019c, para. 37). They stopped seventy-five and blocked fifty-three cases retroac-
tively. This indicates that from a sample of largely routine request, more than 19
per cent of searches were considered too broad. This is not problematized in the
report. The evidence provided for the evaluation of the effectiveness of the TFTP
also seems to contradict its preventative aims. On its website, the US Treasury
states:
These U.S. Treasury Department efforts have not only disrupted terrorist net-
works, they have saved lives. Since the start of the program, the TFTP has
provided thousands of valuable leads to U.S. Government agencies and other gov-
ernments that have aided in the prevention or investigation of many of the most
visible and violent terrorist attacks and attempted attacks of the past decade.
(US Department of the Treasury 2022)
While the emphasis in this statement is on the preventative quality of the TFTP, the
examples provided in the annex of the 2017 review highlight cases in which indi-
viduals or groups were already being investigated rather than identified through
the TFTP. The report refers to the Charlie Hebdo or the Paris attacks (EC 2017b,
41–5), which are examples of the usefulness for the investigation and collection
of evidence rather than the TFTP’s ex ante success (see also Amicelle 2013). The
emphasis on pre-emption and prevention in post 9/11 counterterrorism measures
(McCulloch and Pickering 2009) thus seems to be at odds with the evidence pro-
vided for their effectiveness. As the EDPS points out in 2011 regarding a European
Commission document, ‘The Communication mentions on several occasions the
“added value” or the “interest” of the US TFTP, without referring to any analysis
of the efficiency of existing tools’ (EDPS 2011b, 2). By consistently avoiding a clear
assessment of the value of data, even in dialogue with its oversight committee, the
TFTP’s effectiveness is constructed around a certain aura of importance that is
created by not despite its secrecy (see de Goede and Wesseling 2017). Thus, again,
the notion of secrecy prevents the emergence of the imperative of justification. It
is based on the active removal of both the data and the discussion about its gover-
nance from public discourse. In conclusion, in the resolution of the conflict and the
stabilization of the agreement, actors draw on practices of selective transparency
to sustain and legitimize their jurisdictional claims. Continued practices of secrecy
entrench a vision of mundane data as highly classified and sensitive, which averts
critical scrutiny. As highlighted above, review reports seem to have a confirmatory
role for the persistence of the agreement by remaining vague, which is consistent
with the expectations articulated by Boltanski and Thévenot (2006, 336) regarding
the stabilization of compromise. Potential infringements of the agreement rarely
become visible. If actors manage to highlight a lack of transparency, the debate is
FINANCIAL DATA SHARING 149
The growing impact of the FATF on the area of AML and CFT demonstrates an
increased emphasis on data sharing in financial matters for security purposes. The
29WP has criticized this as a departure from data protection standards, pointing to
their framing as obstacles needing to be overcome or circumvented (29WP 2011).
AML measures usually require banks to install systematic scanning and tracking
measures for customers and identify suspicious transactions, which the EDPS crit-
icizes as ‘blanket measures’ (EDPS 2017a, 12), raising concerns about invasions of
privacy and proportionality.
FATF standards have also increased reliance on private actors as actively inter-
preting security actors (de Goede 2018b). While the TFTP largely involved SWIFT
as a passive supporter, potential changes stemming from the diversification of
financial channels make it likely that, as in the case of AML, reliance on ‘pri-
vate corporations for policing purposes’ (Herschinger et al. 2011, 465) will further
increase in CFT. This is despite apparent failures of banks to fulfil their responsi-
bilities as private enforcers. In 2012, the US authorities issued a fine of $1.9 billion
to HSBC, because they had failed to detect Mexican drug cartels laundering $881
million as well as ongoing terrorism financing in Saudi Arabia and Bangladesh
(Homeland Security and Governmental Affairs Committee 2012). In 2019, the
Australian bank Westpac was accused of ‘serious and systemic’ breaches in more
than 23 million cases (Janda and Ryan 2019). The Danske Bank was at the cen-
tre of a money laundering scandal that involved over €200 billion in suspicious
transactions through branches in Estonia (Bjerregaard and Kirchmaier 2019).
In sum, the jurisdictional conflict over TFTP is embedded in a financial secu-
rity assemblage that is characterized by increasing scope, particularly concerning
the role of international actors, but also with regard to targeted transactions, the
increasing role of private actors, and a shift towards principles embedded in the
order of security. This might contribute to the emergence of jurisdictional con-
flicts in the future, as EU DPAs have voiced concerns about the blanket character
of financial surveillance.
6.4 Conclusion
This chapter has traced the evolution of financial data sharing in the transat-
lantic context. By analysing the unfolding and resolution of a jurisdictional conflict
involving the EU, the US, and the Belgian cooperative society SWIFT, I have inves-
tigated how financial data sharing for counterterrorism purposes has stabilized
despite significant disruptions. The jurisdictional claim by the US, while contested,
is codified through a bilateral agreement that has withstood the test of time. I
have argued that the resolution of this jurisdictional conflict is based on the suc-
cessful imposition of security as the main criterion of evaluation. In view of this
extensive inter- and transnational cooperation, data governance is most strongly
FINANCIAL DATA SHARING 151
US adopted the contested Clarifying Lawful Overseas Use of Data Act (CLOUD
Act 2018), a law that explicitly legalized access to data stored beyond US territory
under specific conditions. While this unilateral assertion of US law could have pro-
voked an international dispute, reactions were limited, most likely because similar
legislative proposals were already under discussion in the Council of Europe and
the EU. Yet as the EU is increasingly attempting to impose data protection stan-
dards for cloud data storage, particularly concerning public sector and sensitive
data, and in light of the deficiencies of data protection in the US context identified
by the Schrems II judgment (see Chapter 4), it is likely to be subject of conflicts in
the future.
Although electronic evidence has become a key concern not only in transat-
lantic law enforcement cooperation but also in the Council of Europe (Daskal
2018), it is subject to surprisingly little attention in IR (for a limited exception,
see Farrell and Newman 2019, 41–6). The rise of cross-border cooperation in law
enforcement, particularly in light of the widening role of supranational authority
in the EU (Herschinger et al. 2011; Trauner and Carrapiço 2012) but also more
specifically in relation to data protection (Blasi Casagran 2016; de Busser 2009; de
Busser et al. 2014) and EU–US cooperation (Anagnostakis 2017, ch. 3), has been
well illustrated. Yet few contributions have examined the challenges of electronic
evidence collection and sharing and then, mostly from a legal perspective (Autoli-
tano et al. 2016; Biasiotti et al. 2018; Christou 2018; Daskal and Swire 2018; Ligeti
and Robinson 2021). Bigo et al. (2012), have concentrate on the implications for
data protection rights in the context of cloud computing, including transfers to
third countries and jurisdictional disputes.
This chapter aims to address the existing gap in the literature concerning recent
evolutions of electronic evidence, focusing on two aspects of the conflict resolution
process. First, I investigate the role of Microsoft in this jurisdictional conflict. The
company profoundly challenged the US Department of Justice, acting differently
from other companies such as banks and airlines (see Chapters 5–6) that tend to
support rather than challenge informal access practices for security purposes. Sec-
ond, I analyse how it is possible that the legalization of the US jurisdictional claim
contributed to the resolution rather than the exacerbation of the conflict. The
chapter discusses in particular the international dynamics of this jurisdictional
conflict while also highlighting the complex public–private relationships that con-
tribute to both its emergence and its comparatively quick legislative resolution.
First, I argue that private companies such as Microsoft have increasingly adopted
a more long-term strategy that is based on legal challenges, political advertising,
and the discursive construction of private tech as responsible actors. In answer to
the second question, I argue that the field is subject to incremental processes that
entrench data governance as a security partnership. The circumvention of due
process norms, data protection concerns, and sovereignty considerations implies
a hierarchization of security compared with other principles. While shifts in the
154 DATA GOVERNANCE
order of security fail in the context of the Microsoft court case (In re: A Warrant
2016; United States v. Microsoft Corp. 2018), parallel meaning-making processes
create the backdrop against which the legalization of existing and contested US
law enforcement practices is considered legitimate.
In this chapter, I first offer a brief introduction to the technical and legislative
challenges of electronic evidence before, secondly, demonstrating the emergence
of a jurisdictional conflict due to the resistance of Microsoft in complying with a
US search warrant. I illustrate the increasingly international character of the con-
flict and focus on the justifications brought forward by the US, Microsoft, and a
diversity of international actors in the US Supreme Court before outlining the leg-
islative response of the US via the CLOUD Act. Thirdly, I highlight interlinkages
with broader developments in the field, in particular meaning-making processes
that entrench the moral value of security cooperation before reflecting on the
implications in the conclusion.
Before getting into the conflict, it is important to briefly illustrate the rele-
vance and details of electronic evidence, particularly how the technical nature
of data storage, transfer, and access constitutes challenges for law enforcement
agencies.
Discussions about access to electronic evidence are embedded in a broader
attempt to tackle crime on the internet. Several global and regional institutions
such as the OECD, the G7/8, or the International Telecommunication Union
(ITU) have mainly focused on the issue of cybercrime (Christou 2018). The Coun-
cil of Europe’s Budapest Convention (2001), with more than sixty parties, is the
only legally binding instrument. However, electronic evidence is no longer signif-
icant only in cybercrime investigations. In a 2019 survey, EU judicial authorities
mentioned the relevance of e-evidence in the investigation of fraud, the sexual
exploitation of children, human trafficking, and terrorism (Europol 2019, 9–13).
Both online and offline criminal activities frequently involve information and
communications technology (ICT), for example through the use of messaging
apps or email programs. In consequence, the significance of electronic evidence
or e-evidence is on the rise, with an 84 per cent increase in request numbers
between 2013 and 2018 in the EU (EC 2019a). Legislative frameworks tend to
distinguish between different types of data, such as subscriber information, traffic
data, and content data. Subscriber information comprises information that iden-
tifies the holder of a certain account, such as name, username, address, email, or
financial information. Traffic data includes metadata, such as the time, date, and
length of access. Content data refers to the substantive content of the messages or
files, which is generally considered to be the most sensitive (Europol 2019, 10).
ACCESS DENIED? 155
and voluntary nature of these access requests, law enforcement authorities have
little leverage in forcing providers to release the data.
The increasing interest from law enforcement also generates additional burdens
for companies. On the one hand, companies need to become relatively familiar
with the legal requirements for warrants, wiretaps, or other data access requests in
different jurisdictions. On the other hand, there is the added challenge of forged
requests or faulty requests in a diversity of languages (T-CY 2016a). Companies
have developed different ways to deal with these administrative burdens. In early
2020, Google announced its intention to charge law enforcement agencies for pro-
cessing subpoenas, wiretaps, and search warrants, with charges ranging from $45
for a subpoena to $245 for a search warrant (Dance and Valentino-DeVries 2020).
This has the potential to reinforce the heavy dependency of a public sector with
limited resources on tech companies that have accumulated data. However, it may
also deter law enforcement authorities from requesting data in cases where they
are not strictly necessary.
In sum, challenges to the access of electronic evidence arise not only because
data are moved easily across borders but also because of the variation in the loca-
tion of service providers, data, crime, or investigations, as well as the suspect’s
nationality, which demands complex administrative procedures that complicate
and delay criminal investigations. In light of increasing request numbers and a
growing reliance on cloud storage, the stakes of cooperation increase, which also
raises the potential for jurisdictional conflict. Section 7.2 outlines how those chal-
lenges contributed to a conflict between Microsoft and the US Department of
Justice.
Informal law enforcement requests usually involve a high level of legal uncertainty
for internet service providers. Companies need to assess several factors, such as the
applicability of the requesting state’s jurisdiction as well as the authenticity and
lawfulness of the request (T-CY 2016a). Thus, there are few benefits to coopera-
tion with law enforcement agencies, particularly because this cooperation is only
in very few cases publicly recognized. One exception is the Charlie Hebdo shoot-
ings in Paris in 2015, during which Microsoft provided the requested data after
forty-five minutes (Lien 2015). In light of the 44,655 requests Microsoft received
in 2018 concerning nearly twice this number in users or accounts (Microsoft
2020), cooperation constitutes a burden on companies. Yet Microsoft’s decision
in 2013 to challenge the relatively common practice of direct informal coopera-
tion came largely unexpected by many public officials (Tech company employee
A, pers. comm. 2019). The challenge created an imperative of justification for the so
far largely overlooked extraterritorial practices of law enforcement agencies. The
ACCESS DENIED? 157
holder of the relevant account had specified Dublin as the location of residence;
therefore, most of the data were stored in Microsoft’s data centre in Ireland. While
Microsoft handed over data stored on US servers, such as the address book, it did
not provide access to the more sensitive content data stored in Ireland. In the let-
ter refusing the warrant, the company cited concerns about legal certainty, arguing
that the request had extraterritorial implications which went beyond the scope of
the 1986 US Stored Communications Act, the legal basis of the request (Microsoft
2018). Several companies followed suit. Hence, Microsoft and other tech com-
panies used their unique position in the field, sustained by control over data of
high relevance to law enforcement, and challenged the jurisdictional claim by
the US.
Two main potential reasons may be behind this challenge. First, in Microsoft’s
2018 fiscal year, the company for the first time surpassed $100 billion in rev-
enue, which is largely accredited to its strong reliance on cloud services (Wong
2018). The mitigation of legal uncertainty, particularly in light of the company’s
increasing reliance on cloud storage, constituted a significant incentive. Through
the Supreme Court, Microsoft was able to increase the visibility of the issue,
as indicated by several media reports in major international newspapers (Die
Zeit 2016; Rushe 2014; New York Times 2015). Informal cooperation between
law enforcement agencies and internet service providers was neither particularly
widely discussed nor known at the time but suddenly became subject to public
debate (Tech company employee A, pers. comm. 2019). This enforced public jus-
tifications for the agencies’ implicit jurisdictional claim over data and with it the
potential for greater legal certainty in a key sector.
Second, Microsoft strongly emphasized the embeddedness of the conflict in a
broader effort to protect consumers from harm (B. Smith 2018a). This included
references to several initiatives promoted by Microsoft, for example, a ‘Dig-
ital Geneva Convention’ (B. Smith 2017a) or a public–private ‘Tech Accord’
(B. Smith 2018b). Microsoft, which has been described a ‘norm entrepreneur’
(Gorwa and Peez 2020) in global internet governance, seems to display character-
istics of what Eichensehr has conceptualized as ‘Digital Switzerlands’ (2019). The
concept suggests that companies are on par with the governments that regulate
them, scrutinize, and sometimes resist public power. For instance, the Microsoft
CEO Brad Smith suggested that ‘Cloud providers act as a critical check to ensure
that governments’ use of their investigative powers strictly adhere to the rule
of law’ (B. Smith 2018c, 2). This highlights that Microsoft perceives its role as
involving oversight functions and considers itself as a bulwark against intrusive
practices. The broad scope of the request (US District Court Southern District of
New York 2014, 3–4) provided additional legitimacy from a consumer protection
perspective.
Microsoft relied on a twofold justificatory strategy in its jurisdictional claim.
In contrast to expectations formulated in the literature on private governance
158 DATA GOVERNANCE
(Cutler et al. 1999; Green 2013) and in contrast to other areas of inter-
net governance such as the critical infrastructure of protocols (Mueller 2010),
Microsoft did not refer to its expertise to prove the rightfulness of its jurisdictional
claim. The company relied significantly on its large market share, emphasizing
the number of affected customers that were treated unlawfully (Supreme Court
of the US 2018, 47). Microsoft argued that US practices violated privacy stan-
dards (B. Smith 2017b), which highlights a rights-based conceptualization of data
access. While criteria embedded in the order of fairness contributed to salience in
public discourse, the attempt to impose them on the conflict found only limited
acceptance by the conflict parties. The Department of Justice representative even
explicitly argued that ‘It’s not a case about privacy’ (Dreeben, cited in Supreme
Court of the US 2018, 21). Yet NGOs have pointed to privacy implications,
recognizing Microsoft’s interpretation (EDRi 2017; EDRi et al. 2019; EPIC 2018).
Microsoft’s justifications further juxtaposed US practices that violated estab-
lished principles of sovereignty with its own responsible behaviour as a global
political actor. This emulated justifications normally associated with public actors,
such as references to sovereignty and the international order. In the Supreme
Court hearing, the representative frequently referred to the extraterritorial-
ity of the US approach and pointed to the exercise of ‘extraordinary power’
(Rosenkranz, cited in Supreme Court of the US 2018, 60) not granted by law.
The counsel for Microsoft, Joshua Rosenkranz, explicitly highlighted the violent,
intrusive nature of data requests, stating:
we all agree that the Stored Communications Act is limited to the United States.
The government wants to use the act to unilaterally reach into a foreign land to
search for, copy, and import private customer correspondence physically stored
in a digital lockbox, any foreign computer where it’s protected by foreign law.
(Supreme Court of the US 2018, 32)
The references to ‘reach’ and ‘physically stored in a digital lockbox’ reinforced the
physical nature of the intrusion and established data as having implications for
territoriality. Rosenkranz also referred to a violation of European law (cited in
Supreme Court of the US 2018, 42). The company’s justificatory strategy had sim-
ilarities to justifications articulated by Google regarding the right to be forgotten
(see Chapter 8). The emulation of public justifications entrenched the position
of Microsoft as a political actor scrutinizing the exercise of public power and
demanding justifications for potential wrongdoings akin to the idea of ‘Digital
Switzerlands’ (Eichensehr 2019). While Microsoft was not able to establish cri-
teria of evaluation embedded in the order of fairness such as compliance with
human rights law, the construction of data as a territorial sovereignty concern
had far-reaching implications. Extraterritoriality both violates principles of inter-
national law and interferes with the sovereignty of states (see Svantesson 2016).
ACCESS DENIED? 159
The territorial conceptualization of data posed problems for the justificatory strat-
egy of the US Department of Justice. To claim legitimate control over data, the US
Department of Justice needed to illustrate why their jurisdictional claim was not an
intrusion on sovereignty in the sense that Microsoft had proposed, while simul-
taneously making sure that data access continued. Arguing that their claim was
extraterritorial but legitimate would have meant a lack of compliance with estab-
lished principles of international law. I elaborate on the justifications articulated
by US Department of Justice officials in Section 7.2.1.
(Supreme Court of the US 2018). Dreeben also emphasized the lack of resistance
by public international actors, suggesting that ‘we have heard no protests from
foreign governments’ (cited in Supreme Court of the US 2018, 23) concerning the
status quo.
Second, on the basis of the important position and significant material and
symbolic resources in the area of military capacities and intelligence, the US repre-
sentative tried to shift the focus away from human rights towards a security-based
criterion of evaluation:
Microsoft’s theory is that if it moves information abroad, since storage is the only
thing that counts, it’s then free to disclose that information to the world, to sell
it, to do anything it wants free from U.S. law … the only person who can’t get it is
the United States under lawful process.
(cited in Supreme Court of the US 2018, 31–2)
ACCESS DENIED? 161
This suggested that any undermining of law enforcement access would constitute
the support of potentially irresponsible undisclosed third actors, which stigmatizes
private actors as illegitimate. This was reinforced by the devaluation of concerns
located in the order of production, as expressed in the following statement:
Economic concerns cannot override the text of the statute or the interests in pub-
lic safety and national security that are at stake in this case—particularly when the
claimed economic benefit is derived directly from a provider’s ability to market
itself as capable of shielding subscribers’ activity, including their criminal activity,
from discovery by the authorities.
(US 2017, 32)
that the case involved ‘a question of domestic law on which the Special Rapporteur
expresses no view’ (UN Special Rapporteur on the Right to Privacy 2017, 2), the
UN Special Rapporteur Joseph Cannataci urged the Court to interpret the case
narrowly, because it is:
in the interest of all those who care about privacy in the United States and around
the world, for only diplomatic processes of negotiation can accommodate and
balance all of the very significant interests that are in tension in this case.
(UN Special Rapporteur on the Right to Privacy 2017, 2)
The statement invokes the broad relevance of the case and referred to the impli-
cations for privacy rights everywhere. This ultimately compelled the Commission
to submit a statement. The amicus briefs by the European Commission and Ire-
land followed similar strategies in confirming the applicability of sovereignty
as an evaluative criterion while at the same time fundamentally contesting the
US interpretation of how to pursue sovereignty as a higher common principle.
Both problematized that the jurisdictional claim of the US implicitly established
internal sovereignty as primary to sovereign equality. This represents a contesta-
tion of the moral hierarchization within the order of sovereignty rather than the
imposition of alternative criteria. The arguments pointed to potential extraterri-
toriality. For example, the Commission argued ‘that it would be appropriate for
the Court to consider EU domestic law as it pertains to searches of data stored
in the European Union’ (EC 2018a, 14). Ireland even more clearly stated that it
‘does not accept any implication that it is required to intervene into foreign court
proceedings to protect its sovereign rights in respect of its jurisdiction, or that Ire-
land not intervening is evidence of consent to a potential infringement thereof ’
(Ireland 2018, 2–3). They pointed to inconsistencies between the US approach
and the value of the principles of territorial jurisdiction and sovereign equal-
ity in the order of sovereignty, which was also echoed by the judges (Supreme
Court of the US 2018, 7–14). While it was clear that sovereignty played a role,
the Department of Justice emphasized the importance of authority over com-
panies based in the jurisdiction as well as the prosecution of crimes in the US,
while the judges and the international participants emphasized the sovereignty
implications of access to data stored abroad. Both the Commission and Ireland
formally refrained from an explicit endorsement of either party but highlighted
potentially unlawful consequences of a ruling in favour of the Department of Jus-
tice, thus asserting their jurisdictional claim of legitimate control. In sum, while
the European Commission and Ireland acknowledged sovereignty as an evalu-
ative criterion, they delegitimized the hierarchization of principles within this
order as suggested by the US representative. Yet there seemed to be a general
reservation about publicly delegitimizing US practices as a strong violation of
sovereignty.
164 DATA GOVERNANCE
After the adoption of the CLOUD Act, the Supreme Court case was declared moot,
but the legislative solution did not address the underlying tensions that arose due
to the international dimension of the conflict, as the extraterritorial nature of the
CLOUD Act merely legalized the existing jurisdictional claim. Yet the introduc-
tion of the CLOUD Act did not spark any significant reactions from the conflict
parties. In this section, I trace how the US administration successfully imposed
a legislative solution on a political conflict, arguing that two related factors con-
tributed to its prevalence. On the one hand, the legal character of the CLOUD Act
was successfully established as a way to prove worth in the order of sovereignty. On
the other hand, I argue that incremental shifts towards the order of security con-
tributed to the increasing recognition of international cooperation as a necessary
step to fight impunity.
166 DATA GOVERNANCE
While the increasing manifestation of private actors in the field of data governance
brings challenges to the position of public actors, they may draw on resources
to sustain their position in the field that are not available to private actors. The
introduction of a legislative solution changed the relationship between public and
private actors by providing additional resources to public actors, as it moved the
issue from the political back to the legal arena. While private actors articulated
strong jurisdictional claims in the court case, this changed with the introduction
of the CLOUD Act. In a letter to the senators sponsoring the bill, Microsoft and
other tech firms stated, ‘We appreciate your leadership championing an effective
legislative solution, and we support this compromise proposal’ (Apple et al. 2018).
Microsoft seemed to be willing to fight for consumer rights as long as it found
itself in a situation of legal uncertainty, even though this created a significant cost.
However, Microsoft was less willing to continue pushing for consumer rights and
sovereignty principles when it had mitigated this uncertainty. In 2018, Microsoft
published ‘Six Principles for International Agreements Governing Law Enforce-
ment Access to Data’ (B. Smith 2018c). These principles demanded stronger
transparency and judicial review but also emphasized Microsoft’s sacrifices for
customers. For instance, they highlight that ‘Microsoft has fought hard to secure
these rights and protections. Three times we filed lawsuits against the U.S. gov-
ernment to increase transparency, and all three successfully prompted significant
new protections for our customers’ (B. Smith 2018c, 1).
On the one hand, the acceptance of the CLOUD Act by tech companies points
to the limits of responsibility that private actors tend to accept. While it seems
that companies increasingly feel the need to posit themselves as embracing a more
proactive role in the field, they seem more likely to deflect responsibility in con-
texts where the estimated benefit is limited (Eichensehr 2019; Gorwa and Peez
2018). For example, a significant number of companies cooperated under the
PRISM program for years (see Chapter 4), but since widespread public criticism
emerged with regard to these practices, companies have tried to voice their objec-
tions publicly (Reform Government Surveillance 2015). While even companies
like Facebook that have long attempted to avoid responsibility (Haggart 2020)
have embraced calls for regulation more recently, it is important to further exam-
ine the role of private companies and platforms in jurisdictional conflicts as either
active challengers or passive bystanders. By emulating public justifications empha-
sizing the importance of sovereignty and pointing to the inconsistencies inherent
in law enforcement data access practices, Microsoft was able to enforce justifi-
cations from public actors. For Microsoft, this initial challenge was sufficient to
achieve more significant inclusion in the field, despite significant barriers to access.
One interviewee described how those engaged in electronic evidence have formed
somewhat of a ‘community where everybody knows each other’ (Tech company
employee A, pers. comm. 2019), describing close interaction between Microsoft
ACCESS DENIED? 167
and various officials from EU institutions. In contrast, NGOs and data protec-
tion authorities in particular have complained about the exclusive character of this
community (NGO representative A, pers. comm. 2019; DPA representative, pers.
comm. 2019).
On the other hand, this statement points to the valued status of domestic law
even in international contexts. According to a common definition of the distinc-
tiveness of legal rules, law is distinct because of its form and source rather than
the threat of sanctions or the content’s morality (Hart 1958). The justification of
extraterritoriality for security purposes had failed in the context of the Supreme
Court case, but the creation of a legal basis through the CLOUD Act altered the
space of possibles for the US to assert its jurisdictional claim. While its norma-
tive nature did not change, its formal nature did. All actors frequently referred
to the importance of respect for domestic law (e.g. EC 2018a, 5) to prove worth.
A Department of Justice letter explicitly outlined that ‘the legislative proposal is
necessary to reinstate the pre-Microsoft status quo when providers routinely com-
plied’ (Ramer 2017, 1), including for data stored abroad. This is also reminiscent of
the legalization of contested surveillance practices in the context of counterterror-
ism measures. As Viola and Laidler (2022) highlight, increased transparency about
practices has frequently resulted in the explicit legalization of certain surveillance
measures rather than their elimination. Hence, the legalization of US access prac-
tices contributed to an increase in the perceived legitimacy of the jurisdictional
claim by the US. Nonetheless, in light of the persistence of overlapping juris-
dictional claims, it seems surprising that, particularly in the EU, the CLOUD
Act found acceptance as a resolution of the conflict, apart from isolated critical
voices (in’t Veld 2019; Jourová 2018). I examine this question in more detail in
Section 7.3.2.
While now backed by domestic law, US access practices were still in potential
violation of EU law and sovereignty. It is likely that one factor that contributed
to the recognition and acceptance of this solution stems from the fact that the EU
faced an asymmetric distribution of resources. As the majority of internet service
providers relevant for EU law enforcement authorities are located in the US, the
country holds significant control over data (EC official, pers. comm. 2019). Yet I
argue shifts in the normative character of the field constituted the main reason for
the acceptance of the US CLOUD Act. In what follows, I highlight how legislative
developments in the Council of Europe and the EU have contributed, via incre-
mental processes that emphasized the supremacy of security cooperation over
sovereignty concerns, to the entrenchment of alternative criteria of evaluation
in the field. This increased the availability of policy options in the Microsoft
case.
168 DATA GOVERNANCE
The T-CY acknowledges that ‘Article 32b is an exception to the principle of terri-
toriality’ (T-CY 2014, 3) but emphasizes that ‘the Parties to the Convention form a
community of trust and that rule of law and human rights principles are respected
in line with Article 15 Budapest Convention’ (T-CY 2014, 5). The circumvention
of sovereignty in the pursuit of the common good of security has become more
urgent in the context of the development of a second additional protocol to the
Budapest Convention on ‘Enhanced international cooperation on cybercrime and
electronic evidence’ (CoE 2020). The protocol is designed to improve cross-border
cooperation related to cybercrime and electronic evidence investigations and is
similar to the US CLOUD Act. The protocol has been under discussion for an
extended period and was approved by the T-CY in May 2021.
Russia is the only member state that has refused to sign the Budapest Con-
vention on the grounds of sovereignty violations. Yet in a 2017 resolution, the
European Parliament similarly stressed concerns about the circumvention of data
protection and due process rights (EP 2017b, para. 78). The majority of debates
about the additional protocol have focused on a combination of appeals to the
rule of law and the order of security. Justifications highlight the threat to both
the physical safety of a reference community and its core values. For example,
conference proceedings in the Council of Europe on the matter highlight that
‘During the past two years, cybercrime has reached even more threatening pro-
portions affecting the security of individuals and core values of societies’ (CoE
2018b, 1). In several international fora, Alexander Seger, the head of cybercrime
at the Council of Europe, presented the slogan ‘No data → no evidence → no
justice → what rule of law?’ (see, e.g., Seger 2019). This justification similarly
ties principles of the order of security to principles of the order of fairness and
thus echoes justifications brought forward in the context of counterterrorism
(see, e.g., Obama 2014). Council of Europe communications have also specifi-
cally appealed to a dystopian scenario, highlighting the dangers of the absence
ACCESS DENIED? 169
Parallel negotiations have played out in the EU. Designed to increase cooperation
and law enforcement control of cybercrime, law enforcement and judicial coop-
eration using ICT have already been implemented within the framework of the
e-Justice strategy (Anagnostakis 2017). Legislative action increased significantly
170 DATA GOVERNANCE
after the beginning of the Microsoft warrant case, which started in 2013 and was
granted review by the US Supreme Court in October 2017. In July 2016, the
European Commission began requesting submissions on electronic evidence from
member states and found a ‘large variety of approaches adopted by the Member
States’ law enforcement and judicial authorities as well as by the service providers’
(EC 2016a, 1). In late 2017, the European Commission conducted a public consul-
tation on the improvement of cross-border cooperation in law enforcement that
already foresaw the publication of a legislative proposal in early 2018. In April
2018, only a month after the CLOUD Act had been signed into law, the European
Commission presented a twofold proposal to harmonize regulation of e-evidence
in the EU, with provisions in many ways similar to those of the CLOUD Act and
the CoE proposal.
The proposal includes a directive on the appointment of legal representatives
and a proposal for a regulation introducing European Production and Preser-
vation Orders (EC 2018b, c). While the European Investigation Order (2014)
facilitates cooperation between law enforcement agencies regarding investigative
measures, such as hearing remote witnesses or covert investigations, the Euro-
pean Production Order would enable judicial authorities in one member state
to directly obtain e-evidence from a service provider based in another member
state. In turn, the European Preservation Order authorizes judicial authorities to
request the preservation of data from service providers for future access. In the
proposal, data is conceptualized as fluid, which highlights the ‘volatile nature of
electronic evidence and its international dimension’ (EC 2018b, 2). The regula-
tion also ‘moves away from data location as a determining connecting factor, as
data storage normally does not result in any control by the state on whose terri-
tory data is stored. Such storage is determined in most cases by the provider alone,
on the basis of business considerations’ (EC 2018b, 13). This also emphasizes
the relevance of international cooperation. As in the European Arrest Warrant,
the European Commission emphasizes the principles of double criminality and
mutual recognition (EC 2018b, 4). Mutual recognition means that investigations
for offences recognized in both jurisdictions can be executed in another mem-
ber state without judicial review. The Commission carefully tried to avoid overly
broad sovereignty implications, as the proposal ‘clarifies the procedural rules and
safeguards applicable to cross-border access to electronic evidence but does not
go as far as harmonising domestic measures’ (EC 2018b, 6). However, the Com-
mission proposal also explicitly articulated the intention to provide ‘a model for
foreign legislation’ (EC 2018b, 10), thus highlighting the intention to promote EU
norms to third countries.
The proposal has been subject to criticism, including some from the mem-
ber states (Federal Ministry of Justice and Consumer Protection (Germany)
2019, 2). Concerns about the European Commission’s lack of transparency and
cooperation resulted in a total of 841 amendment proposals by the Parliament
ACCESS DENIED? 171
7.4 Conclusion
This chapter has discussed the contested area of access to electronic evidence
by law enforcement agencies in varying jurisdictions. I have demonstrated how
Microsoft used its position in the field to challenge practices by US law enforce-
ment authorities. Microsoft challenged the established and barely problematized
informal cooperation between law enforcement authorities and private companies
on the basis of their extraterritorial effects. This catalysed processes to legalize
and formalize these practices. This has contributed to the fact that sovereignty-
intrusive jurisdictional claims in the area of electronic evidence sharing seem to
have stabilized. The conflict represents, on the one hand, one of the first major
challenges to a formerly largely disregarded system of public–private cooperation.
On the other hand, while formally restricted to the national level, the conflict
ACCESS DENIED? 173
resonated with an inter- and transnational audience and thus initiated broader
meaning-making and normative ordering processes. While the conflict became
salient internationally due to the underlying conception of data governance as a
potential intrusion on sovereignty, the US implemented a legislative solution that
was largely based on the conceptualization of data as a security concern. Thus,
the chapter has illustrated an important conflict resolution strategy: legalization.
By creating a legal basis for existing practices, the US CLOUD Act has continued
strategies that, in the area of counterterrorism, have contributed to bilateral agree-
ments, for example with regard to Passenger Name Records (see Chapter 5) and
the Terrorist Finance Tracking Program (see Chapter 6), or several legislative acts
that enable data access by intelligence services. Law has an enabling as much as a
constraining dimension here (J. Cohen 2019).
I have argued that the conflict resolution process was based on incremental
processes that had altered the space of possibles due to a shift towards the order
of security. The interests of private companies and law enforcement authorities
converged around a vision of data governance as a security partnership. While
both sides benefit from legal certainty, the formalization of informal practices
also brings risks. Public actors attempt to uphold the benefits of informal direct
cooperation with providers, such as ease and speed of access, while making their
access requests mandatory, which potentially undermines due process norms. In
light of stringent data protection standards and a renewed emphasis on digital
sovereignty, particularly in cloud computing, the current regime might face legal
challenges in the EU.
The question of access to electronic data by law enforcement agencies has also
manifested in conflicts about the availability of the so-called WHOIS database in
domain name registration (Kulesza 2018). In Brazil, a similar conflict about judi-
cial orders to disclose user data became very serious. The refusal to disclose data
that Facebook insisted were not accessible due to encryption prompted Brazil-
ian judges not only to shut down WhatsApp temporarily on multiple occasions
but also to detain Facebook’s vice president for Latin America, Diego Dzodan
(Freedom House 2016). To avoid such disputes, companies face incentives to wel-
come legal harmonization, because it enables them to maintain their image as
responsible protectors of customer rights, while simultaneously creating legal cer-
tainty. As Microsoft has introduced its first underwater data centre off the coast
of the Orkney Islands (Cellan-Jones 2018), new jurisdictional challenges and
conflicts are likely to arise.
8
The Right to Be Forgotten
Moral Hierarchies of Fairness
The increasing wealth of data has contributed to the transformation of the inter-
net into a durable collective human memory. Search engines such as Google
aim to make this collection of information available via crawling for, indexing,
and ranking content. According to current estimates, Google, which has a search
engine market share of over 90 per cent worldwide, processes at least 40,000 search
queries on average per second, which amounts to 3.5 billion searches per day and
1.2 trillion searches per year worldwide (Google Search Statistics 2020). As Mayer-
Schönberger put it, on the internet, the availability of data ‘will forever tether us to
all our past actions, making it impossible, in practice, to escape them’ (2011, 125),
as the internet transforms time into a ‘perpetual present’ (2011, 92). This transfor-
mation raises fundamental questions about how identity is represented and shared
online, how it can evolve, and to what extent digital memory is tied to the technical
specificities of search engines. As Zuboff (2018) has warned of surveillance capi-
talism’s impact on human futures, the idea of a ‘right to be forgotten’ has attracted
widespread attention (McGoldrick 2013). Also known as the right to be delisted or
the right to erasure, the right to be forgotten specifies circumstances under which
private individuals may request the removal or delisting of personal information
from search engine results.
This chapter aims to analyse the evolution of the debate about the right to be for-
gotten in the context of two major instances of jurisdictional conflict that emerged
between the US-based search engine Google and EU data protection authorities.
First, I focus on the introduction of the right to be forgotten through a landmark
ruling by the CJEU in 2014. Following a legal dispute with a Spanish citizen,
Google was forced to allow EU residents to demand the delisting of websites from
search results if data are ‘inadequate, irrelevant or no longer relevant or exces-
sive’ (Google Spain 2014, para. 94). The second jurisdictional conflict unfolded
between Google and the French data protection authority regarding its scope of
applicability. While the Commission Nationale de l’Informatique et des Libertés
(CNIL) argued for global applicability, Google refused. The CJEU (Google v.
CNIL 2019) restricted the scope of applicability to the European context, argu-
ing against the French data protection authority’s request. In this second instance
of conflict, the debate shifted from the question of the existence of the right to be
forgotten to the permissible imposition of norms or rules on other communities.
As search engines enable finding publicly available information and content for
participating in democratic processes and societal debates, they are considered
to ‘play a pivotal role in the information society’ (CoE 2012, 1). Due to this
enabling function vis-à-vis the exercise of human rights, they are considered to
have specific public value. However, it is important to note that access to informa-
tion is not independent on the user. The order and visibility of search results is
determined by algorithms, which may be biased and, due to concerns about com-
petition, are not transparent. In 2009, Google began offering personalized search
results depending on user data, such as location, search history, IP addresses, date
and time of the request, cookies, and information from other Google services
(Google 2009). Therefore, individual searches vary significantly in their results.
As crawling and indexing technologies improve, searches are more likely to dis-
close content that was not originally intended to be widely available, in particular
sensitive information. While access to the internet allows the exercise of human
rights such as freedom of speech, the accessibility of this content to anyone may,
in turn, infringe upon the individual rights of others. There are several reasons
why people might aim to have past actions removed from the public record. These
THE RIGHT TO BE FORGOTTEN 177
include fears of reputational damage but also attempts to reduce emotional dis-
tress resulting from cyberbullying or image-based sexual abuse. The right to be
forgotten seeks to restrict such infringements, particularly concerning identity
fraud, risks of the abuse of data or incorrect conclusions on the basis of data, and
limits to autonomy based on a lack of control (Tjong Tjin Tai 2016). The per-
ceived lack of control in relation to data disseminated on the internet may have
serious consequences. For example, in 2012, a 15-year-old Canadian committed
suicide after experiencing significant cyberbullying because of a topless photo she
had sent to a stranger in 2009 (M. Dean 2012). As search results connected to
personal identifiers such as names significantly shape the possibility of moving
beyond past experiences (Mayer-Schönberger 2011), the enactment of individ-
ual and collective digital identity is to an unprecedented extent shaped by the
design of search engines that determine the composition and structure of such
results.
While the right to be forgotten became the subject of intense debate only more
recently, both the societal and legal implications have already been discussed in
the context of offline data, for example in relation to the Data Protection Directive
(1995). The right to be forgotten was also employed in a decentralized manner by
national data protection authorities, for example in Spain (e.g. Azurmendi 2017),
or with regard to certain provisions regarding criminal records, for example in
France, Germany, or Italy. Many domestic legal systems specify general require-
ments that support rehabilitation after criminal convictions, for example, to clear
offences from criminal records or to remove debt histories. In contrast, in the US,
the right to information, freedom of speech, and freedom of the press often take
precedence over the protection of privacy based on the First Amendment to the
US Constitution (Barendt 2005, 232–46).
In the EU, debates about the specification of a right to be forgotten resurfaced
in the context of data protection reform. In 2010, the Justice Commissioner and
Vice President Viviane Reding emphasized that ‘Internet users must have effective
control of what they put online and be able to correct, withdraw or delete it at will’,
also specifically referring to a proposed ‘right to be forgotten’ (Reding 2010). At the
EU level, the first explicit legal expression and clarification of this right manifested
in 2014 through a judgment of the CJEU, which I explore in Section 8.2. While
the court considered a legal basis in Article 12(b) and Article 14(a) of the Data
Protection Directive (1995), which granted rights to rectify non-compliant data
and to object to their processing, the right has been strengthened with the GDPR,
which stipulates that data subjects have, in the first instance, a ‘right to erasure’
vis-à-vis the data controller (GDPR 2016, Art. 17).
178 DATA GOVERNANCE
In 2014, a CJEU landmark judgment specified that data subjects have the right to
request the delisting of search results from search engines if the data are outdated
or irrelevant. The lawsuit was the result of a complaint to the Spanish data pro-
tection authority which sparked a jurisdictional conflict between Google and the
public data protection authority over the legitimate conduct of data and content
displayed by the search engine. In what follows, I first give a brief summary and
then examine the justifications in more detail.
In 2009, after a Spanish citizen, Mario Costeja Gonzáles, had unsuccessfully tried
to get information about a home repossession order removed from the internet, he
filed a complaint with the Agencia Española de Protección de Datos (AEPD). The
relevant information had been published in a newspaper in 1998 and was featured
as one of the top listings when searching on the search engine Google. Accord-
ing to Costeja Gonzáles, this negatively affected his professional life. He had first
asked the newspaper to remove the article and, after that failed, requested a delist-
ing from Google Spain SL which would remove the article from search results on
his name. He argued that the information was no longer relevant, as any financial
issues had been resolved long ago (Google Spain 2014, paras. 14–15). While the
AEDP dismissed the complaint against the newspaper, as the publication had been
legal and for the legitimate purpose of attracting buyers (Google Spain 2014, para.
17), the complaint against Google was upheld. The AEPD argued that the delist-
ing of such information from search engines results rather than the removal of the
information sufficiently decreased the likelihood of accidental access while pre-
serving freedom of information. When Google Spain challenged this decision, the
Audiencia Nacional, the Spanish High Court, referred the case to the CJEU. The
Spanish High Court asked questions regarding, first, the applicability of the Data
Protection Directive to search engines due to their potential status as data ‘pro-
cessors’ or ‘controllers’, secondly, regarding the applicability of EU law to Google
Spain, and thirdly, whether an individual had the right to request the erasure of
data from search results (Google Spain 2014, para. 3). Thus, the CJEU had to weigh
competing claims about the legitimate conduct of data governance with regard to
search engine results.
(Google Spain 2014, 19). Like the AEDP, it argued against the removal of the data
from the newspaper website.
In its judgment, the Court created two moral hierarchies. On the one hand, it
determined that the relevant reference community was constituted by individuals
rather than businesses, referring also to the qualitative difference between individ-
ual rights and business interests. Some have criticized the fact that the judgment
extensively referred to Articles 7 and 8 in the Charter of Fundamental Rights but
omitted that there is no specification of the role of private actors in the enforcement
of such standards (Frantziou 2014, 768). By emphasizing the private responsibil-
ity to protect, the CJEU judgment not only established the jurisdictional claim of
individuals over their data but also transferred the enforcement responsibility to
private companies.
On the other hand, the Court also created a hierarchization of fundamen-
tal rights. The judgment prioritized the rights of data protection and privacy
of the individual. However, the CJEU strongly argued for balancing acts on a
case-by-case basis, emphasizing the importance of the public interest (Google
Spain 2014, para. 81). Some, nonetheless, suggested that the court neglected the
fundamental rights of freedom of expression and information and freedom of
the press (Frantziou 2014, 770). The contested character of the judgment is also
indicated by the fact that the CJEU—atypically—did not follow the opinion of
the Court’s Advocate General. The Advocate General Niilo Jääskinen had argued
against the right to be forgotten on the basis of the fact that ‘the fundamental
right to information merits particular protection in EU law, especially given the
ever‑growing tendency of authoritarian regimes elsewhere to limit access to the
internet or to censure content made accessible by it’ (2013, para. 121). He pointed
to a dystopian scenario in the order of fairness based on authoritarian censorship
and the limitations of freedom of speech and concluded that a right to be forgotten
‘would entail sacrificing pivotal rights such as freedom of expression and informa-
tion’ (Jääskinen 2013, para. 132). In contrast, others argued that the CJEU ‘builds
a defence of privacy against a new dimension of risk generated by the Internet
and search engines’ (Azurmendi 2017). While both draw on a dystopian scenario
based in the order of fairness, they arrive at different conclusions. While the AG
argued against sacrificing freedom of expression and information, Azurmendi, fol-
lowing the Court’s emphasis on privacy rights, argues in favour of the protection
of privacy against private interference.
Reicherts emphasized the right to be forgotten as a way to ‘put citizens back in con-
trol of their data’ (2014, 4), which also evokes the notion of individual ownership
of data (see also Reding 2014). This idea is set up against the idea of data ownership
by companies (B. Smith 2018c) as well as the conception of data as some form of
a public good which became more important in the subsequent dispute. Thus, the
relevant reference community consists of individuals who need to be empowered
rather than businesses and customers or ‘the public’.
In contrast, Google initially aimed to construct a reference community based
on (existential) threats. More specifically, in the immediate aftermath of the judg-
ment, the company tried to delegitimize the right to be forgotten by applying
evaluative criteria based in the order of security. Google attempted to discur-
sively link the right to be forgotten to the cover-up of crime (J. Halliday 2014).
For instance, Google stated that about 30 per cent of the 41,000 requests they got
within two weeks of the ruling were related to fraud, 20 per cent concerned seri-
ous crimes, and more than 10 per cent were related to child pornography, thus
referring to a dystopian scenario of abuse to cover up morally deficient behaviour.
European actors tried to counter this strategy. With regard to criticism of data
protection reform proposals, which included the right to be forgotten, the EU
Justice Commissioner Martine Reicherts warned that ‘Those who try to use dis-
torted notions of the right to be forgotten to discredit the reform proposals are
playing false. We must not fall for this’ (Reicherts 2014, 2). Her explicit warn-
ings of distortion, discredit, and foul play pointed to morally deficient attempts
at interference.
In contrast to the initial reports by Google, the majority of requests under
the right to be forgotten—between 84 and 98 per cent in 2015—were found to
involve private personal information rather than criminal activities (Tippmann
and Powles 2015). After initial attempts to delegitimize the right to be forgotten,
Google rapidly changed its position and adopted a more comprehensive strat-
egy to adapt to the conflict resolution process initiated by the CJEU judgment.
Google implemented an online form for requests under the ruling. The company
pointed out that it ‘moved rapidly to comply with the ruling from the Court.
Within weeks we made it possible for people to submit removal requests, and soon
after that began delisting search results’ (Google 2015), highlighting its coopera-
tive compliance. In addition, Google created an Advisory Council and developed
removal guidelines for the EU in cooperation with academics and other experts.
The membership of the council, which included respected scholars and authori-
ties, established Google’s valuation of independent expertise and impartiality, thus
proving conformity with principles of the order of fairness. The Advisory Coun-
cil’s mission was ‘to advise it [Google] on performing the balancing act between
an individual’s right to privacy and the public’s interest in access to information’
(Advisory Council to Google 2015, 1). This established the company’s increasingly
assertive response to the imperative of justification in response to the ruling and
182 DATA GOVERNANCE
completed, as Chenou and Radu put it, the company’s ‘full transformation from a
messenger into an editor of the world’s information’ (2019, 97).
In sum, the challenge of Costeja and the AEDP presented an alternative vision
of the field that highlighted private responsibility and principles embedded in
the order of fairness. Google failed to engage comprehensively with the imper-
ative of justification. The CJEU, in keeping with parallel discursive developments
in the EU legal order, emphasized the primacy of the human rights of privacy
and data protection in relation to business interests. While initial reactions were
hesitant, Google proactively and comprehensively complied with the ruling. It
was obvious that the right to be forgotten might have effects beyond the EU,
but the specific global relevance of the ruling only manifested in 2015. A new
jurisdictional conflict emerged when Google and the French data protection
authority CNIL voiced diverging opinion regarding the geographical scope of
the law.
Neither the 2014 judgment nor the GDPR specified the geographical scope of
delisting or removal under the right to be forgotten. In 2015, the French data
protection authority contested Google’s interpretation that requests should only
be removed from a specific country domain, such as google.fr or google.de
(CNIL 2015b). Instead, the CNIL argued that the full enforcement of the law
required global delisting (i.e., also from google.com), thus extending its juris-
dictional claim over data globally. Google agreed to apply the right to be for-
gotten to domains outside Europe if the IP address indicated a user’s location
in the EU (Fleischer 2016), but the company appealed against the CNIL’s fine
of €100,000. The case was eventually referred to the CJEU in July 2017 (Google
v. CNIL 2019).
It is important to note that the legal context changed during the dispute. The
GDPR, which codifies the right to be forgotten (GDPR 2016, Art. 17), came into
force in 2016 and has been applicable since early 2018. Thus, the Court exam-
ined the overlapping jurisdictional claims, considering the 1995 Directive and the
2014 ruling as well as the GDPR. Due to the borderless character of the internet,
the demarcation of geographical limitations for the removal or delisting of content
presented the Court with multiple normative and legal contradictions and chal-
lenges. In another landmark ruling in late 2019 (Google v. CNIL 2019), the CJEU
found no current obligation under EU law to apply the right to be forgotten glob-
ally but maintained EU-wide enforcement. In Section 8.3.1, I briefly illustrate the
ambiguous roles of data protection authorities as active shapers of or bystanders
in jurisdictional conflicts before illustrating Google’s increased emphasis on the
imperative of (moral) justification.
THE RIGHT TO BE FORGOTTEN 183
In its guidelines published shortly after the judgment, the European data protec-
tion authority 29WP argued that ‘de-listing decisions must be implemented in a
way that guarantees the effective and complete protection of these rights and that
EU law cannot be easily circumvented’ (29WP 2014, 9). This opinion constituted
sufficient grounds for the CNIL to contest Google’s limited delisting practices and
request the global removal of links (CNIL 2015a), thereby articulating an exten-
sive jurisdictional claim. The CNIL’s interpretation of the adequate role of data
protection authorities in conflict resolution processes diverges significantly from
the interpretation by the Irish data protection authority outlined in Chapter 4.
While the Irish DPA was criticized for a lack of regulatory momentum by largely
assuming the role of a passive bystander, the French data protection authority was
accused of overreach by assuming the role of an (over)active shaper.
The position of data protection authorities in the field of data governance dif-
fers depending on the legal and historical context of the applicable jurisdiction. For
example, the US Federal Trade Commission (FTC) has strong enforcement rights
in comparatively few areas of general consumer protection, while most DPAs in
the EU have a broader mandate (Dowd 2019). In interviews and public statements,
private actor representatives have problematized ‘populist rhetoric’ (Tech com-
pany employee B, pers. comm. 2019) and ‘harmful protectionism’ (Private sector
representative, pers. comm. 2019) by data protection authorities vis-à-vis US tech
companies. The monopoly position of Google, which holds a more than 90 per
cent market share in Europe, thus makes it a particularly likely ‘target’ of such
efforts of data protection authorities to ‘raise their profile’ (EC official, pers. comm.
2019).
The CNIL and its president since 2011, Isabelle Falque-Pierrotin, have been
active challengers of company practices. For example, in 2019, the data protec-
tion authority made headlines when it imposed a financial penalty of €50 million
against Google due to a lack of compliance with the GDPR, despite the company
headquarters being in Ireland. While the company should, according to the so-
called one-stop-shop mechanism, fall under the jurisdiction of the Irish DPA, the
CNIL argued that ‘In this case, the discussions with the other authorities, in par-
ticular with the Irish DPA, where GOOGLE’s European headquarters are situated,
did not allow to consider that GOOGLE had a main establishment in the Euro-
pean Union’ (CNIL 2019). The data protection authority, therefore, argued that
the one-stop-shop mechanism, which is designed to prevent multiple data pro-
tection authorities from taking deviating decisions, did not apply and claimed
jurisdiction to unilaterally investigate the complaints. A recent judgment by the
CJEU confirmed this approach (Facebook Ireland Ltd 2021).
In the right to be forgotten dispute, the CNIL attempted to assert the legiti-
macy of its jurisdictional claim by pointing to the value of domestic law and the
184 DATA GOVERNANCE
While the CNIL’s argument had backup from the data protection community,
Google responded to the imperative of justification posed by the CNIL in a
manifestly different manner. In comparison with the first court case, Google
changed its strategy to rely on normative arguments and closer interaction with
local stakeholders (Tech company employee B, pers. comm. 2019). This had
² The original reads: ‘Mais il n’y a pas d’impérialisme à soumettre à nos règles une entreprise
étrangère venue sur notre marché.’
THE RIGHT TO BE FORGOTTEN 185
three components: Google more strongly emphasized the fundamental right sta-
tus of freedom of speech (cited in Gibbs 2014; Bodoni 2019); it aimed to establish
its position as a responsible actor in the field; and it relied on justifications in line
with the order of sovereignty. One could argue that the case is different in that it
more immediately affects communities beyond Europe. Yet as strong normative
concerns also existed concerning the right to be forgotten as such, the different
justifications are likely to reflect a more foundational change of strategy.
First, the company tried to rectify what it perceived to be a tilted balance
between freedom of information and privacy in Google Spain (2014). Its Chief
Legal Officer, Kent Walker, said that the proceedings ‘represent a serious assault
on the public’s right to access lawful information’ (K. Walker 2017). The company
also highlighted ‘serious chilling effects’ and pointed to a ‘race to the bottom’ (K.
Walker 2016) regarding freedom of speech as potential consequences. Google’s
chief executive Larry Page was quoted highlighting the problematic consequences
of authoritarian states using the right to be forgotten to legitimize widespread
censorship (Gibbs 2014). In particular, the removal of lawfully published content
was expected to encourage abuse (Google 2014, 12). The justifications cautioned
against a dystopian scenario in the order of fairness. Second, the strategy estab-
lished Google as an actor in conformity with the field’s ordering principles. In
2016, the company emphasized its sacrifices as a human rights defender, arguing
that ‘We have received demands from governments to remove content globally
on various grounds—and we have resisted, even if that has sometimes led to the
blocking of our services’ (K. Walker 2016). Third, Google’s strategy also high-
lighted normative justifications rooted in the order of sovereignty, albeit with an
explicit emphasis on external sovereignty and sovereign equality. Google specif-
ically highlighted the inconsistencies of the CNIL’s approach, which, on the one
hand, strongly emphasized protection of shared norms within constitutive com-
munities but, on the other hand, imposed rules on other communities, thereby
inhibiting other communities’ capacity for self-determination. In a 2016 blog post,
Google’s Ken Walker states, ‘For hundreds of years, it has been an accepted rule of
law that one country should not have the right to impose its rules on the citizens
of other countries … As a company that operates globally, we work hard to respect
these differences’ (K. Walker 2016). The post entitled ‘A Principle That Should Not
Be Forgotten’ established extraterritorial practices as morally deficient and against
customary principles.
Google’s new emphasis on moral and norm-based arguments was successful in
so far as not only a broad coalition of businesses such as Microsoft and media
companies but also NGOs such as the Fondation pour la liberté de la presse,
the Wikimedia Foundation Inc., or the Reporters Committee for Freedom of the
Press backed the company. Being at the centre of the struggle, Google was sud-
denly in the position of a human rights defender. For instance, the amicus brief
by Article 19 and others described a global scope of the right to be forgotten as ‘a
186 DATA GOVERNANCE
Google’s strategy has at least partly succeeded. While the right to be forgotten
has been strengthened with the GDPR (2016, Art. 17), the CJEU in its 2019
THE RIGHT TO BE FORGOTTEN 187
judgment generally followed the opinions by Google and the supporting briefs.
The judgment stated that global enforcement of the right to be forgotten is not
currently prescribed by EU law. The CJEU pointed to domestic variation in bal-
ancing acts between fundamental rights and implicitly raised the possibility of
national regulation. More specifically, the Court noted that ‘[while] EU law does
not currently require that the de-referencing granted concern all versions of the
search engine in question, it also does not prohibit such a practice’ (Google v.
CNIL 2019, para. 72). This statement implicitly suggested that future domestic or
EU-level legislation would be in line with EU law, leaving significant wiggle room
for legislators. Besides, while both argued for EU-wide applicability of the right to
be forgotten, the AG considered geo-blocking as sufficient (Szpunar 2019, paras.
73–70), while the CJEU did not specify. Therefore, this jurisdictional conflict has
strengthened a vision of data governance that is clustered around the sovereign
rights of communities. It has strengthened the idea that, rather than a globally
uniform approach to data governance, regulation should allow for normative
differences. The Court specifically emphasized that:
the balance between the right to privacy and the protection of personal data, on
the one hand, and the freedom of information of internet users, on the other,
is likely to vary significantly around the world … the interest of the public in
accessing information may, even within the Union, vary from one Member State
to another.
(Google v. CNIL 2019, paras. 61–7)
The Court emphasized the responsibilities and rights of the member states ‘to pro-
vide for the exemptions and derogations necessary to reconcile those rights with,
inter alia, the freedom of information’ (Google v. CNIL 2019, para. 67). There-
fore, the judgment might provide legitimation to strengthen privacy or freedom
of access rights depending on the specific balancing acts of domestic legislators or
data protection authorities (Samonte 2020). The judgment establishes that moral
hierarchizations in the order of fairness must be negotiated in the context of spe-
cific constitutive communities. It also re-emphasizes that both economic interest
and the public interest are secondary to the fundamental rights of privacy and data
protection (Google v. CNIL 2019, para. 45).
In conclusion, while the court case was largely perceived as a victory for Google
in the media (e.g. Kelion 2019), it might challenge the transnational claims of
jurisdiction for companies in the long term. The Court did not follow Google in
establishing a global scope of the right to be forgotten as morally deficient. The
Court thus upheld the emphasis on ‘effective and complete protection’ (Google
Spain 2014, para. 58) articulated in 2014, which some have criticized for the
construction of data protection as an ‘almost super-right’ (Powles 2015a, 592).
Rather than prohibiting the global scope of the right to be forgotten, the judg-
ment established the normative desirability of varying balancing acts which might
188 DATA GOVERNANCE
favour an increasing fragmentation of rules in the long term. This may also have
implications for other areas of content governance such as hate speech, thereby
broadening the administrative load for Google.
we’ve taken from this, that we’re starting the process of really going and talking to
people’ (cited in Gibbs 2014). The creation of an Advisory Council under the pre-
text of seeking advice also illustrates the complex nature of Google’s strategy in the
conflict resolution process. While the council included reputable academics from
the field likely to act impartially, the members were selected by Google and the
information they received was limited to publicly available information (Advisory
Council to Google 2015). It is notable that Google did not engage in coordi-
nated efforts with other search engines but chose to unilaterally shape the system
and thus set a precedent for others. In the dispute with the CNIL, Google was
well equipped to respond to challenges to its implementation decision and again,
through collaborative compliance, succeeded in establishing itself as a respon-
sible international actor in contrast to the CNIL’s overbroad claim. Powles has
highlighted that the response in the first case ‘co-opt[ed] significant elements of
the media, civil society, governments, and institutions in promulgating its own
agenda’ (2015a, 583–4), particularly as publishers and media companies are also
less likely to be interested in a mechanism that might contribute to the delisting
or removal of articles. Other tech companies have adopted similar strategies to
resolve conflicts in the field. To improve removal processes of harmful or illegal
content, Facebook has established the Facebook Oversight Board, which some
have criticized as a move largely the sake of appearance that nevertheless forces
governments to be ‘passive rule-takers’ (Haggart 2020, 323). Companies seem
increasingly well equipped to respond to rising demands of responsibility by steer-
ing them in specific directions and exercising control over their implementation.
As the Advisory Council mainly seemed to aim at legitimizing its practices rather
than controlling them (Chenou and Radu 2019, 89), the capacity to shape the rules
of implementing the judgment thus ultimately contributed to a reinforced position
for Google in the field.
In sum, the examples outlined above are evidence of how normative arguments
and an emphasis on compliance as well as material resources formed part of a
more comprehensive strategy to establish Google’s identity as a responsible actor.
As I demonstrate in Chapters 7 and 9, this seems to speak to a more general phe-
nomenon regarding private tech companies that tend to act as norm entrepreneurs
to enter the political and diplomatic stage (Gorwa and Peez 2018).
The responsibility of tech companies has been the subject of discussion in recent
years. It is important to note that this strategy also responds to rising demands
for companies to recognize their influence, as pointed out by the EU Justice Com-
missioner Martine Reicherts, who suggested that ‘handling citizens’ personal data
brings huge economic benefits to them [companies]. It also brings responsibility’
THE RIGHT TO BE FORGOTTEN 191
(2014). Indeed, a lack of responsibility may facilitate hate speech and abuse,
as demonstrated by the data abuse in the Cambridge Analytica scandal or the
violence incited on Facebook in Myanmar (Stevenson 2018). Nevertheless, the
unique capacities of tech companies in drawing on both material and emotional
dependency (Culpepper and Thelen 2019), particularly compared with the public
legal system, may have adverse consequences. Frosio proposes that this process
‘might be pushing an amorphous notion of responsibility that incentivizes inter-
mediaries’ self-intervention to police allegedly infringing activities on the Internet’
(2018, 3). The asymmetry of financial resources and administrative constraints
reinforces a dynamic whereby public actors accept tech companies’ implementa-
tion and rule shaping. Google’s quick implementation after the ruling confronted
data protection authorities with a functioning system that was largely effective but
designed according to the company’s preferences.
Thus, Google not only shaped the conflict resolution process but also weighs
individuals’ privacy and freedom of information rights. A publication by Google-
affiliated researchers highlighted ‘the challenges of implementing privacy regula-
tions in practice, both in terms of the thousands of hours of human review required
and the fundamentally challenging process of weighing individual privacy against
access to information’ (Bertram et al. 2019, 971). In another recent case (GC and
Others v. CNIL and Others 2019), the outsourcing of fundamental rights bal-
ancing to private companies is even more explicitly articulated. With regard to
sensitive data, including political opinions, religious beliefs, or past convictions,
the CJEU submitted that Google needs to weigh the ‘public interest’ of particu-
larly sensitive data against the rights of individuals who might be strongly affected
by the decision. This sparked concerns about the empowering effect on private
companies (Lynskey 2015b, 532) with Google possibly becoming a private or fun-
damental rights ‘adjudicator’ (Pirkova and Massé 2019). A 2015 Guardian article
stated, ‘Google has acted as judge, jury and executioner in the wake of Europe’s
right to be forgotten ruling’ (Powles and Chaparro 2015). In the past, Google has
been explicitly criticized for a lack of transparency concerning its assessment of
fundamental rights under the right to be forgotten, for example by a group of over
eighty academics in 2015 criticizing a ‘jurisprudence built in the dark’ (Kiss 2015).
The debate about the power of platforms to decide on human rights is rele-
vant beyond data governance and relates to illegal or harmful or unlawful content
according to recent legislation, such as the German Network Enforcement Act
or the Digital Services Act. A recent case regarding hate speech on Facebook
(Glawischnig-Piesczek v. Facebook Ireland 2019), or the UK Online Harms White
Paper (UK Home Office 2019) have demonstrated more comprehensive respon-
sibilities of providers. Yet under current legislation, content removals would not
be feasible for public regulators due to a lack of financial resources. With regard to
law enforcement cooperation concerning electronic evidence, Google has decided
to issue fees for data requests (see Chapter 7).
192 DATA GOVERNANCE
of fairness or whether the pursuit of own community’s interest and values trumps
respect for other communities’ interests and values in the pursuit of sovereignty.
Theoretically, this also demonstrates that, far from being static, the hierarchization
within value orders is dynamic.
To prevail in conflict resolution processes, the successful imposition of a specific
conceptualization of governance objects and principles is important. For example,
the amicus brief by the Reporters Committee for Freedom of the Press (RCFP)
and others criticized the fact that ‘by consistently framing freedoms and rights
negatively affected by the right of delisting as “interests” … the balancing con-
ducted by the Court of Justice would be tilted towards the right to data protection
from the very beginning’ (RCFP and Others 2017, 13). The amicus brief by Arti-
cle 19 and others reverses this logic by speaking about ‘data protection interests’
(Article 19 and Others 2017, 12) that need to be weighed against the fundamental
rights of users in other jurisdictions. In the order of fairness, human rights and
binding human rights agreements are considered objects of worth. Thus, par-
ties to conflicts aim to reduce the principles referred to by the other party to
an ‘interest’ which corresponds to a lower position within the established moral
hierarchy.
In contrast, with regard to jurisdictional claims drawing on the order of
sovereignty, actors highlight international law and the interpretation by regional
and international courts as well as domestic law to prove how the imposition
of norms on other communities is morally deficient or worthy. By problematiz-
ing extraterritoriality and constructing it as a shared problem, both the CNIL
and Google refer to similar principles, but the interpretation of what constitutes
extraterritorial action differs. The CNIL perceives a responsibility to protect Euro-
pean citizens’ fundamental rights and aims to fully enforce the 2014 judgment.
However, the borderless nature of the internet complicates the demanded ‘effective
and complete’ (Google Spain 2014, para. 58) protection of citizens. Thus, Google
and others perceive the extraterritorial effects of the CNIL’s jurisdictional claim as
a problematic imposition of foreign norms in violation of principles of sovereignty
and therefore conceptualize the CNIL’s demand as an extraterritorial practice with
problematic consequences for the internet as a global network.
While Boltanski (2011, 40) considers such conflicts as ‘reformist’ compared with
the more ‘radical’ conflicts that challenge the existing order based on alternative
principles of worth, the example illustrates that a critique of the current balancing
between different principles may contribute to normative change. It might enable
private actors to emulate public justifications (and potentially vice versa) and beat
public actors at their own game, thus prevailing in conflict resolution processes
by assuming the positively connotated characteristics of the identity of the other
party to the conflict, in this case, responsibility and impartiality.
194 DATA GOVERNANCE
8.5 Conclusion
In this chapter, I have discussed the introduction and implementation of the ‘right
to be forgotten’ through a jurisdictional conflict initially between Google and the
Spanish data protection authority AEDP and later the French CNIL. The first
part of the conflict culminated in a 2014 landmark judgment by the CJEU, which
established both the responsibility of search engines as data controllers and a hier-
archy of business interests and fundamental rights protections. The second part
of the conflict focused on diverging interpretations of the geographical scope of
these protections and the specific balance between different fundamental rights.
The conflict between Google and the CNIL was again resolved by the CJEU,
which emphasized meaning-making processes at the community level. However,
as in the first case, this conflict resolution strategy opened up a new space for
jurisdictional conflicts by shifting the negotiation processes to the community
level.
The chapter has explored the division of public and private responsibilities in
the enactment and enforcement of fundamental rights, arguing that actors use dif-
ferent resources and strategies to prevail. The normative dimension of the conflict
became particularly prominent in the second phase of the jurisdictional conflict.
Google’s initial strategy of deflecting responsibility, however, proved unsuccess-
ful. In contrast, the emphasis on, first, the normative binary of either freedom of
speech or privacy that the company pushed after the ruling and in the CNIL case
turned out to be a more successful strategy. The normative fit with the agenda of
NGOs and media companies was reinforced by the emphasis on, secondly, the
balance between sovereignty and self-determination and the imposition of norms
on others. The chapter has suggested that Google’s strategy of drawing on moral
arguments while emphasizing its identity as a responsible global actor has been
similarly employed by other tech companies, such as Microsoft (see Chapter 7)
and, more recently, Facebook (Haggart 2020). By drawing on moral arguments,
tech companies seem to attempt to shape evaluative criteria and entrench their
positions in the field as political players. In particular, the fact that Google was able
to emulate public justification strategies and draw on NGO and media support in
its justifications contributed to its prevailing in the second phase of the conflict,
as the company could prove worth within the order of fairness. This should not
obscure company interests in shaping policies according to their preferences, par-
ticularly with regard to the construction of Google as a guardian of public space
online. Zuboff (2018, 8) suggests that companies possess a unique ability to shape
human behaviour and future conduct. Further debate is needed about the priva-
tized, biased, and opaque nature of the decision-making that constructs what is
visible for users (see also Chenou and Radu 2019; Powles 2015a).
THE RIGHT TO BE FORGOTTEN 195
After analysing five cases of jurisdictional conflict in data governance, this chapter
provides a comparative assessment embedding the empirical findings in a broader
debate on the normative trajectories of digital governance. The chapter draws on
the insights developed in the case studies to come back to the central question of
the book: how do actors respond to or resolve jurisdictional conflicts in data gov-
ernance? To provide insights into responses and strategies across cases, I examine
how value orders are discursively combined and linked to specific calls for action.
Do actors tend to endorse a cosmopolitan vision of data governance by relating
an emphasis on fairness to the global nature of problems? Or do they emphasize
neoliberal market globalism that connects the order of production with global-
ist claims for efficiency? And what implications does that have for transnational
data governance at large? In this chapter, I look beyond the transatlantic context
to discuss the central findings and implications in light of rising regulatory ambi-
tions, particularly those of China, as well as developments in other contexts such
as Brazil or India.
Chapters 1–8 demonstrated the multidimensional character of jurisdictional
conflicts, which are characterized by the interplay of legal, political, institutional,
and normative dimensions. Throughout the case studies, I focused particularly
on so far understudied normative ordering processes that draw on distinct value
orders. These orders express diverging underlying conceptualizations of data
which are linked to the pursuit of substantive common goods such as security and
safety, human rights, or economic progress as well as particular reference com-
munities and evaluative principles. In addition, there are references to procedural
common goods that specify whether data governance should be exercised to pro-
tect specific constitutive communities or whether it demands coordinated global
responses. Mediated by positional differences and perceived thinkable and sayable
options, these ordering processes have contributed to different conflict resolution
processes. While some jurisdictional claims have been stabilized through insti-
tutional agreements and the introduction of safeguards, many have been subject
to continuous contestation. In other words, I have argued that conflict resolution
processes are shaped by the continuous struggle to impose normative and eval-
uative principles that comprise specific conceptualizations of the common good.
The book has illustrated that while actors share the notion that data governance is
meaningful in pursuit of common societal goals, they have different conceptions
of both what data actually are and what common societal goals should be achieved.
In the remainder of the chapter, I draw on the insights from the case studies
to examine how conflicts have been resolved. First, I discuss to what extent we
may observe specific patterns in justifications. I identify four distinct visions that
actors loosely follow in their justification. Second, I assess the results compara-
tively and discuss the implications for the resolution of jurisdictional conflicts.
Third, Section 9.3 discusses potential normative and policy implications globally
and highlights avenues for further research.
Throughout this book, I have argued that conflict resolution processes must be
understood as multidimensional. Actors’ capability to engage with the impera-
tive of justification, their position, and their subscription to the illusio of the field
(Bourdieu 1991, 179) all shape their ability to create normative fit between their
jurisdictional claims and the underlying principles in the field. This, in turn, cre-
ates winners and losers. In Chapter 2, I argued that, as actors ‘seek to satisfy several
moral concerns’ (Hanrieder 2016, 412), they are expected to draw on multiple
orders when interacting with their audience(s). In this section, I discuss the inter-
pretative patterns in data governance as compromises between different orders.
I zoom out of specific conflict situations to analyse how actors use justifications
across conflicts and combine orders. The discussion of very different cases through
a common conceptual lens makes it possible to identify the stable character of
conflicts in the field. I argue that these arguments may be understood through
four distinct visions of data governance. To assess how actors draw eclectically
on distinct value orders, I combine insights from the qualitative analysis with a
more quantitative assessment of code co-occurrences to develop an empirically
grounded typology of these four distinct visions.
These visions outline both the meaning of data and data governance and pre-
scriptions for its adequate conduct that follow from this assessment. These visions
may find expression when actors link different values in their justifications. For
example, actors frequently rely on composite principles or objects which are
expected to stabilize agreement (Boltanski and Thévenot 2006, 279). They create
compatibility between principles and are broadly formulated to satisfy diverging
demands. For example, the idea of cross-border cooperation between supervisory
and enforcement authorities appeals to both the order of fairness and the order of
globalism. Actors widely employ such composite principles as ‘cross-border secu-
rity cooperation’, ‘free flow of data/information’, or the ‘protection of domestic
citizens’.
198 DATA GOVERNANCE
Globalism (796)
68
292
108
32 130
Fairness (1761)
40
98
108
Production (520)
24
Security (531)
numbers, while the relative strength of linkages between orders can still give an
indication of discursive connections.¹ Combining the code co-occurrences with
insights into the qualitative case studies, I argue that we can distinguish between
four distinct visions of data governance (see also Obendiek 2022).
These visions not only represent compromises between orders but indicate the
orders’ meaning in use. In invoking distinct orders, actors link their appeals to
higher common principles with specific calls for action. These create comprehen-
sive and distinct ‘vision[s] of the world’ (Bourdieu 1991, 159) that define how data
governance is and how it should be. They designate community conceptions and
conceptualize data as a governable object. I describe these stabilizations and mate-
rializations of data as ‘epistemic objects’ (Knorr Cetina 2005, 183) through four
metaphors. For example, a local liberal vision of data governance puts forward a
conceptualization of data as a ‘body part’ attached to the individual and in need
of significant protection. These metaphors represent only a reductive understand-
ing of the characteristics of data but have significant evocative power. They show
us how actors think about data and to what extent they foreground some charac-
teristics rather than others. This gives us important insights into what actors are
likely to consider as acceptable and unacceptable policy choices. These visions also
comprise calls for action. They not only specify the abstract goals of data gover-
nance but also define its means. These means encompass high degrees of protective
measures by domestic actors as well as looser governance structures at the global
level. In sum, these visions offer a comprehensive understanding of the current and
future status of data governance. They resemble preliminary ideal types in need of
further (quantitative) scrutiny and can form the starting point of future research.
The visions give a general indication of how actors draw on specific combinations
of principles of worth and thereby create specific visions of the social world. How
do these visions of the social world link to imaginaries of the global order more
comprehensively?
As outlined in Chapter 2, I contextualize these visions through selective com-
parison with existing accounts of globalization ideologies (e.g. de Wilde 2019; de
Wilde et al. 2019; Steger 2013; Steger and Wilson 2012; Zürn and de Wilde 2016).
As reference points, I use the ideologies identified by de Wilde (2019). Through
a quantitative analysis of debates in contested policy areas in domestic and inter-
national fora between 2004 and 2011, de Wilde (2019) found empirical support
for liberalism, cosmopolitanism, communitarianism, and statism. He identifies
four categories: (i) the permeability of borders, (ii) the global or local outlook
concerning the allocation of authority, the perceived scope of challenges, and the
collective identity of the reference community, (iii) justifications, and (iv) moral
¹ In Appendix 2.2, I exclude general references to safeguards. Overall, linkages decrease consider-
ably in absolute numbers, but the analysis offers similar results, with the exception of the connection
between security and globalism, which becomes relatively stronger.
200 DATA GOVERNANCE
foundations. Element (ii), the global or local nature of community, as well as the
conceptualization of the problem, addresses the procedural dimension of worth.
The identification of justifications (iii) and moral foundations (iv) resonates with
the abovementioned substantive dimension of worth. While the former speaks to
the community whose interests or values the claim (supposedly) represents, the
latter concerns the realization and reasoning of a specific goal.² I outline these
visions in Section 9.1.1; an overview is provided in Table 9.1.
² The permeability of borders is less relevant in the context of the internet as a global structure with
a high given permeability and was not considered. In view of rising control over the domestic conduct
of the internet by countries such as China, however, this dimension will benefit from further research.
CONCLUSION 201
representative B, pers. comm. 2019). This indicates the importance of the (domes-
tic) legal system for proponents of this vision, particularly regarding strategies such
as strategic litigation.
Second, another vision of data governance I call digital economy is based on a con-
stellation of principles of the orders of fairness, globalism, and, predominantly,
production. It constructs data governance as a transnational space of commerce
beneficial for prosperity. This vision does not resonate immediately with any of
the identified globalization ideologies. While Zürn and de Wilde (2016) identify
a potential connection between cosmopolitan principles and neoliberalism, de
Wilde’s (2019) empirical investigation does not support this link. In data gover-
nance, this constellation of orders draws on principles such as trust to strengthen
a community feeling between customers and private companies. For instance, the
European Commission argues for strengthening legal frameworks on the assump-
tion that ‘Consumer trust in EU and third country operators will fuel and thus
benefit the European and global digital economy’ (EC 2016c, 7). This particular
compromise emphasizes the ‘free flow of data/information’ as inherently valu-
able in the pursuit of the common good of mankind and values innovation. Data
are perceived as ‘raw material’ (Zuboff 2018, 8) that can be processed and mone-
tized. Freedom of enterprise and legal certainty for companies form cornerstones.
As I discussed in Chapters 7 and 8 with regard to Microsoft and Google and as
indicated by the references to fairness, there is an increased demand for private
responsibility. For example, the European Commission emphasizes the linkage
between data protection and strengthening the digital economy (Reicherts 2014),
particularly through notions of trust. The notion of responsibility is increasingly
relevant for private companies in view of the recent ‘techlash’ (Scott 2019). For a
long time, tech companies’ significant position in the field was largely secured by
their accumulation of capital in the form of control over vast amounts of data. This
was embedded in a discourse of ‘solutionism’, that is, the suggestion that this accu-
mulation of data contributes to solving major problems of humanity (Nachtwey
and Seidl 2020). Now, companies are increasingly trying to support this status
by creating narratives about the legitimacy of their practices, explicitly calling
for regulation (Facebook 2020) and criticizing the exploitative data practices of
competitors (Hern 2019). This strengthens their capacity to create alliances with
customers in fighting unwanted regulatory interventions (Culpepper and Thelen
2019). Tech companies seem to construct their identities according to the idea
of ‘Digital Switzerlands’ (Eichensehr 2019). As demonstrated in Chapters 1–8,
they not only emulate public justification strategies but also scrutinize the exercise
of public authority. They actively shape policies by promoting a ‘Digital Geneva
CONCLUSION 203
Fourth, the order of globalism and the order of fairness converge in a vision
of global cooperation. This vision focuses on the universal character of privacy
and data protection and demands stronger cooperation and legal harmonization
to avoid fragmentation and international discord. For example, with regard to
the Umbrella Agreement, the European Commission emphasizes the significant
improvement compared with ‘the present day situation which is characterised by
fragmented, non-harmonised and often weak data protection rules in a patchwork
of multilateral, bilateral, national and sectorial instruments’ (EC 2016c, 14). This
vision emphasizes a global reference community and demands respect for other
communities’ needs (e.g. ICDPPC 2005). There is some overlap with the ideology
of cosmopolitanism identified by de Wilde (2019). For example, the UN Special
Rapporteur emphasizes the universal human rights character of privacy (UN Spe-
cial Rapporteur on the Right to Privacy 2017; UNHRC 2018). This is consistent
with approaches that highlight the importance of political accountability levels:
global actors advocate for global values (de Wilde et al. 2019, 32). In a recent
statement, the EDPS also articulated a more cosmopolitan vision of data gover-
nance, arguing that ‘The protection of personal data requires actionable rights
for everyone, including before independent courts. It is more than a “European”
fundamental right—it is a fundamental right widely recognised around the globe’
(EDPS 2020, emphasis in original). References also relate to multilateral cooper-
ation through institutions or cross-border enforcement networks. For example,
the US Federal Trade Commission argues that ‘Meaningful protection for such
data requires convergence on core principles, an ability of legal regimes to work
together, and enhanced cross-border enforcement cooperation’ (FTC 2013, 7).
While there is a demand for common principles and enforcement, there seems to
be a general recognition of the need for the interoperability of frameworks, even
if that implies a lower level of protection compared with that of the local liberal
vision (ICDPPC 2005; UNHRC 2018). The qualitative case studies indicate that
the vision of global cooperation seems to be less decisive in the resolution of juris-
dictional conflicts, as none of the outcomes or challenges is based on this vision.
CONCLUSION 205
However, the relative strength of the connection between fairness and globalism
as illustrated in Figure 9.1 and the number of references indicate that actors draw
on similar principles simultaneously. The limited relevance in outcomes might
speak to bias in the selection of cases, which are mostly limited to the transatlantic
context.
In conclusion, I argue that in their attempt to justify jurisdictional claims, actors
draw on four distinct visions or social realities of data governance. There seem to
be some similarities between the globalization ideologies observed by IR scholars
and the order constellations I suggest in the area of data governance, particularly
with regard to the local vision of liberalism. However, there are also important
differences. For example, the applicability of the central category of permeability
of borders is difficult to conceptualize and useful only to a limited extent in the
context of the internet as an inherently global structure. The analysis also points
to a potential conceptual shortcoming in current globalization debates in their
ignorance of the prominence of private actors and particularly the normativity that
is assigned to the degree of ‘publicness’. While particularly in this area but also
in other globalized contexts, the distinction between private and public is fluid,
actors’ justifications strongly invoke such binaries. Considering the relevance of
private actors in other globalized arenas, it seems surprising that this distinction
has not been more comprehensively included. This short comparison highlights
the need to consider the specific context of the internet more comprehensively
in the conceptualization of globalization ideologies. The plurality of visions also
suggests that the view of competing privacy- and security-focused coalitions is in
many ways too simple. At the very least, membership varies as actors shift between
these coalitions. I elaborate on the implications for conflict resolution processes in
Section 9.2.
Each of the chapters has highlighted different elements of the process of conflict
resolution and the book’s central questions regarding the responses, the stabi-
lization of agreement, and the interlinkages with the field of data governance. I
have illustrated that major conflicts in data governance share a common char-
acteristic: they emerge when actors articulate overlapping ‘jurisdictional claims’,
that is, claims of legitimate control (Abbott 1986, 191). I have argued that in addi-
tion to the well-explored institutional and power dimensions of these conflicts,
there are deep but undertheorized normative divisions. While adding complex-
ity to already complex inter- and transnational dynamics, I argue that the analysis
of such normative divisions in jurisdictional conflicts can contribute to a better
understanding of the broader trajectory of the field of data governance at large.
Through a bifocal approach focusing on specific conflicts and their interlinkages
206 DATA GOVERNANCE
Continued
Table 9.2 Continued
Disruption based Local liberal Local liberal Local liberal Local/global liberal Local liberal
on
Outcome Continued disruption Formalization and Formalization of Formalization of New institutional
expansion of existing existing practices, existing practices, structure, limited
practices, limited limited revisions ongoing changes in second
revisions phase
Outcome based Uncertain Security partnership Security partner- Security partnership Local liberal
on ship
Convergence Uncertain Global economic — Global economic —
with
Normative Uncertain Entrenchment of Entrenchment Entrenchment of Shift to alternative
consequences alternative principles of alternative alternative principles principles
principles
CONCLUSION 209
security coalitions, as it runs against the interests of the vast majority of the pow-
erful actors in the field, including the US government, intelligence agencies, the
European Commission, many member states, and major private companies. How-
ever, the peripheral position of actors in the field is notably not necessarily tied to
their power more generally. For example, Microsoft is clearly a powerful actor but
assumes a more peripheral position on law enforcement data, which prompted
the company to issue a foundational challenge (see Chapter 7). As the relationship
between the company and law enforcement agencies becomes formalized, how-
ever, the company is increasingly integrated into the field. The analytical focus
on dissimilar social realities offers an alternative account of how such disruptions
of powerful interests are possible when peripheral actors through legal or other
means gain opportunities to demand justifications from dominant actors in the
field.
This high potential for disruption also means that conflicts are rarely fully
resolved. In all cases but the electronic evidence example (see Chapter 8), which
is likely to become subject to challenge in the future, conflict outcomes have been
contested. This seems to confirm the continuous character of conflicts that has
been emphasized in both the empirical (Farrell and Newman 2019) and theoreti-
cal literature (Abbott 1986; Boltanski and Thévenot 2006; Fligstein and McAdam
2012). Thus, there is rarely a clear-cut end, as underlying normative differences
remain. Compromises may dissolve in response to external shocks or new infor-
mation, such as surveillance revelations or scandals involving private companies,
shifting notions of identity, such as increased perceptions of company responsibil-
ity, or institutional shifts in formal power, such as after the Lisbon Treaty. As Faude
and Große-Kreul (2020) outline, overlap between institutions may enable actors
that face limited recognition in one institution to draw on norms or rules embod-
ied by an alternative institution to justify their claims. The book has outlined the
empowerment of actors in the EU to contest transatlantic or even US data gov-
ernance practices. However, this empowerment stands in contrast to the limited
effects on domestic activities, particularly surveillance practices by EU member
state intelligence services (see also Bigo et al. 2013; Farrell and Newman 2019).
Second, and notwithstanding the potential for disruption, jurisdictional con-
flict resolution outcomes may be stabilized over an extended period when actors
succeed in entrenching a specific conceptualization of data. Examples such as the
expansion of Passenger Name Record (PNR) sharing and the persistence of the
TFTP agreement in the face of contestations (Chapters 5–6) are often related to the
conceptualization of data as a security tool. While data access by public authorities
contributed to the emergence of significant disputes in the immediate post-9/11
period, surveillance practices are increasing (Fischer-Lescano 2016; Viola and
Laidler 2022). While the securitization literature prioritizes the intentionality of
selected actors, the findings emphasize processes of incremental change (Williams
2003, 521). For example, through the concept of a ‘chain of security’ (de Goede
210 DATA GOVERNANCE
Naurin 2016), but there is little systematic qualitative research on the evolution of
decisions of the Court in more recent years, particularly from a political science
perspective (but see Bois 2021).
In the early 2000s, the Court seemed to be ‘disinclined to back a privacy rights-
based approach’ (Farrell and Newman 2019, 35), as was illustrated in the PNR
case in Chapter 5. However, the recent invalidation of Privacy Shield seems to
confirm an increasing willingness to entrench data protection measures and a
more general expansion of fundamental rights jurisprudence by the CJEU (Bur-
chardt 2020, 2). In 2014, the CJEU surprisingly, and against the recommendations
of the AG Yves Bot, backed the ‘right to be forgotten’ and thus increased obli-
gations on private companies, and in the Digital Rights Ireland case (2014), the
Court invalidated the European Data Retention Directive. In addition, it con-
firmed the lack of lawfulness of member states’ general obligation to data retention
just two years later (Joined Cases C-203/15 and C-698/15 2016). As in the recent
Canada PNR case (CJEU 2017), in the Schrems cases (Schrems 2015; Data
Protection Commissioner 2020), the Court entrenched the evaluation of data gov-
ernance according to principles based in the order of fairness. This also entrenched
the local liberal vision of data governance, despite serious political consequences.
In a recent judgment, the Court further attempted to strengthen its claim over
data processing by member states for security purposes (Joined Cases C-511/18,
C-512/18 and C-520/18 2020; Privacy International 2020b). Yet assertions of
extraterritorial jurisdictions or the increased fundamental rights jurisprudence
might contribute to political backlash or pushback, including that from domes-
tic constitutional courts (Burchardt 2020). Therefore, while recent literature has
suggested that the Court enjoys broad support with its main audience of legal pro-
fessionals (Bois 2021), it will be crucial to examine the interlinkages between the
legal, political, and normative dimensions that play out in such seemingly legal
conflicts.
In its core, this book has attempted to outline how actors deal with the signifi-
cant uncertainties about the governance of digital data. Data have unique features
that create challenges for successful governance, particularly in a world shaped by
territorial rules and domestic conduct. Data can be transferred across the globe in
an instant; they can be multiplied with very limited efforts but are also linked to
physical structures such as fibre-optic cables and storage centres. Empirical case
studies have illustrated the complexities and tensions that actors need to resolve
when addressing fundamental questions: who, if anybody, owns data? What are
the objectives of data use, processing, and protection practices? Who is and should
be affected by those practices? And, most importantly, who may answer such ques-
tions in light of ever-increasing inter- and transnational linkages? The book has
illustrated that, as much as actors share an opinion of what is at stake in the field
(Bourdieu and Wacquant 1992), there is no ‘consensual “taken for granted” real-
ity’ (Fligstein and McAdam 2011, 4–5). Actors express references to the shared
understanding that data governance is meaningful in the pursuit of the common
CONCLUSION 213
good but articulate different conceptions of what they perceive as shared societal
goals. The book provides a novel account of what these realities might look like
in data governance and establishes the conceptual groundwork for comparative
perspectives into other geographical contexts as well as areas of global gover-
nance that share tendencies of jurisdictional overlap, such as internet governance,
international financial regulation, or environmental governance.
What do these findings mean for the future of transatlantic data governance?
The 2020 judgment in Schrems II in particular makes clear that the challenge
in overcoming existing legal, political, and normative divisions is serious. The
CJEU seems unwilling to rubber-stamp institutional solutions that do not ade-
quately protect EU citizens’ rights. The protracted negotiations show that Safe
Harbour 3.0 entails significant political and economic risks. The judgment also
highlights that existing transparency, oversight, review, and redress mechanisms
must depart from their largely superficial character to include specific safeguards
and more effective constraints on bulk surveillance. Yet at the time of writing in
early 2022, negotiations still seem to be headed towards a revision of the current
framework rather than a complete overhaul. Considering the current situation,
this is hardly surprising. Despite calls to reform US government surveillance,
change has neither been significant nor seems very likely (Viola and Laidler 2022),
while the EU suffers from a lack of credibility. Somewhat paradoxically, the EU’s
legal order provides increased legislative scrutiny for surveillance measures of
third countries, while surveillance by member states’ intelligence services remains
largely unrestricted. Domestic courts such as the German constitutional court
have found surveillance by intelligence services to be too far-reaching (1 BvR
2835/17 -, Rn. 1-332 2020), but, due to national security exceptions, neither the
Commission nor the CJEU has much of a say. While two judgments by the CJEU
seem to indicate that the Court at least attempts to expand its competences in this
area (Joined Cases C-511/18, C-512/18 and C-520/18 2020; Privacy International
2020b), ongoing struggles to re-establish comprehensive data retention provisions
in various member states show that governments are hesitant to back down. In
light of these derogations by member states, it seems unlikely that US authorities
would be willing to seriously reform intelligence practices. To increase its credibil-
ity vis-à-vis the US, the EU needs to address these concerns more comprehensively
and problematize surveillance by its member states. Otherwise, it is likely that the
most stable solution, a comprehensive transatlantic data agreement, will stay out
of reach.
Yet the EU’s cooperation with the UK, which is also known for its extensive
surveillance activities, may foreshadow a potential path for cooperation. The Euro-
pean Commission was relatively quickly able to grant an adequacy decision, which
214 DATA GOVERNANCE
was also based on the existence of a comprehensive data protection law. The deci-
sion has a four-year sunset clause and, while including law enforcement data, has
some restrictions, for example exempting data transfers for immigration control,
which were considered to be too far-reaching. The fact that the Commission was
able to quickly grant an adequacy decision to the UK might indicate how the US,
once it has a federal privacy law in place, could more easily circumvent a com-
prehensive discussion due to the formal existence of legal safeguards. Yet, at least
for the moment, Congress does not seem to be making significant progress in this
regard. If both sides fail to move, transatlantic commercial data sharing might be
headed towards an increasingly conflictual future, perhaps even deadlock, as frus-
tration increases. As the European Commission (2020e) with the Digital Services
Act package introduced a comprehensive and ambitious set of rules on compe-
tition, market structure, content, and services that will shape the digital space in
Europe and beyond, the transatlantic partnership needs to work on cooperative
solutions.
From the tentative answers to the questions and problems that motivated this
book, inevitably more questions arise. The book has focused on transatlantic con-
flicts as the most influential context and main arena of data governance to date.
Yet these insights provide important starting points for thinking about data gov-
ernance in diverse geographical contexts. A significant portion of data governance
frameworks is still to be negotiated. With further jurisdictional conflicts likely to
emerge, this section focuses on the global implications of the book’s findings while
sketching avenues for further research.
The first and major game changer in data governance is constituted by the rise
of China as a regulatory power. In recent years, China has significantly expanded
its regulatory efforts to establish a comprehensive data governance regime. After
the incremental adoption of data protection provisions in various sectors such as
finance, telecommunication, education, and healthcare, regulatory efforts target-
ing online content, critical infrastructure, and encryption, efforts culminated in
the adoption of the Data Security Law and the Personal Information Protection
Law in 2021. Investigating more closely the visions that guide Chinese data gov-
ernance policies should, therefore, be a priority for further research. While it is
often difficult to get access to documents that outline such guiding principles and
collective values, particularly in the inter- and transnational realm, the Chinese
approach stands out because of its exceptionally strong focus on state control.
CONCLUSION 215
First, while they are in many provisions similar to frameworks in the EU or the US,
Chinese regulations constrain data collection and use practices by tech companies
much more significantly than state activities. As Creemers (2021, 2) outlines, the
motives that drive data protection frameworks in China, are much more anchored
in national security and public interest perspectives. Second, Chinese regulatory
frameworks establish extensive data localization obligations requiring personal
data to be stored in China. Any cross-border transfers are subject to the control of
the Cyberspace Administration of China, which thereby gains significant insights
into data flows. Third, Chinese influence on transnational data governance only
marginally relies on regulation. Instead, Erie and Streinz (2022) argue, the ‘Beijing
effect’ shapes data governance through extensive Chinese influence on standard-
setting and the provision of infrastructure. The vast infrastructure investments
through the Chinese Belt and Road Initiative in particular have sparked major dis-
cussions about Chinese influence in Africa, Asia, and Europe (Shen 2018). While
perhaps unique in its extensiveness, the Chinese approach to data governance is
indicative of broader ongoing trends in data and internet governance, as tensions
between self-determination, democracy, and hyperglobalization (Haggart 2020)
have contributed to sovereigntist assertions across the globe. Furthermore, the
Chinese approach is likely to impact regulatory and infrastructural initiatives, par-
ticularly in developing countries. This may result in an increasing regionalization
of regulatory approaches with distinct and competing EU, US, and Chinese or
sovereigntist models (see also Flonk et al. 2020). I examine the likely impacts of
a sovereign perspective on data, data localization policies, and influence through
infrastructure in turn.
Another popular measure to strengthen public control has been the adoption
of data localization policies. Data localization policies demand data storage in
a specific jurisdiction. Countries may adopt such policies to increase control
over sensitive data such as health data or biometric information about their
population, prevent foreign access to sensitive data, or simply restrict company
conduct to give domestic businesses an advantage. Data localization policies
have been one factor driving concerns about internet fragmentation along ter-
ritorial borders (Haggart et al. 2021). The US in particular tries to counteract
these tendencies in trade agreements and international fora such as the WTO,
emphasizing data localization policies as a trade problem (Selby 2017). While
it is important to recognize that various actors have used this justification to
lobby against data protection regulations such as the GDPR, data localization
policies have frequently been found to be failing to address challenges of data
governance, including privacy protection (Brehmer 2018; Selby 2017). Through
provisions such as those established in the CLOUD Act that authorize access by
public bodies or law enforcement irrespective of data location, any additional
protections provided by local data storage are easily circumvented. Nonethe-
less, data localization policies have been frequently used to provide evidence of
data protection- or privacy-compliant policies. Yet this particularly concerns nar-
row data localization policies, for example those of Germany or France. While
the increasing adoption of data governance measures has also resulted in an
CONCLUSION 217
9.3.4 Infrastructure
Another aspect that warrants further research is the role of other layers of inter-
net governance such as technical standards or infrastructure. Empirical cases,
particularly in the security context, have demonstrated the immense difficulty of
contesting a system once it has been put in place, but physical infrastructures
and technical decision-making processes makes this insight even more pressing.
As Bellanova and de Goede illustrate, public values are ‘built into transatlantic
data infrastructures’ (2022, 103). Further research could explore how distinct
visions of data governance are represented in diverse infrastructural and technical
settings. A crucial starting point would be to scrutinize the design of data infras-
tructures. Again, China is the primary example of exercising exceptional control
through its infrastructure domestically. The comprehensive data control infras-
tructure known as the Great Firewall (Erie and Streinz 2022, 43–4) controls traffic
to, from, and within China. While this requires extremely high levels of capacity,
test runs of a sovereign internet in Russia (Sherman 2022) show that the Chi-
nese model is seen as a potential blueprint for other countries. As Russia is facing
sanctions for its invasion of Ukraine, the relevance of such a sovereign internet
further increases. Furthermore, China has been engaged in efforts to proactively
export its model through massive infrastructure investment and the export of
surveillance technologies. This has caused major concerns about the potential
rise of digital repression, as some countries, for example in sub-Saharan Africa,
would be unlikely to gain access to sophisticated surveillance equipment without
China (Feldstein 2021, 240–1). Yet case studies analysing reliance on digital tools
and infrastructures in Pakistan (Erie and Streinz 2022, 66–83), Thailand, or the
Philippines (Feldstein 2021) show that the influence of China is less clear than
often propagated. The availability of surveillance tools and a potentially increased
capacity for digital repression do not mean that countries employ them. Instead, it
significantly depends on pre-existing resources and political interests in the host
country. In addition, both democracies and authoritarian countries are engaged in
buying and selling digital surveillance technologies. The recent disclosures about
the Israeli NSO Group are a case in point. Without detection, the firm’s Pega-
sus spy software gathers data such as text messages from mobile phones and may
also activate the phone’s microphone or video without users noticing (Guardian
218 DATA GOVERNANCE
2021). The software has been employed by both authoritarian and democratic
governments such as Hungary, Rwanda, or the United Arab Emirates to target
journalists and political activists.
What does this mean for global data governance? This book has tried to bring
nuance to typical binaries: EU versus US, privacy versus security, power versus
values. It has demonstrated that data governance is characterized by a plurality of
conceptions of worth—there are several value orders, not one. Further research
could, therefore, compare value orders as promoted by different actors and their
success rates. Will the Chinese influence prevail via the Beijing effect or will the
EU shape data governance via its regulatory reach? Or will the hegemonic position
of the US in internet governance institutions and discourse prevail? While the US
has long enjoyed a hegemonic position in internet governance, its position in data
governance has long been characterized by a piecemeal approach to data protec-
tion and comprehensive data access for law enforcement and intelligence services.
Further research could examine tech companies’ notions of responsibility more
closely. What role do the highly qualified employees play in negotiating the prior-
itization of values within companies? With the GDPR, the EU has strengthened
its position as a data protection vanguard, potentially spurring a regulatory race
to the top via the Brussels effect. Countries like Japan and Korea have upgraded
their data protection provisions, while others are in still the process of changing
their legislative frameworks. In Brazil, the Lei Geral de Proteção de Dados came
into effect in 2020, including measures similar to those of the GDPR. In India, the
2017 judgment of the Constitutional Court declaring privacy to be a fundamental
right has given renewed emphasis to the adoption of a comprehensive data protec-
tion law currently including, for example, provisions comparable to the right to be
forgotten (Dhavate and Mohapatra n.d. 2022). Following a comprehensive review
of the proposed legislation, the Data Protection Bill is currently under debate in
the Indian Parliament. With India, Indonesia, and Pakistan overhauling their data
protection measures, roughly 2 billion people will be presented with new mea-
sures and rights in the near future. At the same time, the simple imposition of,
for example, liberal norms or rules is likely to increase ‘normative friction’ (Kette-
mann 2020, 152). When actors try to impose community-specific norms globally,
they risk undermining other communities’ collective self-determination. It is also
important to point out that even the EU’s approach has been far from uniform.
Besides weakening transparency and data protection standards for counterterror-
ism measures or law enforcement, the EU has also been involved in exporting
surveillance tools. For example, the EU is funding the development of biomet-
ric population databases in Côte d’Ivoire and Senegal by the company Civipol,
CONCLUSION 219
which is partly owned by the French government, with €30 million and €28
million, respectively. While Senegal has signed the CoE’s Convention 108, nei-
ther country fulfils the adequacy standards of the European Commission, which
makes the centralized databases containing sensitive information potentially vul-
nerable to abuse. In addition, EU documents suggest that a key objective of setting
up these databases is to facilitate the deportation of foreign nationals from the EU
(Privacy International 2020a).
Therefore, further research should move beyond justificatory practices to inves-
tigate more closely the implementation of policies both externally and internally.
It could also analyse, for example, the influence of individual data protection bod-
ies. How are data governance measures implemented in Argentina, Ireland, or
Senegal? How do access and correction requests work? While I have focused on
policymakers and major companies, bringing attention to the perception of the
collective values at stake in data governance by the broader public through survey
research or newspaper analyses is crucial. Do people conceptualize data differently
in China from how they do the US? Under what conditions would people decide to
restrict or allow data transfers? In addition, further research should aim for a more
normative assessment of data governance, evaluating the ethical implications of
trends, including the rise of sovereignty, the simultaneous diversification and cen-
tralization of power globally, and the growing importance of data governance for
areas such as security, development, and democracy.
List of Interviews
A.2.1 Codebook
1 Fairness
1.1 Data as a (Fundamental) Rights Concern
1.2 Reference Community: Individuals
1.2.1 Multistakeholderism
1.2.2 Whistle-blowers/Human Rights Defender
1.2.3 Vulnerable Groups
1.2.4 Protection of Individuals/Users
1.3 Dystopian Scenario: Repression
1.3.1 Data Sharing with Other Authorities
1.3.2 Abusive Practices
1.3.3 Lack of Oversight/Control/Review
1.3.4 Lack of Democratic Process
1.3.5 Vagueness/Ambiguity
1.3.6 Lack of Bindingness
1.3.7 Lack of Enforcement
1.3.8 Lowering Standards
1.3.9 Lack of Transparency
1.3.10 Lack of Compliance
1.3.11 Undermining Principles/Liberal Values
1.3.11.1 By Private Actors
1.3.11.2 By Democratic States
1.3.11.3 Media/Press Restrictions/Censorship
1.3.11.4 Large-Scale Collection/Bulk Collection
1.3.11.5 Mass Surveillance
1.4 Virtues/Principles: Liberal Partnership, Human Rights
1.4.1 Public Responsibility to Protect
1.4.2 Human Rights/Fundamental Freedoms
1.4.2.1 Right to Access/Access to Information
1.4.2.2 Freedom of Expression/Speech
1.4.3 Information Society for the Public Good
1.4.4 Free, Open Cyberspace
1.4.5 Protection of Rights/Values
1.4.5.1 Rule of Law
1.4.5.2 Public Interest, incl. Democratic Society
1.4.5.3 Private Responsibility
1.4.5.4 Robust Safeguards—General
1.4.5.4.1 Data Finality
1.4.5.4.2 Precise/Clear Rules/Agreement
1.4.5.4.3 Binding Agreements
APPENDIX 2 223
5 Globalism
5.1 Conceptualization of Data as Fluid
5.1.1 Borderless Nature of the Internet
5.1.2 Free Flow of Data/Information
5.2 Reference Community: Global/Other Communities
5.3 Dystopian Scenario: Fragmentation, Discord
5.3.1 Misunderstanding of Legal Framework
5.3.2 Undermining of Trust
5.3.3 Fragmentation
5.3.3.1 Conflicting Rules/Norms
5.3.3.2 Negative Consequences of Unilateral Action
5.3.4 Outdated Law
5.4 Virtues/Principles: International Institutions, Expertise
5.4.1 Urgency of Agreement
5.4.2 Reform Measures
5.4.3 Cross-Border Framework/Standards
5.4.3.1 Legal Harmonization
5.4.3.2 Global Framework
5.4.4 Strong Institutions
5.4.5 Interoperability
5.4.6 Conflict of Laws Avoidance
5.4.7 Credibility
5.4.8 Multilateral Agreement/Institution
5.4.9 Bilateral Agreement
5.4.10 International Law
5.4.11 Compliance
5.4.12 Reciprocity
5.4.13 Cooperation
5.4.13.1 Compromise
5.4.13.2 Transatlantic Partnership
5.4.13.3 Dialogue/Exchange
5.4.13.4 Assurances
5.4.13.5 Sharing Expertise
5.5 Proving Worth/Deficiency
5.5.1 Fast Solution
5.5.2 Reiteration of Commitment
5.5.3 Reference to International Case Law
5.5.3.1 Reference to Other Communities’ Practices
6 Balanced Interests
7 Call for Action
68 Globalism (796)
202
108
32 130
Fairness (excl. safeguards) (1511)
40
66
70
Production (404)
24
Security (531)
to the analysis visualized in Chapter 9; differences, albeit not unidirectional, are limited to
the connection between security and globalism.
The visualization and analysis showed relatively strong relationships between the order
of fairness and the other orders. The analysis of code co-occurrences in more detail, that
is, checking for individual intersections between codes, indicated that some but not all of
these co-occurrences relate to strong normative underpinnings. As highlighted in Chapter
9, references to the order of fairness are common attempts to increase the legitimacy of
policies. To minimize the reference to vague principles, I repeated the analysis without the
code ‘robust safeguards’, which comprises general references to safeguards without speci-
fying their nature. While references to ‘necessity’ or ‘proportionality’ or the protection of
‘privacy as a fundamental/human right’ were included, general mentions of ‘safeguards’
were excluded. The results are displayed in Figure A.2.1. Compared with the original
analysis, links between security and production and the order of fairness decreased consid-
erably, which potentially indicates more general, unspecific references of little substantial
connection.
In the main analysis, I included only co-occurrences that included intersecting codes,
that is, direct overlap between coded segments. In Figure A.2.2, I display a visualization
that shows co-occurrence within ten paragraphs (proximity). This includes not only direct
overlap but also justifications from different orders that are put forward in close proxim-
ity. This analysis also offers similar results, with the exception of the connection between
security and globalism, which is less clear.
228 APPENDIX 2
Globalism (796)
26
Sovereignty (777) 81
102
23
10 22
Fairness (1761)
12
64
22
Production (520)
5
Security (531)
Advisory Council to Google. 2015. ‘The Advisory Council to Google on the Right
to Be Forgotten’. https://static.googleusercontent.com/media/archive.google.com/de//
advisorycouncil/advisement/advisory-report.pdf, accessed 3 July 2022.
Aguinaldo, Angela, and Paul de Hert. 2021. ‘European Law Enforcement and US Data
Companies: A Decade of Cooperation Free from Law’. In Data Protection beyond Bor-
ders: Transatlantic Perspectives on Extraterritoriality and Sovereignty, edited by Federico
Fabbrini, Edoardo Celeste, and John Quinn, 157–72. London: Bloomsbury.
Allan, Bentley B. 2017. ‘Producing the Climate: States, Scientists, and the Constitution of
Global Governance Objects’. International Organization 71 (1): 131–62.
Allen, Anita L. 1988. Uneasy Access: Privacy for Women in a Free Society. Totowa, NJ:
Rowman & Littlefield.
Alter, Karen J, Emilie M Hafner-Burton, and Laurence R Helfer. 2019. ‘Theorizing the Judi-
cialization of International Relations’. International Studies Quarterly 63 (3): 449–63. doi:
10.1093/isq/sqz019.
Alter, Karen J., and Sophie Meunier. 2009. ‘The Politics of International Regime Complex-
ity’. Perspectives on Politics 7 (1): 13–24. doi: 10.1017/S1537592709090033.
Amicelle, Anthony. 2011. ‘The Great (Data) Bank Robbery: Terrorist Finance Track-
ing Program and the “SWIFT Affair”’. Research Questions 36. Centre d’Études
et de Recherches Internationales Sciences Po. https://www.sciencespo.fr/ceri/sites/
sciencespo.fr.ceri/files/qdr36.pdf, accessed 3 July 2022.
Amicelle, Anthony. 2013. ‘The EU’s Paradoxical Efforts at Tracking the Financing of Terror-
ism: From Criticism to Imitation of Dataveillance’. CEPS Liberty and Security and Europe
56.
Amoore, Louise. 2011. ‘Data Derivatives: On the Emergence of a Security Risk Calculus for
Our Times’. Theory, Culture & Society 28 (6): 24–43. doi: 10.1177/0263276411417430.
Amoore, Louise. 2013. The Politics of Possibility: Risk and Security beyond Probability.
Durham: Duke University Press.
Amoore, Louise. 2020. Cloud Ethics. New York: Duke University Press. doi:
10.1515/9781478009276.
Amoore, Louise, and Marieke de Goede. 2005. ‘Governance, Risk and Dataveillance in the
War on Terror’. Crime, Law and Social Change 43 (2–3): 149–73.
Anagnostakis, Dimitrios. 2017. EU–US Cooperation on Internal Security: Building a
Transatlantic Regime. Abingdon and New York: Routledge.
Anderson, John Ward. 2006. ‘Belgium Rules Sifting of Bank Data Illegal’. Washington Post,
29 September 2006. http://www.washingtonpost.com/wp-dyn/content/article/2006/09/
28/AR2006092801846.html, accessed 3 July 2022.
APEC. 2005. ‘APEC Privacy Framework’. APEC#205-SO-01.2. CTI Sub-Fora & Industry
Dialogues Groups, Digital Economy Steering Group (DESG).
Appiah, Kwame Anthony. 2010. The Ethics of Identity. Princeton, NJ: Princeton University
Press.
Apple, Facebook, Google, Microsoft, and Oath. 2018. ‘Joint Letter to Senator Orrin Hatch,
Senator Christopher Coons, Senator Lindsey Graham, Senator Sheldon Whitehouse’.
https://blogs.microsoft.com/datalaw/wp-content/uploads/sites/149/2018/02/Tech-
Companies-Letter-of-Support-for-Senate-CLOUD-Act-020618.pdf, accessed 3 July
2022.
Aradau, Claudia. 2008. ‘Forget Equality? Security and Liberty in the “War on Terror”’.
Alternatives: Global, Local, Political 33 (3): 293–314.
Aradau, Claudia. 2020. ‘Experimentality, Surplus Data and the Politics of Debilitation in
Borderzones’. Geopolitics, December, 1–21. doi: 10.1080/14650045.2020.1853103.
REFERENCES 231
Archibugi, D., D. Held, and M. Köhler. 1998. Re-Imagining Political Community: Studies in
Cosmopolitan Democracy. Stanford, CA: Stanford University Press.
Arendt, Hannah. 1958. The Human Condition. 2nd edn. Chicago: University of Chicago
Press.
Arendt, Hannah. 2017. ‘Freedom and Politics’. In Liberty Reader, edited by David Miller,
58–79. London: Routledge.
Argomaniz, Javier. 2009. ‘When the EU Is the “Norm-Taker”: The Passenger Name Records
Agreement and the EU’s Internalization of US Border Security Norms’. European Integra-
tion 31 (1): 119–36.
Argomaniz, Javier. 2011. The EU and Counter-Terrorism: Politics, Polity and Policies after
9/11. New York: Routledge.
Argomaniz, Javier. 2015. ‘European Union Responses to Terrorist Use of the Internet’.
Cooperation and Conflict 50 (2): 250–68.
Argomaniz, Javier, Oldrich Bures, and Christian Kaunert. 2017. EU Counter-Terrorism and
Intelligence: A Critical Assessment. London: Routledge.
Aristotle. 2012. Aristotle’s Nicomachean Ethics. Translated by R. C. Bartlett and S. D. Collins.
Chicago: University of Chicago Press.
Article 19. 2017. ‘The Global Principles on Protection of Freedom of Expression and
Privacy’. http://article19.shorthand.com/, accessed 3 July 2022.
Article 19 and Others. 2017. ‘In the CJEU Case-507/17: Written Observations of Article 19
and Others’. Court of Justice of the European Union.
ASEAN. 2016. ‘A SEAN Telecommunications and Information Technology Ministers Meet-
ing (Telmin): Framework on Personal Data Protection’. Brunei Darussalam.
Association of European Airlines. 2007. ‘European Airline Body Dismayed at Pro-
posal for EU-PNR System’. https://www.statewatch.org/news/2007/nov/eu-pnr-airlines-
reactions.pdf, accessed 3 July 2022.
AU. 2014. African Union Convention on Cyber Security and Personal Data Protection.
EX.CL/846(XXV ).
Autolitano, Simona, Anja Dahlmann, Tommaso De Zan, and Vincent Joubert. 2016.
‘EUnited against Crime: Improving Criminal Justice in European Union Cyberspace’. IAI
Istituto Affari Internazionali, November. https://www.iai.it/en/pubblicazioni/eunited-
against-crime-improving-criminal-justice-european-union-cyberspace, accessed 4 July
2022.
Avant, Deborah D., Martha Finnemore, and Susan K. Sell. 2010. Who Governs the Globe?
Cambridge and New York: Cambridge University Press.
Aviation and Transportation Security Act. 2001. Vol. S. 1447.
Azurmendi, Ana. 2017. ‘Spain: The Right to Be Forgotten; the Right to Privacy and
the Initiative Facing the New Challenges of the Information Society’. In Privacy, Data
Protection and Cybersecurity in Europe, edited by Wolf J Schünemann and Max-Otto
Baumann,17–30. Cham: Springer.
Bagger Tranberg, Charlotte. 2011. ‘Proportionality and Data Protection in the Case Law
of the European Court of Justice’. International Data Privacy Law 1 (4): 239–48. doi:
10.1093/idpl/ipr015.
Baker, Steven. 1994. ‘Don’t Worry Be Happy: Why Clipper Is Good for You’. http://groups.
csail.mit.edu/mac/classes/6.805/articles/baker-clipper.txt, accessed 4 July 2022.
Bąkowski, Piotr, and Sofija Voronova. 2015. ‘The Proposed EU Passenger Name
Records (PNR) Directive: Revived in the New Security Context’. European Parliamen-
tary Research Service Blog, May. https://epthinktank.eu/2015/05/04/the-proposed-eu-
232 REFERENCES
passenger-name-records-pnr-directive-revived-in-the-new-security-context/, accessed
4 July 2022.
Ball, Kirstie, and Laureen Snider. 2013. The Surveillance-Industrial Complex: A Political
Economy of Surveillance. Abingdon and New York: Routledge.
Balzacq, Thierry. 2007. ‘The Policy Tools of Securitization: Information Exchange, EU
Foreign and Interior Policies’. Journal of Common Market Studies 46 (1): 75–100. doi:
10.1111/j.1468-5965.2007.00768.x.
Barendt, Eric. 2005. Freedom of Speech. Oxford: Oxford University Press.
Bartlett, Jamie, and Louis Reynolds. 2015. The State of the Art 2015: A Literature Review
of Social Media Intelligence Capabilities for Counter-Terrorism. London: Demos.
Bauman, Zygmunt, Didier Bigo, Paulo Esteves, Elspeth Guild, Vivienne Jabri, David Lyon,
and R. B. J. Walker. 2014. ‘After Snowden: Rethinking the Impact of Surveillance’.
International Political Sociology 8 (2): 121–44. doi: 10.1111/ips.12048.
Beatty, Andrew. 2003. ‘Decimated Airlines Seek Solution to Passenger Data Feud’. EUob-
server, 26 March 2003. https://euobserver.com/justice/10701, accessed 4 July 2022.
Beckerman, Michael. 2015. ‘Examining the EU Safe Harbor Decision and Impacts for
Transatlantic Data Flows’. Washington DC.
Beitz, Charles R. 1979. ‘Bounded Morality: Justice and the State in World Politics’. Interna-
tional Organization 33 (3): 405–24.
Beitz, Charles R. 2001. ‘Human Rights as a Common Concern’. American Political Science
Review 95 (2): 269–82.
Bell, Daniel A. 1993. Communitarianism and Its Critics. Oxford: Clarendon Press.
Bell, Daniel A. 2005. ‘A Communitarian Critique of Liberalism’. Analyse & Kritik 27 (2):
215–238.
Bellanova, Rocco, and Marieke de Goede. 2022. ‘The Algorithmic Regulation of Security:
An Infrastructural Perspective’. Regulation & Governance 16 (1): 102–18.
Bellanova, Rocco, and Denis Duez. 2012. ‘A Different View on the “Making” of European
Security: The EU Passenger Name Record System as a Socio-Technical Assemblage’.
European Foreign Affairs Review 17 (2): 109–24.
Bénatouïl, Thomas. 1999. ‘A Tale of Two Sociologies: The Critical and the Pragmatic Stance
in Contemporary French Sociology’. European Journal of Social Theory 2 (3): 379–96.
doi: 10.1177/136843199002003011.
Benhabib, Seyla. 2004. The Rights of Others: Aliens, Residents, and Citizens. Cambridge and
New York: Cambridge University Press.
Bennett, Colin John. 1992. Regulating Privacy: Data Protection and Public Policy in Europe
and the United States. Ithaca: Cornell University Press.
Bennett, Colin John. 2008. The Privacy Advocates: Resisting the Spread of Surveillance.
Cambridge, MA: MIT Press.
Bennett, Colin John, and Charles D. Raab. 2006. The Governance of Privacy: Policy
Instruments in Global Perspective. Cambridge, MA: MIT Press.
Bennett, Colin John, and Charles D. Raab. 2020. ‘Revisiting the Governance of Privacy:
Contemporary Policy Instruments in Global Perspective’. Regulation & Governance 14
(3): 447–64. doi: 10.1111/rego.12222.
Bertram, Theo, Elie Bursztein, Stephanie Caro, Hubert Chao, Rutledge Chin Feman, Peter
Fleischer, Albin Gustafsson, Jess Hemerly, Chris Hibbert, and Luca Invernizzi. 2019. ‘Five
Years of the Right to Be Forgotten’. In Proceedings of the Conference on Computer and
Communications Security, 11–15 November, London. 959–72. New York: Association
for Computing Machinery
REFERENCES 233
Besek, June M. 2004. ‘Anti-Circumvention Laws and Copyright: A Report from the Ker-
nochan Center for Law, Media and the Arts’. Columbia Journal of Law & Arts 27 (4):
385–519.
Bessette, Rändi, and Virginia Haufler. 2001. ‘Against All Odds: Why There Is No Inter-
national Information Regime’. International Studies Perspectives 2 (1): 69–92. doi:
10.1111/1528-3577.00038.
Bially Mattern, Janice. 2001. ‘The Power Politics of Identity’. European Journal of Interna-
tional Relations 7 (3): 349–97. doi: 10.1177/1354066101007003003.
Biasiotti, M. A., J. P. M. Bonnici, J. Cannataci, and F. Turchi. 2018. Handling and Exchang-
ing Electronic Evidence across Europe. Law, Governance and Technology Series. Cham:
Springer.
Biermann, Frank, Philipp Pattberg, Harro Van Asselt, and Fariborz Zelli. 2009. ‘The
Fragmentation of Global Governance Architectures: A Framework for Analysis’. Global
Environmental Politics 9 (4): 14–40.
Bignami, Francesca, and Giorgio Resta. 2015. ‘Transatlantic Privacy Regulation: Conflict
and Cooperation’. Law & Contemporary Problems 78 (4): 231–66.
Bigo, Didier. 2011. ‘Pierre Bourdieu and International Relations: Power of Practices,
Practices of Power’. International Political Sociology 5 (3): 225–58. doi: 10.1111/j.1749-
5687.2011.00132.x.
Bigo, Didier. 2014. ‘The (In)Securitization Practices of the Three Universes of EU Border
Control: Military/Navy—Border Guards/Police—Database Analysts’. Security Dialogue
45 (3): 209–25.
Bigo, Didier, Gertjan Boulet, Caspar Bowden, Sergio Carrera, Julien Jeandesboz, and
Amandine Scherrer. 2012. ‘Fighting Cyber Crime and Protecting Privacy in the Cloud’.
Study for the European Parliament’s Committee on Civil Liberties, Justice and Home
Affairs (LIBE), PE 462: 35–49.
Bigo, Didier, Sergio Carrera, Nicholas Hernanz, Julien Jeandesboz, Joanna Parkin,
Francesco Ragazzi, and Amandine Scherrer. 2013. ‘Mass Surveillance of Personal Data
by EU Member States and Its Compatibility with EU Law’. Liberty and Security in Europe
Papers, CEPS, no. 61.
Bigo, Didier, Lina Ewert, and Elif Mendos Kuşkonmaz. 2020. ‘The Interoperability Con-
troversy or How to Fail Successfully: Lessons from Europe’. International Journal of
Migration and Border Studies 6 (1–2): 93–114.
Bilefsky, Dan. 2006a. ‘Bank Consortium Faces Outcry on Data Transfer: Europe: Interna-
tional Herald Tribune’. The New York Times, 28 June 2006. http://www.nytimes.com/
2006/06/28/world/europe/28iht-suit.2071000.html, accessed 4 July 2022.
Bilefsky, Dan. 2006b. ‘Belgians Say Banking Group Broke European Rules in Giving Data
to U.S.’. The New York Times, 29 September 2006. https://www.nytimes.com/2006/09/29/
world/europe/29swift.html, accessed 4 July 2022.
Bjerregaard, Elisabetta, and Tom Kirchmaier. 2019. ‘The Danske Bank Money Laundering
Scandal: A Case Study’. Copenhagen Business School, CBS 2019, Working paper. doi:
10.2139/ssrn.3446636.
Blasi Casagran, Christina. 2016. Global Data Protection in the Field of Law Enforce-
ment: An EU Perspective. Routledge Research in EU Law. London and New York:
Routledge.
Blokker, Paul, and Andrea Brighenti. 2011. ‘Politics between Justification and Defiance’.
European Journal of Social Theory 14 (3): 283–300. doi: 10.1177/1368431011412346.
BMI. 2013. ‘Abteilungsleiterrunde zur Koordinierung der Europapolitik am Donner-
stag, dem 12. Dezember 2013 um 8.30 Uhr im BMWi. TOP 6 Datenschutz’. MAT
234 REFERENCES
A BMI-1-8d_8. https://wikileaks.org/bnd-inquiry/docs/BMI/MAT%20A%20BMI-1-8/
MAT%20A%20BMI-1-8d_8.pdf, accessed 4 July 2022.
BMI. 2014. ‘A ktenvorlage an den 1. Untersuchungsausschuss des Deutsches Bundestages
in der 18. WP’. MAT A BMI 6b. Berlin. https://wikileaks.org/bnd-inquiry/docs/BMI/
MAT%20A%20BMI-6b.pdf, accessed 4 July 2022.
Bodó, Balázs, Helene von Schwichow, and Naomi Appelman. 2020. ‘Bodó, Balázs and von
Schwichow, Helene and Appelman, Naomi, Money Talks? Report on the One-Day Sym-
posium on the Impact of Corporate Funding on Information Law Research Institute for
Information Law Research Paper’. Amsterdam Law School Research Paper No. 2020–16.
Bodoni, Stephanie. 2019. ‘Google Clash over Global Right to Be Forgotten Returns to
Court’. Bloomberg, 9 January 2019. https://www.bloomberg.com/news/articles/2019-
01-09/google-clash-over-global-right-to-be-forgotten-returns-to-court, accessed 4 July
2022.
Bodoni, Stephanie. 2022. ‘Meta, Google Face Data Doomsday as Key EU Decision Looms’.
Bloomberg, 18 February 2022. https://www.bloomberg.com/news/articles/2022-02-18/
meta-google-face-eu-data-blackout-as-ruling-on-contracts-looms, accessed 4 July 2022.
Boettcher, Miranda. 2020. ‘Cracking the Code: How Discursive Structures Shape Climate
Engineering Research Governance’. Environmental Politics 29 (5): 890–916.
Bois, Julien Raymond Florent. 2021. ‘The Uncertain World of the Court of Justice of the
European Union: A Multidisciplinary Approach of the Legitimacy of the EU Judiciary in
the 21st Century’. PhD Dissertation. Berlin: Hertie School.
Bolkestein, Frits. 2003. ‘EU/US Talks on Transfers of Airline Passengers’ Personal Data’.
Brussels, September 9. europa.eu/rapid/press-release_SPEECH-03-613_en.pdf.
Boltanski, Luc, and Laurent Thévenot. 1999. ‘The Sociology of Critical Capac-
ity’. European Journal of Social Theory 2 (3): pp. 359–77. https://doi.org/10.1177/
136843199002003010.
Boltanski, Luc, and Ève Chiapello. 2005. The New Spirit of Capitalism. London and New
York: Verso.
Boltanski, Luc, and Laurent Thévenot. 2006. On Justification: Economies of Worth. Prince-
ton Studies in Cultural Sociology. Princeton, NJ: Princeton University Press.
Boltanski, Luc. 2011. On Critique: A Sociology of Emancipation. Cambridge and Malden,
MA: Polity.
Boltanski, Luc. 2012. Love and Justice as Competences. Cambridge and Malden, MA: Polity.
Bot, Yves. 2015. ‘Opinion of Advocate General Bot Delivered on 23 September 2015 Case
C-362/14 Maximilian Schrems v Data Protection Commissioner’. ECLI:EU:C:2015:627.
Bourdieu, Pierre. 1985. ‘The Social Space and the Genesis of Groups’. Information (Inter-
national Social Science Council) 24 (2): 195–220.
Bourdieu, Pierre. 1989. ‘Social Space and Symbolic Power’. Sociological Theory 7 (1): 14–25.
Bourdieu, Pierre. 1990. The Logic of Practice. Stanford, CA: Stanford University Press.
Bourdieu, Pierre. 1991. Language and Symbolic Power. Cambridge, MA: Harvard Univer-
sity Press.
Bourdieu, Pierre. 1993. The Field of Cultural Production: Essays on Art and Literature. New
York: Columbia University Press.
Bourdieu, Pierre. 1996. The Rules of Art: Genesis and Structure of the Literary Field.
Stanford, CA: Stanford University Press.
Bourdieu, Pierre. 1998. Practical Reason: On the Theory of Action. Stanford, CA: Stanford
University Press.
Bourdieu, Pierre, and Loïc Wacquant. 1992. An Invitation to Reflexive Sociology. Chicago
and London: The University of Chicago Press.
REFERENCES 235
Bradford, Anu. 2020. The Brussels Effect: How the European Union Rules the World. New
York: Oxford University Press.
Brehmer, Jacqueline H. 2018. ‘Data Localization: The Unintended Consequences of Privacy
Litigation’. American University Law Review 67 (3): 927–69.
Brin, Sergey, and Lawrence Page. 1998. ‘The Anatomy of a Large-Scale Hypertext Web
Search Engine’. http://infolab.stanford.edu/%7Ebackrub/google.html, accessed 4 July
2022.
Britz, Johannes J. 2008. ‘Making the Global Information Society Good: A Social Justice
Perspective on the Ethical Dimensions of the Global Information Society’. Journal of the
American Society for Information Science and Technology 59 (7): 1171–83.
Britz, Malena, and Arita Eriksson. 2005. ‘The European Security and Defence Policy: A
Fourth System of European Foreign Policy?’ Politique Européenne 17 (3): 35–62. doi:
10.3917/poeu.017.0035.
Brkan, Maja. 2016. ‘Data Protection and Conflict-of-Laws: A Challenging Relationship’.
European Data Protection Law Review 2: 324–41.
Brkan, Maja. 2017. ‘The Court of Justice of the EU, Privacy and Data Protection: Judge-
Made Law as a Leitmotif in Fundamental Rights Protection’. In Courts, Privacy and
Data Protection in the Digital Environment, edited by Maja Brkan and Evangelia Psy-
chogiopoulou, 10–31. Cheltenham: Edward Elgar Publishing.
Brkan, Maja, and Evangelia Psychogiopoulou. 2017. Courts, Privacy and Data Protection in
the Digital Environment. Cheltenham: Edward Elgar Publishing.
Brouwer, Evelien. 2009. ‘Towards a European PNR System? Questions on the Added Value
and the Protection of Fundamental Rights’. Study Requested by the European Parliament’s
Committee on Civil Liberties, Justice and Home Affairs (LIBE).
Bruguière, Jean-Louis. 2010. ‘Second Report on the Processing of EU-Originating Personal
Data by the United States Treasury Department for Counter Terrorism Purposes’. http://
www.statewatch.org/news/2010/aug/eu-usa-swift-2nd-bruguiere-report.pdf, accessed 4
July 2022.
Brzoska, Michael. 2011. ‘The Role of Effectiveness and Efficiency in the European Union’s
Counterterrorism Policy: The Case of Terrorist Financing’. Economics of Security Work-
ing Paper No. 51.
Buchanan, Allen. 2007. Justice, Legitimacy, and Self-Determination: Moral Foundations for
International Law. Oxford: Oxford University Press.
Buchanan, Allen. 2008. ‘Human Rights and the Legitimacy of the International Order’. Legal
Theory 14 (1): 39–70. doi: 10.1017/S1352325208080038.
Bueger, Christian. 2014. ‘Pathways to Practice: Praxiography and International Politics’.
European Political Science Review 6 (3): 383–406.
Bueger, Christian, and Frank Gadinger. 2014. International Practice Theory: New Perspec-
tives. Basingstoke: Palgrave Macmillan.
Bull, Hedley. 2002. The Anarchical Society. New York: Columbia University Press.
Burchardt, Dana. 2020. ‘Backlash against the Court of Justice of the EU? The Recent
Jurisprudence of the German Constitutional Court on EU Fundamental Rights as a
Standard of Review’. German Law Journal 21 (S1): 1–18.
Burman, Anirudh. 2020. ‘Will India’s Proposed Data Protection Law Protect Privacy and
Promote Growth?’ Working Paper. Carnegie India. https://carnegieendowment.org/
files/Burman_Data_Privacy.pdf
Burri, Mira, and Rahel Schär. 2016. ‘The Reform of the EU Data Protection Framework:
Outlining Key Changes and Assessing Their Fitness for a Data-Driven Economy’. Journal
of Information Policy 6 (1): 479–511.
236 REFERENCES
Busch, Andreas. 2013. ‘The Regulation of Transborder Data Traffic: Disputes across the
Atlantic’. Security & Human Rights 23 (4): 313–30.
Busch, Andreas, and David Levi-Faur. 2011. ‘The Regulation of Privacy’. In Handbook on
the Politics of Regulation, edited by David Levi-Faur, 227–41. Cheltenham: Edward Elgar
Publishing.
Butler, Alan. 2017. ‘Whither Privacy Shield in the Trump Era’. European Data Protection
Law Review 3: 111–13.
Buzan, Barry, Ole Wæver, and Jaap De Wilde. 1998. Security: A New Framework for
Analysis. Boulder, CO, and London: Lynne Rienner Publishers.
Cabranes, José A. 2017. ‘José A. Cabranes, Circuit Judge, joined by Dennis Jacobs, Reena
Raggi and Christopher F. Droney, Circuit Judges, Dissenting from the Order Deny-
ing Rehearing En Banc’. http://online.wsj.com/public/resources/documents/Cabranes_
dissental01242017.pdf, accessed 4 July 2022.
Cadwalladr, Carole, and Duncan Campbell. 2019. ‘Revealed: Facebook’s Global Lobbying
against Data Privacy Laws’. The Observer, 2 March 2019. https://www.theguardian.com/
technology/2019/mar/02/facebook-global-lobbying-campaign-against-data-privacy-
laws-investment, accessed 4 July 2022.
Cadwalladr, Carole, and Emma Graham-Harrison. 2018. ‘Revealed: 50 Million Facebook
Profiles Harvested for Cambridge Analytica in Major Data Breach’. The Guardian, 17
March 2018. https://www.theguardian.com/news/2018/mar/17/cambridge-analytica-
facebook-influence-us-election, accessed 4 July 2022.
Calame, Byron. 2006a. ‘Secrecy, Security, the President and the Press’. The New York Times,
2 July 2006. https://www.nytimes.com/2006/07/02/opinion/02pub-ed.html, accessed 4
July 2022.
Calame, Byron. 2006b. ‘Can “Magazines” of The Times Subsidize News Coverage?’ The
New York Times, 22 October 2006. https://www.nytimes.com/2006/10/22/opinion/
22pubed.html, accessed 4 July 2022.
Cammaerts, Bart, and Robin Mansell. 2020. ‘Digital Platform Policy and Regulation:
Toward a Radical Democratic Turn’. International Journal of Communication 14: 135–
54.
Campbell, David. 1992. Writing Security: United States Foreign Policy and the Politics of
Identity. Minneapolis, MN: University of Minnesota Press.
Cannataci, Joseph A, and Jeanne Pia Mifsud-Bonnici. 2005. ‘Data Protection Comes of
Age: The Data Protection Clauses in the European Constitutional Treaty’. Information
& Communications Technology Law 14 (1): 5–15. doi: 10.1080/1360083042000325274.
Carrera, Sergio, Nicholas Hernanz, and Joanna Parkin. 2013. ‘The “Lisbonisation” of the
European Parliament: Assessing Progress, Shortcomings and Challenges for Democratic
Accountability in the Area of Freedom, Security and Justice’. Centre for European Policy
Studies Liberty and Security in Europe (CEPS), no. 58.
Carrier Directive. Council Directive 2004/82/EC of 29 April 2004 on the Obligation of
Carriers to Communicate Passenger Data. 2004. OJ L 261.
Carrubba, Clifford J, and Matthew J. Gabel. 2015. International Courts and the Perfor-
mance of International Agreements: A General Theory with Evidence from the European
Union. New York: Cambridge University Press.
Cartwright, Madison. 2020. ‘Internationalising State Power through the Internet:
Google, Huawei and Geopolitical Struggle.’ Internet Policy Review 9 (3). doi:
10.14763/2020.3.1494.
Castells, Manuel. 2011. The Rise of the Network Society. Vol. I, second edition. Malden, MA:
John Wiley & Sons.
REFERENCES 237
Cellan-Jones, Rory. 2018. ‘Microsoft Sinks Data Centre off Orkney’. BBC News, 6 June 2018.
https://www.bbc.com/news/technology-44368813, accessed 4 July 2022.
Charter of Fundamental Rights of the European Union. 2000. OJ 2000/C 364.
Chenou, Jean-Marie, and Roxana Radu. 2019. ‘The “Right to Be Forgotten”: Negotiating
Public and Private Ordering in the European Union’. Business & Society 58 (1): 74–102.
doi: 10.1177/0007650317717720.
Chertoff, Michael. 2006. ‘A Tool We Need to Stop the Next Airliner Plot’. Washington Post,
29 August 2006. https://www.washingtonpost.com/archive/opinions/2006/08/29/a-
tool-we-need-to-stop-the-next-airliner-plot/bcd240b8-8d61-4664-8f8f-f45d5b3cfaf7/,
accessed 4 July 2022.
Choucri, Nazli. 2012. Cyberpolitics in International Relations. Cambridge, MA, and Lon-
don: MIT Press.
Christakis, Theodore. 2020. ‘E-Evidence in the EU Parliament: Basic Features of Birgit
Sippel’s Draft Report’. European Law Blog, January. https://europeanlawblog.eu/2020/
01/21/e-evidence-in-the-eu-parliament-basic-features-of-birgit-sippels-draft-report/,
accessed 4 July 2022.
Christou, George. 2018. ‘The Challenges of Cybercrime Governance in the European
Union’. European Politics and Society 19 (3): 355–75.
Christou, George, and Imir Rashid. 2021. ‘Interest Group Lobbying in the European Union:
Privacy, Data Protection and the Right to Be Forgotten’. Comparative European Politics
19 (3): 380–400. doi: 10.1057/s41295-021-00238-5.
Chuches, Genna, and Monika Zalnieriute. 2020. ‘A Groundhog Day in Brussels’. Ver-
fassungsblog. 16 July 2020. https://verfassungsblog.de/a-groundhog-day-in-bruessels/,
accessed 4 July 2022.
CJEU. 2014. ‘Press Release No 70/14: Judgment in Case C-131/12 Google Spain SL, Google
Inc. v Agencia Española de Protección de Datos, Mario Costeja González’. https://curia.
europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf, accessed 4 July
2022.
CJEU. 2017. Opinion 1/15. The Court Declares That the Agreement Envisaged between the
European Union and Canada on the Transfer of Passenger Name Record Data May Not
Be Concluded in its Current Form. Court of Justice of the EU: General Court. 2017.
CJEU. 2020. ‘Press Release No 91/20: The Court of Justice Invalidates Decision 2016/1250
on the Adequacy of the Protection Provided by the EU-US Data Protection Shield’.
Luxembourg.
CJEU. 2022. ‘Press Release No 19/22. Advocate General’s Opinion in Case C-817/19 Ligue
des Droits Humains’. Luxembourg.
Clarke, Richard A., Michael J. Morell, Geoffrey R. Stone, Cass R. Sunstein, and Peter
Swire. 2013. ‘Liberty and Security in a Changing World’. The President’s Review Group
on Intelligence and Communications Technologies. https://obamawhitehouse.archives.
gov/blog/2013/12/18/liberty-and-security-changing-world, accessed 4 July 2022.
Clifford, Damian, and Jef Ausloos. 2018. ‘Data Protection and the Role of Fairness’. Yearbook
of European Law 37: 130–87.
CLOUD Act. H.R.4943 - Clarifying Lawful Overseas Use of Data Act. 2018.
CNIL. 2015a. ‘CNIL Orders Google to Apply Delisting on All Domain Names of the
Search Engine’. June. https://web.archive.org/web/20180201043309/https://www.cnil.
fr/fr/node/15790.
CNIL. 2015b. ‘Right to Delisting: Google Informal Appeal Rejected’. September. https://
web.archive.org/web/20181119132900/https://www.cnil.fr/en/right-delisting-google-
informal-appeal-rejected-0.
238 REFERENCES
Cool, Allison. 2018. ‘Europe’s Data Protection Law Is a Big, Confusing Mess’. The New York
Times, 15 May 2018. https://www.nytimes.com/2018/05/15/opinion/gdpr-europe-data-
protection.html, accessed 4 July 2022.
Couldry, Nick, and Ulises A Mejias. 2019. ‘Data Colonialism: Rethinking Big Data’s
Relation to the Contemporary Subject’. Television & New Media 20 (4): 336–49.
Council of the EU. 2007. ‘Proposal for a Council Framework Decision on the Use of Pas-
senger Name Records (PNR) for Law Enforcement Purposes’. MEMO/07/449. Brussels.
Council of the EU. 2009. ‘Council Decision 2010/16/CFSP/JHA of 30 November 2009 on
the Signing, on Behalf of the European Union, of the Agreement between the European
Union and the United States of America on the Processing and Transfer of Financial Mes-
saging Data from the European Union to the United States for Purposes of the Terrorist
Finance Tracking Program’. OJ L 8, 13.1.2010, 9–10.
Council of the EU. 2010. ‘EU Action Plan on Combating Terrorism’. Brussels. 15
November 2010 https://register.consilium.europa.eu/doc/srv?l=EN&f=ST%2015893%
202010%20INIT.
Council of the EU. 2010. ‘The Stockholm Programme: An Open and Secure Europe Serving
and Protecting Citizens’. 17024/09. Brussels.
Council of the EU. 2011. ‘EU Counter-Terrorism Coordinator (CTC) Report on EU Action
Plan on Combating Terrorism’. 17594/1/11 REV 1. Brussels.
Council of the EU. 2016. ‘Joint Statement of EU Ministers for Justice and Home Affairs and
Representatives of EU Institutions on the Terrorist Attacks in Brussels on 22 March 2016’.
158/16. Brussels.
Council of the EU. 2018. ‘Joint EU–U.S. Statement Following the EU–U.S. Justice and Home
Affairs Ministerial Meeting’. https://ec.europa.eu/commission/presscorner/detail/en/
STATEMENT_18_3906, accessed 4 July 2022.
Council of the EU. 2019. ‘Interoperability between EU Information Systems: Council
Adopts Regulations’. http://www.consilium.europa.eu/en/press/press-releases/2019/
05/14/interoperability-between-eu-information-systems-council-adopts-regulations/,
accessed 4 July 2022.
Council of the EU. 2020. ‘EU-Japan PNR Agreement: Council Authorises Opening of Nego-
tiations’. February. http://www.consilium.europa.eu/en/press/press-releases/2020/02/
18/eu-japan-pnr-agreement-council-authorises-opening-of-negotiations/.
CPB. 2004. ‘Undertakings of the Department of Homeland Security Bureau of Customs
and Border Protection (CBP)’. https://www.statewatch.org/news/2006/jun/eu-usa-pnr-
undertakings-may-2004.pdf, accessed 4 July 2022.
Creemers, Rogier. 2021. ‘China’s Emerging Data Protection Framework’. SSRN Scholarly
Paper. Rochester, NY: Social Science Research Network. doi: 10.2139/ssrn.3964684.
Culpepper, Pepper D, and Kathleen Thelen. 2019. ‘Are We All Amazon Primed? Consumers
and the Politics of Platform Power’. Comparative Political Studies 53 (2): 288–318.
Curtin, Deirdre. 2013. ‘Official Secrets and the Negotiation of International Agreements: Is
the EU Executive Unbound’. Common Market Law Review 50 (2): 423–57.
Curtin, Deirdre. 2014. ‘Overseeing Secrets in the EU: A Democratic Perspective’. Journal of
Common Market Studies 52 (3): 684–700.
Curtin, Deirdre. 2018. ‘Second Order Secrecy and Europe’s Legality Mosaics’. West Euro-
pean Politics 41 (4): 846–68.
Cutler, A. Claire, Virginia Haufler, and Tony Porter. 1999. Private Authority and Interna-
tional Affairs. SUNY Series in Global Politics. Albany, NY: State University of New York
Press.
240 REFERENCES
de Goede, Marieke. 2018b. ‘The Chain of Security’. Review of International Studies 44 (1):
24–42.
de Goede, Marieke, and Mara Wesseling. 2017. ‘Secrecy and Security in Transatlantic
Terrorism Finance Tracking’. Journal of European Integration 39 (3): 253–69. doi:
10.1080/07036337.2016.1263624.
de Hert, Paul, and Martin de Schutter. 2008. ‘International Transfers of Data in the Field
of JHA: The Lessons of Europol, PNR and Swift’. In Justice, Liberty, Security: New Chal-
lenges for EU External Relations, edited by S Thiel and B Martenczuk, 303–40. Brussels:
VUBPRESS.
Deibert, Ronald J., and Masashi Crete-Nishihata. 2012. ‘Global Governance and the Spread
of Cyberspace Controls’. Global Governance 18 (3): 339–61.
Deitelhoff, Nicole. 2009. ‘The Discursive Process of Legalization: Charting Islands
of Persuasion in the ICC Case’. International Organization 63 (1): 33–65. doi:
10.1017/S002081830909002X.
DeNardis, Laura. 2013. ‘The Emerging Field of Internet Governance’. In The Oxford Hand-
book of Internet Studies, edited by William H. Dutton, 555–76. Oxford: Oxford University
Press.
Dencik, Lina, Arne Hintz, and Jonathan Cable. 2016. ‘Towards Data Justice? The Ambiguity
of Anti-Surveillance Resistance in Political Activism’. Big Data & Society 3 (2): 1–12. doi:
10.1177/2053951716679678.
de Wilde, Pieter. 2019. ‘The Making of Four Ideologies of Globalization’. European Political
Science Review 11 (1): 1–18.
de Wilde, Pieter, Ruud Koopmans, and Wolfgang Kleinwächter. 2019. The Struggle over Bor-
ders: Cosmopolitanism and Communitarianism. Cambridge and New York: Cambridge
University Press.
Dhavate, Nitin, and Ramakant Mohapatra. n.d. 5 January 2022. ‘A Look at Proposed
Changes to India’s (Personal) Data Protection Bill’. Iapp. https://iapp.org/news/a/a-look-
at-proposed-changes-to-indias-personal-data-protection-bill/, accessed 4 July 2022.
Dhont, Jan, and Marı́a Verónica Pérez Asinari. 2004. ‘Safe Harbour Decision Implementa-
tion Study: With the Assistance of Prof. Reidenberg and Dr. Lee Bygrave, at the Request
of the European Commission’. CRID. https://www.europarl.europa.eu/meetdocs/2009_
2014/documents/libe/dv/07_etude_safe-harbour-2004_/07_etude_safe-harbour-2004_
en.pdf.DHS. 2005. ‘A Report Concerning Passenger Name Record Information Derived
from Flights between the U.S. and the European Union’. Privacy Office. https://www.
dhs.gov/xlibrary/assets/privacy/privacy_pnr_rpt_09-2005.pdf, accessed 4 July 2022.
Digital Rights Ireland v. Commission. 2014. Digital Rights Ireland v. Commission. Court of
Justice of the EU: General Court. Case T-670/16.
Dijck, José van. 2014. ‘Datafication, Dataism and Dataveillance: Big Data between Scientific
Paradigm and Ideology’. Surveillance & Society 12 (2): 197–208.
Dillon, Michael. 2002. Politics of Security: Towards a Political Phiosophy of Continental
Thought. London: Routledge.
Dingwerth, Klaus, and Philipp Pattberg. 2009. ‘World Politics and Organizational Fields:
The Case of Transnational Sustainability Governance’. European Journal of International
Relations 15 (4): 707–43.
Directive (EU) 2018/1673 of the European Parliament and of the Council of 23
October 2018 on Combating Money Laundering by Criminal Law. 2018. OJ
L 284.
Doneda, Danilo, and Laura Schertel Mendes. 2014. ‘Data Protection in Brazil: New
Developments and Current Challenges’. In Reloading Data Protection: Multidisciplinary
242 REFERENCES
Insights and Contemporary Challenges, edited by Serge Gutwirth, Ronald Leenes, and
Paul de Hert, 3–20. Dordrecht: Springer.
Doty, Roxanne Lynn. 1993. ‘Foreign Policy as Social Construction: A Post-Positivist Anal-
ysis of US Counterinsurgency Policy in the Philippines’. International Studies Quarterly
37 (3): 297–320.
Dowd, Rebekah. 2019. ‘The Development of Digital Human Rights in the European Union:
How Key Interests Shape National and Regional Data Governance’. PhD Dissertation.
Atlanta: Georgia State University. doi: https://doi.org/10.57709/14937195.
Drezner, Daniel W. 2007. All Politics Is Global: Explaining International Regulatory Regimes.
Princeton, NJ: Princeton University Press.
Drezner, Daniel W. 2013. ‘The Tragedy of the Global Institutional Commons’. In Back to
Basics: State Power in a Contemporary World, edited by Martha Finnemore and Judith L
Goldstein, 280–312. Oxford and New York: Oxford University Press.
Drozdiak, Natalia. 2019. ‘EU Privacy Laws May Be Hampering Pursuit of Ter-
rorists’. Bloomberg, 8 July 2019. https://www.bloomberg.com/news/articles/2019-07-
08/european-privacy-laws-may-be-hampering-those-catching-terrorists, accessed 4 July
2022.
Dunoff, Jeffrey L., and Joel P. Trachtman. 2009. Ruling the World? Constitutionalism, Inter-
national Law, and Global Governance. Cambridge and New York: Cambridge University
Press.
Dür, Andreas, David Marshall, and Patrick Bernhagen. 2019. The Political Influence of Busi-
ness in the European Union. New Comparative Politics. Ann Arbor, MI: University of
Michigan Press.
EC. 2002. ‘Commission Staff Working Paper: The Application of Commission Decision
520/2000/EC of 26 July 2000 Pursuant to Directive 95/46 of the European Parliament
and of the Council on the Adequate Protection of Personal Data Provided by the Safe
Harbour Privacy Principles and Related Frequently Asked Questions Issued by the US
Department of Commerce’. SEC(2002) 196. Brussels.
EC. 2003. ‘Transfer of Air Passenger Name Record (PNR) Data: A Global EU Approach’.
COM(2003)826 final. Communication from the Commission to the Council and the
Parliament. Brussels.
EC. 2010a. “A Comprehensive Approach on Personal Data Protection in the European
Union”. https://eur-lex.europa.eu/legal-content/en/txt/pdf/?uri=celex:52010dc0609&
from=en, accessed 4 July 2022.
EC. 2010b. ‘On the Global Approach to Transfers of Passenger Name Record (PNR) Data
to Third Countries’. COM/2010/0492 final. Communication from the Commission.
Brussels.
EC. 2011. ‘Commission Report on the Joint Review of the Implementation of the Agreement
between the European Union and the United States of America on the Processing and
Transfer of Financial Messaging Data from the European Union to the United States for
the Purposes of the Terrorist Finance Tracking Program 17–18 February 2011’. Brussels.
EC. 2012a. “Observations Écrites”. sj.f(2012)819079. Brussels.
EC. 2012b. ‘Commission Staff Working Document. Report on the Second Joint Review
of the Implementation of the Agreement between the European Union and the United
States of America on the Processing and Transfer of Financial Messaging Data from the
European Union to the United States for the Purposes of the Terrorist Finance Tracking
Program October 2012’. SWD(2012) 454 final. Brussels.
EC. 2013a. ‘Report from the Commission to the European Parliament and the Council on
the Joint Review of the Implementation of the Agreement between the European Union
REFERENCES 243
and the United States of America on the Processing and Transfer of Passenger Name
Records to the United States Department of Homeland Security’. COM/2013/0844 final.
Brussels.
EC. 2013b. ‘Communication from the Commission to the European Parliament and the
Council: A European Terrorist Finance Tracking System (EU TFTS)’. COM(20130 842
final. Brussels.
EC. 2013c. ‘Annex: Joint Report from the Commission and the U.S. Treasury Depart-
ment Regarding the Value of TFTP Provided Data Pursuant to Article 6 (6) of
the Agreement between the European Union and the United States of America on
the Processing and Transfer of Financial Messaging Data from the European Union
to the United States for the Purposes of the Terrorist Finance Tracking Program’.
Brussels.
EC. 2013d. ‘Communication from the Commission to the European Parliament
and the Council on the Functioning of the Safe Harbour from the Perspective
of EU Citizens and Companies Established in the EU’. COM(2013) 847 final.
Brussels.
EC. 2013e. ‘Rebuilding Trust in EU–US Data Flows’. COM(2013) 846 final. Brussels.
EC. 2014. ‘Written Observations Submitted by the European Commission in Case
C-362/14’. sj.f(2014)4003332. Brussels.
EC. 2015a. ‘First Vice-President Timmermans and Commissioner Jourová ’s Press Confer-
ence on Safe Harbour Following the Court Ruling in Case C-362/14 (Schrems)’. Stras-
bourg. http://europa.eu/rapid/press-release_STATEMENT-15-5782_en.htm, accessed
4 July 2022.
EC. 2015b. ‘Minutes of the 62nd Meeting’. Committee on the Protection of Individuals with
regards to the processing of Personal Data (Article 31 Committee). Brussels.
EC. 2016a. ‘Questionnaire on Improving Criminal Justice in Cyberspace Summary
of Responses.’ https://home-affairs.ec.europa.eu/system/files/2020-09/summary_of_
replies_to_e-evidence_questionnaire_en.pdf.
EC. 2016b. ‘Commission Presents Action Plan to Strengthen the Fight against Terrorist
Financing’. 2 February 2016. https://ec.europa.eu/commission/presscorner/detail/en/
IP_16_202, accessed 4 July 2022.
EC. 2016c. ‘Transatlantic Data Flows: Restoring Trust through Strong Safeguards’.
COM(2016) 117 final. Brussels.
EC. 2016d. ‘Minutes of the 65th Meeting’. Committee on the Protection of Individuals with
Regards to the Processing of Personal Data (Article 31 Committee). Brussels.
EC. 2016e. ‘Minutes of the 68th Meeting’. Committee on the Protection of Individuals with
Regards to the Processing of Personal Data (Article 31 Committee). Brussels.
EC. 2016f. ‘European Commission Launches EU–U.S. Privacy Shield: Stronger Protec-
tion for Transatlantic Data Flows’. https://europa.eu/rapid/press-release_IP-16-2461_
en.htm, accessed 4 July 2022.
EC. 2017a. ‘Terrorist Finance Tracking Programme’. https://ec.europa.eu/home-affairs/
what-we-do/policies/crisis-and-terrorism/tftp_en, accessed 4 July 2022.
EC. 2017b. ‘Commission Staff Working Document: Joint Review of the Implementation
of the Agreement between the European Union and the United States of America on
the Processing and Transfer of Passenger Name Records (PNR) to the United States
Department of Homeland Security’. SWD/2017/014 final.
EC. 2018a. ‘Brief of the European Commission on Behalf of the European Union as Amicus
Curiae in Support of Neither Party, United States of America v. Microsoft Corporation’.
US Supreme Court.
244 REFERENCES
EC. 2018b. ‘Proposal for a Regulation of the European Parliament and of the Council
on European Production and Preservation Orders for Electronic Evidence in Criminal
Matters’. COM(2018) 225 final.
EC. 2018c. ‘Proposal for a Directive of the European Parliament and of the Council Laying
Down Harmonised Rules on the Appointment of Legal Representatives for the Purpose
of Gathering Evidence in Criminal Proceedings’. COM/2018/226 final.
EC. 2018d. ‘Report from the Commission to the European Parliament and the Coun-
cil on the Second Annual Review of the Functioning of the EU–U.S. Privacy Shield
{SWD(2018) 497 Final}’. COM(2018) 860 final.
EC. 2019a. ‘Questions and Answers: Mandate for the EU–U.S. Cooperation on Elec-
tronic Evidence’. 5 February 2019. https://ec.europa.eu/commission/presscorner/detail/
en/memo_19_863.
EC. 2019b. ‘Commission Staff Working Document. Joint Review Report of the Imple-
mentation of the Agreement between the European Union and the United States of
America on the Processing and Transfer of Financial Messaging Data from the Euro-
pean Union to the United States for the Purposes of the Terrorist Finance Track-
ing Program’. SWD(2019) 301 final. https://eur-lex.europa.eu/legal-content/EN/TXT/
?uri=SWD:2019:301:FIN, accessed 4 July 2022.
EC. 2019. ‘Commission Staff Working Document Accompanying the Document “Report
from the Commission to the European Parliament and the Council on the Third Annual
Review of the Functioning of the EU–U.S. Privacy Shield {COM(2019) 495 Final}’.
Brussels.
EC. 2020a. ‘A European Strategy for Data’. https://ec.europa.eu/digital-single-market/en/
policies/building-european-data-economy, accessed 4 July 2022.
EC. 2020b. ‘E-Evidence: Cross-Border Access to Electronic Evidence’. European Commis-
sion. 2020. https://ec.europa.eu/info/policies/justice-and-fundamental-rights/criminal-
justice/e-evidence-cross-border-access-electronic-evidence_en, accessed 4 July 2022.
EC. 2020c. ‘Parliamentary Questions: Answer Given by Mr Reynders on Behalf of the Euro-
pean Commission. Question Reference: E-003136/2019’. Brussels. http://www.europarl.
europa.eu/doceo/document/E-9-2019-003136-ASW_EN.html, accessed 4 July 2022.
EC. 2020d. ‘Commission Steps Up Fight against Money Laundering’. 7 May 2020. https://
ec.europa.eu/commission/presscorner/detail/en/ip_20_800, accessed 4 July 2022.
EC. 2020e. ‘Europe Fit for the Digital Age: Commission Proposes New Rules for Digital
Platforms’. 15 December 2020. https://ec.europa.eu/commission/presscorner/detail/en/
ip_20_2347, accessed 4 July 2022
EC. 2022. ‘Adequacy Decisions’. European Commission. 2022. https://ec.europa.eu/info/
law/law-topic/data-protection/international-dimension-data-protection/adequacy-
decisions_en, accessed 4 July 2022.
EC and US Customs. 2003. ‘Joint Statement: European Commission/US Customs Talks
on PNR Transmission’. Brussels. 17-18 February. https://www.statewatch.org/news/
2003/february/european-commission-us-customs-talks-on-pnr-transmission-brussels-
17-18-february-joint-statement/.
The Economist. 2017. ‘The World’s Most Valuable Resource Is No Longer Oil, but Data’. The
Economist, 6 May 2017. https://www.economist.com/leaders/2017/05/06/the-worlds-
most-valuable-resource-is-no-longer-oil-but-data, accessed 4 July 2022.
ECOWAS. 2010. ‘Supplementary Act A/SA.1/01/10 on Personal Data Protection within
ECOWAS, 37th Session of the Authority of Heads of State and Government’. Abuja.
EDPS. 2007. ‘New PNR Agreement with the United States of America: Letter from Peter
Hustinx, EDPS, to Dr. Wolfgang Schäuble, Minister for the Interior, Berlin’. PH/SM/ab
REFERENCES 245
EDRi, EFF, et al. 2019. ‘Letter to Alexander Seger, Head of the Cybercrime Unit
of the Council of Europe’. https://edri.org/wp-content/uploads/2019/11/20191108_
CivilSocietyLetter_TCYSecondProtocol.pdf, accessed 4 July 2022.
Eichensehr, Kristen. 2019. ‘Digital Switzerlands’. University of Pennsylvania Law Review
167: 665–732.
Elligsen, Nora. 2016. ‘The Microsoft Ireland Case: A Brief Summary’. Lawfare, July.
https://www.lawfareblog.com/microsoft-ireland-case-brief-summary, accessed 4 July
2022.
Ellis-Petersen, Hannah. 2021. ‘WhatsApp Sues Indian Government over “Mass Surveil-
lance” Internet Laws’. The Guardian, 26 May 2021. https://www.theguardian.com/
world/2021/may/26/whatsapp-sues-indian-government-over-mass-surveillance-
internet-laws, accessed 4 July 2022.
Emirbayer, Mustafa, and Victoria Johnson. 2008. ‘Bourdieu and Organizational Analysis’.
Theory and Society 37 (1): 1–44.
Endt, Christian. 2019. ‘Überwachung von Flugpassagieren liefert Fehler über
Fehler’. Süddeutsche Zeitung, 24 April 2019. https://www.sueddeutsche.de/digital/
fluggastdaten-bka-falschtreffer-1.4419760, accessed 4 July 2022.
EP. 2003. ‘European Parliament Resolution on Transfer of Personal Data by Airlines in the
Case of Transatlantic Flights’. P5_TA(2003)0097. Strasbourg.
EP. 2004. ‘Resolution on the Draft Commission Decision Noting the Adequate Level of Pro-
tection Provided for Personal Data Contained in the Passenger Name Records (PNRs)
Transferred to the US Bureau of Customs and Border Protection’. P5_TA(2004)0245.
Brussels.
EP. 2006. ‘Interception of Bank Transfer Data from the SWIFT System by the US Secret
Services’. P6_TA(2006)0317. Strasbourg.
EP. 2007a. ‘US Homeland Security Secretary Michael Chertoff and MEPs Debate
Data Protection’. https://www.europarl.europa.eu/sides/getDoc.do?type=IM-PRESS&
reference=20070514IPR06625&language=EN, accessed 4 July 2022.
EP. 2007b. ‘European Parliament Resolution on SWIFT, the PNR Agreement and the
Transatlantic Dialogue on These Issues’. P6_TA(2007)0039. Strasbourg.
EP. 2009. ‘European Parliament Resolution of 17 September 2009 on the Envisaged Interna-
tional Agreement to Make Available to the United States Treasury Department Financial
Payment Messaging Data to Prevent and Combat Terrorism and Terrorist Financing’.
P7_TA(2009)0016. Strasbourg.
EP. 2010a. ‘SWIFT: European Parliament Votes Down Agreement with the US’.
11 February 2010. https://www.europarl.europa.eu/sides/getDoc.do?type=IM-
PRESS&reference=20100209IPR68674&language=GA, accessed 4 July 2022.
EP. 2010b. ‘Brussels Plenary Session: 5–6 May, 2010’. 6 May 2010. https://www.
europarl.europa.eu/sides/getDoc.do?language=en&type=IM-
PRESS&reference=20100430FCS73854#title9, accessed 4 July 2022.
EP. 2013. ‘European Parliament Resolution of 23 October 2013 on the Suspension
of the TFTP Agreement as a Result of US National Security Agency Surveillance’.
P7_TA(2013)0449. Strasbourg.
EP. 2014a. ‘Resolution on the US NSA Surveillance Programme, Surveillance Bodies
in Various Member States and Their Impact on EU Citizens’ Fundamental Rights
and on Transatlantic Cooperation in Justice and Home Affairs (2013/2188(INI))’.
P7_TA(2014)0230. Strasbourg.
EP. 2014b. ‘Written Observations Submitted by the European Parliament in Case C-362/14’.
SJ-0686/14. Brussels: Court of Justice of the European Union.
REFERENCES 247
EP. 2015. ‘Resolution of 29 October 2015 on the Follow-Up to the European Parlia-
ment Resolution of 12 March 2014 on the Electronic Mass Surveillance of EU Citizens
(2015/2635(RSP)).
EP. 2016a. ‘Position of the European Parliament Adopted at First Reading on 14 April 2016
with a View to the Adoption of Directive (EU) 2016/ … of the European Parliament and
of the Council on the Use of Passenger Name Record (PNR) Data for the Prevention,
Detection, Investigation and Prosecution of Terrorist Offences and Serious Crime (EP-
PE_TC1-COD(2011)0023)’. Brussels.
EP. 2016b. ‘European Parliament Resolution of 26 May 2016 on Transatlantic Data Flows
(2016/2727(RSP))’. P8_TA(2016)0233. Brussels.
EP. 2017a. ‘European Parliament Resolution of 6 April 2017 on the Adequacy of the Pro-
tection Afforded by the EU–US Privacy Shield (2016/3018(RSP))’. P8_TA(2017)0131.
Strasbourg.
EP. 2017b. ‘European Parliament Resolution on the Fight against Cybercrime’.
(2017/2068(INI)). Strasbourg.
EP. 2018. ‘European Parliament Resolution of 5 July 2018 on the Adequacy of the Pro-
tection Afforded by the EU–US Privacy Shield (2018/2645(RSP))’. P8_TA(2018)0315.
Strasbourg.
EPIC. 2018. ‘Brief of Amici Curiae Electronic Privacy Information Center (EPIC) and
Thirty-Seven Technological Experts and Legal Scholars in Support of Respondent,
United States of America v. Microsoft Corporation’. https://www.epic.org/amicus/ecpa/
microsoft/US-v-Microsoft-amicus-EPIC.pdf, accessed 4 July 2022.
Epstein, Dmitry, Christian Katzenbach, and Francesca Musiani. 2016. ‘Doing Internet
Governance: Practices, Controversies, Infrastructures, and Institutions’. Internet Policy
Review 5 (3): 1–14.
Epstein, Dmitry, Merrill C. Roth, and Eric P.S. Baumer. 2014. ‘It’s the Definition, Stupid!
Framing of Online Privacy in the Internet Governance Forum Debates’. Journal of
Information Policy 4: 144–72. doi: 10.5325/jinfopoli.4.2014.0144.
Erie, Matthew S., and Thomas Streinz. 2022. ‘The Beijing Effect: China’s “Digital Silk Road”
as Transnational Data Governance’. SSRN Scholarly Paper ID 3810256. Rochester, NY:
Social Science Research Network. https://papers.ssrn.com/abstract=3810256, accessed
4 July 2022.
Etzioni, Amitai. 1999. ‘A Communitarian Perspective on Privacy’. Connecticut Law Review
32: 897–905.
Etzioni, Amitai. 2014. ‘Communitarianism Revisited’. Journal of Political Ideologies 19 (3):
241–60.
Etzioni, Amitai. 2018. Law and Society in a Populist Age: Balancing Individual Rights and
the Common Good. Bristol: Bristol University Press.
EU and US. 2004. ‘Agreement between the European Community and the United States
of America on the Processing and Transfer of PNR Data by Air Carriers to the United
States Department of Homeland Security, Bureau of Customs and Border Protection’.
22004A0520(01). OJ L 183.
EU and US. 2007. ‘Agreement between the European Union and the United States of Amer-
ica on the Processing and Transfer of Passenger Name Record (PNR) Data by Air Carriers
to the United States Department of Homeland Security (DHS)’. OJ L 204.
EU and US. 2010. ‘Agreement between the European Union and the United States of Amer-
ica on the Processing and Transfer of Financial Messaging Data from the European
Union to the United States for the Purposes of the Terrorist Finance Tracking Program’.
OJ L195/5.
248 REFERENCES
EU and US. 2011. ‘Agreement between the United States of America and the Euro-
pean Union on the Use and Transfer of Passenger Name Records to the United States
Department of Homeland Security’. OJ L 215, 11.8.2012.
EU and US. 2012. ‘EU–US Agreement on the Use and Transfer of PNR to the US Depart-
ment of Homeland Security’. OJ C 258E. http://eur-lex.europa.eu/legal-content/EN/
TXT/?uri=celex:52012AP0134, accessed 4 July 2022.
EU and US. 2016. ‘Agreement between the United States of America and the European
Union on the Protection of Personal Information Relating to the Prevention, Investiga-
tion, Detection, and Prosecution of Criminal Offences’. OJ L336.
EU and US. 2019. ‘Joint EU–U.S. Statement Following the EU–U.S. Justice and Home Affairs
Ministerial Meeting’. 2019. 11 December 2019.
EU FRA. 2018. ‘Surveillance by Intelligence Services: Fundamental Rights Safeguards
and Remedies in the European Union: Volume II: Summary’. https://fra.europa.eu/
en/publication/2018/surveillance-intelligence-services-fundamental-rights-safeguards-
and-remedies#TabPubOverview, accessed 4 July 2022.
European Centre for International Political Economy and Kearney. 2021. ‘The Eco-
nomic Costs of Restricting the Cross-Border Flow of Data’. https://www.kearney.
com/documents/3677458/161343923/The+economic+costs+of+restricting+the+cross-
border+flow+of+data.pdf/82370205-fa6b-b135-3f2b-b406c4d6159e?t=1625067571000,
accessed 4 July 2022.
European Investigation Order. 2014. Directive 2014/41/EU of the European Parliament and
of the Council of 3 April 2014 Regarding the European Investigation Order in Criminal
Matters. OJ L 130 1–36.
European Ombudsman. 2014. ‘Decision of the European Ombudsman Closing the Inquiry
into Complaint 1148/2013/TN against the European Police Office (Europol)’. https://
www.ombudsman.europa.eu/en/decision/en/54678#_ftnref2, accessed 4 July 2022.
European Ombudsman. 2018. ‘Decision in Case 811/2017/EA on the Transparency of
“Advisory Bodies” That Influence the Development of EU Policy’. 19 September 2018.
https://europa.eu/!UQ46By, accessed 4 July 2022.
Europol. 2019. ‘SIRIUS EU Digital Evidence Situation Report 2019’. 1073582. The Hague.
https://www.europol.europa.eu/sites/default/files/documents/sirius_eu_digital_
evidence_report.pdf, accessed 4 July 2022.
Europol. 2020. ‘Europol Information System (EIS)’. https://www.europol.europa.eu/
activities-services/services-support/information-exchange/europol-information-
system, accessed 4 July 2022.
EU–US Working Group. 2013. ‘Report on the Findings by the EU Co-Chairs of the Ad Hoc
EU–US Working Group on Data Protection’. http://www.consilium.europa.eu/uedocs/
cms_data/docs/pressdata/en/jha/139745.pdf, accessed 4 July 2022.
Fabbrini, Federico, Edoardo Celeste, and John Quinn. 2021. Data Protection beyond
Borders. Oxford and New York: Hart Publishing.
Facebook. 2019. ‘Public Consultation Response: 2nd Additional Protocol to the Budapest
Convention on Cybercrime: 4th Round of Consultation’. Strasbourg: Council of Europe.
https://rm.coe.int/facebook-comments-2nd-additional-protocol/168098c93f, accessed
4 July 2022.
Facebook. 2020. ‘Meta Investor Relations’. https://investor.fb.com/resources/default.aspx,
accessed 4 July 2022.
Facebook Ireland Ltd. 2021. Facebook Ireland Ltd, Facebook Inc., Facebook Belgium BVBA,
v Gegevensbeschermingsautoriteit. Court of Justice of the EU: General Court. Case
C-645/19.
REFERENCES 249
Fahey, Elaine. 2014. ‘EU Foreign Relations Law: Litigating to Incite Openness in EU
Negotiations’. European Journal of Risk Regulation 5 (4): 553–6.
Fahey, Elaine. 2020. Framing Convergence with the Global Legal Order: The EU and the
World. Oxford and London: Bloomsbury Publishing.
Fahey, Elaine, and Fabien Terpan. 2021. ‘Torn between Institutionalisation and Judicialisa-
tion: The Demise of the EU–US Privacy Shield’. Indiana Journal of Global Legal Studies
28 (2): 205–44.
Falque-Pierrotin, Isabelle. 2017. ‘Pour un droit au déréférencement mondial’. https://www.
cnil.fr/fr/pour-un-droit-au-dereferencement-mondial, accessed 4 July 2022.
Farrell, Henry. 2003. ‘Constructing the International Foundations of E-Commerce: The
EU–U.S. Safe Harbor Arrangement’. International Organization 57 (2): 277–306.
Farrell, Henry. 2005. ‘New Issue-Areas in the Transatlantic Relationship: E-Commerce
and the Safe Harbor Arrangement’. In Creating a Transatlantic Marketplace: Govern-
ment Policies and Business Strategies, edited by Michelle Egan, 112–33. European Policy
Research Unit. Manchester: Manchester University Press.
Farrell, Henry. 2006. ‘Regulating Information Flows. States, Private Actors, and
E-Commerce’. Annual Review of Political Science 9 (1): pp. 353–74.
Farrell, Henry. 2015. ‘Obama Says That Europeans Are Using Privacy Rules to Protect
Their Firms against U.S. Competition. Is He Right?’ Washington Post, 2015. https://www.
washingtonpost.com/news/monkey-cage/wp/2015/02/17/obama-says-that-europeans-
are-using-privacy-rules-to-protect-their-firms-against-u-s-competition-is-he-right/.
Farrell, Henry, and Abraham L. Newman. 2016. ‘The New Interdependence Approach:
Theoretical Development and Empirical Demonstration’. Review of International Politi-
cal Economy 23 (5): 713–36.
Farrell, Henry, and Abraham L. Newman. 2018. ‘Linkage Politics and Complex
Governance in Transatlantic Surveillance’. World Politics 70 (4): 515–54. doi:
10.1017/S0043887118000114.
Farrell, Henry, and Abraham L. Newman. 2019. Of Privacy and Power: The Transatlantic
Struggle over Freedom and Security. Princeton, NJ: Princeton University Press.
Farrell, Henry, and Abraham L. Newman. 2021. ‘The Janus Face of the Liberal International
Information Order: When Global Institutions Are Self-Undermining’. International
Organization 75 (2): 333–58.
FATF. 2020. ‘Home - Financial Action Task Force (FATF)’. 2020. http://www.fatf-gafi.org/
home/, accessed 4 July 2022.
Faude, Benjamin, and Felix Groβe-Kreul. 2020. ‘Let’s Justify! How Regime Complexes
Enhance the Normative Legitimacy of Global Governance’. International Studies Quar-
terly 64 (2): 431–9.
Federal Government (Germany). 2019. ‘Antwort der Bundesregierung auf die Kleine
Anfrage der Abgeordneten Roman Müller-Böhm, Michael Theurer, Dr. Marcel Klinge,
Weiterer Abgeordneter und der Fraktion der FDP: Drucksache 19/12402: Fluggast-
datenspeicherung und Übermittlung an das Bundeskriminalamt’. 19/12858. http://dipbt.
bundestag.de/dip21/btd/19/128/1912858.pdf, accessed 4 July 2022.
Federal Ministry of Justice and Consumer Protection (Germany). 2019. ‘E-Evidence
(Grenzüberschreitende Gewinnung elektronischer Beweismittel in Strafverfahren)’.
https://cdn.netzpolitik.org/wp-upload/2019/07/Hintergrundpapier-e-Evidence-cl.pdf.
pdf, accessed 4 July 2022.
Feldstein, Steven. 2021. The Rise of Digital Repression: How Technology Is Reshaping Power,
Politics, and Resistance. Oxford and New York: Oxford University Press.
250 REFERENCES
Fischer, Camille. 2018. ‘The CLOUD Act: A Dangerous Expansion of Police Snooping
on Cross-Border Data’. Electronic Frontier Foundation, February. https://www.eff.
org/deeplinks/2018/02/cloud-act-dangerous-expansion-police-snooping-cross-border-
data, accessed 4 July 2022.
Fischer-Lescano, Andreas. 2016. ‘Struggles for a Global Internet Constitution: Protecting
Global Communication Structures against Surveillance Measures’. Global Constitution-
alism 5 (2): 145–72. doi: 10.1017/S204538171600006X.
Flaherty, David H. 1989. Protecting Privacy in Surveillance Societies: The Federal Republic of
Germany, Sweden, France, Canada, and the United States. Chapel Hill, NC, and London:
University of North Carolina Press.
Fleischer, Peter. 2012. ‘Peter Fleischer: Privacy … ?: The Right to Be Forgotten, or How to
Edit Your History’. Peter Fleischer, January. http://peterfleischer.blogspot.com/2012/01/
right-to-be-forgotten-or-how-to-edit.html, accessed 4 July 2022.
Fleischer, Peter. 2016. ‘Adapting Our Approach to the European Right to Be Forgotten’.
Google, March. https://www.blog.google/around-the-globe/google-europe/adapting-
our-approach-to-european-rig/, accessed 4 July 2022.
Fligstein, Neil. 2001. ‘Social Skill and the Theory of Fields’. Sociological Theory 19 (2): 105–
25. doi: 10.1111/0735-2751.00132.
Fligstein, Neil, and Doug McAdam. 2011. ‘Toward a General Theory of Strategic Action
Fields’. Sociological Theory 29 (1): 1–26.
Fligstein, Neil, and Doug McAdam. 2012. A Theory of Fields. Oxford and New York: Oxford
University Press.
Flonk, Daniëlle. 2021. ‘Emerging Illiberal Norms: Russia and China as Promoters of
Internet Content Control’. International Affairs 97 (6): 1925–44. doi: 10.1093/ia/iiab146.
Flonk, Daniëlle, Markus Jachtenfuchs, and Anke S. Obendiek. 2020. ‘Authority Conflicts in
Internet Governance: Liberals vs. Sovereigntists?’ Global Constitutionalism 9 (2): 364–
86.
Floridi, Luciano. 2005. ‘The Ontological Interpretation of Informational Privacy’. Ethics and
Information Technology 4: 185–200.
Floridi, Luciano, ed. 2016. The Routledge Handbook of Philosophy of Information. Routledge
Handbooks in Philosophy. London: Routledge.
Forst, Rainer. 2001. ‘Towards a Critical Theory of Transnational Justice’. Metaphilosophy 32
(1–2): 160–179. https://doi.org/10.1111/1467-9973.00180.
Forst, Rainer. 2015. Normativität und Macht: Zur Analyse sozialer Rechtfertigungsordnun-
gen. Berlin: Suhrkamp.
Fourcade, Marion, and Kieran Healy. 2017. ‘Seeing like a Market’. Socio-Economic Review
15 (1): 9–29.
Fowler, Bridget. 2014. ‘Figures of Descent from Classical Sociological Theory: Luc Boltan-
ski’. In The Spirit of Luc Boltanski: Essays on the ‘Pragmatic Sociology of Critique’, edited
by Bryan S. Turner and Simon Susen, 67–88. London: Anthem Press.
Frantziou, Eleni. 2014. ‘Further Developments in the Right to Be Forgotten: The European
Court of Justice’s Judgment in Case C-131/12, Google Spain, SL, Google Inc v Agen-
cia Española de Protección de Datos’. Human Rights Law Review 14 (4): 761–777. doi:
10.1093/hrlr/ngu033.
Freedom House. 2016. ‘Brazil: Judicial Over-Reach in Data Privacy Case | Press Release’. 1
March 2016. https://freedomhouse.org/article/brazil-judicial-over-reach-data-privacy-
case.
Friedman, Milton. 2009. Capitalism and Freedom. Chicago: University of Chicago Press.
REFERENCES 251
Friedman, Thomas L. 2000. The Lexus and the Olive Tree: Understanding Globalization.
New York: Farrar, Straus, and Giroux.
Friedrichs, Jörg, and Friedrich Kratochwil. 2009. ‘On Acting and Knowing: How Prag-
matism Can Advance International Relations Research and Methodology’. International
Organization 63 (4): 701–31.
Frosio, Giancarlo F. 2013. ‘A Brazilian Judge Orders Facebook off Air If It Fails to Remove
a Defamatory Discussion’. Stanford Law School: Center for Internet and Society.
Frosio, Giancarlo F. 2016. ‘The Right to Be Forgotten: Much Ado about Nothing’. Colorado
Technology Law Journal 15 (2): 307–36.
Frosio, Giancarlo F. 2018. ‘Why Keep a Dog and Bark Yourself ? From Intermediary Liability
to Responsibility’. Oxford International Journal of Law and Information Technology 26
(1): 1–33.
FTC. 2013. ‘Privacy Enforcement and Safe Harbor: Comments of FTC Staff to European
Commission. Review of the U.S.–EU Safe Harbor Framework’. Washington DC.
Fuster, Gloria González, Paul de Hert, and Serge Gutwirth. 2008. ‘SWIFT and the Vul-
nerability of Transatlantic Data Transfers’. International Review of Law, Computers &
Technology 22 (1–2): 191–202. doi: 10.1080/13600860801925185.
G7. 1989. ‘Economic Declaration Paris, July 16, 1989’. http://www.g8.utoronto.ca/summit/
1989paris/communique/index.html, accessed 4 July 2022.
G7. 2016. ‘Joint Declaration by G7 ICT Ministers’. Takamatsu, Kagawa: G7 ICT Ministers’
Meeting. http://www.g8.utoronto.ca/ict/2016-ict-declaration.html, accessed 4 July 2022.
G20. 2016. ‘G20 Digital Economy Development and Cooperation Initiative’. http://www.
g20chn.org/English/Documents/Current/201609/t20160908_3411.html, accessed 4
July 2022.
Garstka, Krzysztof, and David Erdos. 2017. ‘Hiding in Plain Sight: The Right to Be Forgot-
ten and Search Engines in the Context of International Data Protection Frameworks’. In
Platform Regulations: How Platforms Are Regulated and How They Regulate Us, edited
by Luca Belli and Nicolo Zingales, 127–47. Rio de Janeiro: FGV Direito Rio.
GC and Others v. CNIL and Others. 2019. GC, AF, BH, and ED v Commission Nationale
de l’Informatique et des Libertés (CNIL), Premier ministre, and Google LLC. Court of
Justice of the EU of the EU: General Court. Case C-136/17.
Genschel, Philipp, and Markus Jachtenfuchs. 2018. ‘From Market Integration to Core State
Powers: The Eurozone Crisis, the Refugee Crisis and Integration Theory’. Journal of
Common Market Studies 56 (1): 178–96. doi: 10.1111/jcms.12654.
General Data Protection Regulation (GDPR). 2016. Regulation (EU) 2016/679 of the Euro-
pean Parliament and of the Council of 27 April 2016 on the Protection of Natural Persons
with Regard to the Processing of Personal Data and on the Free Movement of Such Data,
and Repealing Directive 95/46/EC.
Gerring, John. 2004. ‘What Is a Case Study and What Is It Good for?’ The American Political
Science Review 98 (2): 341–54.
Gesellschaft für Freiheitsrechte. 2020. ‘CJEU to Decide on Processing of Passenger Data
under PNR Directive’. EDRi, January. https://edri.org/cjeu-to-decide-on-processing-of-
passenger-data-under-pnr-directive/, accessed 4 July 2022.
Gibbs, Samuel. 2014. ‘Larry Page: “Right to Be Forgotten” Could Empower Government
Repression’. The Guardian, 30 May 2014. http://www.theguardian.com/technology/
2014/may/30/larry-page-right-to-be-forgotten-government-repression, accessed 4 July
2022.
Gibbs, Samuel. 2015a. ‘Leave Facebook If You Don’t Want to Be Spied on, Warns
EU’. The Guardian, 26 March 2015. https://www.theguardian.com/technology/2015/
252 REFERENCES
Halliday, Josh. 2014. ‘Google Search Results May Indicate “Right to Be Forgotten” Cen-
sorship’. The Guardian, 8 June 2014. http://www.theguardian.com/technology/2014/
jun/08/google-search-results-indicate-right-to-be-forgotten-censorship, accessed 4 July
2022.
Halliday, Terence C, Lucien Karpik, and Malcolm Feeley. 2007. Fighting for Political Free-
dom: Comparative Studies of the Legal Complex and Political Liberalism. Oxford and
Portland, OR: Hart Publishing.
Hamm, Marylou, Frédéric Mérand, and Anke S. Obendiek. 2021. ‘Field Theory: From
Social Interaction to the Analysis of Power’. Unpublished Manuscript.
Hanrieder, Tine. 2016. ‘Orders of Worth and the Moral Conceptions of Health in Global
Politics’. International Theory 8 (3): 390–421.
Hanrieder, Tine. 2019. ‘How Do Professions Globalize? Lessons from the Global South in
US Medical Education’. International Political Sociology 13 (3): 296–314.
Hansen, Lene. 2000a. ‘Gender, Nation, Rape: Bosnia and the Construction of Security’.
International Feminist Journal of Politics 3 (1): 55–75.
Hansen, Lene. 2000b. ‘The Little Mermaid’s Silent Security Dilemma and the Absence of
Gender in the Copenhagen School’. Millennium 29 (2): 285–306.
Hansen, Lene, and Helen Nissenbaum. 2009. ‘Digital Disaster, Cyber Security, and
the Copenhagen School’. International Studies Quarterly 53 (4): 1155–75. doi:
10.1111/j.1468-2478.2009.00572.x.
Harbisch, Amelie. 2021. ‘Practices of Constructing the Ideal Refugee Subjectivation and
Performance Approaches to Discourse Combined’. PhD Dissertation. Berlin: Freie
Universität Berlin.
Harding, Elizabeth (Liz), Jarno J. Vanto, Reece Clark, L Hannah Ji, and Sara C Ainsworth.
2019. ‘Understanding the Scope and Impact of the California Consumer Privacy Act of
2018’. Journal of Data Protection & Privacy 2 (3): 234–53.
Harmes, Adam. 2012. ‘The Rise of Neoliberal Nationalism’. Review of International Political
Economy 19 (1): 59–86. doi: 10.1080/09692290.2010.507132.
Hart, Herbert Lionel Adolphus. 1958. ‘Positivism and the Separation of Law and Morals’.
Harvard Law Review 71 (4): 593–629.
Hartlapp, Miriam, Julia Metz, and Christian Rauh. 2014. Which Policy for Europe? Power
and Conflict inside the European Commission. Oxford: Oxford University Press.
Hatch, Orrin. 2018. ‘Congressional Record: Senate’. S1885-1984. Washington DC.
Hayes, Ben. 2012. ‘The Surveillance-Industrial Complex’. In Routledge Handbook of Surveil-
lance Studies, edited by Kirstie Ball, Kevin Haggerty, and David Lyon, 167–75. Abingdon
and New York: Routledge.
Hegel, Georg Wilhelm Friedrich. 2015. The Philosophy of Right. Mineola, NY: Dover
Publications.
Heisenberg, Dorothee. 2005. Negotiating Privacy: The European Union, the United States,
and Personal Data Protection. Ipolitics: Global Challenges in the Information Age.
Boulder, CO: Lynne Rienner Publishers.
Held, David. 2002. ‘Cosmopolitanism: Ideas, Realities and Deficits’. In Governing Global-
ization: Power, Authority and Global Governance, edited by David Held and Anthony
McGrew, 305–24. Malden, MA: Polity Press.
Held, David, and Anthony McGrew. 1998. ‘The End of the Old Order? Globalization and
the Prospects for World Order’. Review of International Studies 24 (5): 219–45.
Héritier, Adrienne, and Dirk Lehmkuhl. 2008. ‘The Shadow of Hierarchy and New Modes
of Governance’. Journal of Public Policy 28 (1): 1–17.
REFERENCES 255
Hern, Alex. 2019. ‘Apple Chief Calls for Laws to Tackle “Shadow Economy” of Data
Firms’. The Guardian, 17 January 2019. https://www.theguardian.com/technology/
2019/jan/17/apple-chief-tim-cook-calls-for-laws-to-tackle-shadow-economy-of-data-
firms, accessed 4 July 2022.
Herschinger, Eva, Markus Jachtenfuchs, and Christiane Kraft-Kasack. 2011. ‘Scratching
the Heart of the Artichoke? How International Institutions and the European Union
Constrain the State Monopoly of Force’. European Political Science Review 3 (3): 445–68.
Herzog, Lisa. 2013. Inventing the Market: Smith, Hegel, and Political Theory. Oxford: Oxford
University Press.
Hijmans, Hielke. 2016a. ‘The DPAs and Their Cooperation: How Far Are We in Making
Enforcement of Data Protection Law More European’. European Data Protection Law
Review 2: 362–72.
Hijmans, Hielke. 2016b. The European Union as Guardian of Internet Privacy. Cham:
Springer.
Himma, Kenneth Einar. 2016. ‘Why Security Trumps Privacy’. In Privacy, Security and
Accountability: Ethics, Law and Policy, edited by A. D. Moore, 145–70. London: Rowman
& Littlefield International.
Hirschl, Ran. 2009. ‘The Judicialization of Politics’. In The Oxford Handbook of Political
Science, edited by Robert E Goodin: 253–74. Oxford and New York: Oxford University
Press.
Hobbes, Thomas. 2016. Thomas Hobbes: Leviathan. Longman Library of Primary Sources
in Philosophy. New York: Routledge.
Hobbes, Thomas, and Edwin Curley. 1994. Leviathan: With Selected Variants from the Latin
Edition of 1668. Indianapolis, IN: Hackett Publishing.
Hoffmann, Anna Lauren. 2019. ‘Where Fairness Fails: Data, Algorithms, and the Limits of
Antidiscrimination Discourse’. Information, Communication & Society 22 (7): 900–15.
Hofmann, Jeanette, Christian Katzenbach, and Kirsten Gollatz. 2016. ‘Between Coordina-
tion and Regulation: Finding the Governance in Internet Governance’. New Media &
Society 19 (9): 1406–23. doi: 10.1177/1461444816639975.
Holzscheiter, Anna, Sassan Gholiagha, and Andrea Liese. 2020. ‘Activating Norm Colli-
sions: Interface Conflicts in International Drug Control’. Global Constitutionalism 9 (2):
290–317.
Homeland Security & Governmental Affairs Committee. 2012. ‘HSBC Exposed U.S.
Financial System to Money Laundering, Drug, Terrorist Financing Risks’. https://
www.hsgac.senate.gov/subcommittees/investigations/media/hsbc-exposed-us-finacial-
system-to-money-laundering-drug-terrorist-financing-risks.
Honneth, Axel. 2010. ‘Dissolutions of the Social: On the Social Theory of Luc
Boltanski and Laurent Thévenot’. Constellations 17 (3): 376–89. doi: 10.1111/j.1467-
8675.2010.00606.x.
H.R.1428-114th Congress (2015–2016): Judicial Redress Act of 2015. 2016.
Hutter, Swen, Edgar Grande, and Hanspeter Kriesi. 2016. Politicising Europe. Cambridge:
Cambridge University Press.
Huysmans, Jef. 2006. The Politics of Insecurity: Fear, Migration and Asylum in the EU.
London and New York: Routledge.
Huysmans, Jef. 2008. ‘The Jargon of Exception: On Schmitt, Agamben and the Absence of
Political Society’. International Political Sociology 2 (2): 165–83.
IA. 2015. ‘Examining the EU Safe Harbor Decision and Impacts for Transatlantic Data
Flows’. Congressional Hearing: House Energy and Commerce, Subcommittees on Com-
merce, Manufacturing and Trade and Communications & Technology.
256 REFERENCES
IA. 2018. ‘IA Privacy Principles for a Modern National Regulatory Framework’. https://
web.archive.org/web/20190530160257/https://internetassociation.org/files/ia_privacy-
principles-for-a-modern-national-regulatory-framework_full-doc/.
IATA. 2011. ‘IATA Reveals Checkpoint of the Future’, June. https://www.iata.org/en/
pressroom/2011-press-releases/2011-06-07-01/.
ICANN. 2017. ‘Updated ICANN Procedure for Handling WHOIS Conflicts with Pri-
vacy Laws Now Available: ICANN’. https://www.icann.org/news/announcement-2017-
04-18-en, accessed 4 July 2022.
ICDPPC. 2005. ‘The Protection of Personal Data and Privacy in a Globalised World: A
Universal Right Respecting Diversities’. 27th International Conference of Data Protection
and Privacy Commissioners. Montreux.
Inman, Phillip. 2020. ‘TikTok Halts Talks on London HQ amid UK–China Tensions’. The
Guardian, 19 July 2020. https://www.theguardian.com/technology/2020/jul/19/tiktok-
halts-talks-on-london-headquarters-amid-uk-china-tensions, accessed 4 July 2022.
In re: A Warrant. 2016. In re: A Warrant to Search a Certain E-Mail Account Controlled and
Maintained by Microsoft Corporation, Appellant, v. United States of America, Appellee.
United States Court of Appeals, Second Circuit.
In Re Search Warrant No. 16-1061-M to Google, 232 F. Supp. 3d 708. US District Court,
E.D. Pennsylvania, 2017.
International Law Commission. 2006. ‘Report on the Work of Its Fifty-Eighth Session’. UN
Doc. A/61/10. 1 May–9 June and 3 July–11 August.
in ’t Veld, Sophie. 2019. ‘Long Arm of American Law? Not in Europe!’. February. https://
sophieintveld.eu/long-arm-of-american-law-not-in-europe/, accessed 4 July 2022.
in ’t Veld, Sophie. 2020. ‘Letter to the European Commission’, 10 June 2020. https://twitter.
com/SophieintVeld/status/1270972850367877121/photo/1.
Ireland. 2018. ‘Brief of the Republic of Ireland as Amicus Curiae in Support of Neither Party,
United States of America v. Microsoft Corporation’. US Supreme Court.
Irish DPC. 2013. ‘Letter from the Irish Data Protection Commissioner’, 26 July 2013. http://
www.europe-v-facebook.org/DPC_PRISM_all.pdf (Letters from the DPC, compiled by
Maximilian Schrems, 2–3).
Irish DPC. 2020. ‘DPC Statement on CJEU Decision’. Data Protection Commis-
sion. https://www.dataprotection.ie/news-media/press-releases/dpc-statement-cjeu-
decision, accessed 4 July 2022.
Jääskinen, Niilo. 2013. ‘Opinion of Advocate General Jääskinen Delivered on 25 June 2013.
Case C-131/12. Google Spain SL, Google Inc. v Agencia Española de Protección de Datos
(AEPD), Mario Costeja González’. ECLI:EU:C:2013:424.
Jagd, Søren. 2011. ‘Pragmatic Sociology and Competing Orders of Worth in Organizations’.
European Journal of Social Theory 14 (3): 343–59. doi: 10.1177/1368431011412349.
Janda, Michael, and Peter Ryan. 2019. ‘Westpac Faces Fines over “Serious and
Systemic” Anti-Money Laundering Breaches, AUSTRAC Says’. ABC News, 19 Novem-
ber 2019. https://www.abc.net.au/news/2019-11-20/westpac-to-face-fines-anti-money-
laundering-terrorism-breaches/11720474, accessed 4 July 2022.
Janger, Edward J. 2002. ‘Privacy Property, Information Costs, and the Anticommons’.
Hastings Law Journal 54: 899–929.
Johns, Fleur, and Caroline Compton. 2022. ‘Data Jurisdictions and Rival Regimes of Algo-
rithmic Regulation’. Regulation & Governance 16 (1): 63–84. doi: 10.1111/rego.12296.
Johnson, Bobbie, Charles Arthur, and Josh Halliday. 2010. ‘Libyan Domain Shutdown
No Threat, Insists Bit.Ly’. The Guardian, 9 October 2010. http://www.theguardian.com/
technology/2010/oct/08/bitly-libya, accessed 4 July 2022.
REFERENCES 257
Johnson, David R, and David Post. 1996. ‘Law and Borders: The Rise of Law in Cyberspace’.
Stanford Law Review 48: 1367–402.
Joined Cases C-92-93/09. 2010. Volker und Markus Schecke. GbR and Hartmut Eifert v Land
Hessen. Court of Justice: General Court.
Joined Cases C-203/15 and C-698/15. Tele2 Sverige AB v. Post- och Telestyrelsen and Sec-
retary of State for the Home Department v Tom Watson and Others. Court of Justice of
the EU: General Court, 2016.
Joined Cases C-317/04 and C-318/04. 2006. European Parliament v. Council of the Euro-
pean Union and Commission of the European Communities. Court of Justice of the EU:
General Court.
Joined Cases C-511/18, C-512/18 and C-520/18. 2016. La Quadrature du Net and Others v
Commission. Court of Justice of the EU: General Court.
Joined Cases C-511/18, C-512/18 and C-520/18. 2020. La Quadrature du Net et. al v. Premier
ministre et al. and Order des barreaux francophones et germanophones et al. v. Conseil
des ministres. Court of Justice of the EU: Grand Chamber.
Jones, Meg Leta. 2018. Ctrl + Z: The Right to Be Forgotten. New York and London: NYU
Press.
Jóri, András. 2015. ‘Shaping vs Applying Data Protection Law: Two Core Functions of Data
Protection Authorities’. International Data Privacy Law 5 (2): 133–43.
Jourová, Věra. 2018. ‘Tweet’, 26 March 2018. https://twitter.com/VeraJourova/status/
978256311480709120, accessed 4 July 2022.
JSB. 2011. ‘Report on the Inspection of Europol’s Implementation of the TFTP Agreement,
Conducted in November 2010 by the the Europol Joint Supervisory Body’. JSB/Ins.
11–07. Brussels. https://www.ip-rs.si/fileadmin/user_upload/Pdf/novice/Terrorist_
Finance_Tracking_Program__TFTP__inspection_report_-_public_version.pdf.
JSB. 2015. ‘Europol JSB Inspection of the Implementation of the TFTP Agreement’. Letter
from Ms Vanna Palumbo, Chair of the Europol Joint Supervisory Board to Mr Étienne
Schneider, President of the Council 15/28. Brussels.
Kaczmarek, Michael, and Elena Lazarou. 2018. ‘US Counter-Terrorism since 9/11. Trends
under the Trump Administration’. Washington DC: European Parliamentary Research
Service.
Kalyanpur, Nikhil, and Abraham L. Newman. 2019a. ‘Mobilizing Market Power: Jurisdic-
tional Expansion as Economic Statecraft’. International Organization 73 (1): 1–34. doi:
10.1017/S0020818318000334.
Kalyanpur, Nikhil, and Abraham L Newman. 2019b. ‘The MNC-Coalition Paradox: Issue
Salience, Foreign Firms and the General Data Protection Regulation’. Journal of Common
Market Studies 57 (3): 448–67.
Kang, Cecilia. 2018. ‘Tech Industry Pursues a Federal Privacy Law, on Its Own Terms’. The
New York Times, 27 August 2018. https://www.nytimes.com/2018/08/26/technology/
tech-industry-federal-privacy-law.html, accessed 4 July 2022.
Kanter, James, and Raphael Minder. 2010. ‘Air Travelers Lead European Privacy Con-
cerns’. The New York Times, 28 April 2010. https://www.nytimes.com/2010/04/28/
world/europe/28iht-privacy.html, accessed 4 July 2022.
Katz v. United States. 1967. 347.Supreme Court of the US.
Kaunert, Christian, and Sarah Léonard. 2012. ‘Introduction: Supranational Governance
and European Union Security after the Lisbon Treaty: Exogenous Shocks, Policy
Entrepreneurs and 11 September 2001’. Cooperation and Conflict 47 (4): 417–32.
Kaunert, Christian, Sarah Léonard, and Alex MacKenzie. 2012. ‘The Social Con-
struction of an EU Interest in Counter-Terrorism: US Influence and Internal
258 REFERENCES
Struggles in the Cases of PNR and SWIFT’. European Security 21 (4): 474–96. doi:
10.1080/09662839.2012.688812.
Kaunert, Christian, Sarah Léonard, and Alex MacKenzie. 2015. ‘The European Parliament
in the External Dimension of EU Counter-Terrorism: More Actorness, Accountability
and Oversight 10 Years On?’ Intelligence and National Security 30 (2–3): 357–76. doi:
10.1080/02684527.2014.988446.
Kaunert, Christian, and Sarah Léonard. 2019. ‘The Collective Securitisation of Ter-
rorism in the European Union’. West European Politics 42 (2): 261–77. doi:
10.1080/01402382.2018.1510194.
Kauppi, Niilo, and Mikael R. Madsen. 2014. ‘Fields of Global Governance: How Transna-
tional Power Elites Can Make Global Governance Intelligible’. International Political
Sociology 8 (3): 324–30.
Kelion, Leo. 2019. ‘Google Wins Landmark Right to Be Forgotten Case’. BBC News,
24 September 2019. https://www.bbc.com/news/technology-49808208, accessed 4 July
2022, accessed 4 July 2022.
Keller, Reiner. 2018. ‘The Sociology of Knowledge Approach to Discourse’. In The Sociology
of Knowledge Approach to Discourse: Investigating the Politics of Knowledge and Meaning-
Making, edited by Reiner Keller, Anna-Katharina Hornidge, and Wolf J. Schünemann,
16–47. Routledge Advances in Sociology. London and New York: Routledge. doi:
10.4324/9781315170008.
Keohane, Robert O., and David G. Victor. 2011. ‘The Regime Complex for Climate Change’.
Perspectives on Politics 9 (1): 7–23. doi: 10.1017/S1537592710004068.
Kergueno, Raphaël. 2017. ‘The Über-Lobbyists: How Silicon Valley Is Changing Brussels
Lobbying’. Transparency International EU, May. https://transparency.eu/uber-lobbyists,
accessed 4 July 2022.
Kerry, Cameron. 2012. ‘Avoiding a Data Divide between the US and the EU’. POLITICO,
November. https://www.politico.eu/article/avoiding-a-data-divide-between-the-us-
and-the-eu/, accessed 4 July 2022.
Kerry, Cameron. 2014. ‘Missed Connections: Talking with Europe about Data, Privacy,
and Surveillance’. Brookings. https://www.brookings.edu/wp-content/uploads/2016/06/
Kerry_EuropeFreeTradePrivacy.pdf, accessed 4 July 2022.
Kettemann, Matthias C. 2020. The Normative Order of the Internet: A Theory of Rule and
Regulation Online. Oxford: Oxford University Press.
Khan, Mehreen. 2019. ‘Brussels Defends Lack of Blockbuster Fines for Big Tech
Groups’. Financial Times, July. https://www.ft.com/content/3629b0fc-ad73-11e9-8030-
530adfa879c2, accessed 4 July 2022.
King, Colin, and Clive Walker. 2015. ‘Counter Terrorism Financing: Redundant Fragmen-
tation?’ New Journal of European Criminal Law 6 (3): 372–95.
King, Colin, Clive Walker, and Jimmy Gurulé, eds. 2018. The Palgrave Handbook of
Criminal and Terrorism Financing Law. Cham: Palgrave Macmillan.
Kirby, Michael. 2011. ‘The History, Achievement and Future of the 1980 OECD Guidelines
on Privacy’. International Data Privacy Law 1 (1): 6–14. doi: 10.1093/idpl/ipq002.
Kirkhope, Timothy. 2016. Protection of Individuals with Regard to the Processing of
Personal Data: Processing of Personal Data for the Purposes of Crime Prevention
(Debate). 13 April 2016. Strasbourg: European Parliament. http://www.europarl.europa.
eu/doceo/document/CRE-8-2016-04-13-INT-3-521-0000_NL.html?redirect, accessed
4 July 2022.
Kiss, Jemima. 2015. ‘Dear Google: Open Letter from 80 Academics on “Right to Be For-
gotten”’. The Guardian, 14 May 2015. https://www.theguardian.com/technology/2015/
REFERENCES 259
may/14/dear-google-open-letter-from-80-academics-on-right-to-be-forgotten, accessed
4 July 2022.
Kjaer, Poul F., and Antje Vetterlein. 2018. ‘Regulatory Governance: Rules,
Resistance and Responsibility’. Contemporary Politics 24 (5): 497–506. doi:
10.1080/13569775.2018.1452527.
Klass and Others v. Federal Republic of Germany. 1978. European Court of Human Rights.
App no. 5029/71
Knorr Cetina, Karin. 2005. ‘Objectual Practice’. In The Practice Turn in Contemporary The-
ory, edited by Karin Knorr Cetina, Theodore R Schatzki, and Eike von Savigny, 184–97.
London: Routledge.
Kobrin, Stephen J. 2004. ‘Safe Harbours Are Hard to Find: The Trans-Atlantic Data Privacy
Dispute, Territorial Jurisdiction and Global Governance’. Review of International Studies
30 (1): 111–31.
Kornprobst, Markus. 2014. ‘From Political Judgements to Public Justifications (and Vice
Versa): How Communities Generate Reasons upon Which to Act’. European Journal of
International Relations 20 (1): 192–216.
Kottasová, Ivana. 2018. ‘These Companies Are Getting Killed by GDPR’. CNN Money,
May. https://money.cnn.com/2018/05/11/technology/gdpr-tech-companies-losers/
index.html.
Kreuder-Sonnen, Christian. 2018. ‘Political Secrecy in Europe: Crisis Manage-
ment and Crisis Exploitation’. West European Politics 41 (4): 958–80. doi:
10.1080/01402382.2017.1404813.
Kreuder-Sonnen, Christian. 2019. Emergency Powers of International Organizations:
Between Normalization and Containment. Oxford: Oxford University Press.
Kreuder-Sonnen, Christian, and Michael Zürn. 2020. ‘After Fragmentation: Norm Col-
lisions, Interface Conflicts, and Conflict Management’. Global Constitutionalism 9 (2):
241–67.
Krisch, Nico. 2010. Beyond Constitutionalism: The Pluralist Structure of Postnational Law.
Oxford: Oxford University Press.
Krisch, Nico, Francesco Corradini, and Lucy Lu Reimers. 2020. ‘Order at the Margins:
The Legal Construction of Norm Collisions Over Time’. Global Constitutionalism 9 (2):
343–63.
Kulesza, Joanna. 2018. ‘Balancing Privacy and Security in a Multistakeholder Environment:
ICANN, WHOIS and GDPR’. The Visio Journal 3: 49–58.
Kuner, Christopher. 2013. Transborder Data Flows and Data Privacy Law. Oxford: Oxford
University Press.
Kuner, Christopher. 2015. ‘Extraterritoriality and Regulation of International Data Trans-
fers in EU Data Protection Law’. International Data Privacy Law 5 (4): 235–45. doi:
10.1093/idpl/ipv019.
Kuner, Christopher. 2017. ‘Reality and Illusion in EU Data Transfer Regulation Post
Schrems’. German Law Journal 18 (4): 881–918.
Kydd, Andrew H. 2021. ‘Decline, Radicalization and the Attack on the US Capitol’. Violence:
An International Journal 2 (1): 3–23.
Lamla, Jörn. 2016. ‘Schluss: Demokratische Alternativen der Reterritorialisierung’. In Die
Reterritorialisierung des Digitalen: Zur Reaktion nationaler Demokratie auf die Krise
der Privatheit nach Snowden, edited by Barbara Büttner, Simon Ledder, Carsten Ochs,
Fabian Pittroff, Christian L. Geminn, Thilo Hagendorff, and Jörn Lamla, 153–9. Kassel:
Kassel University Press.
260 REFERENCES
Lancieri, Filippo Maria. 2018. ‘Digital Protectionism? Antitrust, Data Protection, and
the EU/US Transatlantic Rift’. Journal of Antitrust Enforcement 7 (1): 27–53. doi:
10.1093/jaenfo/jny012.
Larsson, Olof, and Daniel Naurin. 2016. ‘Judicial Independence and Political Uncer-
tainty: How the Risk of Override Affects the Court of Justice of the EU’. International
Organization 70(2): 377–408. doi:10.1017/S0020818316000047.
Laurer, Moritz, and Timo Seidl. 2021. ‘Regulating the European Data-Driven Econ-
omy: A Case Study on the General Data Protection Regulation’. Policy & Internet 13:
257–77.
Law Enforcement Directive (EU) 2016/680 of the European Parliament and of the Coun-
cil of 27 April 2016 on the Protection of Natural Persons with Regard to the Processing
of Personal Data by Competent Authorities for the Purposes of the Prevention, Inves-
tigation, Detection or Prosecution of Criminal Offences or the Execution of Criminal
Penalties, and on the Free Movement of Such Data, and Repealing Council Framework
Decision 2008/977/JHA. 2016. OJ L 119.
Leander, Anna. 2005. ‘The Power to Construct International Security: On the Significance
of Private Military Companies’. Millennium 33 (3): 803–25.
Leander, Anna. 2011. ‘The Promises, Problems, and Potentials of a Bourdieu-Inspired
Staging of International Relations’. International Political Sociology 5 (3): 294–313.
Lee, Dave. 2015. ‘How Worried Is Silicon Valley about Safe Harbour?’ BBC News, 7 October
2015. https://www.bbc.com/news/technology-34461682, accessed 4 July 2022.
Lee, Dave. 2017. ‘US Government “Monitored Bank Transfers”’. BBC News, 16 April 2017.
https://www.bbc.com/news/technology-39606575, accessed 4 July 2022.
Leiser, Mark, and Bart Custers. 2019. ‘The Law Enforcement Directive: Conceptual Chal-
lenges of EU Directive 2016/680’. European Data Protection Law Review 5 (3): 367–78.
Lévy, Emmanuel. 2016. ‘Valls a-t-il voulu la peau d’un préfet?’ Marianne, 31 March 2016.
https://www.marianne.net/politique/valls-t-il-voulu-la-peau-dun-prefet, accessed 4 July
2022.
LIBE. 2010. ‘Report on the Proposal for a Council Decision on the Conclusion of the
Agreement between the European Union and the United States of America on the Pro-
cessing and Transfer of Financial Messaging Data from the European Union to the United
States for Purposes of the Terrorist Finance Tracking Program’. 05305/1/2010REV –
C7-0004/2010 – 2009/0190(NLE). Brussels.
LIBE. 2013. Hearing. LIBE Committee Inquiry on Electronic Mass Surveillance of EU
Citizens. https://multimedia.europarl.europa.eu/en/committee-on-civil-liberties-
justice-and-home-affairs_20131007-1900-COMMITTEE-LIBE_vd. (07/10/2013 video
recording).
LIBE. 2014a. ‘LIBE Committee Inquiry: Electronic Mass Surveillance of EU Citizens’.
https://europarl.europa.eu/document/activities/cont/201410/20141016ATT91322/
20141016ATT91322EN.pdf, accessed 4 July 2022.
LIBE. 2014b. ‘Report on the US NSA Surveillance Programme: Surveillance Bodies in
Various Member States and Their Impact on EU Citizens’ Fundamental Rights and on
Transatlantic Cooperation in Justice and Home Affairs’. 2013/2188(INI). Brussels.
LIBE. 2015. ‘Second Report on the Proposal for a Directive of the European Parliament
and of the Council Second Report on the Use of Passenger Name Record Data for the
Prevention, Detection, Investigation and Prosecution of Terrorist Offences and Serious
Crime’. (COM(2011)0032 – C7-0039/2011 – 2011/0023(COD)). Rapporteur: Timothy
Kirkhope. Brussels.
REFERENCES 261
LIBE. 2019. ‘Draft Report on the Proposal for a Regulation of the European Parliament
and of the Council on European Production and Preservation Orders for Electronic
Evidence in Criminal Matters (COM(2018)0225 – C8-0155/2018 – 2018/0108(COD)).
Rapporteur: Birgit Sippel’. Brussels.
Lichtblau, Eric, and James Risen. 2006. ‘Bank Data Is Sifted by U.S. in Secret to Block Terror’.
The New York Times, 23 June 2006. https://www.nytimes.com/2006/06/23/washington/
23intel.html, accessed 4 July 2022.
Lien, Tracy. 2015. ‘Microsoft Handed FBI Data on Charlie Hebdo Probe in 45 Minutes’.
Los Angeles Times, 20 January 2015. https://www.latimes.com/business/technology/la-
fi-tn-microsoft-charlie-hebdo-20150120-story.html, accessed 4 July 2022.
Ligeti, Katalin, and Gavin Robinson. 2021. ‘Sword, Shield and Cloud: Toward a Euro-
pean System of Public–Private Orders for Electronic Evidence in Criminal Matters?’
In Surveillance and Privacy in the Digital Age: European, Transatlantic and Global Per-
spectives, edited by Valsamis Mitsilegas and Niovi Vavoula, 1st edn, 27–70. Oxford: Hart
Publishing.
Ligue des droits humains. 2022. Ligue des droits humains v. Conseil des ministres, Court
of Justice: Grand Chamber. Case (C- 817/19)
Lillington, Karlin. 2019. ‘Schrems II Will Seriously Stress Test EU’s Data Privacy Rules’. The
Irish Times, 11 July 2019. https://www.irishtimes.com/business/technology/schrems-ii-
will-seriously-stress-test-eu-s-data-privacy-rules-1.3952925, accessed 4 July 2022.
Linklater, Andrew. 1998. The Transformation of Political Community: Ethical Foundations
of the Post-Westphalian Era. Studies in International Relations. Cambridge: Polity.
Lobbyfacts. 2022. ‘LobbyFacts Database’. https://lobbyfacts.eu/, accessed 4 July 2022.
Locke, John. 1967. Locke: Two Treatises of Government. Cambridge: Cambridge University
Press.
Long, William J., and Marc Pang Quek. 2002. ‘Personal Data Privacy Protection in an Age of
Globalization: The US–EU Safe Harbor Compromise’. Journal of European Public Policy
9 (3): 325–44. doi: 10.1080/13501760210138778.
Lowe, Peter. 2006. ‘Counterfeiting: Links to Organised Crime and Terrorist Funding’.
Journal of Financial Crime 13 (2): 255–7.
Lynskey, Orla. 2015a. The Foundations of EU Data Protection Law. Oxford: Oxford Univer-
sity Press.
Lynskey, Orla. 2015b. ‘Control over Personal Data in a Digital Age: Google Spain v AEPD
and Mario Costeja Gonzalez’. The Modern Law Review 78 (3): 522–34. doi: 10.1111/1468-
2230.12126.
Lynskey, Orla. 2019. ‘Grappling with “Data Power”: Normative Nudges from Data Protec-
tion and Privacy’. Theoretical Inquiries in Law 20 (1): 189–220.
Lyon, David. 1994. The Electronic Eye: The Rise of Surveillance Society. Minneapolis, MN:
University of Minneapolis Press.
Lyon, David. 2007. Surveillance Studies: An Overview. Cambridge: Polity.
Lyon, David. 2015. Surveillance after Snowden. Cambridge and Malden, MA: Polity.
McCourt, David M. 2016. ‘Practice Theory and Relationalism as the New Constructivism.’
International Studies Quarterly 60 (3): 475–85.
McCulloch, Jude, and Sharon Pickering. 2009. ‘Pre-Crime and Counter-Terrorism: Imag-
ining Future Crime in the “War on Terror”’. The British Journal of Criminology 49 (5):
628–45.
Mac Ginty, Roger. 2017. ‘A Material Turn in International Relations: The 4x4, Intervention
and Resistance’. Review of International Studies 43 (5): 855–74.
262 REFERENCES
Meehan, Patrick. 2011. ‘Congressional Hearing: Intelligence Sharing and Terrorist Travel:
How DHS Addresses the Mission of Providing Security, Facilitating Commerce and
Protecting Privacy for Passengers Engaged in International Travel’. Washington DC.
Mérand, Frédéric. 2010. ‘Pierre Bourdieu and the Birth of European Defense’. Security
Studies 19 (2): 342–74.
Microsoft. 2018. ‘Brief in Opposition. In Re Search Warrant. USA v. Microsoft’. 584 U.S. __
(2018). US Supreme Court.
Microsoft. 2020. ‘Law Enforcement Requests Report: Microsoft CSR’. Microsoft. https://
www.microsoft.com/en-us/corporate-responsibility/lerr, accessed 4 July 2022.
Mill, John Stuart. 1859. On Liberty. 4th edn. London: Longman, Roberts, & Green Co.
http://www.econlib.org/library/Mill/mlLbty3.html, accessed 4 July 2022.
Miller, David. 2017. Liberty Reader. London: Routledge.
Milliken, Jennifer. 1999. ‘The Study of Discourse in International Relations: A Critique of
Research and Methods’. European Journal of International Relations 5 (2): 225–54.
Mitsilegas, Valsamis. 2014. ‘Transatlantic Counterterrorism Cooperation and European
Values: The Elusive Quest for Coherence’. In A Transatlantic Community of Law: Legal
Perspectives on the Relationship between the EU and US Legal Orders, edited by Elaine
Fahey and Deirdre Curtin, 289–315. Cambridge: Cambridge University Press.
Mitsilegas, Valsamis. 2020. ‘The Preventive Turn in European Security Policy: Towards
a Rule of Law Crisis?’ In EU Law in Populist Times: Crises and Prospects,
edited by Francesca Bignami, 301–18. Cambridge: Cambridge University Press. doi:
10.1017/9781108755641.011.
Möllers, Christoph. 2020. The Possibility of Norms. Oxford: Oxford University Press.
Monar, Jörg. 2010. ‘The Rejection of the EU–US SWIFT Interim Agreement by the Euro-
pean Parliament: A Historic Vote and Its Implications’. European Foreign Affairs Review
15: 143–51.
Monar, Jörg. 2015. ‘The EU as an International Counter-Terrorism Actor: Progress
and Constraints’. Intelligence and National Security 30 (2–3): 333–56. doi:
10.1080/02684527.2014.988448.
Moody, Glyn. 2016. ‘In “an Unusual Move,” US Government Asks to Join Key EU Face-
book Privacy Case’. Ars Technica, June. https://arstechnica.com/tech-policy/2016/06/
eu-facebook-schrems-case-us-government-amicus-curiae/, accessed 4 July 2022.
Mueller, Milton. 2010. Networks and States: The Global Politics of Internet Governance.
Information Revolution and Global Politics. Cambridge, MA: MIT Press.
Mueller, Milton, and Mawaki Chango. 2008. ‘Disrupting Global Governance: The Internet
Whois Service, ICANN, and Privacy’. Journal of Information Technology & Politics 5 (3):
303–25.
Murray, Andrew. 2007. The Regulation of Cyberspace: Control in the Online Environment.
Abingdon and New York: Routledge.
Nachtwey, Oliver, and Timo Seidl. 2020. ‘The Solutionist Ethic and the Spirit of Digital
Capitalism’. Soc ArXiv Working Paper, March. https://osf.io/preprints/socarxiv/sgjzq/,
accessed 4 July 2022.
Nagel, Thomas. 2005. ‘The Problem of Global Justice’. Philosophy & Public Affairs 33 (2):
113–47.
Nance, Mark T. 2018. ‘Re-Thinking FATF: An Experimentalist Interpretation of the Finan-
cial Action Task Force’. Crime, Law and Social Change 69 (2): 131–52.
Neveu, Erik. 2018. ‘Bourdieu’s Capital(s): Sociologizing an Economic Concept’. In The
Oxford Handbook of Pierre Bourdieu, edited by Thomas Medvetz and Jeffrey J. Sallaz,
347–74. New York: Oxford University Press.
264 REFERENCES
Newman, Abraham L. 2008. Protectors of Privacy: Regulating Personal Data in the Global
Economy. Ithaca, NY: Cornell University Press
Newman, Abraham L., and David Bach. 2004. ‘Self-Regulatory Trajectories in the Shadow
of Public Powe: Resolving Digital Dilemmas in Europe and the United States’. Gover-
nance 17 (3): 387–413.
The New York Times. 2015. ‘Microsoft Challenges Warrant for Emails Stored in Ireland’. The
New York Times, 9 September 2015. https://www.nytimes.com/2015/09/10/technology/
microsoft-challenges-warrant-for-emails-stored-in-ireland.html, accessed 4 July 2022.
Niemann, Holger. 2019. The Justification of Responsibility in the UN Security Council: Prac-
tices of Normative Ordering in International Relations. Routledge Global Cooperation
Series. London: Routledge.
Ni Loideain, Nora. 2016. ‘The End of Safe Harbor: Implications for EU Digital Privacy and
Data Protection Law’. Internet Law 19 (8): 7–14.
Nissenbaum, Helen. 1998. ‘Protecting Privacy in an Information Age: The Problem of
Privacy in Public’. Law and Philosophy 17 (5): 559–96.
Nissenbaum, Helen. 2009. Privacy in Context: Technology, Policy, and the Integrity of Social
Life. Stanford Law Books. Stanford, CA: Stanford University Press.
noyb. 2020. ‘CJEU Judgment: First Statement’. Noyb.Eu. 16 July 2020. https://noyb.eu/en/
cjeu, accessed 4 July 2022.
Nozick, Robert. 1974. Anarchy, State, and Utopia. Vol. 5038. New York: Basic Books.
Nussbaum, Martha. 1994. ‘Patriotism and Cosmopolitanism’. Boston Review XIX (5): 3–16.
OAS. 2004. ‘A Comprehensive Inter-American Cybersecurity Strategy: A Multidimensional
and Multidisciplinary Approach to Creating a Culture of Cybersecurity’. AG/RES. 1939
(XXXIII-O/03).
Obama, Barack. 2013. ‘Statement by the President’. San Jose, California. https://
obamawhitehouse.archives.gov/the-press-office/2013/06/07/statement-president,
accessed 4 July 2022.
Obama, Barack. 2014. ‘Text of the President’s Remarks on NSA and Surveillance’. Lawfare,
January. https://www.lawfareblog.com/text-presidents-remarks-nsa-and-surveillance,
accessed 4 July 2022.
Obendiek, Anke S. 2021. ‘Take Back Control? Digital Sovereignty and a Vision for Europe’.
Policy Paper. Jacques Delors Centre. May.
Obendiek, Anke S. 2022. ‘What Are We Actually Talking about? Conceptualizing Data as a
Governable Object in Overlapping Jurisdictions’. International Studies Quarterly 66 (1).
doi: 10.1093/isq/sqab080.
Occhipinti, John D. 2015. ‘Still Moving toward a European FBI? Re-Examining the Politics
of EU Police Cooperation’. Intelligence and National Security 30 (2–3): 234–58.
Ochs, Carsten, Fabian Pittroff, Barbara Büttner, and Jörn Lamla. 2016. ‘Governing the
Internet in the Privacy Arena’. Internet Policy Review 5 (3). doi: 10.14763/2016.3.426.
OECD. 1980. ‘OECD Guidelines on the Protection of Privacy and Transborder Flows
of Personal Data’. http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotection
ofprivacyandtransborderflowsofpersonaldata.htm, accessed 4 July 2022.
OECD. 2013. Recommendation of the Council concerning Guidelines Governing the
Protection of Privacy and Transborder Flows of Personal Data. OECD/LEGAL/0188
https://legalinstruments.oecd.org/en/instruments/OECD-LEGAL-0188.
Office of the United States Trade Representative. 2017. ‘2017 Special 301 Report’. https://
ustr.gov/sites/default/files/301/2017%20Special%20301%20Report%20FINAL.PDF,
accessed 4 July 2022.
REFERENCES 265
Ohmae, Kenichi. 1995. The End of the Nation State: The Rise of Regional Economies. Free
Press Paperback. London and New York: Free Press.
Olmstead v. United States. 1928. 277 U.S. 438. Supreme Court of the US.
O’Neill, Onora. 2000. Bounds of Justice. Cambridge: Cambridge University Press.
Parkin, Joanna, Didier Bigo, Sergio Carrera, Amandine Scherrer, Julien Jeandesboz,
Nicholas Hernanz, and Francesco Ragazzi, European Parliament, and Directorate-
General for Internal Policies of the Union. 2013. National Programmes for Mass Surveil-
lance of Personal Data in EU Member States and Their Compatibility with EU Law.
Luxembourg: Publications Office.
Parsons, Christopher. 2017. ‘The (In)Effectiveness of Voluntarily Produced Transparency
Reports’. Business & Society 58 (1): 103–31. doi: 10.1177/0007650317717957.
Pasquale, Frank, and Tara Adams Ragone. 2013. ‘Protecting Health Privacy in an Era of Big
Data Processing and Cloud Computing’. Stanford Technology Law Review 17: 595–654.
Patten, Chris. 2004. ‘Speech by the Rt. Hon Chris Patten, Commissioner for External Rela-
tions: Passenger Name Records (PNR) EC /US Agreement’. SPEECH/04/189 Presented
at the European Parliament Plenary Session, Strasbourg, April 21.
Pawlak, Patryk. 2009. ‘Network Politics in Transatlantic Homeland Security Cooperation’.
Perspectives on European Politics & Society 10 (4): 560–81.
Pellandini-Simányi, Léna. 2014. ‘Bourdieu, Ethics and Symbolic Power’. The Sociological
Review 62 (4): 651–74. doi: 10.1111/1467-954X.12210.
Pettit, Philip. 1997. Republicanism: A Theory of Freedom and Government. Oxford: Claren-
don Press.
Pieth, Mark. 2006. ‘Criminalizing the Financing of Terrorism’. Journal of International
Criminal Justice 4 (5): 1074–86.
Pirkova, Eliska, and Estelle Massé. 2019. ‘EU Court Decides on Two Major “Right to Be
Forgotten” Cases: There Are No Winners Here’. Access Now, October. https://www.
accessnow.org/eu-court-decides-on-two-major-right-to-be-forgotten-cases-there-are-
no-winners-here/, accessed 4 July 2022.
PNR Directive. 2016. Directive (EU) 2016/681 of the European Parliament and of the
Council of 27 April 2016 on the Use of Passenger Name Record (PNR) Data for the
Prevention, Detection, Investigation and Prosecution of Terrorist Offences and Serious
Crime. OJ L 119.
Pogge, Thomas W. 1989. Realizing Rawls. Ithaca, NY, and London: Cornell University
Press.
Pogge, Thomas W. 2008. World Poverty and Human Rights. 2nd edn. World Poverty and
Human Rights: Cosmopolitan Responsibilities and Reforms. Cambridge and Malden,
MA: Polity.
Pohle, Jörg. 2017. ‘Datenschutz und Technikgestaltung: Geschichte und Theorie des Daten-
schutzes aus informatischer Sicht und Folgerungen für die Technikgestaltung’. PhD
Dissertation. Humboldt-Universität zu Berlin. doi: 10.18452/19136.
Pohle, Julia, M. Hösl, and Ronja Kniep. 2016. ‘Analysing Internet Policy as a Field of
Struggle’. Internet Policy Review 5 (3). doi: 10.14763/2016.3.412.
Pohle, Julia, and Thorsten Thiel. 2020. ‘Digital Sovereignty’. Internet Policy Review 9 (4).
doi: 10.14763/2020.4.1532.
Porter, Andrew L., and Annegret Bendiek. 2012. ‘Counterterrorism Cooperation in
the Transatlantic Security Community’. European Security 21 (4): 497–517. doi:
10.1080/09662839.2012.688811.
Posner, Richard A. 1977. ‘The Right of Privacy’. Georgia Law Review 12 (3): 393–422.
266 REFERENCES
Posner, Richard A. 1981. ‘The Economics of Privacy’. The American Economic Review 71
(2): 405–9.
Pouliot, Vincent. 2007. ‘“Sobjectivism”: Toward a Constructivist Methodology’. Interna-
tional Studies Quarterly 51 (2): 359–84.
Pouliot, Vincent. 2016. ‘Hierarchy in Practice: Multilateral Diplomacy and the Governance
of International Security’. European Journal of International Security 1 (1): 5–26. doi:
10.1017/eis.2015.4.
Pouponneau, Florent, and Frédéric Mérand. 2017. ‘Diplomatic Practices, Domestic Fields,
and the International System: Explaining France’s Shift on Nuclear Nonproliferation’.
International Studies Quarterly 61 (1): 123–35. doi: 10.1093/isq/sqw046.
Powles, Julia. 2015a. ‘The Case That Won’t Be Forgotten’. Loyola University Chicago Law
Journal 47: 583–615.
Powles, Julia. 2015b. ‘Data Privacy: The Tide Is Turning in Europe: But Is It Too Little, Too
Late?’ The Guardian, 6 April 2015. https://www.theguardian.com/technology/2015/apr/
06/data-privacy-europe-facebook, accessed 4 July 2022.
Powles, Julia. 2015c. ‘Tech Companies like Facebook Not above the Law, Says Max Schrems’.
The Guardian, 9 October 2015. http://www.theguardian.com/technology/2015/oct/09/
facebook-data-privacy-max-schrems-european-court-of-justice, accessed 4 July 2022.
Powles, Julia, and Enrique Chaparro. 2015. ‘How Google Determined Our Right to Be
Forgotten’. The Guardian, 18 February 2015. https://www.theguardian.com/technology/
2015/feb/18/the-right-be-forgotten-google-search, accessed 4 July 2022.
Pritzker, Penny. 2015. ‘Statement from U.S. Secretary of Commerce Penny Pritzker on Euro-
pean Court of Justice Safe Harbor Framework Decision’. Department of Commerce,
October. https://2014-2017.commerce.gov/news/press-releases/2015/10/statement-us-
secretary-commerce-penny-pritzker-european-court-justice.html.
Privacy International. 2018. ‘Privacy Protection Principles’. https://privacyinternational.
org/sites/default/files/2018-09/Part%203%20-%20Data%20Protection%20Principles.
pdf, accessed 4 July 2022.
Privacy International. 2020. ‘Here’s How a Well-Connected Security Company Is Quietly
Building Mass Biometric Databases in West Africa with EU Aid Funds’. Privacy Inter-
national. 10 November 2020. http://www.privacyinternational.org/news-analysis/4290/
heres-how-well-connected-security-company-quietly-building-mass-biometric.
Privacy International. 2020b. Privacy International v. Secretary of State for Foreign and
Commonwealth Affairs, Secretary of State for the Home Department, Government Com-
munications Headquarters, Security Service, Secret Intelligence Service. Court of Justice
of the EU: Grand Chamber. Case C-623/17.
Price, Rob. 2015. ‘Facebook Responds to Europe’s Top Court Striking down a Vital US-EU
Data Sharing Agreement’. Business Insider. http://uk.businessinsider.com/facebook-safe-
harbor-response-ecj-max-schrems-2015-10.
‘Protection of Individuals with Regard to the Processing of Personal Data - Processing
of Personal Data for the Purposes of Crime Prevention (Debate)’. 2016. European Par-
liament, Strasbourg, April 13. http://www.europarl.europa.eu/doceo/document/CRE-8-
2016-04-13-INT-3-521-0000_NL.html?redirect.
Puschmann, Cornelius, and Jean Burgess. 2014. ‘Big Data, Big Questions: Metaphors of Big
Data’. International Journal of Communication 8 (June). https://ijoc.org/index.php/ijoc/
article/view/2169, accessed 4 July 2022.
Putnam, Tonya L. 2009. ‘Courts without Borders: Domestic Sources of US Extraterritori-
ality in the Regulatory Sphere’. International Organization 63 (3): 459–90.
REFERENCES 267
Raab, Charles, and Ivan Szekely. 2017. ‘Data Protection Authorities and Information Tech-
nology’. Computer Law & Security Review 33 (4): 421–33. doi: 10.1016/j.clsr.2017.05.002.
Radu, Roxana. 2019. Negotiating Internet Governance. Oxford: Oxford University
Press.
Ramer, Samuel R. 2017. ‘Letter from Samuel R. Ramer, Acting Assistant Attorney General,
Office of Legislative Affairs, U.S. Department of Justice, to Hon. Paul Ryan, Speaker, U.S
House of Representatives’. [Cited in Statement of Richard W. Downing at a Hearing Enti-
tled ‘Data Stored Abroad: Ensuring Lawful Access and Privacy Protection in the Digital
Era’, 15 June 2017, Appendix A].
Rankin, Jennifer. 2016. ‘Brussels Bombings: EU Ministers to Meet’. The Guardian,
24 March 2016. https://www.theguardian.com/world/2016/mar/24/brussels-bombings-
eu-ministers-meet-as-border-controls-criticised, accessed 4 July 2022.
Raustiala, Kal. 2011. Does the Constitution Follow the Flag? The Evolution of Territoriality
in American Law. Oxford and New York: Oxford University Press.
Raustiala, Kal. 2013. ‘Institutional Proliferation and the International Legal Order’. In Inter-
disciplinary Perspectives on International Law and International Relations: The State of
the Art, edited by J. L. Dunoff and M. A. Pollack, 293–320. Cambridge and New York:
Cambridge University Press.
Rawls, John. 1971. A Theory of Justice. Cambridge, MA, and London: Belknap Press of
Harvard University Press.
Rawls, John. 2001. The Law of Peoples: With ‘The Idea of Public Reason Revisited’. Cam-
bridge, MA, and London: Harvard University Press.
Raymond, Mark, and Laura DeNardis. 2015. ‘Multistakeholderism: Anatomy of an Inchoate
Global Institution’. International Theory 7 (3): 572–616.
RCFP and Others. 2017. ‘Statement in CJEU Case C-507/17’. Court of Justice of
the European Union. https://www.rcfp.org/wp-content/uploads/imported/2017-11-29-
Statement-CJEU-in-Googe-v.-CNIL-RCFP-et-al.-as-filed.pdf, accessed 4 July 2022.
Redeker, Dennis, Lex Gill, and Urs Gasser. 2018. ‘Towards Digital Constitutionalism? Map-
ping Attempts to Craft an Internet Bill of Rights’. International Communication Gazette
80 (4): 302–19.
Reding, Viviane. 2010. ‘Viviane Reding Vice-President of the European Commission
responsible for Justice, Fundamental Rights and Citizenship Building Trust in Europe's
Online Single Market Speech at the American Chamber of Commerce to the EU’.
Brussels, 22 June 2010.
Reding, Viviane. 2014. ‘Facebook Post’. Facebook. 13 May 2014. https://www.facebook.com/
VivianeRedingEU/posts/304206613078842.
Rees, Wyn. 2007. Transatlantic Counter-Terrorism Cooperation: The New Imperative. Lon-
don and New York: Routledge.
Reform Government Surveillance. 2015. ‘Reform Government Surveillance’. Reform Gov-
ernment Surveillance. http://reformgovernmentsurveillance.com/, accessed 4 July 2022.
Regan, Priscilla M. 1995. Legislating Privacy: Technology: Social Values, and Public Policy.
Chapel Hill, NC: University of North Carolina Press.
Regan, Priscilla M. 2003. ‘Safe Harbors or Free Frontiers? Privacy and Transborder Data
Flows’. Journal of Social Issues 59 (2): 263–82.
Reiberg, A. 2018. Netzpolitik: Genese eines Politikfeldes. Policy Analyse. Baden-Baden:
Nomos.
Reicherts, Martine. 2014. ‘The Right to Be Forgotten and the EU Data Protection Reform’.
Lyon. http://europa.eu/rapid/press-release_SPEECH-14-568_en.htm, accessed 4 July
2022.
268 REFERENCES
Ryngaert, Cedric M. J., and Nico A. N. M. van Eijk. 2019. ‘International Cooperation
by (European) Security and Intelligence Services: Reviewing the Creation of a Joint
Database in Light of Data Protection Guarantees’. International Data Privacy Law 9 (1):
61–73. doi: 10.1093/idpl/ipz001.
Saco, Diana. 1999. ‘Colonizing Cyberspace: “National Security” and the Internet’. In Cul-
tures of Insecurity: States, Communities, and the Production of Danger, edited by Jutta
Weldes, Mark Laffey, H Gusterson, and Raymond Duvall, 261–92. Minneapolis, MN,
and London: University of Minnesota Press.
Safran. 2016. ‘Safran: Very Strong Revenue Growth for First-Quarter 2016’. Safran, April.
https://www.safran-group.com/pressroom/safran-very-strong-revenue-growth-first-
quarter-2016-2016-04-26.
Salter, Mark B. 2007. ‘Governmentalities of an Airport: Heterotopia and Confession’.
International Political Sociology 1 (1): 49–66. doi: 10.1111/j.1749-5687.2007.00004.x.
Samonte, Mary. 2020. ‘Google v. CNIL: The Territorial Scope of the Right to Be Forgotten
under EU Law’. European Papers. http://www.europeanpapers.eu/en/europeanforum/
google-v-cnil-territorial-scope-of-right-to-be-forgotten-under-eu-law, accessed 4 July
2022.
Sanchez v. France. 2021. European Court of Human Rights. App no. 45581/15.
Sandel, Michael J. 1998. Liberalism and the Limits of Justice. Cambridge and New York:
Cambridge University Press.
Sapiro, Gisèle. 2018. ‘Field Theory from a Transnational Perspective’. In The Oxford Hand-
book of Pierre Bourdieu, edited by Thomas Medvetz and Jeffrey J. Sallaz, 161–82. Oxford
and London: Oxford University Press.
SAS. 2017. ‘Identify High-Risk Travelers Faster and More Accurately through Better Anal-
ysis of PNR Data’. Solution Brief. https://www.sas.com/content/dam/SAS/en_us/doc/
solutionbrief/identify-high-risk-travelers-pnr-data-109223.pdf, accessed 4 July 2022.
Sassen, Saskia. 1998. Globalization and Its Discontents. Political Science/Economics. New
York: New Press.
Satamedia. 2008. Tietosuojavaltuutettu v. Satakunnan Markkinapörssi Oy, Satamedia Oy.
Court of Justice of the EU: General Court. Case C-73/07.
Satariano, Adam. 2018. ‘G.D.P.R., a New Privacy Law, Makes Europe World’s Leading Tech
Watchdog’. The New York Times, 24 May 2018. https://www.nytimes.com/2018/05/24/
technology/europe-gdpr-privacy.html, accessed 4 July 2022.
Satz, Debra. 2012. Why Some Things Should Not Be for Sale: The Moral Limits of Markets.
Oxford Political Philosophy. Oxford and New York: Oxford University Press.
Savage, Charlie. 2019. ‘Judge Rules Terrorism Watchlist Violates Constitutional Rights’. The
New York Times, 4 September 2019. https://www.nytimes.com/2019/09/04/us/politics/
terrorism-watchlist-constitution.html, accessed 4 July 2022.
Scally, Derek, and Jochen Bittner. 2013. ‘The Man in Portarlington Who Protects People’s
Data’. The Irish Times, 14 August 2013. https://www.irishtimes.com/news/technology/
the-man-in-portarlington-who-protects-people-s-data-1.1493167, accessed 4 July 2022.
Schildt, Henri. 2020. The Data Imperative: How Digitalization Is Reshaping Management,
Organizing, and Work. Oxford: Oxford University Press.
Schmidt, Vivien A., and Mark Thatcher. 2014. ‘Why Are Neoliberal Ideas so
Resilient in Europe’s Political Economy?’ Critical Policy Studies 8 (3): 340–7. doi:
10.1080/19460171.2014.926826.
Schneider, Jens-Peter. 2017. ‘Developments in European Data Protection Law in the
Shadow of the NSA-Affair’. In Privacy and Power: A Transatlantic Dialogue in the Shadow
270 REFERENCES
Sherman, Justin. 2022. ‘This Year, Russia’s Internet Crackdown Will Be Even Worse’. Atlantic
Council. 13 January 2022. https://www.atlanticcouncil.org/blogs/new-atlanticist/this-
year-russias-internet-crackdown-will-be-even-worse/, accessed 4 July 2022.
Sieghart, Paul. 1976. Privacy and Computers. London: Latimer New Dimensions.
Simon, Morgan. 2020. ‘Investing in Immigrant Surveillance: Palantir and the #NoTech-
ForICE Campaign’. Forbes, 15 January 2020. https://www.forbes.com/sites/
morgansimon/2020/01/15/investing-in-immigrant-surveillance-palantir-and-the-
notechforice-campaign/, accessed 4 July 2022.
Singer, Peter W., and Ian Wallace. 2014. ‘Secure the Future of the Internet’. Brookings. Jan-
uary. https://www.brookings.edu/research/secure-the-future-of-the-internet/, accessed
4 July 2022.
Slaughter, Anne-Marie. 2005. ‘Security, Solidarity, and Sovereignty: The Grand Themes of
UN Reform’. American Journal of International Law 99 (3): 619–31.
Smale, Alison. 2013. ‘Anger Growing among Allies on U.S. Spying’. The New York Times,
23 October 2013. https://www.nytimes.com/2013/10/24/world/europe/united-states-
disputes-reports-of-wiretapping-in-Europe.html, accessed 4 July 2022.
Smith, Adam. 1999. The Wealth of Nations: Books IV–V. Edited by Andrew S. Skinner.
Harmondsworth: Penguin.
Smith, Brad. 2017a. ‘The Need for a Digital Geneva Convention’. Microsoft on the
Issues, February. https://blogs.microsoft.com/on-the-issues/2017/02/14/need-digital-
geneva-convention/, accessed 4 July 2022.
Smith, Brad. 2017b. ‘US Supreme Court Will Hear Petition to Review Microsoft Search
Warrant Case While Momentum to Modernize the Law Continues in Congress’.
Microsoft on the Issues. 16 October 2017. https://blogs.microsoft.com/on-the-issues/
2017/10/16/us-supreme-court-will-hear-petition-to-review-microsoft-search-warrant-
case-while-momentum-to-modernize-the-law-continues-in-congress/, accessed 4 July
2022.
Smith, Brad. 2018a. ‘The CLOUD Act Is an Important Step Forward, but Now More Steps
Need to Follow’. Microsoft on the Issues, April. https://blogs.microsoft.com/on-the-
issues/2018/04/03/the-cloud-act-is-an-important-step-forward-but-now-more-steps-
need-to-follow/, accessed 4 July 2022.
Smith, Brad. 2018b. ‘34 Companies Stand up for Cybersecurity with a Tech Accord’.
Microsoft on the Issues, April. https://blogs.microsoft.com/on-the-issues/2018/04/17/
34-companies-stand-up-for-cybersecurity-with-a-tech-accord/, accessed 4 July 2022.
Smith, Brad. 2018c. ‘A Call for Principle-Based International Agreements to Govern Law
Enforcement Access to Data’. Microsoft on the Issues, September. https://blogs.microsoft.
com/on-the-issues/2018/09/11/a-call-for-principle-based-international-agreements-to-
govern-law-enforcement-access-to-data/, accessed 4 July 2022.
Smith, H. Jeff, Tamara Dinev, and Heng Xu. 2011. ‘Information Privacy Research: An
Interdisciplinary Review’. MIS Quarterly 35 (4): 989–1016.
Smith v. Maryland. 1979. 442 U.S. 735. Supreme Court of the US.
Snowden, Edward. 2014. ‘Testimony to the European Parliament’. https://www.techdirt.
com/2014/03/07/snowden-gives-testimony-to-european-parliament-inquiry-into-
mass-surveillance-asks-eu-asylum/.
Snowden, Edward. 2019. Permanent Record. London: Macmillan.
Solove, Daniel J. 2002. ‘Conceptualizing Privacy’. California Law Review 90 (4): pp. 1087–
1155. https://doi.org/10.2307/3481326.
Solove, Daniel J. 2007. ‘I’ve Got Nothing to Hide and Other Misunderstandings of Privacy’.
San Diego Law Review 44: 745–72.
272 REFERENCES
Solove, Daniel J. 2008. Understanding Privacy. Cambridge, MA: Harvard University Press.
Solove, Daniel J., and Woodrow Hartzog. 2014. ‘The FTC and the New Common Law of
Privacy’. Columbia Law Review 114: 583–676.
Sorell, Tom. 2016. ‘Law and Equity in Hobbes’. Critical Review of International Social and
Political Philosophy 19 (1): 29–46.
Spanish Delegation to the Council of the EU. 2015. ‘Information by the Commission on
the PNR Legislation Adopted by Mexico and the Republic of Argentina Requesting the
Transfer of PNR Data from the EU’. 6857/15. Brussels. http://data.consilium.europa.eu/
doc/document/ST-6857-2015-INIT/en/pdf.
Spar, Debora L. 1999. ‘Lost in (Cyber)Space. The Private Rules of Online Commerce”. In
Private Authority and International Affairs, edited by A. Claire Cutler, Virginia Haufler,
and Tony Porter, 31–51. SUNY Series in Global Politics. Albany, NY: State University of
New York Press.
Steger, Manfred B. 2013. ‘Political Ideologies in the Age of Globalization’. In The Oxford
Handbook of Political Ideologies, edited by Michael Freeden and Marc Stears, 214–30.
Oxford: Oxford University Press.
Steger, Manfred B, and Erin K Wilson. 2012. ‘Anti-Globalization or Alter-Globalization?
Mapping the Political Ideology of the Global Justice Movement’. International Studies
Quarterly 56 (3): 439–54.
Stevenson, Alexandra. 2018. ‘Facebook Admits It Was Used to Incite Violence in Myan-
mar’. The New York Times, 6 November 2018. https://www.nytimes.com/2018/11/06/
technology/myanmar-facebook.html.
Stigler, George J. 1980. ‘An Introduction to Privacy in Economics and Politics’. The Journal
of Legal Studies 9 (4): 623–44.
Strange, Susan. 1996. The Retreat of the State: The Diffusion of Power in the World Econ-
omy. Cambridge Studies in International Relations. Cambridge: Cambridge University
Press.
Suda, Yuko. 2013. ‘Transatlantic Politics of Data Transfer: Extraterritoriality, Counter-
Extraterritoriality and Counter-Terrorism’. Journal of Common Market Studies 51 (4):
772–88.
Suda, Yuko. 2017. The Politics of Data Transfer: Transatlantic Conflict and Cooperation over
Data Privacy. Routledge Studies in Global Information, Politics and Society. New York
and Abingdon: Taylor & Francis.
Supreme Court of the US. 2018. ‘Oral Hearing in the Supreme Court of the United
States United States, Petitioner, v. Microsoft Corporation, Respondent’. No 17–2.
Washington DC.
Susen, Simon. 2014. ‘Towards a Dialogue between Pierre Bourdieu’s “Critical Sociology”
and Luc Boltanski’s “Pragmatic Sociology of Critique”’. In The Spirit of Luc Boltanski:
Essays on the ‘Pragmatic Sociology of Critique’, edited by Bryan S. Turner and Simon
Susen, 313–48. London: Anthem Press.
Susen, Simon. 2015. ‘Towards a Critical Sociology of Dominant Ideologies: An Unexpected
Reunion between Pierre Bourdieu and Luc Boltanski’. Cultural Sociology 10 (2): 195–246.
doi: 10.1177/1749975515593098.
Susen, Simon, and Bryan S. Turner, eds. 2014. The Spirit of Luc Boltanski: Essays on the
‘Pragmatic Sociology of Critique’. London: Anthem Press.
Svantesson, Dan. 2016. ‘Enforcing Privacy across Different Jurisdictions’. In Enforcing
Privacy: Regulatory, Legal and Technological Approaches, edited by David Wright
andPaul de Hert, 195–222. Law, Governance and Technology Series. Dordrecht:
Springer.
REFERENCES 273
Svantesson, Dan Jerker B. 2015a. ‘Limitless Borderless Forgetfulness? Limiting the Geo-
graphical Reach of the “Right to Be Forgotten”’. Oslo Law Review 2 (2): 116–38. doi:
10.5617/oslaw2567.
Svantesson, Dan Jerker B. 2015b. ‘Extraterritoriality and Targeting in EU Data Privacy Law:
The Weak Spot Undermining the Regulation’. International Data Privacy Law 5 (4): 226–
34. doi: 10.1093/idpl/ipv024.
SWIFT. 2006. ‘Executive Summary of SWIFT’s Response to the Belgian Privacy Commis-
sion’s Advisory Opinion 37/2006 of 27 September 2006’. La Hulpe, Belgium. https://www.
swift.com/node/14941, accessed 4 July 2022.
SWIFT. 2007. ‘SWIFT Board Approves Messaging Re-Architecture’. SWIFT, October.
https://www.swift.com/insights/press-releases/swift-board-approves-messaging-re-
architecture, accessed 4 July 2022.
SWIFT. 2022. ‘SWIFT’. 2022. https://www.swift.com/about-us, accessed 4 July 2022.
Swire, Peter P., and Robert E. Litan. 1998a. None of Your Business: World Data Flows,
Electronic Commerce, and the European Privacy Directive. Washington DC: Brookings
Institution Press.
Swire, Peter P., and Robert E. Litan. 1998b. ‘Avoiding a Showdown over EU Privacy Laws’.
Brookings Policy Brief 29. https://www.brookings.edu/research/avoiding-a-showdown-
over-eu-privacy-laws/, accessed 4 July 2022.
Szpunar, Maciej M. 2019. ‘Conclusions de l’Avocat Général: Affaire C-507/17ʹ. Court
of Justice of the European Union. https://eur-lex.europa.eu/legal-content/FR/TXT/
?uri=CELEX:62017CC0507.
Tapscott, Don. 1996. The Digital Economy: Promise and Peril in the Age of Networked
Intelligence. Vol. 1. New York: McGraw-Hill.
Tarrow, Sidney. 2005. The New Transnational Activism. Cambridge Studies in Contentious
Politics. Cambridge: Cambridge University Press.
T-CY. 2014. ‘T-CY Guidance Note # 3 Transborder Access to Data (Article 32) Adopted
by the 12th Plenary of the T-CY (2–3 December 2014)’. T-CY (2013)7 E. Strasbourg:
Council of Europe.
T-CY. 2016a. ‘Criminal Justice Access to Data in the Cloud: Cooperation with “Foreign”
Service Providers. Background Paper Prepared by the T-CY Cloud Evidence Group’. T-
CY (2016)2. Strasbourg: Council of Europe.
T-CY. 2016b. ‘Criminal Justice Access to Electronic Evidence in the Cloud: Recommenda-
tions for Consideration by the T-CY. Final Report of the T-CY Cloud Evidence Group’.
Strasbourg.
Thévenot, Laurent, Michael Moody, and Claudette Lafaye. 2000. ‘Forms of Valuing Nature:
Arguments and Modes of Justification in French and American Environmental Disputes’.
In Rethinking Comparative Cultural Sociology: Repertoires of Evaluation in France and
the United States, edited by Michèle Lamont, 229–72. Cambridge: Cambridge University
Press.
Thierer, Adam. 2016. Permissionless Innovation: The Continuing Case for Compre-
hensive Technological Freedom. Washington DC: Mercatus Center, George Mason
University.
Tippmann, Sylvia, and Julia Powles. 2015. ‘Google Accidentally Reveals Data on “right
to Be Forgotten” Requests’. The Guardian, 14 July 2015. http://www.theguardian.com/
technology/2015/jul/14/google-accidentally-reveals-right-to-be-forgotten-requests,
accessed 4 July 2022.
Tjong Tjin Tai, TFE. 2016. ‘The Right to Be Forgotten: Private Law Enforcement’. Interna-
tional Review of Law, Computers & Technology 30 (1–2): 76–83.
274 REFERENCES
Tocci, Nathalie. 2017. ‘From the European Security Strategy to the EU Global Strategy:
Explaining the Journey’. International Politics 54 (4): 487–502. doi: 10.1057/s41311-017-
0045-9.
Seger, Alexander. 2019. ‘Towards a Protocol to the Budapest Convention’. Bucharest,
February 25.
Trauner, Florian, and Helena Carrapiço, eds. 2012. ‘The External Dimension of EU Justice
and Home Affairs: Post-Lisbon Governance Dynamics’. European Foreign Affairs Review,
Special Issue, 17 (5): 1–18.
Tynan, Dan. 2018. ‘Silicon Valley Finally Pushes for Data Privacy Laws at Senate Hearing’.
The Guardian, 26 September 2018. https://www.theguardian.com/technology/2018/
sep/26/silicon-valley-senate-commerce-committee-data-privacy-regulation, accessed 4
July 2022.
Tzanou, Maria. 2017. The Fundamental Right to Data Protection: Normative Value in the
Context of Counter-Terrorism Surveillance. London: Bloomsbury Publishing.
UK Home Office. 2019. ‘Online Harms White Paper’. https://www.gov.uk/government/
consultations/online-harms-white-paper/online-harms-white-paper, accessed 4 July
2022.
Ulbricht, Lena. 2018. ‘When Big Data Meet Securitization: Algorithmic Regulation with
Passenger Name Records’. European Journal for Security Research 3 (2): 139–61.
UN. 2011. ‘UN Guiding Principles on Business and Human Rights’. HR/PUB/11/04.
Unabhängiges Landeszentrum für Datenschutz. 2018. Unabhängiges Landeszentrum für
Datenschutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH.
Court of Justice of the EU: Grand Chamber. Case C-210/16.
UNCTAD. 2016. ‘Data Protection Regulations and International Data Flows: Implications
for Trade and Development’. New York, Geneva.
UNCTAD. 2022. ‘Data Protection and Privacy Legislation Worldwide’. 2022. https://
unctad.org/en/Pages/DTL/STI_and_ICTs/ICT4D-Legislation/eCom-Data-
Protection-Laws.aspx, accessed 4 July 2022.
UNGA. 1990. ‘Guidelines for the Regulation of Computerized Personal Data Files’.
A/RES/45/95.
UNGA. 2000. ‘United Nations Convention against Transnational Organized Crime and the
Protocols Thereto’. https://www.unodc.org/unodc/en/organized-crime/intro/UNTOC.
html, accessed 4 July 2022.
UNGA. 2013. ‘The Right to Privacy in the Digital Age’. A/RES/68/167. https://documents-
dds-ny.un.org/doc/UNDOC/GEN/N13/449/47/PDF/N1344947.pdf ?OpenElement,
accessed 4 July 2022.
UNHRC. 2018. ‘Report of the Special Rapporteur on the Right to Privacy’. A/73/45712.
https://digitallibrary.un.org/record/1656178.
United States v. Miller. 1976. 425 U.S. 435. US Supreme Court of the US.
United States v. Microsoft Corp. 2018. Supreme Court of the US.
UNSC. 2017. ‘Resolution 2396 (2017)’. https://documents-dds-ny.un.org/doc/UNDOC/
GEN/N17/460/25/PDF/N1746025.pdf ?OpenElement, accessed 4 July 2022.
UN Special Rapporteur on the Right to Privacy. 2017. ‘Brief Amicus Curiae of UN Special
Rapporteur on the Right to Privacy Joseph Cannataci in Support of Neither Party, United
States of America v. Microsoft Corporation’. 17–2. US Supreme Court.
US. 2017. ‘In the Matter of United States v. Microsoft Corp.: Petition for a Writ of Certiorari’.
17–2. United States Court of Appeals for the Second Circuit.
US Chamber of Commerce. 2013. ‘The Economic Importance of Getting Data Protec-
tion Right: Protecting Privacy, Transmitting Data, Moving Commerce’. https://www.
REFERENCES 275
uschamber.com/sites/default/files/documents/files/020508_EconomicImportance_
Final_Revised_lr.pdf, accessed 4 July 2022.
US Congress. 2015. ‘The House Energy and Commerce Subcommittees on Commerce,
Manufacturing and Trade and Communications & Technology: U.S.-EU Safe Harbor
Framework’. Washington DC.
US Department of Commerce. 2000a. ‘Safe Harbor Workbook’. https://2016.export.gov/
safeharbor/eg_main_018238.asp.
US Department of Commerce. 2000b. ‘Safe Harbor Privacy Principles’. OJ L25/10.
US Department of Commerce. 2016. ‘EU–U.S. Privacy Shield Framework Principles Issued
by the U.S. Department of Commerce’. https://www.privacyshield.gov/Privacy-Shield-
Principles-Full-Text, accessed 4 July 2022.
US Department of Justice. 2019. ‘Promoting Public Safety, Privacy, and the Rule of Law
around the World: The Purpose and Impact of the CLOUD Act: White Paper’. https://
www.justice.gov/opa/press-release/file/1153446/download, accessed 4 July 2022.
US Department of the Treasury. 2002. ‘Passenger Name Record Information Required for
Passengers on Flights in Foreign Air Transportation to or From the United States’. Federal
Register Vol. 67, No. 122. https://www.govinfo.gov/content/pkg/FR-2002-06-25/pdf/02-
15935.pdf, accessed 4 July 2022.
US Department of the Treasury. 2007a. ‘Letter from United States Department of the
Treasury Regarding SWIFT/Terrorist Finance Tracking Programme’. OJ 2007/C 166/08.
US Department of the Treasury. 2007b. ‘Processing of EU Originating Personal Data by
United States Treasury Department for Counter Terrorism Purposes: “SWIFT” (2007/C
166/09) Terrorist Finance Tracking Program: Representations of the United States
Department of the Treasury’. OJ C 166/18.
US Department of the Treasury. 2022. ‘Terrorist Finance Tracking Program (TFTP)’. 2022.
https://home.treasury.gov/policy-issues/terrorism-and-illicit-finance/terrorist-finance-
tracking-program-tftp, accessed 4 July 2022.
US Embassy Berlin. 2006a. ‘Scenesetter: FBI Director Mueller’s Berlin Visit’. Wikileaks
Public Library of US Diplomacy, 06BERLIN2303_a. https://wikileaks.org/plusd/cables/
06BERLIN2303_a.html, accessed 4 July 2022.
US Embassy Berlin. 2006b. ‘DHS A/S Baker Engages on PNR, Seeks Greater CT Info Shar-
ing’. Wikileaks Public Library of US Diplomacy, 06BERLIN3173_a. https://wikileaks.org/
plusd/cables/06BERLIN3173_a.html, accessed 4 July 2022.
US Embassy Dublin. 2006. ‘Next Steps on TFTP: Ireland Takes Hard Line on SWIFT’. Wik-
ileaks Public Library of US Diplomacy, 06DUBLIN1396_a. https://wikileaks.org/plusd/
cables/06DUBLIN1396_a.html, accessed 4 July 2022.
US Embassy London. 2006. ‘Terrorism Finance: Next Steps on the TFTP’. Wikileaks Pub-
lic Library of US Diplomacy, 06LONDON8247_a. https://wikileaks.org/plusd/cables/
06LONDON8247_a.html, accessed 4 July 2022.
US Mission to the EU. 2007. ‘European Commission Outlines PNR-Style SWIFT Issue’.
07BRUSSELS253_a. Brussels.
US Mission to the EU. 2009. ‘European Parliament: Introducing the New Civil Liberties,
Justice, and Home Affairs Committee’. Wikileaks Public Library of US Diplomacy, Brus-
sels 001105. https://wikileaks.org/plusd/cables/09BRUSSELS1105_a.html, accessed 4
July 2022.
US Mission to the EU. 2015. ‘Data Privacy and Protection’, December. https://web.archive.
org/web/20151231220650/http://useu.usmission.gov/st-09282015.html, accessed 4 July
2022.
276 REFERENCES
US Patriot Act. 2001. ‘Uniting and Strengthening America by Providing Appropriate Tools
Required to Intercept and Obstruct Terrorism (USA PATRIOT ACT) Act of 2001’.
US White House. 1997. A Framework for Global Electronic Commerce. https://
clintonwhitehouse4.archives.gov/WH/New/Commerce/read.html.
US White House. 2011. ‘National Strategy for Counterterrorism’. https://obamawhitehouse.
archives.gov/blog/2011/06/29/national-strategy-counterterrorism, accessed 4 July
2022.
US White House. 2014. ‘Presidential Policy Directive 28: Signals Intelligence Activ-
ities’. PPD-28. https://obamawhitehouse.archives.gov/the-press-office/2014/01/17/
presidential-policy-directive-signals-intelligence-activities, accessed 4 July 2022.
Vanbever, Francis. 2006. ‘SWIFT Statement: Francis Vanbever, Chief Financial Officer,
Member of the Executive Committee, SWIFT’. European Parliament Hearing, 4 October
2006.
Vauchez, Antoine, and Bruno de Witte. 2013. Lawyering Europe: European Law as a
Transnational Social Field. Oxford and Portland, OR: Hart Publishing.
Veljanovski, Cento. 2010. ‘Economic Approaches to Regulation’. In The Oxford Handbook of
Regulation, edited by Robert Baldwin, Martin Cave, and Martin Lodge, 17–38. Oxford:
Oxford University Press.
Viola, Lora Anne, and Paweł Laidler. 2022. Trust and Transparency in an Age of Surveillance.
Abingdon, New York: Routledge.
Volokh, Eugene. 1999. ‘Freedom of Speech and Information Privacy: The Troubling Impli-
cations of a Right to Stop People from Speaking about You’. Stanford Law Review 52:
1–65.
Voss, W. Gregory. 2016. ‘The Future of Transatlantic Data Flows: Privacy Shield or Bust?’
Journal of Internet Law 19 (11): 1, 9–18.
Wacquant, Loïc, and Aksu Akçaoğlu. 2017. ‘Practice and Symbolic Power in Bour-
dieu: The View from Berkeley’. Journal of Classical Sociology 17 (1): 55–69. doi:
10.1177/1468795X16682145.
Walker, Clive. 2018. ‘Counter-Terrorism Financing: An Overview’. In The Palgrave Hand-
book of Criminal and Terrorism Financing Law, edited by Colin King, Clive Walker, and
Jimmy Gurulé, 737–53. Cham: Palgrave Macmillan.
Walker, Kent. 2016. ‘A Principle That Should Not Be Forgotten’. Google, May. https://blog.
google/around-the-globe/google-europe/a-principle-that-should-not-be-forgotten/,
accessed 4 July 2022.
Walker, Kent. 2017. ‘Defending Access to Lawful Information at Europe’s Highest
Court’. Google, November. https://www.blog.google/around-the-globe/google-europe/
defending-access-lawful-information-europes-highest-court/, accessed 4 July 2022.
Walker, Neil. 2008. ‘Taking Constitutionalism beyond the State’. Political Studies 56 (3):
519–43.
Waltz, Kenneth N. 2010. Theory of International Politics. Long Grove, IL: Waveland Press.
Walzer, Michael. 2008. Spheres of Justice: A Defense of Pluralism and Equality. New York:
Basic Books.
Warren, Samuel D., and Louis D. Brandeis. 1890. ‘The Right to Privacy’. Harvard Law
Review 4 (5): 193–220.
Weiss, Martin A., and Kristin Archick. 2016. ‘US–EU Data Privacy: From Safe Harbor to
Privacy Shield’. Congressional Research Service.
Weiss, Moritz. 2008. ‘The “Political Economy of Conflicts”: A Window of Opportunity for
CFSP?’ Journal of Contemporary European Research 4 (1): 1–17.
REFERENCES 277
Weldes, Jutta. 1999. Cultures of Insecurity: States, Communities, and the Production of
Danger. Vol. 14. Minneapolis, MN: University of Minnesota Press.
Wensink, Wim, Bas Warmenhoven, Roos Haasnoot, Rob Wesselink, Bibi van Ginkel,
Stef Wittendorp, Christophe Paulussen, et al. 2017. ‘The European Union’s Policies
on Counter-Terrorism: Relevance, Coherence and Effectiveness’. Policy Department C:
Citizens’ Rights and Constitutional Affairs, European Parliament.
Wesseling, Mara. 2016. ‘An EU Terrorist Finance Tracking System’. Occasional Paper. Royal
United Services Institute for Defence and Security Studies.
Wesseling, Mara, Marieke de Goede, and Louise Amoore. 2012. ‘Data Wars beyond Surveil-
lance: Opening the Black Box of SWIFT’. Journal of Cultural Economy 5 (1): 49–66.
Westin, Alan F. 1967. Privacy and Freedom. New York: Athenum.
Williams, Michael C. 2003. ‘Words, Images, Enemies: Securitization and Interna-
tional Politics’. International Studies Quarterly 47 (4): 511–31. doi: 10.1046/j.0020-
8833.2003.00277.x.
Wittgenstein, Ludwig. 2001. Philosophische Untersuchungen. Edited by Joachim Schulte,
Heikki Nyman, Eike von Savigny, and Georg Henrik von Wright. Frankfurt am Main:
Suhrkamp.
Wong, Julia Carrie. 2018. ‘Microsoft Revenue Exceeds $100bn Boosted by Cloud Ser-
vices’. The Guardian, 20 July 2018. https://www.theguardian.com/technology/2018/jul/
19/microsoft-earnings-cloud-services-revenue, accessed 4 July 2022.
Yeung, Karen. 2018. ‘A lgorithmic Regulation: A Critical Interrogation’. Regulation & Gov-
ernance 12 (4): 505–23. doi: 10.1111/rego.12158.
Die Zeit. 2016. ‘Microsoft: US-Regierung darf nicht auf Kundendaten im Ausland zugreifen’.
Die Zeit, 14 July 2016. https://www.zeit.de/digital/datenschutz/2016-07/microsoft-
datenschutz-us-regierung, accessed 4 July 2022.
Zhao, Bo, and Weiquan Chen. 2019. ‘Data Protection as a Fundamentl Right: The European
General Data Protection Regulation and Its Extraterritorial Application in China’. US–
China Law Review 16 (3): 97–113. doi: 10.17265/1548-6605/2019.03.002.
Zhao, Bo, and Jeanne Mifsud-Bonnici. 2016. ‘Protecting EU Citizens? Personal Data in
China: A Reality or a Fantasy?’ International Journal of Law and Information Technology
24 (2): 128–50. doi: 10.1093/ijlit/eaw001.
Zittrain, Jonathan. 2008. The Future of the Internet and How to Stop It. New Haven, CT:
Yale University Press.
Zuboff, Shoshana. 2018. The Age of Surveillance Capitalism: The Fight for a Human Future
at the New Frontier of Power. New York: Public Affairs.
Zürn, Michael, Martin Binder, Matthias Ecker-Ehrhardt, and Katrin Radtke. 2007. ‘Poli-
tische Ordnungsbildung wider Willen’. Zeitschrift für Internationale Beziehungen 14 (1):
129–64.
Zürn, Michael, and Pieter de Wilde. 2016. ‘Debating Globalization: Cosmopolitanism and
Communitarianism as Political Ideologies’. Journal of Political Ideologies 21 (3): 280–301.
doi: 10.1080/13569317.2016.1207741.
Regulation (EU) 2016/679 of The European Parliament and of The Council of 27 April
2016 on the protection of natural persons with regard to the processing of personal data
and on the free movement of such data, and repealing Directive 95/46/EC, Pub. L. No.
OJ L119/1 (2016).
Index
9/11 terrorist attacks 32, 56, 59t, 68, 84, 104–8, Article 29 Working Party 63–4, 83, 109, 111–12,
116, 126, 129, 131, 146, 148–9, 209 116, 126–7, 133–4, 137, 150, 155, 171,
29WP, see Article 29 Working Party 183–4, 201
1995 Data Protection Directive 2, 42t, 52, 54–5, ASEAN, see Association of Southeast Asian
58, 60t, 74–5, 77, 79, 87, 95, 99–100, 109, Nations Framework on Personal Data
111, 133, 137, 162, 177, 179, 182 Protection
reform 99–102, see also General Data Asia-Pacific Economic Cooperation Privacy
Protection Regulation Framework 42t, 55, 64, 65, 66, 68, 71, 73, 75
and Safe Harbour 54–5 Association of Southeast Asian Nations Frame-
2004/82/EC Carrier Directive 108 work on Personal Data Protection 42, 44,
55, 62, 63, 65, 68, 71, 75
Abbott, Andrew 3, 8, 18, 21–2, 31, 205, 209 ATSA, see Aviation and Transportation Security
adequacy 2, 79, 85–92, 96–103, 108–9, 112–13, Act
122, 213–14, 219 Aviation and Transportation Security Act 104,
CJEU definition 2, 91, 97–103 106, 108–11, 115, 121
decision 79, 85–92, 96–7, 112–13, 213–14, aviation security 104, 108–9
219
invalidation based on 91–2, 96–9, 103 banks 106, 132, 134, 138, 146, 150, 153
Advanced Passenger Information 108–9 Belgian Society for Worldwide Interbank
African Union 55 Financial Telecommunications, see SWIFT
Convention on Cybersecurity and Personal
Bennett, Colin 2, 6, 53–4, 75, 175
Data Protection 42, 55, 63, 65–6, 68–9, 71,
Bigo, Didier 10, 22, 26, 34, 86, 104, 151, 153, 209
73
BMI, see Federal Ministry of the Interior
AFSJ, see Area of Freedom, Security and Justice
(Germany)
Agencia Española de Protección de Datos 178,
Boltanski, Luc 21, 24, 27–9, 32–3, 36, 44, 143,
208
193, 210
agency 10–11, 18, 21, 24, 65, 206
reflexivity 24–7, 40, 44 and Bourdieu, Pierre 10, 18, 21–4, 34–5,
197–8
airlines 14, 56, 102, 104, 106, 108–10, 112–16,
124, 127, 153 pragmatism 40, 54
algorithm 10, 176, 188 sociology of critique 18, 24, 27–30
regulation 10, 19, 44 and Thévenot, Laurent 8, 10, 15, 18–19, 21,
Science and Technology Studies 10, 19 24, 27–30, 32, 34, 36–7, 41, 44, 48, 51, 64,
Alphabet, see Google 90, 95, 117, 130, 148, 151, 192, 197, 206,
AML, see anti-money laundering 209–10
anonymization (of data) 5, 119 Bourdieu, Pierre
anti-money laundering 130–4, 149–50 and Boltanski, Luc 10, 18, 21–4, 34–5, 197–8
AML Directive 141 fields 8, 10–12, 17–18, 21–2, 24–6, 34, 39, 51,
FATF 149–50 89, 121, 130, 169, 197, 206, 212
APEC, see Asia-Pacific Economic Cooperation Bradford, Anu 2, 13, 45, 62, 79, 99, 165, see also
Privacy Framework Brussels effect
API, see Advanced Passenger Information Brussels effect 2, 13, 45, 62, 79, 99, 165, 218
Aradau, Claudia 10, 19, 67–8 Budapest Convention 154, 159, 168–71
Area of Freedom, Security and Justice 118 Bureau of Customs and Border Protection
Article 19 43 (CBP) 112
INDE X 279
digital sovereignty 2, 173, 216, see also EU-US agreement on personal data protection,
sovereignty see Umbrella Agreement
Digital Switzerlands 157–8, 169, 189, 202 EU-US PNR Agreement 114, 116–23, 126, 130,
DPA, see Data Protection Authority 139, see also Passenger Name Records
dystopian scenario 29, 52, 60t, 78, 83, 85, 90, 98, extraterritoriality 12, 15, 56, 71, 74–6, 79, 81–2,
103, 115, 120–2, 146, 161, 168, 180–6 99, 100, 103, 133, 152, 156–62, 163, 165,
167, 172, 184–9, 193, 201, 211–12
Economic Community of West African States definition of 3
Supplementary Act A/SA.1/01/10 On Personal
Data Protection Within ECOWAS 42t, 55, Facebook 51, 54, 56–7, 83, 87, 90, 96–7, 166,
66, 68, 71–3 169, 173, 179, 183, 190, 191, 194, 195, 202,
EDPB, see European Data Protection Board 216
EDPS, see European Data Protection Supervisor Oversight Board 190
EDRi, see European Digital Rights fairness (order of ) 14–15, 28, 37, 44, 52, 58–9t,
electronic evidence (e-evidence) 61–3, 71, 76, 78–80, 82–6, 90–2, 94–5, 97–9,
101, 103, 109, 113–14, 116–20, 122, 134–7,
Council of Europe 167–9
142, 145–6, 151, 158, 161–2, 168–9, 174–6,
definition of 152–6
180–2, 184–7, 192–4, 196–8f, 200–2, 204–5,
legislative frameworks 164–72
210, 212, 227
Mutual Legal Assistance Treaties
Farrell, Henry 6, 54–5
(MLAT) 152–5
and Newman, Abraham 4, 6–7, 9–11, 19–20,
Electronic Privacy Information Center
45, 56, 77, 79, 83, 88, 90, 94–5, 103, 105–8,
(EPIC) 54, 158, see also non-governmental
110–13, 115, 118–19, 121, 123, 130, 135–6,
organizations
138–40, 153, 204, 206, 209, 212
European Commission 14, 48, 78–88, 91–103,
Federal Ministry of the Interior (Germany) 86,
107, 109–24, 135–42, 144–8, 162–77,
114–15, 122
179–86, 202–4, 209, 213–14, 219
Federal Trade Commission (FTC) 70, 80, 94,
DG JUST (Justice) 1, 80, 85, 135, 177, 180–1, 101, 183, 204
190–1
field
internal market 54, 135 Bourdieu 8, 10–12, 17–18, 21–2, 24–6, 34, 39,
as negotiator 94, 172 51, 89, 121, 130, 169, 197, 206, 212
European Commission and DHS Joint genesis of the 41–4, 51–8, 75–6
Statement 56, 111, 123 theory 24–6, 34
European Data Protection Board 64, 100 Financial Action Task Force (FATF) 149–50
European Data Protection Supervisor 64, 101, FISA, see Foreign Intelligence Surveillance Act
116–17, 122, 124, 130, 138–40, 144, 148, Five Eyes network 81, 86, 125
151, 155, 201, 204, 211 Fligstein, Neil 26
European Digital Rights Initiative 119, 124, 158, and McAdam, Doug 10, 13, 17–18, 24, 26,
169, 171 34–5, 38, 78, 206, 209, 212
European Parliament 9, 14–15, 20, 82–3, 85–6, strategic action fields 26, 34, 86
90, 92–6, 102–7, 111–28, 130–46, 168–70, Foreign Intelligence Surveillance Act (FISA) 82,
201, 206 96, 107
Committee on Civil Liberties, Justice, and fragmentation
Home Affairs (LIBE) 82–4, 113, 118, 122, of data governance regimes 55, 74, 76, 123,
123, 138–9, 142, 145, 151, 171 171–2, 187–8, 204, see also data localization
LIBE hearing 82–4, 142, 145 of financial security 149
Lisbon Treaty 107–8, 117–22, 129, 138–41, in international law 6
206–7 of the internet 216–20
MEPs 119, 122–8, 139, 141, 144–5 of law enforcement 171–2
resolutions 9, 111–12, 118, 130, 133–6, 138–9, threat of 53, 60t, 74, 85, 123, 204, see also
144, 168–9 harmonization
responsibility 9, 14, 117–24, 140
Europol 107, 140, 144–5, 151, 153–5 G7 149, 154
Information System (EIS) 129 Joint Declaration by G7 ICT Ministers 42t, 61,
Joint Supervisory Body (JSB) 144–5, 147 65
INDE X 281
legal (un)certainty 66, 93, 95, 110, 113–14, 120, epistemic 12, 199, 201
127, 156–7, 166, 173, 202 of governance 21–3, 51, 102, 116, 193, 200
liberalism 61–2, 65, 67, 199–200, 202 valued 59t–60t, 82, 91, 101, 120, 137, 192–3,
liberal order 4, 58, 215 197
liberal values 49, 62–3, 67, 69–70, 123, 201 OECD, see Organization for Economic
local liberalism 12, 184, 200t–2, 205 Co-Operation and Development
Lisbon Treaty 15, 55, 88, 107, 118, 122, 134, 139, ombudsperson 95, 97–8, 124, 145–6
141, 206, 209 Office of the Ombudsperson (EU) 124, 145–6
empowerment of the European Par- Privacy Shield 95, 97–8
liament 118, 122, 134, 141, 206, open internet 66, 165, 200t
209 orders of worth, see value orders
lobbying 100, 139, 189–90, 203, 216 Organization for Economic Co-Operation and
Lyon, David 5, 63, 77, 81, 128 Development 154
Guidelines on the Protection of Privacy
MAXQDA 41, 48–9, 222–8 and Transborder Flows of Personal Data
McAdam, Doug updated guidelines 42t, 51, 53–6, 59t, 65–7,
and Fligstein, Neil 10, 13, 17–18, 24, 26, 34–5, 71–5
38, 78, 206, 209, 212 Organization of American States
Mérand, Frédéric 10, 24, 25, 41, 54, 78 Comprehensive Inter-American
Meta, see Facebook Cybersecurity Strategy 42t, 69,
Microsoft 15, 46t, 53, 56, 83, 87, 152–73, 185, 74
188–90, 194, 202–3, 207t, 209 oversight mechanisms 15, 63, 73, 86, 96, 103,
as norm entrepreneur 157, 190 129–32, 136, 140–1, 144–6, 148, 151, 157,
Smith, Brad 157–8, 166, 181, 189, 203, 210 190, 210, 213, 219
Mitsilegas, Valsamis 6, 9, 10, 145 hearing 82–4, 89, 93, 142
money laundering, see anti-money laundering review 15, 47, 69, 85, 89–91, 95–6, 103, 120,
Mutual Legal Assistance Treaties (MLAT) 152–5 126–7, 130–1, 138–9, 141, 143–9, 151–2,
166, 170, 191, 206, 210, 213, 218
Newman, Abraham L. 2, 6, 9, 19, 45, 53, 55–6,
66, 70, 72, 75, 77–9, 81, 99–100, 103 Palantir 151
and Farrell, Henry 4, 6–7, 9–11, 19–20, 45, passenger name records (PNR) 14, 95, 101,
56, 77, 79, 83, 88, 90, 94–5, 103, 105–8, 104–28, 129–30, 136, 139–41, 207t–10, 212
110–13, 115, 118–19, 121, 123, 130, 135–6, Canada-EU agreement 126, 212
138–40, 153, 204, 206, 209, 212 European PNR Directive 101–2, 104, 108,
New York Times, the 1, 2, 129–30, 133, 143, 157 110, 121–6
Nissenbaum, Helen 5, 114, 116 review 120, 126–7, 130
non-governmental organizations 54 Patriot Act 107–8
Access Now 54 Pegasus, see NSO Group
Electronic Frontier Foundation 54 PNR, see passenger name records
Electronic Privacy Information Center 54, political philosophy 14, 41, 44, 52, 59t, 60, 62,
158 67, 70
European Digital Rights (EDRi) 119, 124, pragmatism 40, 54
158, 169, 171 Presidential Policy Directive 83–4, 88–9, 96
noyb 97, 201, see also Schrems, Maximilian PRISM Program 81–2, 87, 166, see also
privacy activists 49, 54, 103 surveillance
Statewatch 54 Pritzker, Penny 93
NSO Group 4, 217–18 privacy
activists 49, 54, 103, see also Bennett, Colin
Obama, Barack 66, 83–4, 168 John
administration of 83–4 and data governance 49
objects 11, 18, 21–2, 52, 82, 91, 102, 104, 122, and data protection 4–5, 52–3
130, 162, 192–3, 197 history of 4–5, 52–3
composite 36–7, 49, 130, 146, 151, 197, 210 Privacy and Civil Liberties Oversight Board 96
construction of 12, 18–19, 21–3, 102 Privacy International 54, 75, 121, 212, 213, 219
INDE X 283
Terrorist Finance Tracking Program 46t, 95, Guidelines for the Regulation of Personal
108, 122, 129–150, 206, 207t–8t, 209–10 Data Files 42t, 53, 63, 65, 71
review procedure 146–9 Kadi case 99, 149
Terrorist Finance Tracking System 140 Security Council 28, 43, 104, 126, 137, 149
TFTP, see Terrorist Finance Tracking Program Security Council Resolution 2396 42, 43, 59t,
TFTS, see Terrorist Finance Tracking System 68, 104, 126
Thévenot, Laurent 28, 44 Special Rapporteur on the Right to
and Boltanski, Luc 8, 10, 15, 18–19, 21, 24, Privacy 56, 63, 81, 162–3, 200, 204
27–30, 32, 34, 36–7, 41, 44, 48, 51, 64, 90, United States
95, 117, 130, 148, 151, 192, 197, 206, 209–10 Congress 51, 93, 119–20, 136, 164, 214
sociology of critique 18, 24, 27–30 congressional hearing 51, 93
transparency 5, 81–2, 95, 130, 141, 167, 206, data protection 213–14
209–10, 213 US Supreme Court 52–3
Microsoft case 152, 154, 157–66, 170–2,
of private companies 1, 66, 155–6, 166, 210
211–12
of public actors 69, 120, 127–31, 142, 143–6,
148–9, 151, 170–1, 213, 218
value orders, see also Boltanski, Luc and
transparency reports 66, 155–6, 210
Thévenot, Laurent
Trump, Donald 51
definition of 28, 32
administration of 96, 100, 164, 216 reconstruction 8, 28–9, 40–1, 44, 51–2, 58, 76
trust 12, 32, 59t, 65–6, 85, 99, 145, 168, 172, tests 28, 32–4, 36, 143, 206, 210
200t, 202 visions of data governance 3, 6, 8, 10–13, 15, 17,
24, 37–41, 49–50, 58, 76, 101–2, 115, 125,
Umbrella Agreement 42t, 69, 72, 74, 95, 101, 130, 135, 159, 172, 173, 184, 187, 196–206,
108, 155, 204 217
United Kingdom 1, 53, 79, 81, 88, 116, 123, 135, metaphors of data 199
142, 171, 191, 213–14 typology of 200t–205
United Nations vulnerable groups 62, 67, 69, 106, 203
counterterrorism 42, 43, 59t, 68, 104, 107, children 69, 154, 160, 181
121, 126, 137, 149–50, see also security
(order of) Whatsapp 1, 173, 216
General Assembly Resolution 68/167 The whistle-blowers 63, 84, 103
Right to Privacy in the Digital Age 42t, 56,
59t, 61, 81, see also Special Rapporteur on Zuboff, Shoshana 10, 63, 174–5, 194, 202
the Right to Privacy Zürn, Michael 11, 24, 38, 70, 73, 199–200, 202