You are on page 1of 17

Deleuze and the Digital: On the

Materiality of Algorithmic Infrastructures

Dennis Mischke University of Potsdam

Abstract
In his short and often quoted essay ‘Postscript on the Societies of
Control’, Gilles Deleuze famously describes the structures of power
in the dawning twenty-first century as driven by ‘machines of a third
type, computers’, as novel and predominantly digital infrastructures.
In fact, from a Deleuzian perspective the entire ecosystem of the
digital transformation can be described as a larger shift in modes of
production and the political economy. This essay proposes to read
this ‘technological evolution’ as the power of algorithms and their
material substance – digital infrastructures that entail a different mode
of interaction between humans and technology. In looking at these
infrastructures from a materialist position, my essay reconceptualises the
digital as the unfolding logic of assemblages that have been shaping a
‘long now’ of technological modernity. In bringing a Deleuzian reading
of infrastructures to the study of technology and society, this essay
seeks to shed a new light on the political function – and the increasing
abstraction – of infrastructures in the realm of the digital.

Keywords: digital materialism, critical infrastructure studies, culture and


algorithms, societies of control, abstract machines, assemblages

But the machines don’t explain anything, you have to analyze the collective
arrangements of which the machines are just one component.

(Deleuze 1995: 175)

Deleuze and Guattari Studies 15.4 (2021): 593–609


DOI: 10.3366/dlgs.2021.0459
© Edinburgh University Press
www.euppublishing.com/dlgs
594 Dennis Mischke

I am writing this text at a ‘pedal desk’ at the philological library of


the Freie Universität Berlin – a piece of ‘human powered bike furniture’1
that provides the opportunity for physical exercise, while reading and
writing. As I ‘ride’ this desk, I sense a spinning wheel underneath my
seat and see that my movements charge the notebook that I plugged
into the socket. The desk transforms the kinetic energy I produce into
electricity. Is the pedal desk, I ask myself, a serious solution or rather
the caricature of a workplace? In more than one way, it confronts its
user with a question easily forgotten: What is the material basis of our
electronic lives and how does the materiality of our increasingly digital
infrastructures impact the way we read, think and write? We tend to
forget that our apparently immaterial digital spaces and devices come at
a price. The physical side of our telecommunication and information
technologies, in particular, is profoundly structured and determined
by its material and energetic connection to a physical reality of high
costs. Even the ‘green’ email that saves the life of trees if left unprinted
requires the production of electricity in considerable proportions. Unless
this amount of power is not produced entirely by renewable energy,
the contemporary slogan that ‘data is the new oil’ is not merely a
metaphorical residue of industrial capitalism: the entropy of fossil-fuel
extraction remains a decisive make-up of the digital age (Parikka 2015).
The sociologist Armin Nassehi (2019) has recently made the point that
the similarities between data and oil are not only economic. Whereas a
‘carbonized creation of value’ (ibid. 190, my translation) underpinned
virtually any modern invention and innovation, the secret appeal of
the digital is its apparent lack of material boundaries. In the words
of Michael Betancourt: ‘It is an illusion that emerges in fantasies that
digital technology ends scarcity by aspiring to the state of information’
(Betancourt 2015: 37). By insinuating that digital environments, tools
and software products thrive in a domain of self-productivity, infinite
and ‘capable of creating value without expenditure’ (ibid. 37), the
narrative of many Silicon Valley companies often suggests that these
companies operate beyond the material realities of limited resources,
energy, labour time and other externalities of production. Uber, as we
know, owns no cabs, while profoundly disrupting the transportation
business; Airbnb advertises itself as a community network of hospitality,
while being one of the largest hotel businesses of the world. But of
course, despite apparently minimal marginal costs of production and
organisation, the immateriality of the digital remains a fundamental
delusion that is part of its foundational promise.
Deleuze and the Digital 595

However, anyone who has run out of battery in a crucial situation


knows that electricity to charge phones and laptops – or, by extension,
energy-thirsty data centres – has to come from somewhere. A single
mobile device that is connected to the Internet and hooked up to a
‘cloud’ service in a data centre consumes four to five times as much
energy ‘remotely’ as it does locally (Hintemann and Fichter 2015).
The global network of data centres that are hosting the algorithmic
infrastructures of the digital age are predicted to consume about 20
per cent of global electricity by 2030 (Jones 2018). As long as these
steadily rising amounts of energy will be produced from burning
fossil fuels, the digital age will also necessarily increase the overall
entropy of its habitat – Earth. Yet, questions regarding the material
foundations of our global digital culture(s) do not stop at the conundrum
of matter/energy relations. As the cases of Uber and Airbnb make
clear: in talking about the material realities of digital ecosystems, we
cannot – and must not – overlook the shifts in labour conditions, modes
of production and forms of togetherness that are fostered by new
technologies. In theorising the overall turn towards the digital and
the wide-ranging processes we call ‘digital transformation’ today, it
is hence mandatory to open the discussion to a decidedly materialist
perspective that combines the questions of ‘old’ historical materialisms
with insights from the discursive field of the ‘new’ materialisms.
Whereas historical materialism – along the lines of Marxist thought – has
traditionally looked at the living conditions of the working class and
the modes of production that undergird social and cultural formations,
it is the recent discussion of a ‘new materialism’ that connects culture
and society to the very material substratum – the object-related, energetic
and molecular layer – of life itself.2 In fact, during the recent surge
of materialist thought, much time and energy has been invested to
overcome anthropocentric thinking and the notion that humanity is the
sole source of intentional and directed agency. Instead, what the work of
recent philosophers and scholars, such as Bruno Latour (2004), Karen
Barad (2007), Jane Bennett (2010), Manuel DeLanda (1997, 2006),
DeLanda and Harman (2017) and others, has revealed is the idea that
the complex interactions between humans and the material assemblages
surrounding us (ranging from food, technologies and animals to climate)
are themselves not devoid of their own kind of agency.
In bringing such a materialist reading to the study of computational
and algorithmic environments, this essay seeks to describe digital
infrastructures as ‘machinic assemblages’ that can best be grasped by
596 Dennis Mischke

way of a Deleuze-inspired empiricist thinking. Dealing with the complex


embeddedness of humans and our material surroundings requires an
understanding of the relationship between technological environments
and their users. A Deleuze-inspired concept of the digital and its cultural
and social ramifications, therefore, has to begin with an analysis of the
infrastructures that facilitate, influence and – increasingly also – give way
to entirely automated decision-making processes.
In the following, I want to start from the concept of infrastructure
as such, linking it with what Deleuze has famously characterised as
‘societies of control’. In thinking about the impact of infrastructural
assemblages and their power to – quite literally – set the conditions of
possibility of human and technological agency, I turn to the question
of algorithmic agency and the kind of ‘Algocracy’ (Aneesh) they install.
In the third part, I will combine the insights from my reflections with
a Deleuzian reading of infrastructures and the digital by returning once
more to his seminal essay on the societies of control.

I. Infrastructures of Control
Turning to Deleuze to think about the digital may seem odd, given the
fact that he hardly ever commented on new media and computers and
only occasionally made explicit references to the digital (Savat 2009:
1). Alexander Galloway has even classified him as ‘a philosopher of
the analogue paradigm alone’ (Galloway 2012: 520). In his short and
often quoted essay ‘Postscript on the Societies of Control’, however,
Deleuze famously describes the structures of power at the end of the
twentieth century as driven by ‘machines of a third type, computers’
(Deleuze 1992: 6), as novel and predominantly digital infrastructures
that are reflective of a profound ‘mutation’ of global capitalism itself.
For Deleuze, technology is an inherently social structure: a product
of material processes within society, culture and the economy that is
as much an expression of the social as it is capable of transforming it.
As he puts it in his book on Foucault: ‘Technology is therefore social
before it is technical’ (Deleuze 1988: 40). The technical, in other words,
is not only inextricably wedded to the social, but technology has also
always been an essential and an intraspecific part of the social. In fact,
the way Deleuze thinks about technology here is not so much a question
of an object or single gadget per se, but rather a question of a ‘collective
apparatus’ (Deleuze 1995: 175), of an infrastructure. In the ‘Postscript’,
Deleuze diagnoses a transition from the inclusive milieus of surveillance
and punishment in prisons, factories and hospitals – following Michel
Deleuze and the Digital 597

Foucault’s reflections on the disciplinary societies of the eighteenth and


nineteenth centuries – to a radical expansion of the logic of surveillance,
which is no longer confined to the realm of these institutions, but now
operates in a more flexible manner in all areas of human coexistence.
According to Deleuze, this transition goes hand in hand with a shift
from the infrastructures of the factory to those of the corporation. ‘The
corporation’, he notes:

works more deeply to impose a modulation of each salary, in states of


perpetual metastability that operate through challenges, contests, and highly
comic group sessions . . . [T]he corporation constantly presents the brashest
rivalry as a healthy form of emulation. (Deleuze 1992: 4)

If one compares the contemporary work world, characterised by short-


term employment, constant rivalry and precarious working conditions,
precisely with those ‘metastable’ states described by Deleuze, this
short and rather casual text appears ever more timely. According to
Deleuze, individuals (despite their freedoms) have become ‘dividuals’
in societies of control. The ‘anonymous masses’ of the disciplinary
societies of the eighteenth and nineteenth centuries are slowly but surely
transformed into ‘samples, data, markets or “banks”’ (Deleuze 1992:
5). The transition from the societies of discipline to the societies of
control described by Deleuze can thus also be grasped as a transition
of infrastructures, from the analogue to the digital. In an interview from
1990, he argues:

One can of course see how each kind of society corresponds to a


particular kind of machine – with simple mechanical machines corresponding
to sovereign societies, thermodynamic machines to disciplinary societies,
cybernetic machines and computers to control societies. But the machines
don’t explain anything, you have to analyze the collective arrangements of
which the machines are just one component. (Deleuze 1995: 175)

In applying Deleuzian thought to the digital transformation, I consider


it helpful to understand technology as a manifestation of infrastructure
that has become so ubiquitous that it is increasingly invisible and yet
impossible to avoid. In cultural studies and the emerging field of ‘critical
infrastructure studies (CI)’ (CI Collective 2020), infrastructure has been
defined as meaning ‘“below-structure,” that is, the innards of a structure
that are hidden by the structure’s surface or facade’ (Rubenstein et
al. 2015: 575), ‘[l]ike infrared, the below-red energy just outside of
the reddish portion of the visible light section of the electromagnetic
spectrum’ (Rothstein 2015: 2). In information technology, the term
598 Dennis Mischke

‘infrastructure’ is used in a surprisingly similar manner, as frameworks


that function ‘below the level of work i.e. without specifying exactly
how work is to be done’ (Edwards et al. 2007: 17). In other words,
infrastructure is not only incredibly important to our modern lives, it
is also designed to disappear and only comes to our senses when it
does not work, breaks down or fails. By the same token, infrastructure
is an excluded middle, a medium. Without kids riding school buses,
without people walking on sidewalks, commuters riding trains or
webservers without requests, infrastructure means nothing. Rather, it is
the product of emergent processes, always the co-creation of humans and
technology. Although Deleuze never explicitly mentions infrastructures,
he repeatedly refers to infrastructural elements, especially in his critique
of societies of control. In another interview of the 1990s, he delineates
how infrastructural assemblages – like highways, for example – literally
embody his idea of control:
Control is not discipline. You do not confine people with a highway. But by
making highways, you multiply the means of control. I am not saying this is
the only aim of highways, but people can travel infinitely and ‘freely’ without
being confined while being perfectly controlled. That is our future. Let’s say
that is what information is, the controlled system of the order-words used in
a given society. (Deleuze 2007: 321)

The same logic can also be applied to the early ‘information highways’
of the Internet and contemporary systems of shopping and consumption.
In steering and ‘nudging’ our online consumer behaviour through
means of ‘collaborative filtering’, ‘user tracking’ or ‘dark patterns’ that
measure, predict and eventually control our everyday movements and
choices, Internet users are indeed moving ‘infinitely and freely’, without
confinement, but under perfect control. Deleuze, of course, could not
have anticipated the ‘black box society’ (Pasquale 2015) of the twenty-
first century in which algorithms control the behaviour of individuals
and populations. And he did not. The potential of Deleuzian thinking
in the context of the digital transformation can best be described with
a case of what Paul Edwards et al. have called ‘the long now of
infrastructure’ (Edwards et al. 2007: 3). The technological foundations
of the ‘information revolutions’ and the ultra-rapid acceleration of the
‘24/7 lifestyle’ of the Internet age did not spring up overnight. Instead:
For the development of cyberinfrastructure, the long now is about 200
years . . . When dealing with information infrastructures, we need to look
to the whole array of organizational forms, practices, and institutions that
accompany, make possible, and inflect the development of new technology
Deleuze and the Digital 599

. . . Similarly, Manuel Castells argued that the roots of the contemporary


‘network society’ are new organizational forms created in support of large
corporate organizations, which long predate the arrival of computerization
. . . The lesson of all these studies is that organizations are (in part)
information processors. (Edwards et al. 2007: 3)

The ‘long now’ of digital infrastructures, in other words, reverberates


nicely with Deleuze’s remarks in his book on Foucault (see the epigram
at the beginning of this article) that technology in and of itself is never
a sufficient explanation of the social and cultural processes that it is
implemented in. Technology is as much implemented into the ‘collective
apparatus’ as it itself is implementing it. Thus, a Deleuzian perspective
on the digital makes most sense if we conceptualise this epochal
transformation as a complex and emergent amalgam of infrastructures,
algorithmic (e.g. abstract) machines running on them as well as human
beings communicating and increasingly living in them.
In his article ‘Computers and the Superfold’ (2012), Alexander
Galloway ruminates on a similar figure of recursion that a Deleuzian
image of the digital would necessarily entail. Following an especially
attentive recollection of all the scattered text passages in which Deleuze
talks about computers, Galloway takes the concepts of ‘the fold’ and
‘the superfold’ to argue: ‘Just as the fold was Deleuze’s diagram for the
modern subject of the baroque period, the superfold is the new “active
mechanism” for life within computerised control society’ (Galloway
2012: 524). The notion of folds that Deleuze initially develops in his
books on Leibniz, and later refines in his book on Foucault, captures
Deleuze’s notion of basic components of matter and operates with an
interesting notion of recursion: ‘A fold’, Deleuze notes in his text on
Leibniz:
is always folded within a fold, like a cavern within a cavern. The unit of
matter, the smallest element of the labyrinth is the fold, not the point, which
is never a part, but a simple extremity of the line. That is why parts of matter
are masses or aggregates. (Deleuze 2006: 6)

In his later book on Foucault, Deleuze applies the concept of the fold and
its inherent recursion to computation and describes digital environments
as ‘superfolds’:
Dispersed work had to regroup in third-generation machines, cybernetics and
information technology. What would be the forces in play, with which the
forces within man would then enter into a relation? It would no longer involve
raising to infinity or finitude but an unlimited finity, thereby evoking every
situation of force in which a finite number of components yields a practically
600 Dennis Mischke

unlimited diversity of combinations. It would be neither the fold nor the


unfold that would constitute the active mechanism, but something like the
Superfold, as borne out by the foldings proper to the chains of the genetic
code, and the potential of silicon in third-generation machines, as well as by
the contours of a sentence in modern literature, when literature ‘merely turns
back on itself in an endless reflexivity’. (Deleuze 1988: 131)

Put differently, the iterations of programmed recursion in ‘third


generation machines’ are themselves manifestations and infrastructures
of social, cultural and economic patterns of which they are a part.
Dispersed across these various later texts, Deleuze here develops a
sophisticated image of computation and the digital transformation that
goes beyond the typical Deleuzian themes that are usually applied to the
realm of new media, like the rhizome, the virtual, the abstract machine,
etc.3

II. Deleuzian Assemblages, Critical Infrastructure Studies and


Algocratic Rule
What is remarkable about Deleuze’s view of the digital, as it emerges
from his notion of the fold, is the fact that the shift in power he
sees is not a direct consequence of technological innovations alone,
but rather the result of infrastructures as collective apparatuses in
use. From this infrastructural perspective, the question of the digital
and the agency of humans, technology and the ‘collective apparatuses’
of digital infrastructures is best conceptualised with another powerful
concept from the arsenal of Deleuzian materialist philosophy: the
assemblage. In A Thousand Plateaus, Deleuze and Guattari defined an
assemblage as: ‘every constellation of singularities and traits deduced
from the flow – selected, organized, stratified – in such a way as to
converge (consistency), artificially and naturally, into extremely vast
constellations constituting “cultures” or even ages’ (Deleuze and
Guattari 1997: 406). Essentially, assemblages emerge on all levels of
material reality. From vast things like weather phenomena, erupting
volcanos, nation states or human languages to the tiniest material or
immaterial groupings on the microscopic scale, like viruses, bacteria or
DNA molecules, or Wi-Fi connections and radio signals. By the same
token, however, even the vastest assemblages never constitute seamless
wholes. Ultimately, an assemblage will never be fully complete and
actualised. Instead, assemblages create consistency through constantly
changing, constantly becoming different assemblages. They are, as
Manuel DeLanda has explained, born and cease to exist, not unlike
Deleuze and the Digital 601

a human body, a biological species or a population (DeLanda


2006). In view of such a Deleuzo-Guattarian assemblage, the digital
infrastructures of the twenty-first century can be best described in
computer programming parlance as a ‘big ball of mud’. In software
engineering:

A BIG BALL OF MUD is a casually, even haphazardly, structured system.


Its organization, if one can call it that, is dictated more by expediency than
design. Yet, its enduring popularity cannot merely be indicative of a general
disregard for architecture. (Foote and Yoder 2000: 661)

Despite the fact that Deleuze surely knew nothing about software
engineering, the Postscript’s second, and for my purposes most relevant,
part starts with the heading ‘Logic’ – logique in the French original,
which, as Alexander Galloway reminds us, is a cognate of logiciel, the
word for software. Interestingly, in this part he describes the dangers
of the dawning digital age also in terms of entropic and haphazard
developments:

[T]he recent disciplinary societies equipped themselves with machines


involving energy, with the passive danger of entropy [my emphasis] and the
active danger of sabotage; the societies of control operate with machines of
a third type, computers, whose passive danger is jamming and whose active
one is piracy or the introduction of viruses. This technological evolution must
be, even more profoundly, a mutation of capitalism. (Deleuze 1992: 6)

The mutation of capitalism that Deleuze is speaking about here is


the shift from industrial to so-called post-industrial service economies,
a shift that has recently found its most dramatic culmination in the
short-term gig economy of ‘platform capitalism’ (Srnicek 2017). ‘The
operation of markets is now the instrument of social control’ (Deleuze
1992: 6), notes Deleuze. Under the conditions of late capitalism,
marketing not only creates desires that did not exist before, but also
transforms people into machines of desire, ‘wishing’ that desire can be
satisfied according to expected parameters.
Particularly in the age of big data, artificial intelligence and machine
learning, the potential effects of marketing infrastructures must therefore
be thoroughly examined. In a sociological study of the structural
changes of the ‘technological condition’ in the age of the transnational
enterprise, A. Aneesh (2006) coined a term that brings together the
complexity of the contexts discussed so far: algocratic rule – the rule
of algorithms. Aneesh begins the development of this term with
an analysis of a phenomenon he calls ‘virtual migration’. At the
602 Dennis Mischke

beginning of the 1990s, many transnational companies began to transfer


the production of goods to areas with lower labour costs, and to
outsource technological and infrastructural services to independent
subcontractors. But how can transnational companies make sure that
production sites relocated to other countries still meet the set standards
and expect maximum profits, even with immaterial work? How are
established power structures implemented and maintained in the face
of increasingly flat, network-like hierarchies of subcontractors, some
of which operate in atomised and isolated form? Aneesh’s answer to
these questions is simple: collaborative, digital work infrastructures and
specific software solutions such as Enterprise Resource Planning (ERP),
Customer Relationship (CMR) or Supply Chain Management (SCM)
systems produce a seamless global integration of tasks, labour and
systems of control.
What is interesting about these globally implemented infrastructures
is the way they predetermine all work processes in a non-negotiable
manner, while keeping the user under the impression of being ‘free’, like
a driver on the Deleuzian highway mentioned above. The user of such
software systems simply has no choice but to remain within the realm
of the possible determined by the software design. The programmed
code simply does not allow any deviating behaviour. The repercussions
of this become impressively clear from interviews with bank employees
that Aneesh had conducted as early as 2006. A clerk working behind the
counter of a large bank describes the ‘activity management’ that controls
every encounter with his customers:

So it’s pretty basic, it [the banking software] takes you step by step through
the transaction. It says, now give the customer this much money, and asks
you if the amount is correct. And so you fill in numbers for all the sections,
hit enter, it will take you to the next step. You will validate the thing you’re
holding . . . and then it asks if there is anything you want for the customer.
And you say yes or no. (Aneesh 2006: 114)

Since, in such systems, independent thinking of employees is


neither encouraged nor demanded, many operative procedures are
indirectly – like in the case above – or directly – like in the case of
automated decision-making systems – more and more delegated to the
agency and control of algorithms. The importance of algorithmic
assemblages and their uncanny power to predict, suggest and sort
the behaviour of individuals has been discussed for quite some time
now. In fact, algorithms – once hardly a topic for people outside of
computer science – have become so pervasive and widely implemented
Deleuze and the Digital 603

that it has become commonplace to assume that we are dealing with


algorithmic cultures (see Galloway 2006; Syfert and Roberge 2017).
At the same time, especially for non-computer scientists, algorithms
keep an aura of inscrutability and secrecy hidden behind barriers of
‘access and expertise’ (Seaver 2014: 2). For computer science, however,
algorithms are primarily ‘a well-defined computational procedure’ that
takes some values, or a set of values, as input, performs a ‘sequence of
computational steps, and transforms the input into an output’ (Cormen
et al. 2009: 5). ‘Analyzing an algorithm’, Cormen et al. proceed, ‘has
come to mean predicting the resources that the algorithm requires’
(ibid. 15). In computer science, algorithms are a matter of mathematical
proofs and nothing more (or less) than the abstract procedures4 of
computation that operate on specific data structures to either sort,
classify or transform this data in a strictly logical manner. An algorithm
without data to transform input to output would be a purely theoretical
and mathematical exercise. It is no wonder then that the historically
unique ‘advent of the algorithm’ (Berlinski 2000) coincided with the
proliferation of another key technology of the twenty-first century: the
database.
As the American media theorist Lev Manovich diagnosed in his
book The Language of New Media from 2001, databases are not
only a new medium, but initiators of a completely new genre of
communication and knowledge, perhaps even a new formation of the
symbolic order itself (Manovich 2001: 219). In the course of the
rapid development of database technologies and, above all, the dense
networking of all kinds of databases on the Internet, the possibilities
for using databases have exploded in recent years. With innovations
of artificial intelligence, machine learning and statistical evaluation
procedures, databases can now be automatically queried in order to
recognise connections in the largest possible datasets (hence the term ‘big
data’). These methods of data mining are not only suitable for creating
profiles of people and individuals in a diversity of contexts; with the help,
for example, of ‘statistical twins’5 or ‘collaborative filtering’,6 complex
machine-learning algorithms can make precise predictions about the
behaviour of individuals, groups and social classes. How precise such
predictions can be will be briefly demonstrated with a case that caused a
considerable media response in the USA and the entire English-speaking
press.
As the New York Times reported in February of 2012, the Minnesota-
based supermarket group Target knew of the pregnancy of a young
American woman before her own father. Using complex algorithms, the
604 Dennis Mischke

department for ‘predictive analytics’ had noticed that pregnant women


in certain phases of their pregnancy buy very specific things and change
their shopping behaviour significantly. For example, Target’s data
analysts had found that a female customer who suddenly starts buying
unperfumed body lotion in combination with zinc and iron supplements
or similar products has a high probability of being pregnant. When such
patterns are predicted, Target start to send out coupons and discounted
offers to its customers automatically.
This practice, which has since been used on a large scale by many
companies, only came to light in the above example because the father
of a young customer, to whom coupons for baby products had been
sent, suspected that Target wanted to encourage his teenage daughter
to get pregnant. The article in the New York Times described the case
vividly:
About a year after [Target] created [its] pregnancy-prediction model, a man
walked into a Target outside Minneapolis and demanded to see the manager.
He was clutching coupons that had been sent to his daughter, and he was
angry, according to an employee who participated in the conversation. ‘My
daughter got this in the mail!’, he said. ‘She’s still in high school, and you’re
sending her coupons for baby clothes and cribs? Are you trying to encourage
her to get pregnant? The manager didn’t have any idea what the man was
talking about. He looked at the mailer. Sure enough, it was addressed to the
man’s daughter and contained advertisements for maternity clothing, nursery
furniture and pictures of smiling infants. The manager apologized and then
called a few days later to apologize again. On the phone, though, the father
was somewhat abashed. ‘I had a talk with my daughter,’ he said. ‘It turns out
there’s been some activities in my house I haven’t been completely aware of.
She’s due in August. I owe you an apology.’ (Duhigg 2012)

However, it is not only the supermarket chains that now operate such
profiling. The sector has expanded into an entire industry and will
continue to grow. The main problem is that companies like Target
pass the data collected from their customers on to third parties for
further processing. According to Manzerolle and Smeltzer (2011), actors
operating in the shadows of the usual suspects, like Facebook or
Google, collect a lot of sensitive data and exploit them economically
and politically. Personal data in large quantities is sold on to third
parties, where it is combined with other data about locations, shopping
habits, sexual preferences, political attitudes and much more. As another
more recent investigative report by the New York Times (Thompson and
Warzel 2019) revealed: more than ten years after the coverage of the
Target case, the business of data brokerage is now especially lucrative
Deleuze and the Digital 605

in the realm of location-based services and location data retrieved


from GPS signals of millions of smart phones worldwide. As Deleuze
predicted in the Postscript, we are today living in a world in which
private companies hold, organise and manage complex digital doubles
of individuals, social groups and entire populations.

III. The Materiality of Algorithms – Infrastructure as Code


So, where does this leave us with regard to my initial hypothesis? At
the beginning of this text, I suggested that by reading the digital in
Deleuzian terms we arrive at a vision of infrastructures that determine
and steer the actions of its constituents while imbuing its subjects with
the illusion of freedom of choice. Interestingly, in one of the most
accepted definitions of algorithms in computer science, algorithms are
indeed closely wedded to the notion of control. As Robert Kowalski
has demonstrated in the following formula: ‘algorithm = logic + control’
(Kowalski 1979: 427). The logic component, he argues, ‘determines the
meaning of an algorithm whereas the control component only effects
its efficiency. The efficiency of an algorithm can often be enhanced by
improving the control component without changing the logic of the
algorithm’ (ibid. 424). As we can see, efficiency and the most economic
usage of resources7 are not only essential goals of certain software
designs, they are a major part of the very nature of algorithms. Here,
my above point of energy consumption, especially with regard to our
electronic lives and tools, reappears in a different fashion. One major
solution of the contemporary climate crisis has long been to use IT and
smart technology (i.e. more efficient algorithms) to increase an overall
energy efficiency through means of control. Smart homes are therefore a
further materialisation of infrastructures of control.8
By the same token, the sharing economy (e.g. Uber, Airbnb,
Drivy and Amazon Mechanical Turk) promises more efficiency while
supposedly enhancing freedom and community. What this algorithmic
and digitalised form of the economy – regardless of whether we call
it ‘sharing economy’ or ‘platform capitalism’ – eventually does is
implement more and more layers of control. One of the most recent
trends in this development is the increasing ‘virtualisation’ of digital
infrastructures. As more and more ‘bare metal servers’ – which means
individual digital infrastructures of, say, small local companies – move
‘into the cloud’, the digital infrastructures of control are shifting control
to an even higher level. Under the heading ‘infrastructure as code’9
or ‘infrastructure as service’, we are currently witnessing a further
606 Dennis Mischke

abstraction and ‘virtualisation’ of digitalised infrastructures themselves.


‘The technological evolution’, as Deleuze has so prominently put it in the
Postscript, ‘must be, even more profoundly, a mutation of capitalism’
(Deleuze 1992: 6). What this mutation of capitalism entails, then, is
nothing less than a fold itself. Armin Nassehi has suggested that one
should see the digital only as a solution to problems that the digital
(as the latest instalment of modernity) has created itself (Nassehi 2019:
62–3).
In this light, algorithms running on digital infrastructures do indeed
appear as Deleuzian superfolds that constantly fold back on themselves.
They are abstract machines that work best when being applied to
their own infrastructures and environment. The more digitalised the
world becomes, the more we face the need for efficient algorithms
and performative computer systems. The rapidly growing number
of available sources of knowledge on the Internet requires an ever-
greater need for more powerful and efficient search algorithms. The
more complex our social interactions become – through means of social
networks – the more we need efficient sorting algorithms to curate our
timelines and newsfeeds. The greater the entropy of the ‘big ball of
mud’ of the Internet gets, the more intelligent algorithms we need to
filter, select and suggest suitable information. Efficiency, however, is not
the same as effectiveness. When digital filter bubbles destroy the public
sphere by locking individual citizens into their own echo chambers, and
when intelligent machine learning algorithms enable the perfect micro-
targeting of individual constituents of a populace (as in the case of
the Cambridge Analytica scandal of 2016), we begin to see that the
algorithmic infrastructures of control can easily turn into infrastructures
of manipulation. The primary arena in which we will have to deal with
the cultural, political, social and economic repercussions of artificial
intelligence and ever more efficient algorithms will then be virtualisation
and abstraction. Deleuze might, as Alexander Galloway has put it, be the
prime example of ‘an analogical philosopher’ (Galloway 2012: 521). In
light of the growing infrastructures of control, however, the Deleuze of
the 1990s (see Berry and Galloway 2016: 158) is increasingly becoming
a key philosopher of the digital too.

Notes
1. https://wewatt.com (accessed 28 November 2019).
2. Surprisingly, such a combination of old and new materialism has rarely been
attempted within new-materialist discourses. In fact, most proponents of new
Deleuze and the Digital 607

materialist thinking, such as Bennett (2010), Latour (2004) or DeLanda (2006),


categorically distance themselves from Marxism and historical materialism. With
a special focus on literary articulations of bodies and visceral forms of materiality,
Christopher Breu’s work (2014) provides a noteworthy exception.
3. On the various uses of Deleuzian concepts in the context of media and technology
studies, see, for instance, Berry and Galloway 2016, Buchanan 2009, Galloway
2012 and Savat 2009.
4. In Deleuzo-Guattarian terms, algorithms can in fact be understood as abstract
machines or at least as one possible manifestation of these. In A Thousand
Plateaus, Deleuze and Guattari explain: ‘But the principle behind all technology
is to demonstrate that a technical element remains abstract, entirely unrelated,
as long as one does not relate it to an assemblage it presupposes’ (1997: 397–8).
Driving this point home, Félix Guattari elaborates in Chaosmosis: ‘But already
this montage and these finalizations impose the necessity of expanding the
limits of the machine . . . [to] material and energy components [and] semiotic,
diagrammatic and algorithmic components (plans, formulae, equations and
calculations which lead to the fabrication of the machine)’ (1995: 34).
5. Statistical twins are individuals who act almost identically under certain
conditions and share a number of other traits (see Duhigg 2012).
6. Collaborative filtering is a method of predicting possible choices of users based
on similar choices by other users in a similar context. For a general introduction,
see Wang et al. 2016 or Klahold 2009.
7. In computer science, the primary resource is first and foremost time and, by
that token, the energy usage of the central processing unit (CPU) on which the
algorithm is executed.
8. See, for example, the German federal government’s ‘Act on the Digitization
of the Energy Transition’: https://www.bmwi.de/Redaktion/EN/Artikel/Energy/
digitisation-of-the-energy-transition.html (accessed 20 October 2020).
9. On the notion of further virtualised digital infrastructures as cloud-based service
architectures, see, as an introduction, Jourdan and Pomès 2017.

References
Aneesh, A. (2006) Virtual Migration: The Programming of Globalization, Durham,
NC: Duke University Press.
Barad, Karen (2007) Meeting the Universe Halfway: Quantum Physics and the
Entanglement of Matter and Meaning, Durham, NC: Duke University Press.
Bennett, Jane (2010) Vibrant Matter: A Political Ecology of Things, Durham, NC:
Duke University Press.
Berlinski, David (2000) The Advent of the Algorithm, New York: Harcourt.
Berry, David M. and Alexander Galloway (2016) ‘A Network is a Network is a
Network: Reflections on the Computational and the Societies of Control’, Theory,
Culture & Society, 33:4, pp. 151–72.
Betancourt, Michael (2015) The Critique of Digital Capitalism: An Analysis of
the Political Economy of Digital Culture and Technology, New York: Punctum
Books.
Breu, Christopher (2014) Insistence of the Material: Literature in the Age of
Biopolitics, Minneapolis: University of Minnesota Press.
Buchanan, Ian (2009) ‘Deleuze and the Internet’, in Mark Poster and David Savat
(eds), Deleuze and New Technology, Edinburgh: Edinburgh University Press,
pp. 143–61.
608 Dennis Mischke

CI Collective (2020) ‘Critical Infrastructure Studies.org. ‘Thinking about the Built,


Repaired, and Lived Things of the World – How We Make Them, and How They
Make Us’, available at https://cistudies.org/ci-collective/ (accessed 30 October
2020).
Cormen, Thomas H., Charles E. Leiserson, Ronald L. Rivest and Clifford Stein
(2009) Introduction to Algorithms, 3rd edition, Cambridge, MA: MIT Press.
De Landa, Manuel (1997) A Thousand Years of Nonlinear History, New York: Zone
Books.
DeLanda, Manuel (2006) A New Philosophy of Society: Assemblage Theory and
Social Complexity, London: Continuum.
DeLanda, Manuel and Graham Harman (2017) The Rise of Realism, Cambridge:
Polity Press.
Deleuze, Gilles (1988) Foucault, trans. Seán Hand, Minneapolis: University of
Minnesota Press.
Deleuze, Gilles (1992) ‘Postscript on the Societies of Control’, October, 59 (Winter),
pp. 3–7.
Deleuze, Gilles (1995) ‘Control and Becoming’, in Gilles Deleuze, Negotiations,
1972–1990, New York: Columbia University Press, pp. 169–76.
Deleuze, Gilles (2006) The Fold: Leibniz and the Baroque, New York: Continuum.
Deleuze, Gilles (2007) Two Regimes of Madness: Texts and Interviews, 1975–1995,
trans. Ames Hodges and Mike Taormina, New York: Semiotext(e).
Deleuze, Gilles and Félix Guattari (1997) A Thousand Plateaus: Capitalism and
Schizophrenia, trans. Brian Massumi, Minneapolis: University of Minnesota Press.
Duhigg, Charles (2012) ‘Psst, You in Aisle 5’, The New York Times Sunday
Magazine (16 February 2012), available at https://perma.cc/VD4U-Y4GB
(accessed 30 October 2020).
Edwards, Paul, Steven Jackson, Geoffrey Bowker and Cory Knobel (2007) Under-
standing Infrastructure: Dynamics, Tension, and Design, NSF Human and
Social Dynamics, University of Michigan, available at http://hdl.handle.net/2027.
42/49353 (accessed 30 October 2020).
Foote, Brian and Joseph Yoder (2000) ‘Big Ball of Mud’, in Neil Harrison, Brian
Foote and Hans Rohnert (eds), Pattern Languages of Program Design 4, Reading,
MA: Addison-Wesley.
Galloway, Alexander (2006) Gaming Essays on Algorithmic Culture, Minneapolis:
University of Minnesota Press.
Galloway, Alexander (2012) ‘Computers and the Superfold’, Deleuze Studies, 6:4,
pp. 513–28. doi: 10.3366/dls.2012.0080.
Guattari, Félix (1995) Chaosmosis: An Ethico-Aesthetic Paradigm, trans. Paul Bains
and Julian Pefanis, Bloomington: Indiana University Press.
Hintemann, Ralph and Klaus Fichter (2015) ‘Energy Demand of Workplace
Computer Solutions – A Comprehensive Assessment Including Both End-User
Devices and the Power Consumption They Induce in Data Centers’, Proceedings
of EnviroInfo and ICT for Sustainability 2015 (1 September 2015). doi:
10.2991/ict4s-env-15.2015.19.
Jones, Nicola (2018) ‘How to Stop Data Centres from Gobbling up the World’s
Electricity’, Nature News (12 September 2018), available at http://www.
nature.com/articles/d41586-018-06610-y (accessed 30 October 2020).
Jourdan, Stephane and Pierre Pomès (2017) Infrastructure as Code (IAC) Cookbook:
Over 90 Practical, Actionable Recipes to Automate, Test, and Manage Your
Infrastructure Quickly and Effectively, Birmingham: Packt.
Klahold, André (2009) Empfehlungssysteme. Recommender Systems – Grundlagen,
Konzepte und Lösungen, Wiesbaden: Vieweg + Teubner.
Deleuze and the Digital 609

Kowalski, Robert (1979) ‘Algorithm = logic + control’, Communications of the


ACM, 22:7, pp. 424–36. doi: 10.1145/359131.359136.
Latour, Bruno (2004) Politics of Nature: How to Bring the Sciences into Democracy,
Cambridge, MA: Harvard University Press.
Manovich, Lev (2001) The Language of New Media, Cambridge, MA: MIT Press.
Manzerolle, Vincent and Sandra Smeltzer (2011) ‘Consumer Databases and the
Commercial Mediation of Identity’, Surveillance & Society, 8:3, pp. 323–37.
Nassehi, Armin (2019) Muster: Theorie der digitalen Gesellschaft, Munich: C. H.
Beck.
Parikka, Jussi (2015) The Anthrobscene, Minneapolis: University of Minnesota
Press.
Pasquale, Frank (2015) The Black Box Society: The Secret Algorithms that Control
Money and Information, Cambridge, MA: Harvard University Press.
Rothstein, Adam (2015) ‘How to See Infrastructure: A Guide for Seven Billion
Primates’, Rhizome (blog), 20 July 2015, available at https://rhizome.org/editorial/
2015 / jul / 02 / how-see-infrastructure-guide-seven-billion-primate/ (accessed 29
November 2019).
Rubenstein, Michael, Bruce Robbins and Sophia Beal (2015) ‘Infrastructuralims: An
Introduction’, MFS Modern Fiction Studies, 61.4, pp. 575–86.
Savat, David (2009) ‘Introduction: Deleuze and New Technology’, in Mark Poster
and David Savat (eds), Deleuze and New Technology, Edinburgh: Edinburgh
University Press, pp. 1–15.
Seaver, Nick (2014) ‘Knowing Algorithms’, in Janet Vertesi and David Ribes
(eds), digitalSTS: A Field Guide for Science & Technology Studies, Princeton,
NJ: Princeton University Press, available at https://digitalsts.net/essays/knowing-
algorithms/ (accessed 20 October 2020).
Seyfert, Robert and Jonathan Roberge (2017) ‘Was sind Algorithmuskulturen?’,
in Robert Seyfert and Jonathan Roberge (eds), Algorithmuskulturen. Über die
rechnerische Konstruktion der Wirklichkeit, Bielefeld: transcript, pp. 7–40.
Srnicek, Nick (2017) Platform Capitalism, Cambridge: Polity Press.
Thompson, Stuart and Charie Warzel (2019) ‘Twelve Million Phones, One
Dataset, Zero Privacy’, The New York Times (19 December 2019),
available at https://www.nytimes.com/interactive/2019/12/19/opinion/location-
tracking-cell-phone.html (accessed 20 October 2020).
Wang, Shuaiqiang, Shanshan Huang, Tie-Yan Liu, Jun Ma, Zhumin Chen and
Jari Veijalainen (2016) ‘Ranking-Oriented Collaborative Filtering: A Listwise
Approach’, ACM Transactions on Information Systems, 35:2, pp. 1–28. doi:
10.1145/2960408.

You might also like