You are on page 1of 20

Volume 16(2): 183–202

ISSN 1350–5084
Copyright © 2009 SAGE Publications
(Los Angeles, London, New Delhi,
Singapore and Washington DC)

On the Computational Rendition of


Reality: Artefacts and Human Agency
Jannis Kallinikos
London School of Economics, London, UK

Abstract. The paper seeks to lay open the computational logic by which
reality is rendered as information. Computation is claimed to involve a drift
away from the palpable and extendible character of things, a trend that
both continues and breaks with the prevailing strategies of technological
mediation in industrialism and modernity. Computation entails the
relentless analytic reduction of the composite character and complexion
of the world. Reality is meticulously dissolved and regained after a long
analytic retreat and technological reconstruction. The outcome of this
analytic strategy is that processes taking place at the human-technology
interface are sustained by an elaborate vertical stratification, entailing a
variety of other programmes and systems that reach down from the level
of the interface to machine language and the mechanics of binary parsing.
The deepening involvement of computation in instrumental settings thus
reframes the perceptive and action modalities by which human agents
confront the world. This way, a coherent set of techniques for building up
reality is established accompanied by a new model of human agency that
increasingly takes the form of a combinatoria of data and information
items, remaking the shape of things out of the digital fragments produced
by computation. Key words. computation; computational rendition;
informatization; human agency; organizational processes

Developments over the last two decades suggest that information’s


involvement in socio-economic life is acquiring comprehensive dimensions
that enlarge and deepen the impact it had on organizations during the

DOI: 10.1177/1350508408100474 http://org.sagepub.com


Organization 16(2)
Articles
second half of the 20th century (Borgmann, 1999; Kallinikos, 2006; Lyman
and Varian, 2003). The organizational operations that become the object of
informatization are steadily expanding across settings. At the same time, an
organization’s transactions with the environment are increasingly mediated
by the internet or other information-based networks. These developments
transform the status of information and make it a pervading element of
socio-economic life. Data and information have to be collected, ordered and
transmitted through a range of technologies and organizational practices.
In the information affluence of the contemporary world, data ordering
and information editing have, slowly and imperceptibly, risen from their
humble status of clerical routines to become vital organizational activities
that accommodate the expanding infospace that engulfs organizations and
their dealings. Information management currently forms the core of an
elaborate and collective perceptual apparatus in organizations.
The implications of this pervasive involvement of information in organ-
izations are not well understood. For a variety of reasons, the field of
organization studies has maintained a tangential relationship to these
developments. Its significance notwithstanding, organization scholars have
tended to see the informatization of tasks and operations as primarily a
technological project, in the periphery of the concerns that have defined the
field. When brought to the centre stage, informatization has been studied
either as causal force of organizational change or conceived as a formative
context participating in the constitution of organizational practices (e.g.
Ciborra and Lanzara, 1994; Fulk and DeSanctis, 1995; Lanzara and Patriota,
2001; Orlikowski, 2000). While useful and insightful, this kind of research
has focused on the investigation of larger functional and structural blocks or
processes occasioned by the deepening involvement of technological infor-
mation in organizations. It has seldom looked at the blackboxed mechanisms
through which technologically driven information processes reassemble
reality and, in doing so, reframe the premises upon which individual and
collective subjects perceive and act upon that reality. While noticed and often
criticized, the distinctively analytic/reductionist predilection of informat-
ization and the minute yet elaborate cognitive models and techniques by
which information is more fundamentally involved in the covert constitu-
tion of socio-economic reality and organizations have seldom been studied
systematically.
Informatization, I claim, marks a distinctive step in the history of the forms
of instrumental involvement. Such distinctiveness is epitomized by a sig-
nificant drift informatizing implies away from the palpable, extendible
and observable character of things that have traditionally conditioned pro-
duction and administration. To render tasks and services informatized, it
is necessary to construe and technologically reconstruct their foundations;
that is, the underlying factors and processes by which these are sustained.
This is the inevitable outcome of having to spell out for the machine how to
perform its operations. Informatizing administrative tasks, work processes
or other services, makes it necessary to lay out in substantial detail the steps

184
On the Computational Rendition of Reality
Jannis Kallinikos

by which these are made, and construct the elaborate computational mech-
anics to support their smooth execution. This is unique in the history of
automation. To get informatized, even relatively simple tasks have to be
supported by an elaborate computational mechanics.
Computational photography and imaging offer an instructive example.
Ever since Susan Sontag’s (1977) path-breaking work, we know that snap-
shots can truthfully map and fake reality at the same time. Digital photo-
graphy has, however, carried this contradictory mix of truths and lies
significantly further in what common parlance refers to as ‘photoshopping’;
that is the construction of an image a posteriori, through the combination
of other already existing components taken from other images. ‘Photo-
shopping’ is made possible by image editing software that moves from
the manipulation of perspective (and lighting) characteristic of traditional
photography to the underlying processes of image making. Shifts in per-
spective and lighting operate at the level of image capturing as a whole.
By contrast manipulation of image through data editing software moves
underneath, at the building blocks by which images are composed. Com-
putational photography, a recent development, further enhances these
characteristics of digital photography by expanding the process of data
editing to the manipulation of the raw material (colour, lighting or shape)
by which culturally recognized objects are made (Raskar and Tumblin,
2007). The raw material of images is as a rule visually unrecognizable by
ordinary people but it can be processed and edited to become a component
of a recongizable image.
Computational photography and imaging are far from trivial. Technological
developments in this field have far reaching implications for medicine,
military, the practice of science in general and other social domains. Cruci-
ally, they are indicative of the paradigmatic shift beyond visual unities in
a field and social practice that predominantly deal with images presented
and consumed by human perception. The computational mechanics that
underlies the informatization of services and operations in and across
organizations epitomize the same paradigmatic shift away from tangible
and observable reality. For this reason, the effects of the pervasive and
continuously growing instrumental involvement of information reach be-
yond the standard theme of the technological mediation of organizational
tasks and processes and whatever effects such a mediation may have on
the structural morphology and practices of organizations. The growing
saturation of organizational processes by information remakes key premises
upon which social agents frame and act upon the world. I will refer to such
remaking as the Computational Rendition of Reality and identify it with the
relentless analytic reduction of the composite character and complexion of
things and their recapturing as machine-run code (Hayles, 2005).
A variety of other practices of representation and cognitive techniques
have historically been involved in the meticulous parsing and mediation
of organizational operations and the control of organizational outcomes,
e.g. management and financial accounting, methods of archiving, filing and

185
Organization 16(2)
Articles
indexing, other writing and notational schemes (Goody, 1986; Rose, 1991).
Numbers and numerical systems, in particular, have been centrally impli-
cated in the historical shift away from palpable, observable reality that in
this paper I associate with computation. For, numbers are abstract cognitive
entities dealing with intangible aspects of the world (Cassirer, 1955; Frege,
1980). Accordingly, their massive involvement in socio-economic life
has had implications (e.g. Cline-Cohen, 1982; Megill, 1994) that in some
respects resemble the ones I seek to point out in this paper with respect to
informatization. Yet, none of the methods or techniques associated with
literacy and, specifically, numeracy have ventured to go thus far, and in
such a comprehensive fashion, in decomposing and reassembling reality,
as software-based coding is currently doing.
In this paper I seek to describe basic traits of computational technology
by deploying a language that can be recognized by social scientists. My
ultimate aim is to bring this description to bear upon the understanding
of the impact the computational rendition of reality may have upon work
and organizational practices. I first summarize the dominant conception
of the informatization of organizational operations in the literature. The
summary is utterly brief. It does not claim to be representative of the lit-
erature in which the relevant issues have been debated. But it does make
a case for capturing the spirit by which the relevant issues have been
analysed and, thus, motivate the distinctive contribution of the analytical
path I have chosen. Next to it, I present in two consecutive sections the key
argument of the paper. I seek to demonstrate how the pronounced analytic/
reductionist orientation of computation un-bundles and decomposes the
compact texture of operations that is brought to bear upon. In so doing, it
provides important, technologically embodied principles for segmenting
and reassembling reality that remake the perceptual base and the skills by
means of which social agents frame and act upon the world. In the section
that follows I place the appreciation of these claims within the broader
context provided by the history of work and organizational arrangements.
A few important differences are emerging. Computation is shown to drive
human agency away from the larger perceptual and action blocks that
have traditionally constituted the object of work. The paper concludes
with a coda on work and organizational implications of the computational
rendition of reality.

Structural Implications of Informatization


Research on information and the technologies by which it has been sup-
ported has predominantly focused on the investigation of what I shall refer
to as macroscopic phenomena; that is phenomena relating to larger blocks
of organizational reality indicated by such terms as structures, functions,
processes or interaction patterns or departments and hierarchical layers.
The focus on macroscopic units certainly entails the drift away from the
perceptual blocks of immediate reality that are associated with situated

186
On the Computational Rendition of Reality
Jannis Kallinikos
action. Concepts like structure and function undeniably invoke an abstract
reality beyond the immediate experience of situated agents. However,
abstract as they may be, phenomena of this sort resonate with human ex-
perience. Functions, structures and processes in social settings can more
easily be comprehended and related to the primary social reality of human
interaction (Knorr-Cetina and Bruegger, 2002) than algorithms and the logic
by which these are composed and manipulated. Computation differs in
having a pronounced microscopic orientation, invoking a link between
immediate experience and its hidden foundations that I hope to show is
counter-intuitive and beyond easy grasp.
Empirical studies conducted within the dominant macroscopic trad-
ition suggest that, over the past few decades, the gradual transformation
of the task infrastructure of organizations induced by informatization has
resulted in the establishment of new organizational processes and proced-
ures and the creation of new services. These, in turn, have brought about
one of several modes of functional and structural accommodation (Forester,
1989; Fulk and DeSanctis, 1995; Lilley et al., 2004; Woolgar, 2002; Yates and
Van Maanen, 2001; Zuboff, 1988). Even though the evidence is far from
conclusive, informatization has been associated with administrative sim-
plification, process streamlining, flatter hierarchies, better cross-functional
and cross-site communication, and, not infrequently, improved responsive-
ness to environmental contingencies. Overall, these studies could be
interpreted to suggest that the cluster of changes that are associated with
informatization carry on the industrial tradition of deploying technology
as an important means for work and administrative rationalization. They
obviously do so in the new forms computer-based technologies have made
possible. Despite the agonizing and frequently failing character of this
project, the understanding of the organizational involvement of technol-
ogical information is predominantly in terms of the operational links which
information manages to bring between functions, tasks modules and pro-
cesses. Alternative structural arrangements or practices emerge to accom-
modate and render these links smooth or effective.
More recently, the research focus has shifted to accommodate a range
of new developments that are associated with the diffusion of the internet
and the growing connectivity across sites, organizational and institutional
boundaries which information and communication technologies have
afforded. These developments have occasioned alternative organizational
arrangements, often referred to as networks. This has been, inter alia, the
outcome of the effort to accommodate distributed forms of work and multi-
valent inter-organizational collaboration enabled by the diffusion and
growing operational convergence of the technologies of computing (e.g.
Castells, 2001; DiMaggio et al., 2001). A key development in this respect
is the rapid diffusion of the practices of subcontracting and service out-
sourcing that are enabled by the sharing of information, significantly lower
communication costs and a global division of labour. Taken together, these
developments have contributed to the establishment of new forms of

187
Organization 16(2)
Articles
governance that are considerably based on the control and distribution of
information (Castells, 1996, 2001; Sassen, 2001).
The growing significance of information and its pervasiveness have, in
addition, changed, rather radically, the status of data ordering and infor-
mation editing, that is operations that have traditionally been humble
enough to be reserved for the back office. On the one hand, the pervasiveness
of information has enabled the comparison of tasks, functions or outcomes
that have previously remained unrelated. The deepening involvement of
information has thus been instrumental in bringing forth an organizational
wide visibility in which different activities can be compared and evaluated
against one another, across settings and over time (Lilley et al., 2004;
Zuboff, 1988). In this respect, informatization has silently transformed key
premises upon which organizational action is predicated making the
management of information the backbone of many vital organizational
operations. On the other hand, the permutability of information, coinciding
with the standardized and interoperable character of many data sources
across functional and institutional boundaries, has enabled the generation
of information out of information in a growing scale. Such a development
harnesses enormous possibilities for producing a variety of new and
tailor-made services made possible by the comparison and juxtaposition
of previously unrelated information sources (Ciborra, 2006; Shiller, 2003).
The post dotcom era figures companies like Google, E-bay, Yahoo that offer
a variety of services based on their ability to deal with information editing
and data ordering.
Despite the important contributions they have made, none of the studies
mentioned above has investigated in sufficient detail, the far reaching im-
plications of rendering organizational operations and other kind of services
or activities as information, along the lines suggested in the introduction of
this paper. The routine, blackboxed and specialized character of the pro-
cedures by which organizational operations are modelled as information-
based transactions, and packaged into software modules, have obscured the
premises upon which organizational reality is recaptured as information
(Bloomfield, 1986). Innocent as it may seem at first glance, informatization
represents, nonetheless, an elaborate and institutionally anchored socio-
cognitive project that implies the significant and, in some cases, profound
transformation of what gets informatized. To transpose objects and oper-
ations to information does not imply mapping or describing an anterior
reality, in one or another approximation. Informatization models and, at the
same time, moulds reality. The computer screens through which reality is
mediated by means of data or information tokens are windows, masks and
blindfolds at the same time. It is vital to know in which ways technological
information reveals and hides, discloses, distorts, magnifies or conceals.
The comprehensive informatized rendition of organizational operations
has occasionally been denoted by terms such as ‘virtual’ and ‘virtualization’
(Sotto, 1998; Woolgar, 2002). The dominant understanding of virtualization
invokes the transposition of organizational tasks onto electronic media and

188
On the Computational Rendition of Reality
Jannis Kallinikos
the possibilities (or limitations) that emerge from the dematerialization
and deterritorialization of the exigencies of production and the social pat-
terns underlying them. However, the understanding of informatization as
virtualization stays silent as regards the premises and detailed operations
through which organizational reality is rendered as information. Virtualiza-
tion does bespeak the significance of an important transition but does not
critically analyse the terms by which the social and material conditions of
production are transformed to software code and code-based transactions.
Computation does not simply transpose reality to electronic medium. It is
rather involved in the far-reaching decomposition of the unity, coherence
and complexion of things and social processes, as the outcome of its in-
escapable and comprehensive analytic predilection. Thus disaggregated,
reality is reassembled by passing through the bottleneck of coding, and
the rules and procedures the practice with which the techniques of coding
are associated. Computation is involved in reassembling the universe and
reshaping the substratum upon which a substantial part of contemporary
modes of action and communication are based (Flusser, 1999, 2000, 2003;
Hayles, 1999, 2005).
In this respect, the issues I identify with computation differ from the
standard theme of human-machine interaction, a domain that I will describe
as typically macroscopic in the sense that it deals with observable parts of
reality. I am not here concerned, not primarily, with how humans interpret
or interact with machines (Nardi and Kallinikos, 2007). Rather, I am seeking
to disclose how computation is involved in removing a significant part of
that interaction from the human-machine interface, wrapping up and
blackboxing it. In doing so, it offers the option of interpreting the end product
at the human-machine interface but not easy intervention on the process
by which the end product is produced. In some cases, like computational
photography or mashups, the processes can be manipulated by the user
but only through access to specific software. Let me elaborate.

The Distinctive Character of Computation


Any attempt to codify a selected domain of reality is bound to be analytic.
It makes necessary the initial, verbal or formal, description of the reference
domain, e.g. the task or service to be codified, and its adequate delimitation
vis-à-vis other related phenomena. Such a process is usually referred to as
modelling. Codification also presupposes the existence or development of a
specialized language and techniques by which the items and relationships
entailed in the original description or model can become further formalized
and ultimately codified (Cowan and Foray, 1997; Foray, 2006).
Computation epitomizes this project well. It presupposes the identifi-
cation and adequate delimitation of a reference domain and its analytic
decomposition through verbal, diagrammatic or other notation-based means.
Analytic decomposition is essential to the task of modelling. The elements
that define a particular domain of reality often occur in bundled, en bloc

189
Organization 16(2)
Articles
forms and must accordingly be disentangled and made available to analysis
and manipulation. The disaggregation of the reference domain proceeds
through its progressive decomposition until the objects or operations have
been described in sufficient detail that allows embarking onto coding. For
instance the Materials Management function in the most widely diffused
Enterprise System (i.e. SAP R3) divides the totality of operations relevant
to the identification, procurement and internal distribution of inputs into
the following eight data/action categories: purchasing, external services
management, vendor evaluation, inventory management, invoice verifica-
tion, warehouse management, consumption based planning and material
ledger. Each of these categories or steps is further broken down into smaller
subcategories of items. Inventory management, for instance, is composed of
the following data or action subcategories: material master, data inventory
management, goods movement, environment, planning goods receipts, goods
receipts for purchased orders, reservations, goods issues, transfer posting
and stock transfer, print functions and physical inventory. Even these sub-
categories are broken down into smaller units. For instance the subcategory
good issues identifies the following distinct groups of operations: deliveries
to customers, withdrawal of material for production orders, other internal
staging of material, return deliveries to vendors, scrapping and sampling
(Bancroft et al., 1996). The same logic applies to all organizational tasks or
operations that are rendered computational and to most services offered
online, e.g. banking, hotel and flight booking, item shopping.
The categories that result from such meticulous disaggregation of reality
are recaptured in software code and reassembled to wholes or unities (fol-
lowing the logic dictated by the model) by means of an extensive series of
automated rules and procedures computation embodies. At this juncture,
computation moves away from culturally and perceptually recognized
units of the reality. The computational mechanics of information processing
construct now the minute processes by which a category, say goods
movement, is comprised, and run how changes in the category are recorded
and monitored. While humans may input data into the system, the link
between data and the category is fully automated and beyond human
discretion. These principles may exhibit a variation, depending on the
particular methodologies used or other contingencies, such as the nature
of the reference domain and its surrounding conditions (Adler, 2005).
However, they are present, in one degree or another, in any computational
object. They apply indiscriminately to computerized administrative tasks,
other work processes or services gone online (Cowan and Foray, 1997;
Zuboff, 1988).
The analytic, step-wise orientation of computation to a considerable
degree recounts the segmenting and rationalizing method on which man-
agement and technology in general are predicated (Cowan and Foray, 1997;
Simon, 1969). Analytic reduction has always been key to technology, and
instrumental reason and action. In this respect computation may seem to
be just a specific manifestation of instrumental reason, a particular mode

190
On the Computational Rendition of Reality
Jannis Kallinikos
of rationalizing the execution and the monitoring of tasks and services.
This is perhaps a principal cause, explaining why some highly specific
and distinguishing attributes of computation have been overlooked. The
analytic character of computation inaugurates a distinct stage in the project
of constructing technological objects. Computation steps beyond the mere
parsing and recomposition of reality that are intrinsic to technology or
other rationalizing methods and techniques. It does so in two important
respects.
First, in conceiving and constructing its objects, computer-based auto-
mation is led to detailing the link between the functionality of the artefacts
it constructs and the processes by which the functionality of these artefacts
is technologically sustained. Again, computational photography well ex-
emplifies this process. Visual images can be manipulated only by having
access to lower level processes by which they are composed. The same
holds true for mashups, combining the output of different applications
across media (text, voice, image). In this respect, computation represents the
technological embodiment of analytic reductionism. If expert behaviour, to
refer to another key theme with which computation has been tied (Dreyfus
and Dreyfus, 1986), is to be embodied in a machine then the ways by which
experts make decisions must be assumed, analytically reconstructed and
technically instrumented. To be sure, all automation makes assumptions
about the nature of the tasks that it automates. However, computation
differs from industrial automation in being significantly more vertically
stratified and functionally interdependent. This applies equally to stand
alone applications, large systems and information infrastructures. Even
the simplest computational tasks at the level of any application with
which humans interface (e.g. writing text, making arithmetic calculations)
need be supported by the elaborate automation of the computational rules
and procedures by which these tasks are accomplished. This is often
predicated on:
• the elaborate vertical stratification of the application itself, in which
minute steps are subsumed under broader categories (recall the example
of the enterprise systems given above);
• a series of computational systems and resources that descend from the
application to the operating system and through a variety of intermediate
systems of rules and procedures down to the primary level of machine
language and binary parsing.
Computation also differs from pre-computational procedures and tech-
niques of information processing in having disentangled information
processing from the complexities of human perception and communication,
and automated its execution (Kittler, 1997). Traditional methods of infor-
mation processing leave open or unspecified considerable enclaves of
actions that have to be accomplished by relying on other methods such
as formal role systems, training and skill profiles, standard operating pro-
cedures, etc. To some degree this applies to computation as well, for no

191
Organization 16(2)
Articles
technology can be all-inclusive. However, computation is evidently more
comprehensive in its regulative aspirations. Its history shows that it steadily
expands to absorb larger blocks of what has traditionally been described as
human organization and expert knowledge. In so doing, it is led to establish
and technologically embody the operational links between the comput-
ational object with the lower-level computational processes enabling and
sustaining its operation and manipulation (Borgmann, 1999; Kittler, 1997).
The ensuing controversy concerning the role of tacit knowledge in support-
ing sophisticated human skills is ultimately associated with computation’s
attempt to infringe upon and technologically reconstruct the relationship
between expert behaviour and its foundations (see Dreyfus, 2001; Dreyfus
and Dreyfus, 1986; Hayles, 2005).
Second, while the construction of computational objects is accomplished
by a variety of methods and techniques, all these must ultimately obey
the principles of a language of just two signs (Borgmann, 1992, 1999), i.e.
a language that captures the heterogeneous constitution of the world in
terms of a huge series of binary alternations. Such a claim may seem long
stretched. Do we really have to go as far as to take into account the binary
constitution of computation? I would like to suggest that this is necessary
for unravelling the logic of computation and its impact across contexts.
The binary nature of computation is implicated in a highly distinctive
attribute of computation, i.e. its ability to render interoperable aspects
of reality (e.g. different systems or applications, sound, image, text) that
despite the spectacular advances of materials technology in modernity
remained separable and part of different technical landscapes (Borgmann,
1999; Kallinikos 1998, 2006; Kittler, 1987, 1997). The computational (i.e.
binary) nature of computation matters beyond the conspicuous gains in
information processing ability and efficiency it brings about. This is mostly
shown in its capacity to transcend the qualitative differences underlying
the various domains of the natural, social and technical reality.
It is reasonable to consider the construction of interoperable systems
and technologies as an issue of standards and protocols by which different
systems and artefacts are brought to bear and communicate (exchange
data) with one another (Bowker, 2005). Standards and protocols are key
devices that negotiate data exchanges between different information sources
and systems. It is, however, important to recognize that standards nego-
tiate exchanges between information sources that are computationally
constituted and thus potentially exchangeable. There is no way to make,
say, sound and text interoperable, nor would it be possible to fuse different
images, as computational photography does, unless one traces them to a
common denominator. Like market prices, computation cuts across differ-
ences. By contrast, spare parts of different industrial devices can seldom be
substituted for one another, for they are discontinuous. Vehicle spare parts
may be useless for television sets. Sewage networks do not interoperate with
electricity or gas networks. It is important in this context to point out that
industrialism never managed to construct other than a fragmentary technical

192
On the Computational Rendition of Reality
Jannis Kallinikos
landscape made of separate artefacts and technical processes that never
intersect. Industrial technology has not up to the present had at its disposal
a unifying system of principles on the basis of which technical artefacts and
processes could be rendered interoperable. The interoperability of different
information systems or artefacts that are so characteristic of contemporary
technologies of computing would have been impossible without the binary
constitution of computation (Kallinikos, 2006).
In thus describing the distinctive character of computation, I do not mean
to imply that differences in computational objects that develop beyond
these core characteristics of the technology of computation are insignificant.
Such differences may reflect the particular software-based methodologies
adopted, the very nature of the reference domain (objects, work practices,
services) as well as a number of contingent characteristics that recount
the specific character of the context (implementation strategies included)
within which computational applications are brought to bear upon. Digital
imaging technologies, for instance, may be used quite differently in medical
practice and filmmaking. Similarly, the same database can be interrogated
in rather divergent ways by, say, criminal authorities and insurance com-
panies or advertising agencies. Yet, the varying ways by which computa-
tional systems and artefacts are used in particular circumstances or settings
tell only part of the story. Another significant part is recounted by those
generic and institutionally constructed attributes that cut across particular
domains and artefacts.

The Vertical Constitution of Computational Objects


Computation thus entails a systematic retreat from the composite and
extendible status of reality and a corresponding stronger focus on those
underlying processes by which various domains of reality are sustained.
The world as it is given to perception, bodily manipulation or experience
is relentlessly partitioned into the elementary, microscopic elements by
which it is supposed to be made and reassembled to recognizable entities
by recourse to elaborate systems of automated rules and procedures.
In this respect, computation inaugurates a distinct stage in the history
of automation and other rationalized methods and techniques of work and
administration. Computational technology supports the functionality of
the objects it constructs by providing the infrastructure, the technical links
by which such functionality is made possible. Computational objects are
vertically stratified. They are manipulable on the screen and manipulability
is made possible through technological access to foundations. It is, I suggest,
through the very construal and computational (binary) construction and
control of these invisible, microscopic foundations (imagined or real) that
the recurrent behaviour of the computational object (e.g. a task, a service)
can be sustained at the level of the human-machine interface (Borgmann,
1999). Let me elaborate further.

193
Organization 16(2)
Articles
The distinctively analytic character of computation is closely associated
with the conception and control of the mechanics though which a particu-
lar aspect of reality (an object, a process, a transaction) is reconstructed
and its operation sustained technologically (i.e. through automation). In
order to be able to sustain its operations, computational technology has to
establish the link between higher objects and processes with which
humans interface, via the artefact screen, and the lower level functions
supporting the computational behaviour of the informatized objects and
processes. This is a key principle of computation. For instance, online
services usually entail a front-end (the interface with customers) and a
back-end, entailing an elaborate edifice of databases and other systems and
computational resources supporting its smooth delivery. Back-end pro-
cesses are themselves sustained by a variety of other computational
systems that provide the underlying support of automated information
processing. The same applies by and large to most administrative tasks
or work processes that are carried out by computer-based systems. The
smooth operation of Enterprise systems mentioned in the beginning of
the preceding section is made possible through elaborate database solutions
and other computational resources that support the task and function
modules at which the users interface with the system.
The differentiation of front-end and back-end processes provides a good
illustration of the vertical stratification of computational objects. At the
same time, the smooth functioning of these objects is supported by a vertical
depth that reaches from the level of the application, through the operating
system down to the very mechanics of binary parsing (Borgmann, 1999).
Computational objects have vertical depth. This is, probably, what prompted
Kittler (1997) to make his provocative (and evocative) statement, ‘there is no
software’. At the very bottom there is just binary parsing (see also Bowker,
2005). The microscopic orientation of computation and its relentless an-
alytic reduction are therefore driven by the goal of the comprehensive auto-
mation of reality intrinsic to it. Such a project is rendered viable through
the extensive links computational technology is able to forge between
higher-level technological objects at the human interface and the digital
mechanics by which they are sustained. In this respect, computation is
both unique and path breaking.
Some of the issues I am at pains to disclose here are distantly re-echoed
in the debate concerning the relationship between tacit and formal know-
ledge and the effects which computation has been claimed to have on expert
knowledge and human skills in general. The reservation with which
computerization has traditionally been met in theory and practice is closely
associated with the limitations of the analytic methodological procedure
intrinsic to automation, and the drift away from the macroscopic blocks
of reality with which human experts have traditionally dealt with. The
relative debate is complex and controversial and there is no reason to deal
with it here. Let me just reiterate that the analytic decomposition and for-
malization of skills have been claimed to lead to the evaporation of people’s

194
On the Computational Rendition of Reality
Jannis Kallinikos
tacit knowledge and, ultimately, to the loss of expert skills across a wide
and multi-disciplinary literature (e.g. Dreyfus and Dreyfus, 1986; Zuboff,
1988). A similar criticism has, more recently, been directed against the way
search engines construct information itineraries by blindly (on the basis of
syntactic similarities) linking data items to one another, without concern
for the underlying semantics and the practices by which individuals and
groups assemble data items and evaluate information (Bowker and Star,
1999; Brown and Duguid, 2000; Dreyfus, 2001). Fully justified as these
criticisms are, they have seldom been able to expose in toto the logic of
computation and provide a rationale as to why it is driven away from the
extendible and solid character of things and the practices by which these
are usually addressed.
The far-reaching importance of the binary nature of computation and
its inescapable vertical stratification emerge against the background of the
contrast which computational objects maintain to other cultural or tech-
nical artefacts constructed on the basis of analogue (composite) models of
reality. These last are dense systems, made of interpenetrating elements
or inputs that lack the kind of standardization underling digital or binary
systems (Goodman, 1976; Kallinikos, 2002). This is the reason why ana-
logue models of reality cannot be decomposed to more elementary or
primary standardized elements and reconstructed through rule based
(and ultimately machine-run) combinations of these elements. If there
is consequently a limitation (in instrumental terms) of analogue-based
artefacts, this consists in them failing to conform to a major instrumental
principle, that is, the adequate separation of the final outcome from the
process of arriving at it (Kallinikos, 2002). The two merge together in ways
that make analogue models of reality very much dependent on the particular
contexts (individual proclivities or other social attributes) in which they
have been brought to being. As a consequence, artefacts constructed by
recourse to analogue models of reality remain distinct in ways that make
them non-commensurable or compatible with one another. There is no
common set of primary attributes to which they can be reduced. In other
words, they remain vertically non-stratified.
By contrast, the power of computation lies in its simple binary language
on the basis of which what presents as irreducible realms of the real can
be transcended (Borgmann, 1999). By recourse to binary parsing, the tech-
nology of computation is ultimately able to establish the link between
higher computational objects and lower level functions through which
the functionality of the object is sustained. Computation as a technological
paradigm is relentlessly reductionist, assuming reality to be ultimately
reducible to binary coding; higher-level functions are supported by huge
series of digital calculations taking place underneath. The meticulous
decomposition of reality, which I therefore attribute to computation, is
crucially related to this project. I do not wish and cannot enter here the vexed
question as to whether computation is predicated on reasonable assump-
tions with respect to those aspects of social and natural reality that it

195
Organization 16(2)
Articles
subsumes under its regulative regime. After all, technology is crucially
based on functional simplification of one or another type (Luhmann, 1993).
Technology does not map truthfully the world. It reconstructs it in ways
that make it amenable to mastery and regulation (Kallinikos, 1995).

Addendum on Agency and Artefacts


The claims put forth in the preceding sections have sought to expose the
distinctive nature of the technology of computation by a series of arguments
that disclose its comprehensive analytic/reductionist predilection. The
far-reaching implications of the growing socio-economic involvement of
computation emerge perhaps more clearly against the historic background
of work and organizational practices and the forms they have customarily
assumed. Despite the long driven rationalization of organizations and the
heavy industrial involvement of technology in the manufacturing and
distribution processes, traditional methods of management and work never
moved as far away from the palpable status and complexion of reality as
computation is currently doing.
Traditionally, human involvement with the world has been bound up
with the enduring shape of reality; that is a reality as mostly comprised by
the concrete, tangible, standing-apart and observable character of things
and their inter-relationships (Cassirer, 1955). In particular, the history of
work could be read as a protracted effort to deal with and master the recal-
citrant materiality and tangibility of the world (Braverman, 1974; Marx,
1954/1956). Its heavy cultural embeddedness notwithstanding, work has
always been immersed into the materiality of life. To work has meant to
act upon and transform the physical and resistant character of things, as
they are given to immediate perception and bodily dexterity or sensibility
(Arendt, 1958; Zuboff, 1988). The elaborate segmentation of the world,
brought about by the deepening division of labour in the recent history of
industrial capitalism, seldom moved as drastically beyond the extendibility
and presentability of materials and things as computation does (Flusser,
1999, 2000, 2003).
Technological information produced by means of computation breaks
with this primordial human dependence on the extendible and presentable
character of the world, and the tangibility and corporeality of labour. It does
so by seeking to penetrate beyond the shape of things, in the unobservable
substratum by which they are made or are assumed to be made, and recon-
struct them as operations of information. In this respect, computational
objects or artefacts are constructed and sustained by recourse to cognitive
operations (computations) that ultimately elude the human embeddedness
in the world, and the inexorably tangible and situated character of work
(Flusser, 2003; Kallinikos, 1995; Zuboff, 1988).
Certainly, cognitive processes are always at a remove from reality. Long
before the advent of computation, institutionally anchored modes of cog-
nition have been involved in the distancing from immediate reality and its

196
On the Computational Rendition of Reality
Jannis Kallinikos
considerable partition and itemization. Especially, in the form of scientific
work, cognition and knowledge have always sought to penetrate beyond
the observable and given. Even though it may occasionally depart from
observation and experience, science always evokes unobservable correlates
and processes (i.e. scientific abstraction) that are assumed to develop
beneath the enduring shapes of reality; or, at least, beyond the reality as this
is given to perception and experience. Industrial capitalism has to a con-
siderable degree been based on the transformation of scientific findings to
technological artefacts and processes. This way the abstractions of science
reached down into the material character of work, through the extensive
involvement of industrial technology in the parsing, instrumentation, mon-
itoring and control of work processes (Zuboff, 1988). However, as claimed
earlier, partitioning in the case of computation does not simply entail the
breaking down of the units of the world but a relentless drive to discover
the microscopic foundations of these units and reconstruct them as software
code. Rather than producing tangible bits and pieces, however small and
scattered these may be, computation ends up in a digital elementalism that
reduces the complexion and durability of the world to binary combinations.
Digital dust slowly covers over the shape of things (Kallinikos, 2006). The
granularity of the computational units (binary digits) is so thin as to elude
immediate involvement and observation. For this reason, the implications
of computation are elusive.
The analytic/reductionist predilection of cognition has also been centrally
implicated in everyday life of organizations in the mundane medium of
administration; as an instance of cognitive work, administration undeniably
signifies a step away from the involvement with the materiality of the
world. Administrative processes are inevitably entangled with analytical
categories, abstractions, numerical systems and technical notation, and
other modes of ordering and calculation by means of which the monitoring
and control of collective human effort takes place (Cline-Cohen, 1982; Rose,
1999; Zuboff, 1988). To a certain degree, computation can be said to derive
from the effort to systematize the methods of information processing cast
in traditional media. A recollection, in particular, of the diffusion of num-
eracy in modernity suggests that numbers and numerical systems have
critically been involved in the establishment of a reality different from
that of immediate experience. Despite their apparent comprehensibility
and semantic simplicity, numbers and numerical relationships are abstract
cognitive entities that describe highly selective aspects of the world,
standardizing and homogenizing it. It is not by accident that numbers
lack referential attribution, i.e. their reference to reality cannot be showed
or demonstrated. Therefore, the substantial involvement of numerical
systems in organizations has produced a reality brought forth by the con-
stitutive power of numeracy and its particular applications, i.e. a reality of
frequencies, ratios, indices and other abstract relationships. However, the
connection between different information items or sources in traditional
media and environments was the product of skilful accomplishment

197
Organization 16(2)
Articles
performed by humans (Dreyfus, 2001; Putnam and Cooren, 2004). The
technological link (automated information processing) to the lower level
functions necessary to support massive calculations at a higher level had
to await the discovery of the digital computer.
Computation and technological information therefore continue this
tradition of cognitive analysability. They give it, though, an interesting shift
and endow it with a momentum that had been unimaginable without two
important and inter-related innovations. The first relates to the conception
and understanding of information processing as an independent domain
of social life, separable from social interaction and communication within
which information exchange and processing traditionally took place. As
Kittler (1997) has shown the innovation of separating interaction, on the
one hand, from information processing and communication, on the other,
has historically been introduced by writing. The second innovation is asso-
ciated with the technological embodiment of the rules for the automation of
information processing that computation signifies. Information processing
has thus been separated from communication in a massive scale. Data can
be produced out of data without needing to transmit or exchange them. The
formidable processing capacity of contemporary technology of computing
has thus been instrumental to the project of reconstructing the world as a
huge series of binary computations. The digital fragments (i.e. data feeds,
data and code patches) produced by computation are inescapably related to
the institutional and technical separation of automated information process-
ing and all the differences it implies when compared to pre-computational
administration (Kallinikos, 2006).
The implications of these developments for humans and the ways they
perceive and act upon the world are far reaching. If I am right then a substan-
tial part of the perceptual inputs of human agents are and will increasingly
come to be provided by the technology of computation and the variety of
systems and artefacts it sustains. Whether as computer-constructed visual
images and units or in the forms of discrete symbol tokens (alphabetic or
numerical systems or other marks and digits) these perceptual inputs will
considerably be derived from a microscopic reality (nano processes) about
which humans do not and cannot have knowledge by experience and
acquaintance. It is not therefore simply that digital systems and artefacts
mediate between humans and the world. The comprehensive and vertically
stratified character of computation as analysed in this paper suggests that
the procedures of reassembling reality intrinsic to it are bound to shape the
forms human agency assumes in contemporary instrumental settings. As
data and information items proliferate and engulf human operations, agency
increasingly assumes the form of remaking the shape of things out of the data
fragments, the digital dust, as it were, left over by the relentless segmentation
of the world coinciding with the computational rendition of reality.
No doubt, significant parts of this reality still preserve their coherence
and semantic comprehensibility. Computational objects at the human inter-
face often deal with physical or culturally recognized units of the world,

198
On the Computational Rendition of Reality
Jannis Kallinikos
i.e. services, objects, tasks or other human agents in electronically medi-
ated transactions. Nevertheless, an equally large and, most crucially,
growing part of that reality must be reassembled to meaningful patterns
by recombining the data feeds and patches of code amply provided by
the growing involvement of the technology of computation in every walk
of life. Such a task must in addition be pursued in ways that conform to
the multivalent network of constraints computational procedures of com-
bination impose. In this respect, computation is performative (it outlines
and often prescribes what has to be done to accomplish a given task) not
descriptive. At the same, the links which computation enables between
higher- and lower-level processes open up new avenues for exploring and
constructing reality. Many things, inconceivable a few decades ago, can
be brought to being while old tasks can be accomplished immensely more
easily and effectively.

Coda
In this paper, I have sought to produce an account of the technology of
computation in terms other than technological, focusing on the forms
through which it reshapes the raw material by which humans perceive
and act upon the world. The vertical stratification of computational objects
represents an important technological innovation that establishes the link
between processes of a higher order that occur at the human interface with
the computational mechanics by which such processes are sustained. A
new, potent and integrated procedure for building up reality is thus estab-
lished, whereby composite units of the world are always made up of digital
fragments produced by the technology of computation. Such an under-
standing of computation departs significantly from the effort to chart the
functional and structural implications of the deepening involvement of
information and communication technologies in organizations, character-
istic of the overwhelming bulk of the literature in the field.
The implications of computation as analysed in this paper provide a
critical set of conditions whose exposition and understanding is crucial
for appreciating the character of wider organizational and economic de-
velopments that take place in the contemporary world. One such key devel-
opment is the growing significance and relatively autonomous character
of the information processes that emerge out of the vast possibilities of
recombining and recycling data and information items. A new instrumental
habitat keeps on forming around these trends, about which I have said almost
nothing here (see e.g. Kallinikos, 2006). Large, growing and interoperable
information infrastructures are taking shape as computational reduction
overcomes the intrinsic heterogeneity and the often-irreconcilable dif-
ferences of the varying domains of the real (Bowker, 2005; Ciborra, 2006;
Shiller, 2003). An increasing variety of information sources are brought
to bear upon one another, and will increasingly do so in the future, as the

199
Organization 16(2)
Articles
outcome of the re-constitution of the heterogeneous character of reality as
permutable information.
Very little research has so far been done on these matters. This is not sur-
prising given the relatively recent character of the relevant developments.
However, the issues identified here as the outcome of the diffusion of the
technologies of computing owe much to a distinctive perspective that derives
from the understanding of the role of media and the way these have shaped
collective forms of perception, cognition and action (e.g. Goodman, 1976;
Hayles; 1999, 2005; Flusser, 2000, 2003; Kittler, 1987, 1997; Manovich, 2001).
The investigation of the implications, which computation has for human
agency and organizations, must therefore be underlain by a historic aware-
ness of the evolution of media. It must accordingly be performed in ways
that draw upon empirical material that covers larger time scales. Even though
such a choice is not irreconcilable with the constructivist contextualism
(see e.g. Kaptelinin and Nardi, 2006; Zuboff, 1988) that has dominated
the field over the last two decades or so it must seek to surpass the strong
limitations that may derive from a constricted research timescape (Pollock
and Williams, 2009).

References
Adler, P. S. (2005) ‘The Evolving Object of Software Development’, Organization
12(3): 379–99.
Arendt, H. (1958) The Human Condition. Chicago, IL: Chicago University Press.
Bancroft, N. H., Seip, H. and Sprengel, A. (1996) Implementing SAP/R3. Manning:
Greenwich.
Bloomfield, B. (1986) Modelling the World: The Social Constructions of Systems
Analysts. Oxford: Blackwell.
Borgmann, A. (1992) Crossing the Postmodern Divide. Chicago, IL: The University
of Chicago Press.
Borgmann, A. (1999) Holding On to Reality: The Nature of Information at the Turn
of the Millennium. Chicago, IL: The University of Chicago Press.
Bowker, G. (2005) Memory Practices in the Sciences. Cambridge, MA: The MIT
Press.
Bowker, G. and Star, S. L. (1999) Sorting Things Out: Classification and Its Con-
sequences. Cambridge. MA: The MIT Press.
Braverman, H. (1974) Labour and Monopoly Capital. New York, NY: The Monthly
Review Press.
Brown, J. S. and Duguid, P. (2000) The Social Life of Information. Boston, MA:
Harvard Business School Press.
Casirrer, E. (1955) The Philosophy of Symbolic Forms, 3 Vols. New Haven, CT:
Yale University Press.
Castells, M. (1996) The Rise of Network Society. Oxford: Blackwell.
Castells, M. (2001) The Internet Galaxy. Oxford: Oxford University Press.
Ciborra, C. (2006) ‘Imbrications of Representations: Risk and Digital Technologies’,
Journal of Managment Studies 43(6): 1339–56.
Ciborra, C. and Lanzara, G. F. (1994) ‘Formative Contexts and Information
Technology’, Accounting, Management and Information Technologies 4: 611–26.

200
On the Computational Rendition of Reality
Jannis Kallinikos
Cline-Cohen, P. (1982) A Calculating People: The Spread of Numeracy in Early
America. Chicago, IL: The University of Chicago Press.
Cowan, R. and Foray, D. (1997) ‘The Economics of Codification and the Diffusion
of Knowledge’, Industrial and Corporate Change 6(3): 595–622.
DiMaggio, P. J., Hargittai, E., Russell Newman, W. and Robinson, J. P. (2001) ‘Social
Implications of the Internet’, Annual Review of Sociology 27: 307–36.
Dreyfus, H. I. (2001) On the Internet. London: Routledge.
Dreyfus, H. I. and Dreyfus, S. E. (1986) Mind over Machine. New York, NY: Free
Press.
Foray, D. (2006) The Economics of Knowledge. Cambridge, MA: The MIT Press.
Flusser, V. (1999) The Shape of Things: A Philosophy of Design. London: Reaktion
Books.
Flusser, V. (2000) Towards a Philosophy of Photography. London: Reaktion
Books.
Flusser, V (2003) Die Schrift: hat Schreiben Zukunft? Athens: Potamos (in Greek),
originally published in German, 1987.
Frege, G. (1980) The Foundations of Arithmetic, foreword by J. L. Austin. Oxford:
Blackwell.
Forester, T., ed. (1989) Computers in the Human Context. Oxford: Blackwell.
Fulk, J. and DeSanctis, G. (1995) ‘Electronic Communication and Changing
Organization Forms’, Organization Science 6(4): 337–49.
Goodman, N. (1976) Languages of Art. Indianapolis, IN: Hackett.
Goody, J. (1986) The Logic of Writing and the Organization of Society. Cambridge:
Cambridge University Press.
Hayles, K. (1999) How We Became Posthuman: Virtual Bodies in Cybernetics,
Literature and Informatics. Chicago, IL: The University of Chicago Press.
Hayles, K. (2005) ‘Computing the Human’, Theory, Culture and Society 22(1): 131–51.
Kallinikos, J. (1995) ‘The Architecture of the Invisible: Technology is Representation’,
Organization 2(1): 117–40.
Kallinikos, J. (1998) ‘Organized Complexity: Posthumanist Remarks on the
Technologizing of Intelligence’, Organization 5(3): 371–96.
Kallinikos, J. (2002) ‘Re-opening the Black Box of Technology: Artefacts and Human
Agency’, in L. Appelgate and R. Galliers (eds) 23rd Conference in Information
Systems, pp. 287–94. Barcelona.
Kallinikos, J. (2006) The Consequences of Information: Institutional Implications
of Technological Change. Cheltenham: Edward Elgar.
Kaptelinin, V. and Nardi, B. A. (2006) Acting with Technology. Cambridge, MA:
The MIT Press.
Kittler, F. (1987) Gramophone, Film, Typewriter. Stanford, CA: Stanford University
Press.
Kittler, F. (1997) Literature, Media, Information Systems. Amsterdam: OPA.
Knorr-Cetina, K. and Bruegger, U. (2002) ‘Global Microstructures: The Virtual
Societies of Financial Markets’, American Journal of Sociology 107(4): 905–50
Lanzara, G. F. and Patriota, G. (2001) ‘Technology and the Courtroom’, Journal of
Management Studies 38(7): 943–71.
Lilley, S., Lightfoot, G. and Amaral, P. (2004) Representing Organization. Oxford:
Oxford University Press.
Luhmann, N. (1993) Risk: A Sociological Theory. Berlin: De Gruyter.
Lyman, P., Varian, H. R. and Associates (2003) How Much Information, http://www.
sims.berkeley.edu:8000/research/projects/how-much-info-2003/index.htm

201
Organization 16(2)
Articles
Manovich, L. (2001) The Language of New Media. Cambridge, MA: The MIT Press.
Marx, K. (1954 and 1956) The Capital. Two Volumes. Moscow: Progress Publishers.
Originally Published in 1865 and 1867 respectively.
Megill, A. (1994) Rethinking Objectivity. London: Duke University Press.
Nardi, B. and Kallinikos, J. (2007) ‘Human-Computer Interaction’, in W. Donsbach
(ed.) International Encyclopedia of of Communication. Oxford: Blackwell.
Orlikowski, W. J. (2000) ‘Using Technology and Constituting Structures: A Practice
Lens for Studying Technology in Organizations’, Organization Science 11(4):
404–28.
Putnam, L. and Cooren, F. (2004) ‘Alternative Perspectives on the Role of Text and
Agency in Constituting Organizations’, Organization 11(3): 323–33.
Raskar, R. and Tumblin, J. (2007) Computational Photography. Wellesley, MA:
A. K. Peters.
Rose, N. (1991) ‘Governing by Numbers’, Accounting, Organizations and Society
16(7): 673–92.
Rose, N. (1999) Powers of Freedom: Reframing Political Thought. Cambridge:
Cambridge University Press.
Sassen, S. (2001) The Global City. Princeton, NJ: Princeton University Press. First
edition in 1991.
Shiller, R. J. (2003) The New Financial Order. Princeton, NJ: Princeton University
Press.
Simon, H. A. (1969) The Sciences of the Artificial. Cambridge, MA: The MIT Press.
Sontag, S. (1977) On Photography. London: Penguin.
Sotto, R. (1998) ‘The Virtualization of the Organizational Subject’, in R. Chia (ed.)
Organized Worlds: Explorations in Technology and Organization with Robert
Cooper. London: Routledge.
Woolgar, S., ed. (2002) Virtual Society: Technology, Cyperbole, Reality. Oxford:
Oxford University Press.
Yates, J. and Van Maannen, J., eds (2001) Information Technology and Organizational
Transformation. London: Sage.
Zuboff, S. (1988) In the Age of the Smart Machine. New York, NY: Basic Books.

Jannis Kallinikos is Professor and Research Chair in the Information Systems and
Innovation Group, Department of Management, London School of Economics.
Major research interests entail the study of the practices, technologies and formal
languages by which organizations are rendered predictable and manageable and the
modes by which current institutional and technological developments challenge
the organizational forms that dominated modernity. Some of these themes are
analysed in detail in his recent book The Consequences of Information: Institutional
Implications of Technological Change (Edward Elgar, 2006). Address: London
School of Economics, London, UK. [email: J.Kallinikos@lse.ac.uk]

202

You might also like