CHAPTER 1
Building a Digital Society
Itis important that many of the structural possibilities of a digital society were foreseen
in the earty days of electronic computing (and even before). Others were either unantici-
pated or came more from the pages of fantasy novels than from the rational projection
of technologists. Parallel processes were in play. This first chapter casts its eye firmly
on the futurism of the past, specifically from the advent of modern computer science
in 1936 to the crisis of undue complexity in 2007. in this brief history, | will recall how
digital technologies, through 2 virtuous circle of mediation, came to teinforee and
implement particular ways of conceptualizing and ordering society in the late twen-
tieth century. As the application of digital technologies gathered pace from the 1970s
onwards, the various imperatives driving technological change were also reshaping
the dominant political structures in developed and developing societies. Because of
the apparent confluence of technical and social change, it has become commonplace
to argue that the impact of digital technologies is comparable to that of the Industrial
Revolution (Castells 1996). An ‘information revolution’ is seen to mark an irreversible
transition from physical to intangible (untouchable) commodities and actions, and
from embodied to mediated social processes (Leadbeater 2000). More cautious scholars
have argued that it is vitally important to recognize that there is a crucial element of
continuity with the previous ‘industrial’ age in terms of a progressive technical auto-
mation of human functions and social organization (Webster 2006). In that respect, the
ligital media have a substantial and fascinating history, and it is worth situating these
arguments against the past evolution of digital society.
Information Society and the Atomic Age
The digital technology of today finds its origins in the mechanical calculating machines
invented during the nineteenth century. The ‘analytical engine’ conceived by Charles
Babbage in the 1830s was the first machine intended to process and store information
with multiple purposes and flexible configurations of usage (Swade 2001). A prototype
of this machine was completed by the beginning of the twentieth century. The storage
of information records on ‘punched’ cards was also an innovation of the nineteenth
century that would later become a key component of computing in the twentieth cen-
tury. In 1936, Alan Turing introduced the concept of the ‘Turing machine’, a device
that would make calculations based upon a large store of printed information which
could be selectively applied for mathematical processing (Petzold 2008). Turing took
this concept further to demonstrate the idea of a “universal machine’ that could read
the description of any computational process (an ‘algorithm’) and then simulate its
operation, Turing was one of the most significant figures in modern mathematics and
as a consequence, following the outbreak of the Second World War, he was recruited to
56 Digital Histories z= m
work at Britain's secret code-breaking centre at Bletchley Park. Turing famously devised
the ‘bombe’ machine in order to decipher the secret codes produced by the German
cryptological machine (the ‘enigma’) (Copeland 2004),
‘The global conflagration that killed between 50 and 70 million people in the mic-
twentieth century occurred on the cusp of several major scientific breakthroughs,
including not only computational machines, but also modern electronics and nuciear
physics. In that respect, the war years (1939-45) were as much a scientific and techn
logical contest as they were a military one. The most technologically advanced nation:
in the world, Britain, the United States and Germany, effectively conscripted their scien-
tific talents and applied them relentlessly to military applications, culminating in the
advent of computers, missiles and the atomic bomb in the 1940s. It is in that context
that Konrad Zuse developed in 1941 the first programmable machine operated through
{information stored in binary code. The United States built the first electronic computer
jn 1941 and Britain developed an electronic device with limited programmability (the
‘calosst 1949 (Copeland et al, 2006)
06). ln 1542, Britain took the momentous decision
to share all of its scientific secrets with the United States, and the collaboration between
the two countries enabled them to surpass Germany in the fields of logical computing
and atomic weaponry. Needless to say, the atomic bomb, and its use against Japan in
1945, was an epochal moment in human history. The significance of the emergence of
modern computer science, however, was kept under tight secrecy, and did not become
fully apparent until a number of years after the war.
The United States Army built the ENIAC device in 1946 to aid in the successful deliv-
ery of missile weapons, whilst Britain built the first programmable electronic comput.
ers (the ‘Manchester computers’) between 1948 and 1950. Accordingly, the pursuit of
electronic computing ~ in primitive but strategically important forms - by the major
antagonists during the Second World War in the 1940s is commonly seen as heralding
what has been called the ‘information age’. The conflict had brought together large
number of scientists, academics and technicians on an unprecedented scale and had
demonstrated how major technical achievements could be made quickly through such
systematic collaboration. It was this experience that underpinned the decision to mas-
sively expand university and technical education in the postwat decades. In making
hnis assessment of these developments for the future, Vannevar Bush, the director of the
Federal Office of Scientific Research and Development in the United States, wrote an
essay in 1945 in which he reflected on the growing specialization of knowledge and the
new tools for managing information that would become essential in the post-war world.
Famously, Bush projected the imminent arrival of a desktop information management
machine that he called the 'Memex', The Memex would facilitate the storage, retrieval
and, most critically, the linkage of information customizable to the needs of each user.
‘A Meinbx is:2 device in. whieh an individual 'stores“all his books, Fecords and communications,
‘and which is mechanized so that it may be consulted with exceeding speed and flenbility. it is an +
enlarged intimate supplement to his memory: It consists ofa desk, and while it can presumably be
operated from a distance, ‘primarily the piece of furniture at. which he Wworks..On the top are
‘slanting translucent screens, on which material cen be projected for convenient reading, There isa
keyboard, and sets of buttons and levers. Otherwise it Tooks like an ordinary. desk. In one end js the
stored material. The matter of bulk ls well taken care of by improved microfilm. Only asmall part
il the inleror of the Memen is devoted to storage, the rest to mechanism, Yet ifthe user insertedms we : cveocuemmtews: Bullding a Digital Society imvsenne 7
“8/000 pager of material a Gay itinould take bios hundreds of years to (
profigate apd enter matetal freely,
ile repository, so he can
Vormevac Bush 1945) Hs We May Think nbc enh en i018 eat ak
The development of ‘mainframe’ computers in the 1950s and 1960s produced rapid
leaps in the application of electronic computing to solving advanced mathematical
problems. These machines were far from the desk-based device envisioned by Vannevar
Bush, commonly taking wp the size of an entire room or more. Mainframes required a
massive amount of power and a large team to maintain and operate. Nonetheless, the
energies spent upon the development of these machines stemmed from a widespread
recognition that the concentration of information in forms that could be processed in
any number of ways would open up enormous potentials for scientific development.
Computerization would simultaneously solve the problem of memorizing and manag-
ing all that information, The speed of electronic processing promised to avercome the
time- and scale-based limitations of human thinking. This step-change in efficiency
could obviously be applied to scientific experiments, but also to any number of large
and complex processes employed in military, bureaucratic and manufacturing appli
cations. ‘Information management’ would no longer be a technique of making and
maintaining records, but rather a dynamic process of experimentation that employed
digitized records (‘data’) as its raw material.
Cold War and White Heat
The 1950s and 1960s were characterized by the onset of the ‘Cold War’, a period in
which the wartimes allies of the capitalist West and communist East were pitted
against each other in an intense scientific and technological contest to master the
new technologies of the age. These (dangerous) rivalries were also expressed in their
respective desire to demonstrate the supremacy of their opposing economic systems.
As such, the potential of computing to improve the efficiency of industrial production
was quickly recognized both by state-owned enterprises in the communist bloc and the
Private industrial corporations of the Western world, in which the United States had
now become predominant. The pursuit of ‘information technology’ was intended to
transform the productive process of global industry, with this modernization furnish-
ing a capacity to rapidly develop and commercialize any number of new technologies.
In 1963, the British Prime Minister, Harold Wilson, referred to the ‘white heat’ of a
technological age. The focus of commercial competition was therefore shifting from
territorial expansion to the pursuit of more efficientindustries and markets via rapid
automation. Three years before, US President Dwight D. Eisenhower had already spoken
of the new institutional form of scientific research and its co-evolution with what he
called the ‘military-industrial complex’ (1961).
It was the machinery of ‘high technology’ that caught the public imagination in the
1960s, via the “space race’, nuclear power and the domestication of electronics (notably
television), The new centrality of information management, however, subsequently
proved to be an equally profound development in the remaking of the modern world,
By the 1970s we had entered an era in which vast stores of information appeared to