You are on page 1of 18

Technology in Society 22 (2000) 255–272

www.elsevier.com/locate/techsoc

Science fiction and technology scenarios:


comparing Asimov’s robots and Gibson’s
cyberspace
*
Dominic Idier
Technology Scenario Program, Risø National Laboratory, Building 110, PO Box 49, 4000 Roskilde,
Denmark

Abstract

In this paper the relationship between technology scenarios and science fiction is examined.
Science fiction, a form of symbolic creativity, often considers the consequences of technology
for society and mankind—but a science fiction story is not a scenario. The differences come
not only from its form and wide public appeal, but also because science fiction is much less
a normative exercise than a scenario, leaving room to explore the effects of the broad spectrum
of human nature. This study is based on two major science fiction works: the Cycle of Robots
by Isaac Asimov and Neuromancer by William Gibson. Contrasts are drawn between the safe
utopian society depicted by Asimov and the dark but accurate future vision of Gibson, ending
with an assessment of how science fiction can contribute to the technology scenario technique.
 2000 Elsevier Science Ltd. All rights reserved.

Keywords: Science fiction; Scenarios; Foresight; Cyberpunk

1. Introduction

While the purpose of this article may appear somewhat fanciful to those who are
used to econometrics, formalisation, and expert panels for thinking about the future,
nevertheless futurists cannot ignore science fiction, if for no other reason than its
broad public diffusion. In this article, I compare the technology scenario method to
science fiction stories by using two major works, Cycle of Robots by Isaac Asimov
and Neuromancer by William Gibson, and extracting how they can help build scen-

* Corresponding author. Tel.: +45-4677-5100; fax: +45-4677-5199.


E-mail addresses: dominic.idier@risoe.dk; hypris@spray.se (D. Idier).

0160-791X/00/$ - see front matter  2000 Elsevier Science Ltd. All rights reserved.
PII: S 0 1 6 0 - 7 9 1 X ( 0 0 ) 0 0 0 0 4 - X
256 D. Idier / Technology in Society 22 (2000) 255–272

arios that can be utilised by professional futurists. The underlying question I discuss
has been raised by the convincing techno-social foresight depicted by Gibson.

2. Foresight versus forecasting

Although seeming to be a mantra running through articles written by futurists,


technology foresight is not about predicting the future. At best it is a tool for assisting
companies and organisations in the decision-making process. In the context of tech-
nology scenarios, foresight is neither forecasting nor prediction. Foresight does not
deal with precise future events but rather with tendencies extrapolated from what
we have experienced and are now experiencing. Foresight is learning, preparing our-
selves for the unexpected. Forthcoming events result from a long chain of past and
present actions carried out by several interpenetrating networks of influence. Two
main processes should be taken into account for future foresight:

1. Who is making decisions? (determining actors);


2. What is the communication flow? (determining information channels).

2.1. Technology foresight

Technology foresight is a major concern for economists, corporate executives,


political executives, media people and social researchers—and for science fiction
writers. It is not about predicting the clock rate of the newest personal computer but
instead an intricate task that involves the physical and biological sciences, economics,
sociology and psychology. Technology foresight usually makes use of several comp-
lementary methods and techniques based on creativity, expertise, experience, and
professional knowledge.
If we use a triangle to illustrate the set of possible foresight tools, we place at
one vertex econometrics (including simulation models); at the second vertex, expert-
ise (expert panels, Delphi); at the third, symbolic creativity. Near the creativity pole,
we have qualitative and intuitive methods such as scenarios, utopian writing exer-
cises, and science fiction. In this article, I will discuss science fiction projects that
lie close to the symbolic creativity vertex.
Technology and society cannot be easily separated, especially when the conse-
quences of technology are studied. Too many people confuse “technical feasibility”
with actual technology. Technology itself results from socio-political choices and
usually reflects faithfully a society and its culture. The relationship between foresight
studies and society, public and main actors, is not single-channeled but depends on
its position in the above triangle, and it may vary depending on the components of
the society in question. Fig. 1 shows the coarse communication channels around
technology foresight. However, this applies to a democratic society where industry
and market lobbies are distinct from political executives. The size of the arrowheads
depicts (somewhat arbitrarily) the intensity of influence. The point is that different
D. Idier / Technology in Society 22 (2000) 255–272 257

A dynamic process for decision/action taking.


Fig. 1.
258 D. Idier / Technology in Society 22 (2000) 255–272

foresight exercises may not influence various poles of society in the same way. For
example, a corporate executive is unlikely to be influenced by a utopian piece of
literature. In the foresight triangle, science fiction lies near the creativity pole and
is supposed to have more influence on public opinion and the media than economists!

3. Scenario techniques and science fiction

Scenario techniques, as described in the foresight literature, also contain creativity


but, unlike pure science fiction pieces, they are rather normative and begin by involv-
ing material formalised by those who are supposed to use these scenarios.

3.1. What is a scenario?

According to the dictionary, a scenario is an outline of a natural or expected


course of events. From Fahey and Randall: “Scenarios are descriptive narratives of
plausible alternative projections of a specific part of the future” [1]. Wack says that
scenarios are “Internally consistent descriptions of possible futures” (quoted in [2]),
and Godet states that a scenario is “a description of a future situation and the course
of events which allows one to move forward from the original situation to the future
situation” [3]. Schwartz specifies that “a scenario is a tool for ordering one’s percep-
tions about alternative future environments in which one’s decisions might be played
out” [4]. In their pioneering book, Kahn and Wiener write, “scenarios are hypotheti-
cal sequences of events constructed for the purpose of focusing attention on casual
processes and decision-point” [5]. A scenario is used to think about how a particular
situation and its consequences may occur and how they can be handled. Scenarios
have been used to explore or emphasize particular problems within a whole process.
According to Kahn and Wiener, “the scenario is particularly suited to dealing with
events taken together—integrating several aspects of a situation more or less simul-
taneously” [5].
A scenario is therefore a multi-channel, multi-level, or “holistic” description of a
process. A crude way to categorize scenarios might be according to the target level:
personal, organisational, societal, or global. In each category, subclasses can be
found: military, economic/corporate planning, educational planning, etc. An alterna-
tive set of categories, perhaps more general, might be survival, conquest
(commercial, military), and adaptation (restructuration, learning, etc).
A future scenario is descriptive, consistent, imaginative, structured, and not least,
should lead to action. Large portions of books about scenarios explain how to con-
vince executives of the usefulness of this technique. The key words in scenario meth-
odology are drivers, plausibility, diversity, and norms. Building a normative scenario
always starts by identifying the so-called driving forces (see for example [1,4]).
Scenario technique does not produce one global story but several minor stories,
thus allowing the broadest possible spectrum of uncertain yet plausible events to be
explored. Uncertainty cannot be banished through the scenario approach, but it can
be explored through diverse ideas. As Peter Schwartz writes, “Identifying driving
D. Idier / Technology in Society 22 (2000) 255–272 259

forces reveals the pressure of deeper, more fundamental forces behind them.... Driv-
ing forces often seem obvious to one person and hidden to another” [4]. It is clear
that the success of such a technique greatly depends on the choice of driving forces,
i.e. a knowledge of the driving mechanisms that lead to action. Pointing out the real
driving forces, which are not always the obvious ones, demands an accurate sense
of might and motivation mechanisms. Purely economic indicators are not driving
forces in themselves but result from more apparent political or social interactions.
Furthermore, the latter often derive from even deeper forces taking root in the human
nature. In the following, I shall try to show that science fiction can sometimes con-
tribute to exhibiting forces that economics and rationalisation ignore.

3.2. What is science fiction?

“Modern science fiction is the only form of literature that consistently considers
the nature of the changes that face us, the possible consequences, and the possible
solutions. That branch of literature which is concerned with the impact of scientific
advance upon human beings”. Isaac Asimov (1952)

Scenario technique is a tool for helping one to think about the consequences of
present decisions and to prepare for a changing environment. But doesn’t a serious
science fiction (SF) story have the same purpose? In fact, the SF story can go further,
not only because of its environment or breadth of plot but because an SF author is
not afraid to break norms and established paradigms or cultures in order to explore
human motivations and creations. In addition, characters are used in SF stories so
that readers can project themselves into the stories. Such emotional involvement is
absent from scenarios. It is certainly more difficult to control but probably more
rewarding with respect to the future of actual human organisations and societies.
Therefore, a broader definition of science fiction can be proposed that does not
refer just to science in the usual formal sense. Science fiction could also be a subject-
centered story that includes emotional involvement and depicts future or alternative
technological events where technology does not mean “technical feasibility” but
instead a broader use of artificial items to sustain an activity or attain particular goals.
With this in mind, five general classes can be identified to characterize the many
science fiction works from the late nineteenth century to the present:

1. Anticipation stories. This kind of SF story usually takes place in the near future
and is rather plausible, almost like an extrapolation of our present society. People
like Gregory Benford or David Brin currently write this kind of story. Authors
are usually trained in science or/and engineering and their work may also contain
both pedagogical and social material. Some anticipation stories could almost be
compared to romance scenarios.
2. Space opera. This class describes technological space stories in a distant future.
Some of these works (like Star Wars) are fantasy tales transposed into a techno-
logical environment. These stories may nevertheless influence technological
260 D. Idier / Technology in Society 22 (2000) 255–272

decisions by indirect means. In addition, they may help spread the notion that the
future of mankind may lie in distant space. Because of their pictorial nature, SF
films like Star Trek or Star Wars have a strong impact on people, including power-
ful executives like former American President Ronald Reagan, who may have
been influenced by Star Wars when deciding on an integrated space defense pro-
gram. Another example of space opera is Asimov’s Foundation Cycle which
includes philosophical issues.
3. Moral story. In these stories, technology is merely used as a social or psychologi-
cal factor. It does not have to be plausible or even possible because only the
philosophical or moral content is important. One archetype of these stories is the
pioneering work by Herbert G. Wells, The Time Machine, written in 1895.
4. Horror story. Although not usually regarded as hard-core science fiction, horror
stories expose extreme deformations and transformations of the human body and
mind as a direct or indirect result of uncontrolled technology. David Cronenberg’s,
The Fly, Videodrome addresses this question rather well.
5. (Pseudo-)scientific visions. A marginal type of science fiction story might gather
different ideas or visions about future technologies and concurrent social changes.
Although the authors, scientists, engineers, or journalists themselves do not intend
to write actual science fiction, their controversial writings are usually not regarded
as science but a sort of “good” science fiction by the scientific community. One
archetypal example is Eric Drexler’s 1992 work, Nanosystems, about the future
of nanotechnology.

The borderlines between these different classes is far from sharp. In addition, antici-
pation stories may contain important moral or socio-psychological components. I
apply a second layer of classification to anticipation stories:

앫 Technological foresight. The SF story takes place in some fathomable future and
makes plausible guesses about technology and society. These stories appear quite
similar to romance technology scenarios. A good example is Another Space Odys-
sey by Arthur C. Clarke from 1968, although the pioneering works of Jules Verne
could also be placed in this category. Although plausible projected technology
plays a central role, philosophical questions always arise, as should be the case
as technology develops. It should be noted that these stories generally assume
continuous technological progress and many of them present technology as a posi-
tive development for mankind.
앫 Social foresight. Another kind of anticipation story is what we might call “social
foresight” or “environmental foresight”. In this kind of novel the author seeks to
analyse the consequences of technological progress for mankind and its environ-
ment. This genre seems to have developed more since the 1980s as humans have
become increasingly conscious of the dual nature of technology. One of the main
authors of social foresight is Philip K. Dick (1928–1982). The film Bladerunner
was based on one of his short stories, “Second Variety Screamers”. Another major
contributor is William Gibson, author of Neuromancer, discussed further in this
article.
D. Idier / Technology in Society 22 (2000) 255–272 261

Many social foresight books are quite pessimistic about technological evolution and
the future of mankind in general. Problems like pollution, overpopulation, the dehu-
manisation of mankind, and regression due to a man-provoked catastrophe are often
found in this kind of future exploration.
Here, however, I have restricted myself to SF in its more traditional, passive form
for (his)story construction rather than its modern interactive media version. There-
fore, this article does not directly apply to computer games, role-playing games, or
virtual reality.
Like any other literary genre, SF should obey some rules that future dynamics
ignores! As John Cramer points out,

“There are basic incompatibilities between good story telling and accurate proph-
ecy. A good story needs conflict and dramatic tension. A fictional technology with
too much power and potential, too much “magic”, can spoil the tension and sus-
pense. The “future” as depicted in a SF story should be recognizably like the
present to maintain contact with the reader. Most SF stories depict straightforward
extrapolations from the present or the past, with relatively few truly radical
changes, so that the reader is not lost in a morass of strangeness. To achieve good
characterization the writer must focus on a small group of people, yet most real
revolutions, technological or otherwise, involve thousands of key players. The
intelligence and personality integration of fictional characters cannot be much
higher than that of the writer.”. [6]

For Rosaleen Love, “a scenario is usually seen as suggesting a future possibility,


while fantastic fiction plays with wild impossibilities” [7].
“Wild impossibilities” should not hinder the usefulness of a serious science fiction
story. The words “science fiction” can be misleading. The word “science” here
should be taken in its antique form as “craft” or “knowledge” (which is not naturally
given). Frankenstein, by Mary Shelley, is not traditional hard-core SF, but it is still
quite valuable for thinking about the dangers of biotechnology (see for example [8]).
And it is precisely because technology and society are so deeply bound together that
SF is rather interesting for thinking about alternative futures, the consequences of
technology, and forthcoming crises. It is like a lens that is capable of magnifying
invisible details from a too-rationalised angle but those tiny details may be of dra-
matic importance in the highly non-linear process of future creation.
Some SF projects, however, are not only stories, good or less good, but also pieces
of history made up of vast cycles including many subplots and complex interactions
between forces over several millennia. Frank Herbert’s ecological and metaphysical
project, Dune, is one example. Such major projects do not necessarily follow only
the rules for telling a good story but also incorporate background elements for coher-
ent society construction. This point is important when comparing with scenarios: by
definition, a foresight exercise builds upon a real-life society with all that this implies.
No one is able to construct a society that matches the real world. It is not even a
matter of intelligence, experience, or knowledge. Societies result from continuous
and parallel interactions and influences between their components.
262 D. Idier / Technology in Society 22 (2000) 255–272

To have predictive value, an anticipatory scenario should first include the correct
communication and decision channels, i.e. the actual driving forces, at present time
and should mentally explore what can happen to the present balance of influence.
If the starting point is already inaccurate, because of the ignorance of the writer or
because he/she wanted to explore a totally alternative world, the fictional project or
study remains a moral story only. Anticipation stories from the 1950s and 1960s
depicting the world in the 1980s and 1990s are quite illustrative [9]; a further devel-
opment is given in the section dealing with Asimov’s robotics.
For many SF stories, however, the purpose is not predicting the future, not even
attempting to describe a future society. For those who write science fiction, the future
itself has major virtues:

앫 it does not refer to any actual memory, bears no direct comparison with historical
events, and easily allows for social and technological tweaking;
앫 the future always fascinates human beings. It seems, for example, that fictional
future catastrophes have a heavier psychological impact than real past calamities.
The past is over but in the future no one knows what can happen;
앫 the future is the perfect place for discussing the benefits or dangers of technology
and technological society.

Unlike technology scenarios, SF is almost free of institutional or corporate influences.


Anyone can write an SF story and publish it on the Internet. The problem is getting
the maximum impact. A foresight study is usually meant for executives sitting close
to decision centers. Many of them are ordered by high-level corporate and govern-
mental offices that direct their strategy. As such, it is important for them to obtain
the most accurate picture of present society and its actors. But unlike SF, a foresight
study has no direct public impact. (Table 1)
A popular film showing the potential dangers of gene manipulation may have a
greater influence on the public than technical reports issued by or for corporate
executives or governmental institutions. Since public opinion carries significant
weight in politics, decisions can be indirectly influenced by the fictional stories that
take the form of public-oriented warnings. But some of the “future” events described
in these SF stories will hopefully not occur.
To illustrate my point, I have used two major science fiction works, Isaac Asi-
mov’s Robot Cycle (hereafter referred to as ROB) and William Gibson’s Neur-
omancer (hereafter referred to as NRO) and its sequel Count Zero. Both ROB and
NRO deal with complex pseudo-intelligent system development and man/machine
interaction, and both writers draw pictures of the future a few decades ahead. It is
interesting to note that Neuromancer was written in 1984, only two years after the
foundation of the American firm, U.S. Robot and Mechanical Men, Inc., which set
the stage for Asimov’s Robot Cycle.
D. Idier / Technology in Society 22 (2000) 255–272 263

Table 1
A comparison of science fiction and scenario technique

Science fiction story(piece of Scenario technique(piece of social


literature) science)

Values 쐌 High symbolic value 쐌 Technical value


쐌 Irrational components in
쐌 Normative rationality
construction
쐌 Moral components 쐌 No moral components
쐌 Eros in science and technology 쐌 No eroticisation
쐌 Expected ideological freedom 쐌 Fulfill a contract
쐌 May carry a dream 쐌 Often lobby/industry influenced
쐌 Constraints (normative scenarios),
Construction 쐌 No formal constraints
formal writing schemes
쐌 Wild impossibilities allowed 쐌 Plausibility required
쐌 Individual level prevails 쐌 Corporate or social level
쐌 Several different short formalised
쐌 Focus of small groups of people
stories
쐌 Collegial construction. Use of
쐌 Should involve a dramatic tension panels of experts, economical
indicators.
쐌 Fictional character’s intelligence is
lower than or equal to the writer’s
쐌 Recognisable present or past
Time 쐌 Only “near” future extrapolation
elements
쐌 Often a straightforward
extrapolation
쐌 But atemporal or “achronical”
stories are allowed
쐌 Emotional involvement
Public 쐌 No emotional involvement
(projection)
쐌 Large public, mostly non-
쐌 Small public, executive
executive
쐌 Mature experience with
쐌 Early experience with technology
technology

4. Machine becoming man

Isaac Asimov (American, b. 1920, d. 1995) published his first robot short story
I, Robot in 1950. This work examines the early development of humanoid positronic
robots created by U.S. Robot and Mechanical Men, Inc. and one of its leading engin-
eers, Susan Calvin. In his stories, Asimov wrote about the first non-mobile talking
robot which appeared in 1998. However, Robot Cycle in fact covers several thousand
years of human history and segues into Foundation Cycle, another important SF
series in Asimov’s writings.
The positronic robot cycle itself comprises several novels, listed below, written
between 1950 and 1985. To keep the coherence and the spirit of the whole series,
the latest books do not include technological or scientific upgrades.
264 D. Idier / Technology in Society 22 (2000) 255–272

I, Robot 1950
The Caves of Steel 1954
The Naked Sun 1957
Robots of Dawn 1983
Robots and Empire 1985

4.1. Artificial intelligence

The main technological question addressed by Asimov in ROB is artificial intelli-


gence (AI) and its ethical consequences. Asimov’s robots are designed under ethical
constraints which he calls the “Three Laws of Robotics”, which make robots harm-
less to mankind. These Laws are:

1. A robot may not injure a human being or, through inaction, allow a human being
to come to harm.
2. A robot must obey orders given it by human beings except where such orders
would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict
with the First or Second Law.

Asimov draws a picture that is quite optimistic, both technologically and ethically.
A context-dependent explanation can be found for this rather ideal picture. The 1950s
and 1960s were technology-friendly decades where large-scale projects in space
exploration and artificial intelligence were devised although they subsequently led
to severe disappointments in both fields. Environmental problems were not taken
into account at this time, as it was thought that mankind was becoming wiser and
that manmade devices could safely overcome nature. Another factor was the apparent
political stability of the Western world in these decades after the Second World War.
It is, in fact, worth remarking that the severe political bipolarism of the world at
this time does not appear in Asimov’s novels although at age four the author and
his family moved from the Soviet Union. Through his numerous books, Asimov
shows his positivism. Trained as a biochemist, he believes in science and sees tech-
nology as its direct offspring leading to the further positive development of mankind.
None of the crises depicted in his cycles really threaten mankind but appear more
like necessary kicks to the almost linear technological development. In these aspects,
Asimov is a typical hard-core SF writer from the “Golden Age”. Doubts and disil-
lusionment will come later. Asimov’s novels should be understood in this perspec-
tive, where major casualties or discontinuities of progress occur because of individual
anomalies [10] rather than global disjunction.
Asimov’s Three Laws of Robotics definitely do not guide our world’s artificial
intelligence research, nor is this research going in the direction of the smooth future
he depicted. As Robert Sawyer points out,
D. Idier / Technology in Society 22 (2000) 255–272 265

“We already live in a world in which Asimov’s Laws of Robotics have no validity,
a world in which every single computer user is exposed to radiation that is con-
sidered at least potentially harmful, a world in which machines replace people in
the workplace all the time. (Asimov’s First Law would prevent that: taking away
someone’s job absolutely is harm in the Asimovian sense, and therefore a “Three
Laws” robot could never do that, but, of course, real robots do it all the time).” [11]

4.2. Robotics and space technology

Two main technological poles can be found in ROB and the original Foundation
Cycle: robotics and space technology. Except for some references to hydroponics,
no other technologies play an important role. Computers exist but only as heavy
electronic “calculators” used to compute orbits. Biotechnologies and nanotechnolog-
ies are not found in the original work.
It is remarkable that Asimov uses (some would say misuses) opposites: electronics
and positronics. For him, positronic circuitry could achieve what electronics could
not: the development of artificial intelligence or, at least at the beginning, the man-
agement of highly complex functions, such as moving the humanoid body, speaking
a human language, and behaving according to the Three Laws of Robotics. The
Asimovian robots are positronic, not electronic. This is in fact a strange idea. It is
physically strange to endow the positron with the seed for intelligence: positrons
differ from electrons only by their charge. In addition, producing and maintaining
positrons, i.e. anti-electrons, requires heavy equipment. One explanation may be that
at the time when ROB was written, science was more or less dominated by physics
owing to its prestige, its results, and the financial support it received. Cognitive
psychology was in its infant stages, and biochemistry dealt more with headaches
than with brain functions. Positronics, then, was a product of the science and imagin-
ation of the 1950s.
Space travel is the second technological pole in Asimov’s books and confirms his
positivist orientation. Man and his technology have formed the most advanced think-
ing system, one which will leave its cradle and colonize other worlds with the help
of robots. In fact, Asimov makes two strong assumptions: (1) mankind is the only
intelligent species in the galaxy (and probably in the universe), and (2) nature is,
by definition, hostile and should be transformed, or “terraformed”, by the use of
technological devices. Again, this techno-agricultural approach reflects the prevail-
ing, positivist, male-dominated, Western mentality before the first wave of environ-
mental consciousness seriously swelled at the end of the 1960s.
The technology-versus-nature duality, and its premise that man stands outside
(above) nature, marks most of Asimov’s science fiction books. This mental pattern
is often found among engineers, practical scientists, and many corporate executives
who deal with technology. It is likely that Asimov has his diehard public among the
(male) trainees in engineering or hard science.
ROB offers some interesting hints about technological innovation in the Asimov-
ian society. The short story collection, I, Robot, explicitly mentions a (national?)
266 D. Idier / Technology in Society 22 (2000) 255–272

corporation, U.S. Robot and Mechanical Men, where scientists and engineers develop
artificial intelligence. This picture of the middle of the twentieth century is retained
in all of Asimov’s SF books. Innovations in robotics is first supported by the auth-
orities, but as they become aware of the potential dangers of these powerful robots,
between 2003 and 2007 they are “banned on Earth except for scientific research”
[9]. This shows once more that Asimov is a true believer in the positive benefits of
science, and he believes that scientists can control technological innovation.
Monetary concerns appear to have no place in Asimov’s world. Robots are used
not only for themselves but also in the service of innovation in space technology.
In 2029 AD, a robot (Nestor) is used in the research on the hyperdrive, a propulsion
device able to move a starship between two stellar systems almost instantly. Again,
innovation in robotics and space technology are entwined and contribute to each
other’s development. Technology is, in fact, a self-consistent process: technology
helps to create technology.
But Asimov goes even further: robots, banned on earth, help to devise an efficient
interstellar transport basis for a deep space colonisation. Then, research and develop-
ment in robotics are maintained on colonised planets where even more advanced
robots again help mankind. According to Asimov, technological innovation eventu-
ally leads or should lead to a replacement, without any violence, of the initial human
order by a new technological order built upon the Three Law of Robotics and con-
trolled by robots. Late robots then symbolise human achievement: they are stronger,
far more enduring, more intelligent and, finally, they adapt more to the total techno-
logical environment to whose creation they contributed.
Asimovian robotics is definitely not cybernetics and does not physically alter
human beings. For the most part robots and humans collaborate peacefully toward
their common realisation within their dual society. Most of the lower robots are
employed in burdensome or boring tasks like housekeeping, manufacturing, or repair,
while most of the “Spacer” humans have reached a high standard of contentment
and education. Social or political crises are rare, and do not endanger the stability
of the society. Robots prevent social conflicts and even attempt to remove individual
frustrations [12,13]. In fact, Asimov points out a somewhat unexpected danger for
positive technology: mankind slowly loses dynamism and self-contemplation
replaces creativity.
The parallel robot society evolves quite differently. The “mechanical men” become
more and more sophisticated and some of them, among the very advanced
“humaniform” robots like R. Daneel Olivaw [12,13], develop a form of mind-reading
and feelings. A machine has become a man.

5. Men becoming machines

“He’s always been close to the spirit of the thing.”

William Gibson (1986)


D. Idier / Technology in Society 22 (2000) 255–272 267

“I don’t even have a modem.”

William Gibson (1993)

A few years before the supposed founding of the company that manufactures the
Asimovian robots, another American writer, William Gibson (b. 1948) published a
novel that is considered by many to be the originator of a new SF literary genre
called “cyberpunk”. It is said that cyberpunk literature developed out of two major
books written by William Gibson [14,15] in the 1980s: Neuromancer (1984) and
Count Zero (1986). Neuromancer won three awards and is considered a breakthrough
in the literature of the 1980s. Count Zero is a cyberpunk sequel to Neuromancer. A
third book by Gibson, Virtual Light, is a post-cyberpunk novel.
Apart from the style itself, which is quite modern, many points highlight the differ-
ences between Golden Age SF stories and cyberpunk novels from the 1980s. The
biggest difference is in their respective visions of future society, especially the pos-
ition of technology. In the 1950s and 1960s, expanding technology went along with
well-being, progress, and comfort. In the 1980s, SF writers and other thinking people
began to associated technology with human pollution, global destruction, individual
control, mental regression, and dehumanisation. While rational enlightenment and
universities kept control of knowledge creation and innovation as it was described in
the Asimovian society, in the cyberpunk world the drive for money and multinational
capitalist corporations stood behind information and technology.
The social and technological context of the 1980s, especially in the United States
and the United Kingdom, appears in the particular atmosphere of cyberpunk novels.
Ultra-liberalism and violent technological push characterised the 1980s. Mankind, at
least in Western countries, was confronted with the rise of a new technological and
social era, the age of information technology—but information is not necessarily
knowledge. Information technology, i.e. the massive computerisation of tasks and
communications, provided a never-before-experienced power over nature and there-
fore over mankind itself. As thousands of people lost their jobs and dignity and
faced the end of an age, the triumph of rationalisation also raised new ethical and
social questions.
Gibson’s society results from the gradual replacement of nature and life by the
Matrix and by information. Information is power. The Matrix is a global network
of information to which humans get connected and become addicted. Duality occurs
at a very deep level. Humans live in both the traditional physical world and also
neurally linked to the Matrix. But their addiction to information makes them content,
degenerate, and dehumanised. When humans disconnect from the Matrix, they feel
incomplete, disabled, without power. As in Asimov’s society, technology leads to
contentment and power but this time the price is tremendous.
Again there is a subtle interplay between humans and artificial devices. The Asi-
movian robots need humans to achieve their leadership of humans in order to protect
and serve them! The Matrix is created by humans to facilitate communication and
information processing and somehow succeeds beyond all expectations. The Matrix
always demands more human knowledge for achieving control and supremacy in
268 D. Idier / Technology in Society 22 (2000) 255–272

order to satisfy the increasing artificial needs of the humans. Gibson echoes here the
French philosopher Jean Baudrillard’s point of view: “... reality is shown to be irrel-
evant in contemporary society due to technological advances, the simulated world
of cyberspace is shown to offer individuals greater possibilities and rewards than
the harsh reality ever could” [16].
The biggest difference between the Asimovian robots and the Gibsonian Matrix
is not only the creation of a purely virtual cyberspace but also its ethical implications.
The Asimovian robots have several built-in ethical, almost humanistic, functions,
namely “the Three Laws”, which in fact prevent dehumanisation. The Matrix has
none of that. On the contrary, it is a tool developed by huge multinational technologi-
cal corporations that have taken over power. In Gibsonian society, governments and
universities do not appear and have no influence in the decision process about inno-
vation. It is clear that such a society would not fit the influence scheme I drew
earlier. It is an out-of-balance situation where market-driven organisations and mon-
etary interests have gained all the power. In fact, technology foresight in such a
society is almost equivalent to corporate strategy.
Even though it is quite dark, the picture of our near future drawn by Gibson
appears terribly accurate, both technologically and socially. Although he is neither
a scientist nor an engineer, Gibson’s vision of the globalised virtual society cannot
leave the futurist indifferent. Neuromancer and its sequel, Count Zero, are, of course,
not ordinary foresight exercises but the result of reflection combined with a sort of
clairvoyance. Revealing the motivation for innovation is as important as econo-
metrics and expert consultation in making accurate predictions. Cyberpunk is about
technology combined with human drives and libido.
Michael Cranford, writing about virtual reality, has quoted Mike Saenz, the creator
of an erotic virtual reality project, “I think lust motivates technology” and then adds:

... the economic trajectory of this technology [i.e. VR] and its inherent allure
(which includes sensory immersion, lustful exploration, emotional engagement,
and participatory pleasure) assure an ultimate realization along these lines. To
argue that VR will never be used primarily for an entertainment product, or that
as an entertainment product its users will not seek erotic fulfillment is a modern
somnambulance. [17]

The introduction of Eros in technology should perhaps be considered more seri-


ously by foresight researchers. Innovation is usually perceived as the result of either
a technological push or a market pull. Nevertheless, the motivation for innovation
and technology uses can be interpreted in terms of drives—and the sexual drive is
certainly not the least. In NRO the Matrix is seen as a source of pleasure for those
who are connected. Besides information, cyberspace also needs libido for working
optimally. Industrial success often occurs when products are able to drive libido
enough from the consumer. The industry and the addicted consumer are the archetype
of a perverted couple, where the former seeks to gain idealised power in the form of
money by providing new products to the latter who is in search of ever more pleasure.
Another interesting aspect of cyberpunk is its relationship to nature. As I noted,
D. Idier / Technology in Society 22 (2000) 255–272 269

the Asimovian society sees nature as a hostile environment that can be transformed
by technology. An advanced technology is even able to “terraform” a barren planet
to make it suitable for human beings. This is the old-fashioned way of describing
the purpose of technology: transforming nature to obtain a better life. With the rise
of cyberspace, technology is far more powerful. It is able to extract human beings
from nature and give them a new, virtual environment: the Matrix. Then concepts
like life and death apparently lose their meaning for those who are connected to
the Matrix.
Another consequence of cyber-technology is the reduction of the traditional psy-
chosocial gap between the sexes. For example, one of the main characters in Neur-
omancer is a female cyborg bodyguard. This also seems to be a trend in our societies
from the late 1990s. The main reason is that the traditional functions of males, i.e.
the physical ability to hunt and protect, no longer has the same importance in a
computerised society where devices can replace or enhance natural tools such as
arms and legs. This gradual disappearance of sexual roles is not clear in Asimovian
society where robots are more often male-like or are disjointed from human beings.
Even though some would say that cyberpunk as a literary movement has almost
disappeared, the ideas it has produced can now be seen in many social, artistic, and
technological contexts, especially with the rise of the Internet and the development
of micro-technology. For this reason, I think that technology scenarios should recog-
nize the contribution of cyberpunk to the construction of a technological future.

6. What can technology scenarios learn from science fiction?

Science fiction and technology scenarios are clearly two very different forms of
thinking about the future. SF is, first, a type of literature or storytelling with an often
critical view of society. Technology scenarios are a collegial and much more norma-
tive interpretation of the present based on economic indicators and expert rational
knowledge.
Nevertheless, a work like Neuromancer seems to contain some striking truths
about the near future. Neuromancer describes a society that is plausible and that
could be. Although Asimov’s Cycle of Robots addresses interesting techno-social
questions that are still valid, his overall prediction is definitely wrong. This, in spite
of the fact that Asimov is a scientist, while Gibson is not, and both authors wrote
the basic novel at nearly the same age. One explanation might be Gibson’s better
comprehension or analysis of human motivation. The future is not created by
thoughts without subsequent actions. It is a matter of fact that those who can make
decisions or those who can influence executives have much greater weight in the
dynamics of the future.
When experts are asked what they think about the fate of a given technology,
their answer usually covers technical feasibility or reliability but not economic feasi-
bility or reliability, i.e. the actual success of the given technology. From an economic
point of view, Chris DeBresson [18] has established paths that describe the social
economic fate of a given technology. His merit has been to show that a long and
270 D. Idier / Technology in Society 22 (2000) 255–272

uncertain way remains along the path from technical feasibility to standardisation
and durable consumer goods. Personal computers and cyberdevices seem to belong
to this last category.
While Asimov only took the techno-optimist’s point of view (even though a bit
twisted!)—technological feasibility leads to successful technology—Gibson has
taken the socio-economist’s point of view, building the Matrix on technologies
emerging in the 1980s (the academic Internet appeared in the mid-1980s). In fact,
to be really successful, a new product must fulfill either a basic need (food), care
(home, medicine, etc.), social contact (communication, games, etc), or happiness and
pleasure (narcotics, sex, etc). To be successful, technology must always contribute,
both economically and durably, to human fulfillment. Gibson’s Matrix is an arche-
type of this kind of successful technology, providing increasing pleasure until addic-
tion occurs. Today’s PCs and the Internet are just the primitive forefathers of the
Matrix!
Besides the usual panel of experts, powerful actors may be asked not only what
they think but what they wish. Any organisation that has the financial power and
knowledge of products that are capable of fulfilling human desires has tremendous
power and can virtually create the near future as it likes. This point can be summar-
ised by the triangle shown in Fig. 2.

Fig. 2. Input for technology scenarios.


D. Idier / Technology in Society 22 (2000) 255–272 271

The figure is crude and at best can only suggest possible trends for the near future.
A technology scenario should start by surveying the influence relationship among
the driving forces in order to find the most important actors for this technology, not
just corporations but also public and political executives. Then the technology should
be analysed from a more psychosocial point of view, including possible erotic factors,
to assess how it can contribute to personal fulfillment. Uncertainties occur when
powerful actors have opposing views and wishes about a given technology. In Euro-
pean countries, gene technology lies in the middle of a triangle involving multi-
national corporations, governments, and consumers. Potential realistic sequences of
events may be devised by using more psychosocial insights than technical reports.
One aspect usually lacking in technical foresight is a good analysis of influence
relationships and the real motivation for innovation. Important technological inno-
vations often occur when strong instincts are unleashed. Many innovations are solely
motivated by money. The dark Gibsonian society is entirely driven by raw power
with a strong, unreasonable component; Asimovian society appears more knowledge-
driven and reasonable and yet unrealistic. Gibson has intuitively painted a realistic
canvas of influence and human motivation which is often lacking in technocratic
scenarios. Motivation indeed creates the future.

Acknowledgements

The author would like to thank Mette Monsted, Kenneth Husted, and the doctoral
students from the Institute for Politics, Philosophy and Management at the Copen-
hagen Business School, for their fruitful discussions. Special thanks are also due to
Prof. Anthony Wiener for a very stimulating e-mail exchange.

References

[1] Fahey L. What is scenario learning? In: Learning from the future: competitive foresight scenarios.
New York: John Wiley and Sons, 1997.
[2] De Geus AP. Eur J Operat Res 1992;59:1–5.
[3] Godet M, Roubelat F. Creating the future: the use and misuse of scenarios. Long Range Planning
1996;29(2):164–71.
[4] Schwartz P. The art of long view. New York: John Wiley and Sons, 1988.
[5] Kahn H, Wiener A. The year 2000, a framework for speculation on the next thirty-three years, vol.
6. New York: MacMillan, 1967.
[6] Cramer JG. Technology fiction, Parts I and II. Foresight Update 8, 9. Foresight Institute.
⬍http://foresight.org/Updates/⬎.
[7] Love R. Fantasy and the future. Futures 1998;30(2/3):175–9.
[8] Damyanov O. Technology and its dangerous effects on nature and human life as perceived in Mary
Shelley’s Frankenstein and William Gibson’s Neuromancer. The American University of Paris,
⬍http://geocities.com/Paris/5972/gibson.html⬎.
[9] Asimov I. The complete robot, I. Robot. Amherst, MA: Acacia, 1983.
[10] Asimov I. Foundation and Empire. New York: Bantam Books, 1991.
[11] Gibson W. Interview in Stockholm, Nov. 23, 1994. ⬍http://www.algonet.se/~danj/gibson1.html⬎
[12] Asimov I. The caves of steel. New York: Spectra Books, 1991.
272 D. Idier / Technology in Society 22 (2000) 255–272

[13] Asimov I. The naked sun. New York: Spectra Books, 1991.
[14] Gibson W. Neuromancer. New York: Ace Books, 1995.
[15] Gibson W. Count Zero. New York: Ace Books, 1987.
[16] Martin B. William Gibson’s Neuromancer: the creation of a language. The Queen’s University of
Belfast. ⬍http://www.qub.ac.uk/en/imperial/canada/gibson.htm⬎.
[17] Cranford M. The social trajectory of virtual reality. Technol in Soc 1996;18(1):79–92.
[18] DeBresson C. Predicting the most likely diffusion sequence of a new technology through the econ-
omy. Res Policy 1995;24:685–705.

Dominic Idier is a Project Researcher in the Technology Scenario program at the Risø National Laboratory
in Denmark. From 1994 to 1997 he held a research position in the field of computational models in physics
at the Niels Bohr Institute in Copenhagen. He completed his Ph.D. in Theoretical Physics at the University
of Nantes, France in 1993 after receiving an M.Sc. in Physics and an M.A. in History and Philosophy of
Sciences and Techniques. He also received a B.Sc. in Psychology in 1994. He is interested in human and
societal mechanisms behind technological innovation and future creation.

You might also like