You are on page 1of 9

Of Robots, Empires and Pencils:

The Worlds of Isaac Asimov Reconsidered

Reviewed by Sally Morem

Human society is the most astonishing and perplexing of all the


universe's life-forming, self-organizing processes in its ability to
transform the creative and mundane acts of thinking beings into
systems that span the globe and stretch out into space.

Isaac Asimov, as a writer and a man, was vitally concerned with the
workings of human societies. He dreamed of far-flung interstellar
empires run by fragile and misguided humans, with robots made in
their image, guiding them away from destruction. But, for all their
imaginative world building, Asimov's Foundation and Robots series of
stories and novels must be considered magnificent failures.

"Magnificent" in the sense of the boldness with which Asimov described


galactic civilization without all the hackneyed, Buck Rogers slam-bang
space fighting against bug-eyed aliens. "Failures" in the sense that a
centralized galactic empire run by a planet-bound bureaucracy and a
future Earth wholly controlled by robotic minds stretch believability to
the breaking point and beyond. But, to be fair, let's try to understand
the literary strictures Asimov had to face as a young writer in the SF
genre of the 1940s.

Let's return to the time when pulp fiction and space opera ruled the
magazine and bookstore racks, when daring spacemen didn't hesitate to
reach for their blasters when facing strangers--either aliens or humans--
and never failed to avoid the use of subtlety in any given opportunity, a
time when enormous galactic battles raged on unabated for no apparent
reason.
Doc Smith's Lensmen series was only the most famous of the "thud and
blunder" school of SF writing. Interstellar war was seen as an enlarged
version of pirate battles on the high seas. War as fun and games, and not
as the desperate struggle for the survival of a people and a culture that
it really is.

Instead of the Empire of Force, held together by death rays and


Lensmen, Asimov was attempting to craft an Empire of Reason. And not
just that, but an Empire guided by a Plan--which was, in fact, an elegant
mathematical equation, one which could accurately predict what
trillions of human beings would be doing for centuries. These stories
were later collected in "Foundation," "Foundation and Empire," and
"Second Foundation"--The Foundation Trilogy.

And at the same time, in a different series of stories, Asimov was


bucking the hoary stereotype of the malevolent robot. Susan Calvin,
Robo-psychologist for U.S. Robots and Mechanical Men, Inc., describes
the development of robots in the 21st century in "I, Robot," a collection
of these stories.

Each robot has its own personality and faces its own rather unusual
challenge. Robbie, devoted nursemaid to a little girl; Speedy, torn
between self-preservation and obedience to a lawful order given by a
human being; Cutie, a would-be theologian with a truly unique view of
the universe and his place in it; Dave, who can't control his "fingers"
during emergencies; and Herbie, the mind-reading robot who makes a
promise he can't keep.

To guide the thoughts and actions of his personable robots, Asimov sets
up the Three Laws of Robotics—

1. A robot may not injure a human being, or through inaction, allow a


human being to come to harm.

2. A robot must obey orders given it by human beings, except where


such orders would conflict with the First Law.

3. A robot shall protect its own existence as long as such protection does
not conflict with the First or Second Law.
—and then, gleefully, proceeds to knock them down.

We discover that a full understanding of the morals and ethics


incorporated in the Three Laws is a bit more elusive than a surface
reading would indicate. Asimov's stories demonstrate the "fuzzy logic"
inherent in such words as "harm," "protect," and "obey." These words
seem straightforward enough, but they're actually laden with semantic
land mines where the unwary step at their own peril. Asimov admitted
that his Laws were deliberately designed with not-so-obvious loopholes
in order to create artistic "wiggle room" for conflict needed to create
interesting stories.

Asimov's short stories and novels took the science fiction world by
storm, and rightly so. Here we have cute, lovable robots, sometimes
brilliant, sometimes bumbling, but always with a new slant on what it
means to be "human." And here we have a man, not a man of action,
but a mathematician, a thinker, as hero. I speak of none other than Hari
Seldon. Seldon and a small group of psychologists developed a
psychological profile of the galactic masses, a science of statistics called
psychohistory which "deals with the reactions of human conglomerates
to fixed social and economic stimuli..."

Any ordinary person, science fiction fan or not, would find such claims
to be astonishingly bold. How could anyone, no matter his intellectual
achievements, have the audacity to think he could envision the future
history of trillions or quadrillions, let alone assume he knew anything
worth knowing about their present lives? This is enough to do serious
damage to the reader's ability to suspend disbelief, a skill required for
enjoying any kind of fiction, especially science fiction. Asimov attempted
to cover himself (and Seldon) by taking great pains to explain that
psychohistory was never applicable to individuals, that individuals were
so variable, so individual, that they were fundamentally unpredictable.

"It [the Plan] could not handle too many independent variables. He
couldn't work with individuals over any length of time; any more than
you could apply the kinetic theory of gases to single molecules. He
works with mobs, populations of whole planets, and only blind mobs
who do not possess foreknowledge of the results of their own actions. "
Seldon openly asks the Emperor and the 40 billion bureaucrats who
manage the affairs of the galaxy on the Imperial planet Trantor to
permit the formation of a Foundation, the purported purpose of which
is to collect scientific and technical data from around the galaxy and to
publish it in the form of a "Galactic Encyclopedia" every ten years. In
the meanwhile, he secretly works on his Plan. The professional
historians and other skilled employees of the Foundation, citizens of
Terminus, the Foundation planet, are maneuvered by Seldon's Plan into
tight spots as the old Empire falls apart, in which only one response
would guarantee their security, each such decision designed by Seldon to
cut the projected 30,000-year collapse of the Galactic Empire down to a
mere one thousand years. In this manner, the Foundation society figures
out how to use conflicts in trade, science, religion, and politics to secure
a tenuous foothold in the development of Seldon's Second Empire. And
so, over centuries, the Foundation and, in secret, the Second Foundation
(the enforcer of the Plan), build the Empire of Reason.

But here is where we run into some serious conceptual difficulties. Even
though the Foundation Trilogy was Asimov's fictionalized plea for our
leaders to rely on reason instead of force to carry out their political
agendas, no Empire, Earthbound or Galactic, can ever be held together
for any length of time by force OR reason. Any Empire worthy of the
name is made up of innumerable rules, customs and mores, crafted over
the ages by those who didn't even know they were creating them. Rough
and ready rules of thumb were established by people for short-term
gain, with no intention whatsoever of enshrining them in institutions.

Their descendants followed these rules even though they had no idea
why they existed. Civilizations survive despite themselves. The legally
blind leading the blind. The Lawgiver imagined by Asimov is more myth
than fact. Instead of the image of the Lawgiver applying the stick of
force and offering the carrot of reason to recalcitrant followers,
visualize trade routes, slowly growing in length, complexity and volume.
Visualize the merchants as they carry languages, art, science and
general know-how along with their physical cargo to ever more remote
areas of the world. Visualize trade becoming abstract, an ever-
increasingly complex ebb and flow of ideas, ratcheting cultural
evolution upwards into literally inconceivably dense networks of human
interaction and aggregation. Such spontaneous social orders are never
commanded by any Lawgiver, although monarchs, dictators, presidents,
generals, and captains of industry have presumed the existence of such
control since human civilization began. Here is one aspect of human
society which is rarely considered in political philosophy outside of
ecological concerns: Population. Size really does matter. To coin a
tautology: Numbers Count.

To illustrate this point, let's start with a group of five people. Consider
the novel decision-making procedures, the growth in the potential
number of relationships and conflicts, the continual fissioning of
professions into specialties and sub-specialties, and the varied societal
structures as the number of individuals in our hypothetical group
increases several times by a factor of ten:

A family of 5
A club or association of 50
A corporation of 500
A town of 5,000
A city of 50,000
A metropolis of 500,000
A province of 5,000,000
A nation of 50,000,000
A continent of 500,000,000
A world of 5,000,000,000.

Now, we could continue this progression by adding zeroes until we have


the numerical equivalents of a solar system civilization, a federation of
star systems, a galactic region, a galactic empire, and so on. The
differences in social structure increase in intensity as we do so. This
means the social structure of a galactic empire would no more resemble
that of the Roman Empire than a 20th century American city would
resemble a Stone Age village. The potential numbers of relationships
and conflicts increase exponentially, resulting in radical changes in the
nature of mediating institutions in the given society and in the society as
a whole. We can see from this analysis that the societies on our
hypothetical list could not possess the same political organization,
technological capability, culture, or knowledge base. And yet, the
Foundation series reads as if it were the history of an Earth-bound
empire. In fact, Asimov admitted that he based his galactic empire on
ideas borrowed from Gibbons' "The Decline and Fall of the Roman
Empire."
This is why the story seems so overly simple. Each scene is written as if
it were taking place in a different city on Earth, rather than in what
would be deeply complex planetary societies in a galaxy-spanning
civilization of nearly 25 million inhabited worlds.

A key to understanding the fundamental problems in Asimov's world-


building can be found in the final story in "I, Robot." Susan Calvin
realizes that the Machines, as she calls them, have taken over Earth.
Robotic brains have evolved into behemoths which control whole
industries and nations. Not for evil purposes, as in "Colossus: The
Forbin Project," but in order that they may fully conform with the
Three Laws.

And so, in order to protect humans from their own mistakes, the
Machines have developed the ability to detect and anticipate deviations
from the explicit orders of the Machines and correct for these
deviations. "Every action by any executive which does not follow the
exact directions of the Machine he is working with becomes part of the
data for the next problem. The Machine, therefore, knows that the
executive has a certain tendency to disobey. It can incorporate that
tendency into data, --even quantitatively, that is judging exactly how
much and in what direction disobedience would occur. Its next answers
would be just sufficiently biased so that after the executive concerned
disobeyed, it would have automatically corrected those answers to
optimal directions. The Machine knows...."

Can you imagine how much information would have to be collected and
processed every hour, every minute, every second, in order to make such
control possible? I can't. As we move into the future, it seems events
enfold us in a chaotic rush, which only afterward take on meaningful
patterns when we place them in context.

Asimov, however, assumes that the onrushing events in human society


can be understood by the Machines long before they happen. We can
understand better what is wrong with that assumption by studying an
essay which was first published just a few years after the Foundation
Trilogy, a small essay written from the viewpoint of an ordinary pencil.
Leonard Read's, "I, Pencil," has since become a classic in the literature
of the science of economics. Read impresses the reader with the singular
fact that even though billions of pencils have been produced, not one
person knows how to make a pencil. Two interrelated questions
immediately spring to mind of the reader:

One: Why are pencils, supposedly such simple things, so difficult to


make? And Two: How do humans manage to make them at all if they
are so difficult to make, let alone in such prodigious numbers?

Read answers the first question by describing something of what goes


into making a pencil: the production of saws, ropes, trucks, and other
gear needed to harvest the cedar tree out of which comes the wood for
the pencil; the fabrication of steel for the tools, growing of hemp for the
rope, construction of the logging camps, the beds, the mess hall, the
cookery, the raising of food, the shipping of logs, the construction of
trains, rails, engines, and so on; the millworks and all that is needed to
run them; and the kilns, the heat, light and power to run them in the
pencil factory. And we haven't even begun to examine the amount of
work that goes into making the pencil lead, paint, lacquer, metal brace
and eraser!

The answer to the second question, the point Read tried to make, is that
it takes an entire, very complex industrial economy to produce a pencil
(among millions of other things). This ability is a kind of societal
intelligence--an emergent function of large aggregates of self-aware,
self-interested, altruistic, interactive, and very individualistic people.
This ability may seem paradoxical, but is only apparently so (out of
many and one?) since the many and the one exist simultaneously at
different levels of organization. This ability cannot be forced,
commanded or ruled from the center. It can only Be.

The pencil explains, "Actually millions of human beings have had a


hand in my creation, not one of whom even knows more than a very few
of the others... each of these millions sees that he can exchange his tiny
know-how for the goods and services he needs or wants...millions of tiny
bits of know-how configuring naturally and spontaneously in response
to human necessity and desire and in the absence of any human master-
minding.”

And if you think pencil making is tough, consider the esoterica that
would be necessary for the construction and maintenance of spaceships,
robots, and space stations--to name but a few things needed to keep a
galactic empire functional. Consider the vast amount of raw material,
human talent and skill, and wide array of tools that would be required
to develop asteroid mining into a going concern, for example. Imagine a
number which would express this complexity. Now multiply that
number by a googol megabytes. And consider that the configuration of
the economic state of that galactic empire would be changing every
nanosecond. Not even the 40 billion busy bureaucrats of Trantor could
manage such an avalanche of data.

Historically, as well as today, societal processes orchestrate the growing


and shipping of food and goods, the generation and allocation of energy,
the setting of political decision-making processes into motion, the
construction of intricate scientific theories out of thousands of ideas and
observations, and the communication of all this and more through ever-
changing networks of individual human beings. Researchers in chaos
theory tell us that not only are the actions of individuals impossible to
predict, but that the future state of entire dynamic systems are also
impossible to know in advance.

Here is where the world-building problems of the Foundation Trilogy


and The Robots series conjoin. Human societies cannot be grasped as
wholes, let alone understood and manipulated in the detail needed for
Asimov's dreams of Robots and Empire to come true. They can only be
allowed to happen. Dynamic systems, such as turbulence, weather, and
societies are notoriously unstable. Scientists describe them as "infinitely
sensitive to initial conditions." This means that if an initial description
of the state of a system is off by even less than a billionth of a percentage
point, after a few cycles of calculation the results will turn out to be
totally different than those predicted.

It does not matter how wise or benevolent the Machines are, or how well
thought-out Seldon's Plan is, the infinite variety of universal processes
will defeat the best intentions of would-be planners every time.

Human society is a new kind of self-organizing system. It is the first one


we know of formed by intelligent beings. This gives society as a whole
the kind of power no other complex chemical or biological system has.
Perhaps we are at the beginning of a new kind of evolutionary
development, one that might lead to a society that spans the globe, the
solar system, or even the galaxy. If so, then Asimov's dreams may come
true, but in a radically different manner, a manner he couldn't have
possibly foreseen.

The Foundation series and the Robots stories, along with Arthur C.
Clarke's "Childhood's End," will probably be remembered as the last
great and most eloquent arguments put forth for the idea of collectivism
in the literature of science fiction. But even as we re-read and enjoy
them, we and our descendants will plunge headlong, unguided, into the
chaotic, self-creating, evolving, no-promises land of the future--of
society and the free human mind.

You might also like