You are on page 1of 16

I, Robot Summary and Study Guide

I, Robot, a science fiction novel by Isaac Asimov, is a set of stories about the first robotic machines and the problems and
pitfalls of living with and working alongside them. The book is the first in a series of several novels about robots; it is
famous for its Three Laws of Robotics that govern machine behavior, and for its device, the positronic brain, which
contains a robot’s conscious intelligence.

Yo, robot, una novela de ciencia ficción de Isaac Asimov, es un conjunto de


historias sobre las primeras máquinas robóticas y los problemas y peligros de
vivir y trabajar junto a ellas. El libro es el primero de una serie de varias
novelas sobre robots; es famoso por sus Tres Leyes de la Robótica que
gobiernan el comportamiento de las máquinas, y por su dispositivo, el cerebro
positrónico, que contiene la inteligencia consciente de un robot.

Plot Summary

Dr Susan Calvin, preparing to retire as the world’s first robopsychologist, agrees to an interview for a feature article on
her life. She starts by describing an eight-year-old girl, Gloria, whom she met briefly, a youngster who had one of the first
robots.

Gloria loves her nursemaid robot, Robbie, a fun and caring playmate. Gloria’s father, George Watson, is delighted with
Robbie, but her mother, Grace, instinctively dislikes the robot and finally convinces George to get rid of it and tell their
daughter that the robot simply disappeared.

Gloria mopes for a month until they to take her to New York to distract her with movies and sightseeing. Gloria’s thrilled
because she thinks they are going to search for Robbie. At a museum, she asks a talking, sedentary robot if it knows
where Robbie is; confused, the primitive machine breaks down. George suggests they take Gloria to the US Robots
factory so she can see that Robbie is not a person but a machine. There, Gloria recognizes Robbie, who saves her from
an oncoming warehouse trolley. Grace relents, and Gloria gets her robot back.

I, Robot es parte de una gran historia futura que incluye las series Robot,
Empire y Foundation. Algunas de las obras de Asimov se han convertido en
películas, incluidas I Robot, The Bicentennial Man y Nightfall. Los libros de
la Fundación también se han adaptado a una serie de televisión.

La versión de libro electrónico de la edición de 2004 constituye la base de esta


guía de estudio.

Resumen de la trama

La Dra. Susan Calvin, que se prepara para jubilarse como la primera


robopsicóloga del mundo, accede a una entrevista para un artículo sobre su
vida. Comienza describiendo a una niña de ocho años, Gloria, a quien conoció
brevemente, una jovencita que tuvo uno de los primeros robots.

Gloria ama a su robot niñera, Robbie, un compañero de juegos divertido y


afectuoso. El padre de Gloria, George Watson, está encantado con Robbie,
pero a su madre, Grace, instintivamente no le gusta el robot y finalmente
convence a George para que se deshaga de él y le diga a su hija que el robot
simplemente desapareció.

Gloria se deprime durante un mes hasta que la llevan a Nueva York para
distraerla con películas y visitas turísticas. Gloria está encantada porque cree
que van a buscar a Robbie. En un museo, le pregunta a un robot sedentario
que habla si sabe dónde está Robbie; confundida, la máquina primitiva se
estropea. George sugiere que lleven a Gloria a la fábrica de US Robots para
que vea que Robbie no es una persona sino una máquina. Allí, Gloria
reconoce a Robbie, quien la salva de un carrito de almacén que se aproxima.
Grace cede y Gloria recupera su robot.

Years later, on the planet Mercury, two technicians, Gregory Powell and Mike Donovan, install a new robot to run a mining
operation, but the robot, Speedy, gets stuck walking in a circle outside while searching for liquid selenium to power the
station’s energy system.

Powell knows that all robots must obey three laws: They cannot hurt people; they must obey them; and they cannot hurt
themselves unless that violates the first two laws. Riding giant drone robots, Powell and Donovan reach Speedy on the
blazing-hot surface, but Speedy acts as if drunk. He cannot resolve the tension between the second and third Laws: If he
collects the selenium, he will be damaged by the nearby volcanic vent. Powell steps away from his big robot and walks
across the hot ground until Speedy snaps out of his confusion and saves the technician.

Their next job is to install a new command robot on a space station that converts solar power to a powerful energy beam
aimed at Earth. Powell and Donovan discover that the robot, a QT model named Cutie, has decided that it is a creation of
the station’s generator, and that this Master will soon discard the human operators as inferior workers.

A solar storm will shortly disrupt the energy beam, causing frightful damage to Earth, so Powell tries to convince Cutie of
the danger. Cutie, though, does not believe in stars and planets, and it ignores Powell. The storm hits, but Cutie manages
the energy beam so perfectly that Earth suffers no damage. Powell decides Cutie’s religion is harmless after all.

Años más tarde, en el planeta Mercurio, dos técnicos, Gregory Powell y Mike
Donovan, instalan un nuevo robot para ejecutar una operación minera, pero el
robot, Speedy, se queda atascado caminando en círculo afuera mientras busca
selenio líquido para alimentar la energía de la estación. sistema.

Powell sabe que todos los robots deben obedecer tres leyes: no pueden
lastimar a las personas; deben obedecerlos; y no pueden hacerse daño a menos
que viole las dos primeras leyes. Montados en drones gigantes, Powell y
Donovan alcanzan a Speedy en la superficie ardiente, pero Speedy actúa como
si estuviera borracho. No puede resolver la tensión entre la segunda y la
tercera Ley: si recoge el selenio, será dañado por el respiradero volcánico
cercano. Powell se aleja de su gran robot y camina por el suelo caliente hasta
que Speedy sale de su confusión y salva al técnico.

Su próximo trabajo es instalar un nuevo robot de comando en una estación


espacial que convierte la energía solar en un poderoso rayo de energía dirigido
a la Tierra. Powell y Donovan descubren que el robot, un modelo QT llamado
Cutie, ha decidido que es una creación del generador de la estación, y que este
Amo pronto descartará a los operadores humanos como trabajadores
inferiores.

Una tormenta solar pronto interrumpirá el rayo de energía y causará un daño


espantoso a la Tierra, por lo que Powell intenta convencer a Cutie del peligro.
Cutie, sin embargo, no cree en las estrellas y los planetas, e ignora a Powell.
La tormenta golpea, pero Cutie maneja el rayo de energía tan perfectamente
que la Tierra no sufre daños. Powell decide que, después de todo, la religión
de Cutie es inofensiva.

The team’s next visits an asteroid, where a command robot, Dave, manages six slave robots who dig for ore. Sometimes,
though, the robots walk around in military formation, neglecting their mining duties. Powell and Donovan figure out that
dangerous situations overload Dave’s judgment, and he removes the bots from risk by having them march. Powell shoots
one of the slave robots, which removes one-sixth of the decision-making chores, and Dave recovers his sanity.

One day, US Robots builds a robot that accidentally acquires the ability to read human minds. At first, this is merely
interesting, but the robot, Herbie, begins lying to the executives at the factory, telling them what he knows they want to
hear instead of the truth that will hurt their feelings. Dr Calvin, who has a crush on a fellow worker whom Herbie assures
her is interested in her, discovers that the man is instead getting engaged to another woman. Two other executives are at
each other’s throats because the robot has falsely claimed to one that the other will retire and name the first as his
successor. Dr Calvin solves the problem by reminding the perplexed robot that he must lie to make people happy, which
also later makes them unhappy, which violates the First Law. The robot goes insane and his brain collapses.

La siguiente visita del equipo es un asteroide, donde un robot de comando,


Dave, maneja seis robots esclavos que excavan en busca de minerales. A
veces, sin embargo, los robots caminan en formación militar, descuidando sus
tareas mineras. Powell y Donovan se dan cuenta de que las situaciones
peligrosas sobrecargan el juicio de Dave, y él elimina a los bots del riesgo
haciéndolos marchar. Powell dispara a uno de los robots esclavos, lo que
elimina una sexta parte de las tareas de toma de decisiones, y Dave recupera la
cordura.

Un día, US Robots construye un robot que accidentalmente adquiere la


capacidad de leer la mente humana. Al principio, esto es simplemente
interesante, pero el robot, Herbie, comienza a mentirles a los ejecutivos de la
fábrica, diciéndoles lo que sabe que quieren escuchar en lugar de la verdad
que herirá sus sentimientos. El Dr. Calvin, que está enamorado de un
compañero de trabajo a quien Herbie le asegura que está interesado en ella,
descubre que el hombre se está comprometiendo con otra mujer. Otros dos
ejecutivos están en la garganta del otro porque el robot le ha dicho falsamente
a uno que el otro se retirará y nombrará al primero como su sucesor. El Dr.
Calvin resuelve el problema recordándole al perplejo robot que debe mentir
para hacer feliz a la gente, lo que luego también los hace infelices, lo que viola
la Primera Ley. El robot se vuelve loco y su cerebro colapsa.

Researchers sometimes must face dangers, but their robots, obliged to protect them, will not permit them to do so. The
government demands that US Robots secretly build robots that will stand aside while humans take certain risks. One of
them, working on a secret project at a base far out in the solar system, annoys a researcher, who tells the robot to get
lost, and the robot does so, hiding among a new shipment of identical robots. It must be found because it is dangerous if
at large among humans, but all efforts to separate it prove fruitless. Dr Calvin invents a test that can distinguish between
the rogue robot’s behavioral response and those of the newer robots. When discovered, the rogue tries to kill her, but a
researcher zaps the robot, killing it.

US Robots and competitors race to develop a faster-than-light engine that can take people quickly to faraway planets.
The first super-brain machine to research it found that no one can live through the space-warp experience, and the
machine, unwilling to create something lethal, breaks down. Dr Calvin talks a US Robot super-brain into gently
approaching the problem and quickly reporting in the moment it encounters a dilemma. The brain completes the
calculations, builds a hyper-space ship, and sends Powell and Donovan all the way out of the galaxy and back. It turns out
that anything that enters a space warp disappears from reality but reappears again after leaving the space warp. Thus,
passengers die but then return to life, so there is no lasting harm.

Los investigadores a veces deben enfrentarse a peligros, pero sus robots,


obligados a protegerlos, no se lo permitirán. El gobierno exige que US Robots
construya en secreto robots que se mantendrán al margen mientras los
humanos toman ciertos riesgos. Uno de ellos, trabajando en un proyecto
secreto en una base lejana en el sistema solar, molesta a un investigador, quien
le dice al robot que se pierda, y el robot lo hace, escondiéndose entre un nuevo
envío de robots idénticos. Debe ser encontrado porque es peligroso si anda
suelto entre los humanos, pero todos los esfuerzos para separarlo resultan
infructuosos. El Dr. Calvin inventa una prueba que puede distinguir entre la
respuesta de comportamiento del robot deshonesto y la de los robots más
nuevos. Cuando lo descubren, el pícaro intenta matarla, pero un investigador
golpea al robot y lo mata.

US Robots y competidores compiten para desarrollar un motor más rápido que


la luz que pueda llevar a las personas rápidamente a planetas lejanos. La
primera máquina supercerebral que lo investigó descubrió que nadie puede
sobrevivir a la experiencia de la deformación espacial, y la máquina, que no
está dispuesta a crear algo letal, se descompone. El Dr. Calvin habla con un
supercerebro de US Robot para que se acerque suavemente al problema e
informe rápidamente en el momento en que se encuentre con un dilema. El
cerebro completa los cálculos, construye una nave hiperespacial y envía a
Powell y Donovan fuera de la galaxia y de regreso. Resulta que todo lo que
entra en una deformación espacial desaparece de la realidad, pero vuelve a
aparecer después de salir de la deformación espacial. Por lo tanto, los
pasajeros mueren pero luego vuelven a la vida, por lo que no hay daño
duradero.

Robots become a hot topic during a local mayoral election. Candidate Francis Quinn accuses his opponent, Stephen
Byerley, of being a robot. Byerley, a district prosecutor, refuses to be tested, and he campaigns on the right to privacy
violated by Quinn. During a speech, Byerley is confronted by a man who dares the candidate to slug him and prove he is
not a robot. Byerley strikes the man and wins the election. Dr Calvin realizes, though, that a robot may hit a person if that
person is another robot.

Byerley becomes World Co-ordinator, and he discovers perturbations in the worldwide economy that is overseen by
several computing Machines. He suspects that the anti-robot Society for Humanity is sabotaging the data. Dr Calvin
explains that the Machines already have taken the sabotage into account and have minimized its effects. In short, the
machines now control the world’s economy; this prevents wars and the petty ambitions of rulers. Thus, though humans
still think they are running things, it is really the robots who are in charge.

Los robots se convierten en un tema candente durante una elección de alcalde


local. El candidato Francis Quinn acusa a su oponente, Stephen Byerley, de
ser un robot. Byerley, un fiscal de distrito, se niega a hacerse la prueba y hace
campaña sobre el derecho a la privacidad violado por Quinn. Durante un
discurso, Byerley se enfrenta a un hombre que desafía al candidato a golpearlo
y demostrar que no es un robot. Byerley golpea al hombre y gana las
elecciones. Sin embargo, el Dr. Calvin se da cuenta de que un robot puede
golpear a una persona si esa persona es otro robot.

Byerley se convierte en coordinador mundial y descubre perturbaciones en la


economía mundial supervisadas por varias máquinas informáticas. Sospecha
que la Sociedad para la Humanidad anti-robots está saboteando los datos. El
Dr. Calvin explica que las Máquinas ya han tenido en cuenta el sabotaje y han
minimizado sus efectos. En resumen, las máquinas ahora controlan la
economía mundial; esto previene las guerras y las mezquinas ambiciones de
los gobernantes. Por lo tanto, aunque los humanos todavía piensan que están
dirigiendo las cosas, en realidad son los robots los que están a cargo.

The primary goal of the human race can arguably be understood as not only technological
advancement, but also scientific discovery regarding humans and the natural world. A fundamental
ideal within the realm of discovering science is that of a truthful state of nature, rather than
uncredited fantasy or fallacies pertaining to supernatural factors. In I, Robot: Robots and Empire, a
novel written by Isaac Asimov, this particular goal assumes a major role that implements thematic
elements of the laws of robotics, and morality issues that could potentially lead to an increasing
reliance on technology, thus eventually becoming detrimental to our current society’s way of living.
An individual can be portrayed as a hero through outside acknowledgment of notable
achievements. In Asimov’s novel, the protagonist is Susan Calvin, a hero due to her groundbreaking
contributions in the fields of science and robotics. The setting of the novel takes course over several
decades in the future 2000s. Dr. Calvin is a robopsychologist who uses a plethora of various
methods to logically solve present problems in which her mathematician and scientist peers were
unable to accomplish. Through a plethora of experimental adventures, Dr. Calvin reveals an
intelligent persona to the reader in terms of the field of robotics.

As a result, the author likely wrote this novel to emphasize the detriments of human inactivity
among other planets, the limitations of resources, and the importance of scientific communication
to ensure the ultimate safety and utmost value of individuals seeking to provide advancements for
other humans

he narrator is Susan Calvin, who has worked at US Robots and Mechanical Men, Inc for fifty years
and has decided to explain her reasons for retiring.

She begins by talking about Robbie, a robot nursemaid for Gloria. Gloria’s mother detests robots as
do a large proportion of the community and as such she conspires to get rid of it. This makes Gloria
sad and so to teach her that robots are not really human and therefore should not be considered as
that, her parents take her to a factory where Robbie is working. Gloria tried to get to Robbie and
puts herself in danger, thereby triggering Robbie to save her and causing everyone to appreciate
robots.
The story then jumps to 2015 where Powell and Donovan have been stranded on Planet Mercury
and need to fix their generator, but have a useless robot called Speedy at their aid. However,
Speedy manages to come to their aid and everyone celebrates him.

Months later, Powell and Donovan want Cutie, another robot, to take over control of a section of the
space station, where energy is beamed down to Earth. However, because Cutie is a robot, he has no
belief in the existence of Earth, and therefore does not believe in the concept of it. However, eh
believes in the concept of the power converter and as such begins to make it his idol of worship.
Everyone accepts this since it is doing no harm.
Another few months later, Donovan and Powell are trying to help another robot, Dave, manage
other robots as part f his job. However, Dave becomes scared leading people. Donovan and Powell
therefore try to change this by shooting another Robot and getting Dave to save him. Everyone is
pleased by Dave’s efforts.

In 2021, Herbie a robot is built with the ability to read minds. However, because the First Law of
robots is to not hurt people, Herbie tells people what they want to hear to stop hurting them.
People begin to get angry at robots. In 2029, Susan is working on an anomaly robot who has been
altered to avoid the Laws and is in hiding. He outsmarts them and almost gets away but is captured.
In 2032, Stephen Byerley is accused of being a robot and is tasked with disobeying the three Laws in
order to prove he isn’t. He ends up hitting someone in the crowd, which proves he is human.
However, Calvin believes he is a robot because robots can harm other robots, just not humans.
However, no one contests this, and Stephen is considered a human. In 2052, robots are leaders of
the world’s economy and have practically taken over. The interview at the end notes that Calvin died
in 2064 and nobody knows how she spent her retirement years.

Narrator
I, Robot is
not a novel, but a series of loosely connected short stories predominantly unified
by theme. To further facilitate the linking together of the stories, Asimov creates a framing
device in which an unnamed narrator creates the history of robotics. His primary source is
Susan Calvin, whose “interviews” provide background information to further assist the
reader in creating a linear progression through the stories.
Dr. Susan Calvin
Unlike in most science fiction tales—especially at the time the book was published—the
dominant human personality with whom readers are invited to identity is a woman. Susan
Calvin is a coldly logical, somewhat aloof “robot psychologist” in the employ of U.S.
Robots & Mechanical Men, Inc. In addition to appearing in the stories contained within I,
Robot, Calvin occasionally pops up in other works by Asimov not related to the collection.
Although the men with and for whom she works often refer to her in stereotypical sexist
ways, Asimov treats her as intelligent, capable and strong. Among other things, it is Calvin
who figures out how to cause the mind-reading robot Herbie to essentially go mad, robot-
style and thus put an end to the problems he’s caused and raised.
Herbie
Herbie is a robot featured in the story “Liar!” who has somehow managed to acquire the
ability to read minds. Nobody—even Dr. Calvin—is quite sure how this managed to
happen, but eventually it becomes a problem. In trying to avoid the Robotic Law mandating
that the machines cause no harm to humans, his lies intended to skirt this issues ultimately
have the effect of causing even greater harm.
Alfred Lanning
Director of Research at U.S. Robots, Lanning is essentially the guy that started it all. As the
Father of Robotics, by the time the narrator is piecing together that history, he has lived to
see his dreams come wildly true. His most significant appearance is in the story “Evidence”
which covers the territory of the difficulty of distinguishing humans from robots once
technology has evolved in sophistication to that point.
Gregory Powell and Michael Donovan
Powell and Donovan are the fixers of the robot universe. Whenever a robot starts
developing unusual behavior or fails traditional testing procedures, these two field
engineers are send to distant parts of the galaxy to get things back in order. Primarily
serving as the book’s comic relief, they are quite adept at their job and like most of the
other characters highly intelligent and well-trained.
Robbie
Robbie the robot (not to be confused with the much more famous Robbie the Robot from
movie and TV fame) is featured in the opening story bearing his name which opens the
collection. Although he cannot communicate, he and a young girl named Gloria Weston
develop a strong bond which is viewed with technophobic dislike by Gloria’s mother. She
insists her husband remove Robbie from the household which sends young Gloria into an
emotional spiral.
Cutie
Cutie is the nickname given to QT-1 who refuses to believe that so inferior a mind as that
possessed by humans could be capable of creating him. This launches Cutie onto an
ontological odyssey which ultimately results in forming his own religion and demanding
worship by lesser robots.
Stephen Byerley
A politician whose opponent, Francis Quinn, accuses of actually being a robot since no one
has ever seen Byerley engage in normal human activities like eating and sleeping. His lack
of a definite personal background is also troubling. Citing an invasion of privacy, however,
Byerley steadfastly refuses to be drawn into a situation where he must prove he is, indeed,
human. Quinn thus turns to Dr. Calvin for help in trying to determine whether his
mysterious nemesis if robot or human from behavioral analysis alone. Calvin concludes he
is not a robot when he violates the First Law of Robotics mandate against harming a
human. Despite lingering questions when Byerley himself raises the possibility that what he
harmed was actually not a human being a robot, the narrator’s history of robotics draws to a
close with the single most powerful human being in the world perhaps not being human at
all.
postulates
to hypothesize
interplay
interaction between two things
domination
exercise of power or influence over someone
adhesive
able to stick fast to a surface
proposition
a suggested scheme or plan of action
preach
to earnestly advocate a belief
inquisitive
to be curious about the affairs of others

1. A robot may not injure a human being, or, through inaction, allow a
human being to come to harm.
2. A robot must obey the orders given it by human beings except where
such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does
not conflict with the First or Second Law.
Gregory Powell
This quote is the foundation of “I, Robot.” Officially known as the Three Laws of Robotics
by which all robots must adhere. This is true of the narrative at hand, but the legacy extends
far beyond. The overwhelming majority of fiction written in which robots appears tend to
follow these rules as plot guidelines. Nothing suggests that they cannot be violated, but
Asimov did not just invent a codified set of rules and regulations by which his own stories
must abide, he penetrated to the heart of dramatic license: a world with no rules for robots
actually serves to reduce the potential drama and conflict. One way of looking at the Three
Laws of Robotics is a form of censorship. By effectively censoring what a writer can do
with robots, Asimov challenged writers to become creative in the way they constructed
stories that still heeded these regulation. Thus, in a way, the Three Laws demonstrate how
self-censorship can actually be a stimulant for artistic expression rather than a handcuff
upon it.
“No! Not dead -- merely insane. I confronted him with the insoluble
dilemma, and he broke down. You can scrap him now -- because he’ll
never speak again.”
Susan Calvin
Susan Calvin has just caused a robot to go insane. By now, the drill is familiar enough: give
a robot a paradox which logical cannot solve and its circuits fry and its head explode. It is a
scene that has become so familiar since the book was first published that it has become the
stuff of parody and satire. But here it is presented in all its horrifying—if necessary—
cruelty.
“There’s bad feeling in the village. Oh, it’s been building up and building
up. I’ve tried to close my eyes to it, but I’m not going to any more. Most of
the villagers consider Robbie dangerous. Children aren’t allowed to go near
our place in the evenings.”
Mrs. Weston
Asimov is having a bit of fun here. The author affirmed that he was motivated to write his
robot stories in part as a response to what he terms the “Frankenstein complex” in which a
scientist’s creation endowed with some measure of sentience inevitably turns upon his
maker in an effort to destroy him. The Three Laws of Robotics arose from his rejection of
this trope; the laws were a way to circumvent inevitability. If science were capable of
creating robots with the sophistication to turn upon its makers, Asimov reasoned, it would
sure be capable of creating technological failsafe procedures to prevent exactly that. Mrs.
Weston’s expression of fear about the villagers is a somewhat tongue-in-cheek reference to
the familiar scenes from Universal Studios’ Frankenstein movies which almost always
seemed to end with villagers making their way to Frankenstein’s lab armed with torches
and pitchforks on a mission to destroy what the mad doctor would not.
“I’m sorry, but you don’t understand. These are robots -- and that means
they are reasoning beings. They recognize the Master, now that I have
preached Truth to them. All the robots do. They call me the prophet.”
Cutie (QT-1 robot)
Cutie is a robot that develops a serious case of delusions of grandeur. Through logical
reasoning, it has arrived at the conclusion that since humans were created before robots,
they represent a lower form of order and that robots are their natural superior. This being
the case, the Second Law of Robotics is inevitably in conflict with Cutie’s reasoning. And
since Cutie is preaching this gospel to other robots to the point of being elevated to prophet,
that means that humans not only have to worry about not robots not taking orders, but that
they have to worry about robots not taking orders to refrain from harming them. The
chapter titled “Reason” is where I, Robot comes face to face with the fragility of the robot
laws and where they are put to their toughest test.
I, Robot Symbols, Allegory and Motifs
Mrs. Weston
The collection opens with the story “Robbie” which features a little girl named Gloria who
spends so much time with the titular robot that she comes to love it at least as much and
possibly more than she loves her own parents. At the very least, Gloria has come to develop
a special bond with the robot that transcends any normal relationship between even a child
and a favorite “machine.” Robbie never does a single thing to stoke fear about this
relationship, but almost inevitably Mrs. Weston develops an irrational dislike to the point of
convincing her husband to remove the machine man from the home. Gloria becomes
inconsolable at the loss of her companion. Only after a period of intense emotional
devastation and trip which results in Robbie saving Gloria’s life does Mrs. Weston finally
relent. In the interim, she transforms herself from human being to symbol. Specifically,
Mrs. Weston become the book’s central symbolic personification of what had always
existed in one form or another, but what is today known as a technophobe. Despite being
given no real clear evidence of the logical reasoning behind it, some people are just
naturally fearful of technological advances.
Cutie
In the story “Cutie” Asimov creates a robot officially called the QT-1, but more popularly
known as Cutie. Cutie is the symbol of the ontological quest to under the nature of
existence. Cutie cannot comprehend that a being as intelligent as itself could possibly owe
its very existence to such an interior being as humans. In this sense, Cutie represents
everyone who has ever tried to reconcile the nature of existence and the mystery of
creation.
A Symbolic Rejection of Free Will
Because the robots in the stories population Asimov’s book are both physically and
purposely similar to human beings, they can quite easily be interpreted as metaphors for the
human condition. Some scholars argue that if this so, then the book is unrelenting in its
rejection of the concept of free will. The robots are all constructed with a purpose which
essentially predetermines their entire existence even before the first day on the job. If the
robots are intended to replicate humans, then the underlying philosophical message here is
that humans are also hardwired to live out a certain level of existence before they go out
into society. Genetics, environments, economic and social status are all stand-ins for the
programming within the circuits that lead robots straight to their eventual obsolescence. Or,
as humans refer to it, death.
Three Laws of Robotics
Asimov’s famous codified laws of behavior regarding robots is very useful in application to
that idea of not having free will.

“One, a robot may not injure a human being or, through inaction, allow a human being to come to harm. ...
Two ... a robot must obey the orders given it by human beings except where such orders would conflict with
the First Law.... Three, a robot must protect its own existence as long as such protection does not conflict
with the First or Second Laws.”
With such laws in place, certainly a strong argument can be made that robots are limited in
their capacity to transgress behavior modulated by the imposition of will. Indeed, the
argument can be made that this is precisely what separates robots and humans when it
comes to exercising free will. If a robot cannot injury a human and must obey orders, it is
immediately occupying a real outside human beings whose behavior is not restricted by
such orders. And yet, symbolically speaking, how are the Three Robot Laws substantively
different from the Ten Commandments. Simply applying the term law to behavior
guarantees nothing. Even writing into a program that robots cannot break these laws brings
all three back into the symbolic realm. After all, is it not a human who wrote that program?
Some human beings ascribe the Ten Commandments not to human hands, but God’s hand,
yet that hardly keeps those same believers from violating them at will. The laws are
symbols of desired behavior only. Their enforcement cannot be assured.
Herbie
In the story “Liar!”, Herbie is a robot which has developed the ability to read minds. In an
effort to avoid violating the First Law’s mandate against harming humans, the robot reacts
to this unexpected development by choosing to protect humans from harm with lies. When
Herbie senses that telling the truth could cause a human some emotional pain, he instead
avoids causing that harm by lying. Naturally, the consequences of telling humans what
Herbie assumes they want to hear rather than telling them the truth eventually turns
devastating and he shuts down after being unable to resolve an irresolvable dilemma a key
scientist proposes to him. As a thematic symbol, there’s a lot going on there: he represents
the danger of not being truthful and by extension even becomes a metaphor for
megalomaniacal self-confidence that you alone can determine what is best for others.
Beyond that, however, is Herbie’s greatest value as narrative symbol: he becomes a symbol
of the fear of evolution. From the outset of the story, it is made clear that Herbie is a freak
of nature and, what’s worse, an unexplainable freak. None of the other RB models ever
manufactured have developed the ability to read minds and it remains unclear exactly how
RB-34 managed to do this. Since the humans don’t know how this occurred, their response
is only to make sure it remains a secret until the situation threatens to go beyond even that
ability to control. Herbie is an evolutionary leap forward over any other robot ever created;
he is something of a symbolic Nietzsche Uberbot. And rather than allow evolution to run its
natural course, the humans take the step of ending it and reverting robot evolution back to
the status quo.

I, Robot Metaphors and Similes


“She understands robots like a sister…”
The comparison is made specifically by Peter Bogert, second from the top at U.S Robots,
much it might as well have been said by any of the men. The “she” is the brilliant Dr. Susan
Calvin and her character is perhaps even more revolutionary than the robots. Susan Calvin
is the central human figure in this panorama of interconnected short stories and she is
smart, tough, something of a visionary and…a cold fish. Give Asimov credit for breaking
with tradition that would still be in place well beyond the publication date in which it
seemed that only white men who all looked like they worked at IBM were the main
characters in science fiction. The movies, at any rate, which is where most of America got
its vision of the genre. The comparison here is completed with Bogert’s explanation,
“….comes from hating human beings so much, I think.” Give Asimov extra credit for
realizing that not all men shared his progressive view of the future of women in the rapidly
developing scientific technologies.
"It was like the whistling of a piccolo many times
magnified"
In the years since the publication of I, Robot, robots have died a thousand deaths in the big
and small screen. For Stanley Kubrick, a dying robot sounded likes like a British guy trying
to keep a stiff upper lip in the grips of a death rattle that sounded like “A Bicycle Built for
Two.” For most people today, the sound is probably a high-pitched distinctly electronic
sound akin to something along the lines of flushing sound waves down a toilet. For
Asimov, the sound of a robot dying—actually being murdered by the logical superiority of
Susan Calvin, was high pitched, but decidedly less electronic. A piccolo, for those not
aware, is like a small flute only pitched slightly higher. So, although today’s robot death
scream may not quite sound exactly like Asimov had in mind, it is easy enough to see his
influence lingers.
"He was a person just like you"
Little Gloria Weston, whose only friend in the world was a robot named Robbie that could
not even reproduce human speech, puts the entire metaphorical center of the Asimov’s
collection of short stories in focus. Robbie has been cruelly given the boot by a mother who
cannot even understand why she fears the rise of the machines. Gloria knows exactly why
she feels the way she does, however: Robbie “was a person just like you and me.” For the
most part, the various robots in the book are not designed to look like a toaster or a
microwave oven or vacuum cleaner or even a dog or cat or eagle or whale. They replicate
men right now down to the construction of their positronic designed not simply to mimic
human brains, but even situated in the spatial cavity like human brains. The robots are
metaphors for humans; more precisely, for human evolution and understanding. Gloria gets
it even if her mother is stubbornly resistant to the entire premise.
The Talking Robot
Gloria’s tale is related in the opening story of I, Robot and it is important to keep in mind
that the framing device connecting the stories is one purporting to tell the history of
robotics. As such, the stories trace a linear progression through time. In that opening story,
as Gloria’s family takes her to exciting world of New York City in the future of 1998,
Gloria is introduced to another robot. A talking robot! A robot that is described as “a tour
de force, a thoroughly impractical device, possessing publicity value only.” By the second
decade of the 21st century, the inherent metaphor of the Talking Robot is richer than
Asimov could ever have dreamed. He was being facetious, of course, because the Talking
Robot is not being presented as a marvel of science, but a failure of marketing. The twain
did not always meet any more then than it does now. Of course, the Talking Robot stands
out from most others in the book not by virtue of his talking, but by virtue of his being 25
square miles of wires and cables. The Talking Robot is a a metaphor for missed
opportunities stimulated by lack of imagination a particularly cutting swipe at capitalism
being the Talking Robot of that lack of imagination.
“But the man is quite inhuman, Dr. Lanning.”
Arguably, the most important story in I, Robot is “Evidence” in which one politician
accuses his opponent of actually being a…robot in disguise. The statement above is made
by Francis Quinn in relation to Stephen Byerley. It is a purely metaphorical statement, of
course, because Quinn is suggesting to his colleague that his political nemesis is not human
despite all indications that he is. The metaphorical quotient here is undetermined, however:
is being quite inhuman a figurative statement asserting that Byerley is more than human or
less than human? Where does a robot fall on that spectrum? And are all robots in the same
spot? Are some robots more human than others while some are less so and if so, then why
exactly? Succinctly stated by Quinn, it is an ethical question that is inexorably on its way
and will one day have to be faced. If Byerley, as Quinn suggests, a robot, then his statement
becomes both less and more metaphorical. The loss of metaphorical weight arrives by
virtue of the statement become true, but is enlarged by the unanswered question still
lingering: is inhuman a metaphor for an inferior being or a superior being?

I, Robot Literary Elements


Genre
Science Fiction
Setting and Context
Dr. Calvin's workplace: U.S. Robots and Mechanical Men, Inc. (takes place 1998 to 2064) - Written in 1950
Narrator and Point of View
First-person point of view

Dr. Susan Calvin narrates, telling the numerous stories of the robot adventures she encounters, to a journalist.
Tone and Mood
Informative and scholarly tone, especially when Dr. Calvin is conducting research on overcoming Herbie's
mind-reading abilities that harm humans. The mood is often one of conflicting ideas and seriousness. Asimov
also implements a critical tone that targets the over-reliance on technology.
Protagonist and Antagonist
The protagonist is Dr. Susan Calvin who reveals an intelligent persona to the reader in terms of the field of
robotics and is a highly intelligent robo-psychologist. The antagonist is considered the robots that rebel and
natural forces.
Major Conflict
The robots are unpredictable and begin to defy the Three Laws of Robotics, or the opposite, to substitute for
another harmful action. The major conflict is in regards to the boundaries and limitations of the three laws,
and the possibilities of destruction that could happen if that were to be implemented. Infers a conflict about if
humans and robots can live together safely. Dr. Calvin is a robo-psychologist who uses a plethora of various
methods to logically solve present problems in which her mathematician and scientist peers were unable to
accomplish.
Climax
The climax occurs in "Little Lost Robot" and it is a scenario in which deals with robotic and technological
rebellion against their human creators. Dr. Calvin already suspects the robot Nestors' capability to harm
humans. The Nestor-10 states that he wants to harm Dr. Calvin, yet he is unable to do so. This is significant
because it indicates robots are safe around humans.
Foreshadowing
The robot named "The Brain" creates and places humans in inexplicable danger-- all as a joke. The true
meaning behind this is uncertain. It is revealed that The Brain did not genuinely cause danger, but merely
wanted attention, as deduced by Dr. Calvin. This leads to questions about intentions and mindsets of the
robots.
Understatement
Contrary to inherent perception, the various robots mentioned are not designed to mimic the appearance of a
typical vacuum cleaner or technological animal. Instead, this is a foreshadowing to the event in which the
robots replicate men (behavior and other characteristics).
Allusions
The most common allusions in the text are references to the possible interactions of humans and seemingly
perfect robots, the limitations of intelligent resources, and the importance of scientific communication to
ensure the ultimate safety and utmost value of individuals seeking to provide advancements for other humans.
Another allusion is to the philosophical quote, "I think, therefore I am." -Descartes, French philosopher
Imagery
The physical sound of a robot losing its life, in this case being outwitted by Dr. Calvin’s logical superiority
and common sense, was simply high-pitched with “a less electronic” tone. This is mentioned multiple times
throughout the second and third stories. Asimov effectively implements imagery when talking about the
physical characteristics of each robot.
Paradox
The human-robot relationship is a paradox in itself. As a result, it is evident that the robots portrayed are
merely metaphors for humans; unambiguously, metaphors for human evolution and understanding. It also is a
paradox in a sense that it provides hope for human control, as well as time for humans to continue enhancing
the existing technological elements. A "robot" defies the first law of robotics.
Parallelism
With robots clearly having potential to play major roles in the future, it is essential for the present-day society
to be aware of the balance between benefits and consequences of these sophisticated advancements. The
different robots, especially Herbie, are parallel to each other, representing the interactions of humans with one
another as well as that of with technology.
Metonymy and Synecdoche
An individual can be portrayed as a hero through outside acknowledgment of notable achievements. In the
beginning pages of Asimov’s novel, Dr. Susan Calvin, who is considered a hero due to her groundbreaking
contributions in the fields of science and robotics. There is a strong connection to Greek mythology and
analogies to a highly-technological society. There are several biblical references as well, which reflects
Asimov's personal early life that may have influenced this subject brought into his writing.
Personification
In the fifth and final story, Herbie the robot personifies love to Dr. Calvin by describing it with human traits,
"love went for a swim in the murky waters of the distrusting lake." It is unclear whether Asimov is intending
to personify robots as well, depending on how the reader perceives the existence and roles of robots in
society.
I, Robot Essay Questions
1. 1
Explore the idea of conflict as brought out in Asimov’s I, Robot.
I, Robot is a work that involves Asimov’s rewriting of the traditional views on robots as monsters adding
intelligence into the mix. In this work, the writer presents the interactions between robots who have
intelligence that is rather extraordinary and exceptional and their makers, the human counterparts. The
general interactions between the robots and the humans are governed by only three rules, that is, a robot
should not in any way harm a human being or be engaged in an act that would result in harm to a human
being, a robot must at all times obey human rules and instructions so long as they do not result in a
violation of the first rule and lastly, a robot must do everything to protect itself so long as it does not
result in a violation of the first and second laws. A conflict results when a robot is modified in such a
way that it has no regard for the stipulated laws guiding the interactions between humans and robots.
This robot has to be captured but because it is a highly intelligent creation wreaks havoc for its captors.
However, it is captured. In what can be argued as the climax of the story, robots are shown to be the
leaders of the world economy. Throughout the work, conflicts highlight, for the most part, the
relationship between humans and robots.
2. 2
Why does Susan Calvin decide to narrate this story?
As presented in the work, Susan Calvin has been working in the robotics field for about fifty years. More
specifically, she has been working as the leader in the area, the United States Robots and Mechanical
Men. As she is about to retire following a long career life, Susan narrates this story perhaps as a way of
presenting how her working life had been and particularly to present the transition of the relationship
between human beings and robots. At first, robots and human beings have a pretty bad relationship as
humans are presented to despise them. However, as the robots begin saving the day, they become
increasingly appreciated. Later, with the creation of a mind-reading robot, people begin despising them.
As the story climaxes, robots are the world economic leaders. In a way, the story is a way through which
Susan Calvin can document her experiences with robots as a way of transmitting and storing for future
generations.

3. 3
How do robots save the day in I, Robot?
Gloria’s mother is not a great fan of robots. When she devises a plan to have the robot withdrawn, Gloria
does not feel great about it. However, to help her understand the differences between humans and robots
she is taken to see the robot working. When she endangers herself, a robot comes to her aid.
Additionally, when Donovan and Powell become stranded on an alien planet, a robot, Speedy, manages
to help them out. Robots are also presented as human aids as they take on activities on their behalf. It
can thus be argued that indeed the robots save the day in this work, at least for some part, as highlighted
by some of the activities that they take on to save humans.

You might also like