Professional Documents
Culture Documents
T
EC
S
07 0
20210057 - N
Joy's thesis about unforeseen effects includes a quote from Ted Kaczynski,
the Unabomber. The concept of unintended consequences, according to
Joy, is "a well-known problem with the design and use of technology..."
Regardless of the strength of Kaczynski's anti-technology argument—which I
also find flawed—it is difficult to argue about the existence of unintended
consequences.1 And it is easy to see why. The repercussions of an action
exist in the future relative to that action, and some consequences are
unknown since the future is uncertain. Furthermore, it goes without saying
that an unknown future and unknowable repercussions are inextricably
linked.
We may now be chasing our own tails as we try to develop defenses against
the hazards posed by new technologies. Every countermeasure may be as
harmful as the technology it was designed to combat. But Joy's conclusion
is puzzling: "The only realistic alternative I see is relinquishment: to limit
development of the technologies that are too dangerous, by limiting our
pursuit of certain kinds of knowledge." For starters, it is unrealistic to
believe that we could limit our pursuit of knowledge even if we wanted to,
and that it would be a good idea. Second, at current technological levels,
this "freeze" does not eliminate the danger; the danger exists now.
Joy's panic blinds him to the potential benefits of our knowledge, and his
pessimism prevents him from seeing our knowledge and its applications as
essential to our salvation. Instead, he calls to the Dalia Lama's ethics to
redeem us, as if another religion ethics will provide an escape from our
nature's less honorable angels. I am aware of no-good evidence that
religion ethics prescriptions have increased humanity's morals overall.
Without a doubt, the opposite case might be made. Why not apply our
understanding to acquire control over ourselves? If we accomplish this,
mastery of our technology will follow. Joy's concerns are valid, but his
answers are unachievable. His intentional knowledge halt condemns
humans.
Yes, and what about his contention that humans have no business
researching robotics and artificial intelligence since we have "such much
trouble...understanding—ourselves?" " The response to this, striving to
understand mind won't help you understand mind, states that self-
awareness is the purpose of knowledge pursuit. His grandmother "had a
sense of the nature of the order of life, and of the importance of living with
and respecting that order," he writes sentimentally, but this is utterly naive
and contradicts the realities.
Would he have us die destitute and young, fodder for creatures,
defenseless against sickness, leading lives that were "nasty, brutish, and
short," as Hobbes so succinctly described it? "Respecting the natural order
implies impotence and passivity."
In fact, the life that Joy and most of the rest of us live was built on the
labors of people who fought valiantly against nature's order and the pain,
poverty, and suffering that it exudes. What would we be like without
Pasteur, Fleming, and Salk? As Joy points out, life is fragile, but it was more
so in the past, which was far from the idyllic paradise that he imagines.
5. What solutions can you propose as to not reach what he predicts might
happen?
I say abandon Joy's pessimism and reject all limits to our knowledge, health,
and longevity. Be cognizant of our past achievements, appreciative of all
that we are, but driven passionately and creatively forward by the
possibility of all that we might become. Therein lies humanity's and their
descendants' hope. In the words of Walt Whitman:
Joy then moves on to his other technologies are making things worse
argument. Concerning genetic engineering, I can think of no reason, short of
infantile pleadings not to play God, to obstruct our growing ability to
perfect our bodies, eradicate disease, and prevent deformities. Failure to do
so would be immoral, making us responsible for enormous amounts of
avoidable pain and death.
Even if there are Gods who have equipped us with intelligence, it seems
unlikely that they did not intend for us to use it. In terms of
nanotechnology, Joy speaks eloquently about how "engines of creation"
might become "engines of destruction," but it's difficult to see why we or
anyone else would want that.
Joy also thinks that there is something sinister about the fact that NBC
technologies are mostly military in nature and were developed by
governments, but GNR technologies are primarily commercial in nature and
are being developed by corporations. Unfortunately, Joy provides us no
cause to share his terror. Are private firms' commercial products more
prone to inflict destruction than governments' military products? At first
look, the contrary appears to be more likely, and Joy gives us no reason to
reconsider.
a. What three concepts from the article will you never forget?
1. Can We Prevent a Future with Machines as Masters?
2. What Dimensions Should We Examine?
3. To Succeed, We Need to Change How We Innovate
b. What three realizations did you have after reading the article?
State your answer in the following manner:
Before reading the article, I thought the future does not really need us,
the overthrow of the human species by machines is by no means inevitable.
It will not happen overnight. There would necessarily be stages that we as a
species would be witness to. Initially, there would be a state of reasonable
reliance on machines to augment our thinking, in advance of relegating it
excessively and detrimentally to them. Conceivably upon encountering a
situation which goes too far, potentially threatening our existence or
relevance, we could intervene.
However, after reading, I can now say that I learned, Economic return
to investors and shareholders is, of course, a significant priority when
launching into a new entrepreneurial endeavor or corporate innovation
initiative. The need to take other factors into account when selecting where
to focus our innovative capacities is increasingly urgent.
a. Societal benefit
b. Potential job displacement and commensurate strategies for buoyancy
c. Degree of collaboration among humans being fostered
d. Global climate impact
The intent here is not to delve into each of these dimensions and propose a
means for analyzing investment opportunities against each. The
measurement of the above factors is complex and not straightforward.
The aim, rather, is to look more broadly than at each dimension and to look
at them collectively. It is arguably now more important than ever that we
approach innovation such that we are clear and intentional about what we
are advancing. We do this so that we can craft and escort ourselves into a
future that we desire; presumably, one where human being will remain
relevant.
c. What three things are still unclear to you after reading the article?
None.
References:
1. https://www.wired.com/2000/04/joy-2/
2. https://reasonandmeaning.com/2016/02/17/critique-of-bill-joys-why-the-
future-doesnt-need-us/
3. https://www.forbes.com/sites/columbiabusinessschool/2020/03/26/why-
the-future-does-need-us/?sh=aac7a1759f26