You are on page 1of 2

CHRISIA OSORIO BSIT-IB

Why The Future Doesn't Need Us

When the stunning article “Why the Future Doesn’t Need Us” by Bill Joy, chief scientist for Sun
Microsystems, made the cover of Wired Magazine in April 2000, it created quite a rumble in
high-tech circles. Its argument was that “our most powerful 21st century technologies—robotics,
genetic engineering, and nanotech—are threatening to make humans an endangered species.”

Bill Joy was writing about out of control, self-replicating technologies that, once the stuff of
science fiction, were now on the way in decades if not years. Tens of thousands of scientists,
engineers, mathematicians, and systems analysts are working in countries all over the world
churning out theories and specialized applications without much consideration of their overall
impacts.The funding has been coming from various governments’ military budgets, heavily
contracted out to industrial corporations and, now increasingly, from the commercial pursuits of
global corporations. The rate of knowledge production has been exponential as computers
become faster and are programmed to become more self-reliant.Seventy percent of the volume of
stock trading in the U.S. is now driven by computers and their algorithms—a mere glimmer of
the future pictured by Mr. Joy.The worries among sensitive futurists are both the intended and
unintended consequences. Autonomous weaponry, for example, may be intended for certain
purposes by government militaries, but then emerge as more dreaded unintended consequences
where, for example, these weapons decide themselves when and whom to strike.

Last month, astrophysicist Stephen Hawking, Apple co-founder Steve Wozniak and Elon Musk
of Tesla Motors were some of many specialists who signed an open letter that called for a ban on
autonomous weapons. The letter says, “If any major military power pushes ahead with artificial
intelligence weapons, a global arms race is virtually inevitable,” adding that “unlike nuclear
weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous
and cheap for all significant military powers to mass-produce.”Artificial intelligence (AI) or
“thinking machines” are worrying far more of the serious scientists/technologists than those few
who speak out publicly.Last December, in an interview with the BBC, Stephen Hawking,
through his computer-generated voice, warned that “the development of full artificial intelligence
could spell the end of the human race... It would take off on its own, and re-design itself at an
ever increasing rate.” Hawking, a big thinker, noted that “humans, who are limited by slow
biological evolution, couldn’t compete, and would be superseded.”
Subscribe to The Morning Email.
Wake up to the day's most important news.

You might also like