Professional Documents
Culture Documents
Chapter 1 Part 1
Chapter 1 Part 1
One important entrepreneurial venture during the early years is the Apple II personal
computer, which was introduced in 1977. This event has forever changed how society
perceives computers: that computing is made available to individuals and very small
companies.
IBM tossed its hat into the personal computer ring with its release of the IBM personal
computer in 1981. By the end of 1982, 835,000 units had been sold. When software
vendors began to orient their products to the IBM PC, many companies began offering
IBM PC-compatibles or clones. Today, the IBM PC and its clones have become a
powerful standard in the microcomputer industry.
In 1984, Apple Macintosh introduced the Macintosh desktop computer with a very
friendly graphical user interface (GUI). This was a proof that computers can be easy and
fun to use. GUI began to change the complexion of the software industry. They have
changed the interaction between the user and the computer from a short, character-
oriented exchange modeled from the teletypewriter to the now famous WIMP interface
(WIMP stands for windows, icons, menus, and pointing devices).
It was in 1985 when Microsoft adopted the GUI in its Windows operating system for
IBM PC compatible computers. Windows did not enjoy widespread acceptance until
1990, with the release of Windows 3.0. It gave a huge boost to the software industry
because larger, more complex programs could not be run on IBM-PC compatibles.
Subsequent releases made the PC even easier to use, fueling the PC explosion in the
1990s.
In 1991, Linus Torvalds developed LINUX, a reliable and compactly designed operating
system that is an offshoot of UNIX and can be run on many different hardware platforms.
It is available free or at very low cost. LINUX was used as an alternative to the costly
Windows Operating System.
In 1993, the IBM-PC compatible PCs started out using Intel microprocessor chips, then
a succession of even more powerful chips. But not until the Intel Pentium and its
successors did PCs do much with multimedia (the integration of motion, video,
animation, graphics, sound, and so on). The emergence of the high-powered Intel
Pentium processors and their ability to handle multimedia applications changed the way
people view and use PCs.
It was also in this year when millions of people began to tune into the Internet for news.
The World Wide Web (WWW), one of several internet-based applications, came of age
as Web traffic grew 341.634%. The web is unique that it enabled Web pages to be linked
across the Internet. A number of Internet browsers were introduced (e.g. Mosaic and
Netscape Navigator which were developed by Marc Andreesen, and Internet
Explorer by Microsoft Corporation). These browsers enabled users to navigate the
World Wide Web with ease. Today, WWW is the foundation for most Internet
communications and services. The World Wide Web was actually created in 1991 by
Tim Berners-Lee, an engineer in Geneva, Switzerland.
The year 1996 marked the 50th year of computer history. The US Postal service issued
stamps that commemorated the 50th anniversary of ENIAC, the first full-scale
computer and the 50 years of computer technology that followed. It was during this year
when the handheld computer was introduced and signaled to the world that you can
place a tremendous computing power at the palm of your hand. Nowadays, millions of
people rely on handhelds for a variety of personal information management applications,
including e-mail.
In the year 1999, the world was threatened by the Y2K problem, known as the
millennium bug. It may have been one of the biggest challenges ever to confront the
businesses of the world. For most of the 20th century, information systems had only two
digits to represent the year (e.g. 99 for 1999). But what would happen when the 20th
century ended and a new one begins is that non-compliant computers would interpret the
date 01-01-00 for January 1, 2000 as being January 1, 1900. Y2K heightened
management’s awareness of how critical information technology is to the operation of
any organization.
Jack Kilbey’s first IC contained a single transistor. Tens of thousands engineers around
the world have built on his invention, such that each year, our society is the beneficiary of
smaller, more powerful, cheaper chips.
One continuing trend in computer development is microminiaturization, the effort to
compress more circuit elements into smaller and smaller chip space. In 1999, scientists
developed a circuit the size of a single layer of molecules, and in 2000 IBM announced
that it had developed new technology to produce computer chips that operate five times
faster than the most advanced models to date.
Researchers are also trying to speed up circuitry functions through the use of
superconductivity, the phenomenon of decreased electrical resistance observed in certain
materials at very low temperatures.
Whether we are moving into a fifth generation of computing is a subject of debate since
the concept of generations may no longer fit the continual, rapid changes occurring in
computer hardware, software, data, and networking technologies. But in any case, we can
be sure that progress in computing will continue to accelerate and that the development of
Internet-based technologies and applications will be one of the major forces driving
computing in the 21st century.