You are on page 1of 16

Title of the Chapter:

History of IT

Chapter 2

Overview

Image courtesy of Hailey Power

Looking back to our past is a fundamental action in seeking to understand the


present and planning for the future. It keeps us attached to our heritage, enlightens and
informs us about their past struggles, and helps us to shed light on current events.
Indeed, studying and reflecting on our history, give us knowledge about what courses
of action are to be pursued in the present (Michael, 2011).

Information Technology has been part of civilization. It becomes an essential


tool in the creation and recreation of knowledge especially the application of it. One of
its essential roles is the communication process. It extends the distance that humans
could reach by only using their senses. Through Information Technology,
communication encompasses even at the opposite side of the world. Hence, chances of
development and innovation through an instant meeting of minds and sharing of ideas
regardless of whereabouts and time become easier and possible using Information
Technology.

However, before advancing to our next step, we must need to know the lesson
and knowledge of what had happened in the past. Like the Filipino saying goes, “Ang
taong hindi marunong lumingon sa pinanggalingan ay hindi makakarating sa
paroroonan.” Thus, this chapter tackles the history and development of Information
Technology to make us more ready and complete in walking through our destined
future.

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 1


Learning Outcomes
At the end of this lesson, students are expected to:

1. Assess events, inventions, personalities in the history of IT.


2. express opinions on how the Information Age affects the society
3. distinguish the stages of the Industrial revolution

Pre-test
Multiple Choice: Circle the letter that corresponds to the correct answer.

1. It describes an infrastructure that uses the Internet as a medium for offering and selling
services.
a. Morse Code c. Abacus
b. Telecommunication d. Internet of Services

2. He is the Father of Information Theory.


a. Microcomputers c. Charles Babbage
b. Charles Xavier Thomas d. Claude E. Shannon.

3. It is a weaving device incorporated in special looms to control individual warp yarns


a. Jacquard loom c. Jacquard Mechanism
b. Jacquard Attachment d. All of the choices

4. He invented the Voltaic Battery


a. Lisa Volt c. Alessandro Volta
b. John Mcffaeu d. Fernando Volta

5. He invented the telephone


a. Harvard Mark c. Alexander Graham Bell
b. Guglielmo Marconi d. Samuel F.B. Morse

6. It refers to devices capable of solving problems by processing information in discrete


form
a. abacus c. IoS
b. Digital computers d. digital watches

7. The Fourth Generation of Computer is known as the Age of ______________.


a. The Age of Vacuum Tube c. The Age of Integrated Circuit
b. The Age of Transistor d. The Age of Microprocessor

8. He devised a punched-card tabulating machine


a. Herman Hollerith c. G. Murray Hopper
b. Jack S. Kilbey d. All of the choices

9. Tim Berners-Lee developed the ____________.


a. Yahoo c. Facebook
b. COBOL d. HTML

10. It is the basis of the fifth generation computers.


a. Microprocessor c. Nanotechnology
b. Internet d. Artificial intelligence

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 2


Lesson 4. Four Basic Periods of Information Technology

A-Learning Outcomes
At the end of this lesson, students are expected to:

1. Assess events and inventions made during the Four Basic Periods of Information
Technology.
2. Identify the significant personalities during the periods of Information Technology.

B-Time Allotment
Week 3, Day 5: (1 hour & 30 minutes)

C-Discussion

Four Basic Periods of Computer History

1. Pre-mechanical Age
2. Mechanical Age
3. Electromechanical Age
4. Electronic Age

Pre-mechanical Age

The Pre-mechanical age is the earliest age of information technology (3000B.C. -


1450 A.D.)

According to Gates (1994), language was the earliest system of storing and transmitting
information from one person to another. History, rituals, stories, prayers, medical & other
knowledge were passed on from generation to generation. When people realized that visual
symbols could represent spoken words, they invented their second means for preserving and
transmitting knowledge: writing- the chief medium used for this purpose for more than 5000
years.

The first writings were crude pictures carved on rocks, stone, bark, metal, and clay, or
whatever materials were at hand, called Petroglyph. The
first writing has three kinds:

1. Pictographic- representing an object.

2. Ideographic- representing the idea suggested


by the object

3. Phonographic- representing the sound of the


object or idea.
Chinese pictographs.
Image courtesy of Sarmad Hussain

Cuneiform.

From about 3600 to 2357 BC, the Sumerian civilization flourished in the Tigris-
Euphrates Valley. Perhaps their most significant contribution to human culture is the

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 3


Cuneiform is a style of writing of the Sumerians,
considered their greatest contribution to human culture-
the oldest writing system known, the oldest form of
information dissemination tool of trade and commerce.
Moreover, it is from the Latin word “cuneus,” meaning "
wedge " – an object with one pointed edge and one thick
edge which you keep firmly in position. Materials used
were soft clay and wedge shape metal, ivory, or wood
stylus.
Sumerian Cuneiform Tablet.
Image courtesy of The Metropolitan Museum of Art, New York;
Perhaps Sumer’s culture passed to Babylonia in Purchase, Raymond, and Beverly Sackler Gift, 1988, 1988.433.1,
www.metmuseum.org
Lower Mesopotamia (Iraq). A civilization that lasted until
689 B.C. and which produced Hammurabi and his notable code of laws.

Both Sumerian and Babylonian writing characters are represented by syllables rather than
letters. The Babylonians used writing in business transactions and in recording noteworthy
events. Thus, their books were devoted to government, law, history, and religion.

The Kingdom of Assyria, which existed at the time as Babylonia, also inherited the
Sumerian language and method of writing but modified the written characters until they
resembled those of the Babylonians.

Around 2000 BC, the Phoenicians created symbols expressing single syllables and
consonants (the first true alphabet). Greek adopted the Phoenician alphabet and added
vowels. Romans gave the letters Latin names to create the alphabet we use today

Paper and Pens

The earliest known writings of the


Egyptians date from ca.3000 B.C. The writing
material was the papyrus sheet, and the writing
instrument was a brush-like-pen made by
fraying the edges of a reed. The writing style
was Hieroglyphic, a word derived from the
Greek “hieros” means Sacred, and “glyphein”
means to carve or Hieroglyphic writing means
Sacred Writing.

Hieroglyphics on a temple wall at Karnak, Egypt.


Image credit of uwimages/Fotolia

• Sumerians – stylus and wet clay

Clay being inscribed with the written


Stylus language - cuneiform) -
Image credit of Nic http://oracc.museum.upenn.edu/
Rosenau (pinterest.ph) nimrud/ancientkalhu/thewritings/sumerian/
index.html

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 4


• Egyptians – Reed pen and paper
made from papyrus plants (2600 BC)

Reed pen Paper made from Papyrus plant


Image credit of Sutori.com Image credit of
(https://www.sutori.com/item/these-are- Experience_Ancient_Egypt.com
reed-pens-that-the-egyptians-made-in- (www.experience-ancient-egypt.com)
3000-b-c-to-write-on-papyrus-scro)

• Chinese – The Chinese were the first to master the art of


papermaking. The most likely used bark but later moved to
rags, bones, tortoiseshell, bamboo stalks, wooden tables, and
silk and linen.

https://beyond-calligraphy.com/2010/03/05/oracle-bone-script/

Animal bone with writings from the Shang Dynasty, it is said that it is a divination
inscription dating to the 6th year of the reign of King Diyi or Dixin. Reportedly
unearthed at Anyang, Henan Province. Image by BabelStone available under Bamboo slices and white writing brush made from fine animal hair
a Creative Commons License. https://www.chinasage.info/oracle-bones.htm tied together. Each bamboo slice is attached by threads that hold
them together. Image credit of
https://ancientchina132.weebly.com/

Books and Libraries (permanent storage device)

• Mesopotamia – religious leaders kept the earliest book •


• Egyptians – kept scrolls
• Greeks – (600 BC) fold sheets of Papyrus vertically into leaves and bound them
together

First Numbering System

• Egyptian – Vertical lines


(|)for numbers 1 – 9 - U or O
– 10 - coiled rope – 100 - lotus
blossom for 1000
Ancient Egyptian Numbers & Numeral system. Image
credit of Facts About Ancient Egyptians.
https://ancientegyptianfacts.com/ancient-egyptian-
numbers.html
• Hindus – (100 – 200 AD) 9
digit numbering
(1,2,3,4,5,6,7,8,9)

• 875 AD, the concept of zero was developed (0,1,2,3,4,5,6,7,8,9)


This trend continued with the advent of formal language and better media such as
rags, papyrus, and eventually paper. The first-ever calculator – the abacus, was invented in
this period after the development of numbering systems.

The early civilization uses Abacus for trade and commerce. Probably of Babylonian
origin, an abacus is a calculating instrument that uses beads that slide along a series of
wires or rods set in a frame to represent the decimal places. It is the ancestor of the
modern digital calculator (Merriam-Webster, (n. d.)

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 5


Mechanical Age

The mechanical age is when civilization starts to see


connections between the current technology and its
ancestors. The Mechanical age can be defined as the time
between 1450 and 1840. Johannes Gutenberg, a German
goldsmith, invented the modern printing press with movable
metal-type in 1450.

The first general-purpose computers

In 1614, the Scottish mathematician John Napier published


the first table of logarithms to simplify and speed up
calculations (multiplication, division, addition, and
subtraction).

An English clergyman, William Oughtred, invented the


slide rule with two sliding scales that are graduated
according to the logarithms of the amounts calculated.

The first page of Napier's Tables. Image courtesy of


Landmarks of Science Series, NewsBank-Readex

Two Sliding Scales


Image courtesy of Keuffel & Esser Co.

Wilhelm Schickard is a German mathematician famous for building the first automatic
calculator in 1623.

The Pascaline was designed and built by the French


mathematician-philosopher Blaise Pascal between
1642 and 1644. It could only do addition and
subtraction. Pascal invented the machine for his
father, a tax collector, so it was the first business
machine, too (if one does not count the abacus).
Image courtesy of IBM Archives

Image courtesy of the Computer Museum


History Center

Gottfried Leibniz, a German philosopher and


mathematician designed the calculating
machine intended (1671) and built it (1673). This machine expanded on the French
mathematician-philosopher Blaise Pascal’s ideas and did multiplication by repeated addition
and shifting (Swaine & Freiberger, 2017).

Charles Xavier Thomas de Colmar of France developed the Arithmometer, an early


calculating machine built in 1820. It is the first commercial mass-produced calculating device
to perform addition, subtraction, multiplication, and division.

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 6


Charles Babbage

He is considered the “Father of Computing” (born December 26,


1791, in London, England—died October 18, 1871, London)—an
English mathematician and inventor credited with having
conceived the first automatic digital computer.

Analytical
Engine is generally
considered the first
computer, a general-
Charles Babbage's Difference
purpose computing Engine, 1832.
machine (Swaine & Image courtesy of Science Museum
London
Freiberger, 2020).

A portion (completed 1910) of Charles


Babbage's Analytical Engine. Only partially built
during Babbage's death in 1871, this portion
contains the "mill" (functionally analogous to a
modern computer's central processing unit) and a
Analytical Engine
printing mechanism.
Image courtesy of Science Museum London
Jacquard loom, also called Jacquard
Attachment or Jacquard Mechanism, was invented
by Joseph-Marie Jacquard, a weaving device
incorporated in special looms to control individual
warp yarns. It enabled looms to produce fabrics
having intricate woven patterns such as tapestry,
brocade, and damask, and it has also been adapted
to the production of patterned knitted fabrics.

Ada Lovelace in full Ada King, countess of


Lovelace, full name Augusta Ada Byron, Lady
Byron, English mathematician, an associate of
Charles Babbage. Her understanding of the
machine enabled her to create instruction routines
that could feed into the computer. For this, she
has been considered the First Female Computer
Programmer. The U.S. Defense Programming
Language named the ADA Programming
The Jacquard Loom was invented by Joseph-Marie
Jacquard. Language after her.
Image courtesy of The Bettmann Archive

The Electromechanical Age: 1840 - 1940.

The key advance made during this period was discovering ways to harness electricity.
Knowledge and information could now be converted into electrical impulses. The
electromechanical age heralded the beginnings of telecommunications as we know it today.
This age can be defined roughly as the time between 1840 and 1940.

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 7


The Beginning of Telecommunication.

Telecommunication is defined as the science and technology of communication over a


distance. Man's first attempts at distance communication were extremely limited.

During the prehistoric period, man relied on fire and smoke signals
as well as drum messages to encode information over a limited geographic
area, especially during war.

Voltaic Battery

Alessandro Volta, an Italian physicist, invented the Voltaic Battery,


the first electric battery that provides a continuous current. The volt, a unit
of the electromotive force that drives current, was named in his honor in
1881 (The Editors of Encyclopaedia Britannica, 2020).
Alessandro Volta demonstrating his
battery's generation of electric
current before Napoleon (seated) in
Telegraph. Paris in 1801.Image courtesy of
Photos.com/Thinkstock

It is a device or system that allows the transmission of information


by coded signals over a distance. The term often refers to the electric telegraph, developed in
the mid-19th century and is a primary way of transmitting printed information through wire or
radio waves (McGillem, 2020).

Morse Code.

It refers to systems of representing letters of the


alphabet, numerals, and punctuation marks by an
arrangement of dots, dashes, and spaces. These
representations/codes are transmitted as electrical pulses of
varied lengths or analogous mechanical or visual signals, such
as flashing lights.

Samuel F.B. Morse invented one of the Morse code


systems during the 1830s for electrical telegraphy. It was
improved by Alfred Lewis Vail, his assistant, and partner. In
1851, it was again enhanced by a conference of European
nations named the International Morse Code or Continental
Morse Code.
Morse Code Alphabet
Image courtesy of Codebug.org.uk
Telephone and Radio.

Alexander Graham Bell invented the telephone in 1876 and did a refinement of the
phonograph in 1886 (Hochfelder, 2020).

Guglielmo Marconi invented the first radio in 1894, which is considered an important
discovery since it was found that electrical waves could also travel through space. Thus, it was
the starting point of the vast development of radio communication, navigation, and
broadcasting (Smith-Rose, 2020).

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 8


Harvard Mark I (1944). The official name of
Mark 1 was the Automatic Sequence Controlled
Calculator. It is considered the largest
electromechanical computer, the first automatic
computer, and the first automatic general-purpose
digital computer. Also, it performs the four basic
arithmetic operations (Albano, Atole & Ariola).

Electronic Age (1940-Present)


Harvard Mark I. Designed by Howard Aiken, this
electromechanical computer, more than 50 feet (15 meters)
This is the present age, and it started in 1940. long and containing some 750,000 components, was used to
It first started when electronic equipment, make ballistics calculations during World War II.
Image courtesy of IBM Archives
including computers, began to take place. At the
beginning of this stage, it was realized that
electronic vacuum tubes could be used instead of electromechanical parts.

The First High-Speed, General-Purpose Computer


Using Vacuum Tubes, the ENIAC (Electronic
Numerical Integrator and Computer).

One of the first fully functional computer systems was the


ENIAC, developed by John Mauchly, J. Presper Eckert, Jr.,
and their colleagues at the Moore School of Electrical
Engineering at the University of Pennsylvania. It is considered
the first programmable general-purpose electronic digital
computer.
ENIAC.
Courtesy of the University of Pennsylvania Archives
The First Stored-Program Computer.

At the University of Manchester, Frederic C. Williams and Tom Kilburn built a simple stored-
program computer known as the Baby in 1948. By 1949 Williams and Kilburn had extended
the Baby to a full-size computer, the Manchester Mark I, which is believed as the first
stored-program digital computer.

Four months after the Baby first worked, the British government built Ferranti Mark I in
coordination with Ferranti (an electronic company). It became the first commercial computer.
Because ENIAC had no means of storing
program instructions in its memory, John
von Neumann designed a new one with the
help of Mauchly and Eckert. Together they
produced the EDVAC (Electronic
Discrete Variable Computer.

On the other hand, Maurice Wilkes, a British


scientist at Cambridge University,
completed the EDSAC (Electronic Delay
Storage Automatic Calculator) two
EDSAC, 1947The EDSAC computer, 1947, with designer Maurice Wilkes
years before EDVAC was finished.
(kneeling in the center of the photograph).
Courtesy of the Computer History Museum, Mountain View, CA

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 9


The First General-Purpose Computer for Commercial Use.

Computing burst into popular


culture with UNIVAC (Universal
Automatic Computer), the first
computer to become a household
name. Eckert and Mauchly develop
it.

UNIVAC I Supervisory Control Console.


Image Courtesy of Richards, Mark (1951)

The Four Generations of Digital Computer

Information technology is divided into five stages or generations of computers, each


marked by the technology used to create the main logic element (the electronic component
used to store and process information) in computers during the period.

1. The First Generation Computer (1951-1958): The Age of Vacuum Tube


The first generation of computers (1951-1959) was characterized by the use of the vacuum tube and was very
large (a mainframe can occupy the whole room).

2. The Second Generation Computer (1959-1963): The Age of Transistor


The invention of the transistor marked the start of the second generation of computers (ca. 1954-1964), which
were smaller in size (a mainframe can be the size of a closet).

3. The Third Generation Computer (1964-1979): The Age of Integrated Circuit


Even if the first IC was invented earlier during the era of first-generation computers, it was only in the late
1960s when it was introduced, making it possible for many transistors to be fabricated on one silicon substrate, with
interconnecting wires plated in place.
4. The Fourth Generation Computer (1979- Present): The Age of Microprocessor
The introduction of large-scale integration of circuitry (more circuits per unit of space) is the mark of the
beginning of the fourth generation of computers.

5. The Fifth Generation Computer (Present-Onward): The Artificial intelligence


Computer devices with artificial intelligence (AI) are still in development, but some of these technologies,
such as voice recognition, are beginning to emerge and be used. AI is a reality made possible by using parallel
processing and superconductors. Leading to the future, computers will be radically transformed again by quantum
computation, molecular, and nanotechnology (Burns, 2016).

The generations of computers will be discussed more in Lesson 5.

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 10


Lesson 5. Generations of Digital Computer

A-Learning Outcomes
At the end of the chapter, students are expected to:

1. Create and apply the creative style in making a timeline of events and inventions
that happened during the five generations of the computer.
2. Identify significant personalities in the five generations of computers.

B-Time Allotment
Week 3, Day 6: (1 hour and 30 minutes)

C-Discussion

The Five Generations of Digital Computer

1. The 1st Generation: The Age of Vacuum Tube(1951-1958)


2. The 2nd Generation: The Age of Transistor (1959-1963)
3. The 3rd Generation: The Age of Integrated Circuit (1964-1979)
4. The 4th Generation: The Age of Microprocessor (1979- Present)
5. The 5th Generation: Artificial intelligence (Present-Onward)

The First Generation: The Age of Vacuum Tube (1951-1958)

The vacuum tubes were used as the internal components of the first-generation
computer. The first generation of computers began by introducing the first commercially viable
electronic computer: the UNIVAC I (Universal Automatic Computer I). It was developed in 1951
to improve information processing in business organizations.

Characteristics:

• The sizes of computers were as large as the size of a room.


• Possession of Vacuum Tubes to perform the calculation.
• Usage of the program (an internally stored instruction).
• Usage of capacitors to store binary data and information.
• Usage of the punched card for communication of input and output data and
information
• Generation of a lot of heat.
• Possession of about 1000 circuits per cubic foot.

- Zakari & Yar, 2019

The history of computers was dated back to the period of the scientific revolution (i.e.,
1543 – 1678). The calculating machine invented by Blaise Pascal (constructed a mechanical
calculator capable of addition and subtraction, called Pascal's calculator or the Pascalin) in 1642
and that of Gottfried Leibniz (inventor of digital mechanical calculator, called the Stepped
Reckoner) marked the genesis of the application of the machine in the industry.

In 1801, Joseph Marie Jacquard invented a loom that used punched wooden cards to
automatically weave fabric designs, as this became the basis of the early computers.

In 1936, Alan Turing presented the notion of a universal machine, later called the
Turing machine, capable of computing anything computable. According to Hodges (2013),

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 11


Turing’s work is regarded as the foundation of computer science and the artificial
intelligence program.

In 1890, Herman Hollerith devised a punched-card tabulating machine. His invention


was used to tabulate the 1890 US census. Six years later, he founded the Tabulating Machine
Company. Then five years afterward, his company merged with several companies and formed
the Computing-Tabulating-Recording Company. Thomas J. Watson Sr.’ the company’s general
manager, changed its name to International Business Machine (IBM) in 1924.

In 1953, International Business Machines (IBM) produced the first of its computers,
the IBM 701—a machine designed to be mass-produced and easily installed in a customer’s
building. The success of the 701 led IBM to manufacture many other devices for commercial
data processing.

In 1954, The FORTRAN programming language, an acronym for FORmula


TRANslation, was developed by a team of programmers at IBM led by John Backus.

Moreover, the invention of the integrated circuit (IC) started here as Jack S. Kilbey of
Texas Instruments developed it in 1958.

The Second Generation: The Age of Transistor (1959-1964)

The invention of the integrated circuit (IC) by Jack S. Kilbey in 1958 is considered a
great invention that changed how the world functions. It is the heart of all electronic
equipment today.

The transistor was used as the internal component of the second-generation


computer. Transistors were much smaller, faster, and more reliable than vacuum tubes. They
consume less electrical energy and need no warm-up time.

Characteristics:

• Computers were still significant, however, smaller than the first generation of
computers.
• Use a transistor in place of Vacuum Tubes to perform the calculation.
• Produced at a reduced cost compared to the first generation of computers.
• Usage of magnetic tapes for data storage.
• Usage of punch cards as input and output of data and information. The use of the
keyboard was also introduced.
• Computers were still generating a lot of heat which an air conditioner is needed to
maintain the cold temperature.
• Possession of about one thousand circuits per cubic foot.
- Zakari & Yar, 2019

Honeywell 400 computer- is considered the first in the line of second-generation


computers.

In the 1950s and 1960s, the largest companies could only afford six to seven-digit tags
of mainframe computers. Thus, Digital Equipment Corporation introduced the PDP-8,
considered the first successful transistor-based microcomputer. It became an instant hit and
had tremendous demands from business and scientific organizations.

Between 1959 and 1961, COBOL (Common Business Oriented Language) was invented
by Grace Murray Hopper. It is an English-like programming language. Its establishment as a
WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 12
required language by the United States Department of Defense, its emphasis on data structures,
and its English-like syntax led to its pervasive acceptance, especially in business applications.
It became a standardized programming language that is hardware-independent.

The Third Generation: The Age of Integrated Circuit (1965-1970)

Although the first Integrated Circuit (IC) was invented earlier in the first generation of
computers, it was only in the late 1960s when it was introduced.

Characteristics:

• Large-scale integrated circuits were used for both data processing and storage.
• Miniaturization of computers.
• A monitor, keyboard, and mouse were used.
• Programming languages like COBOL and FORTRAN were developed.
• Had a hundred thousand circuits per cubic foot.
- Zakari & Yar, 2019

In 1965, integrated circuits began to replace transistors as the internal components


used to construct the computer. Even the entire circuit board of transistors can be replaced
completely with one chip (integrated circuit). Integrated circuits are made of silicon chips.
Silicon is a semiconductor crystalline substance that can conduct electric current.

IBM System/360 computers made all previous computers obsolete because they could
move to the next model without worrying about converting their data.

In 1964, Beginner's All-purpose Symbolic Instruction Code (BASIC) was developed by


John Kemeny and Thomas Kurtz at Dartmouth College. BASIC is a high-level programming
language that gained enormous popularity because it is easy to use and learn.

In 1969, Dennis Ritchie and Ken Thompson developed Multics (Multiplexed


Information and Computing Service). Also, they implemented an operating system (OS) named
Unics which somehow became UNIX. This OS is portable, machine-dependent can run on all
types of computers, and supports multi-user processing, multitasking, and networking.
Moreover, this OS was written in C language, which Ritchie and Thompson also developed.

In 1971, the floppy disk was developed by Alan Shugart and his team of IBM engineers.
It allowed data to be shared among computers.

In 1973, Ethernet was developed by Robert Metcalfe. It was used for connecting
multiple computers and other hardware.

In 1976, Apple I was developed by Steve Jobs and Steve Wozniak. Also, Job and
Wozniak started the Apple Computers on April Fool's Day. The Apple 1 was the first computer
with a single-circuit board.

The Fourth Generation: The Age of Microprocessor (1979- 1980)

Characteristics:

• The microprocessor was used to perform all the tasks of a computer system
used today.
• The size of computers and the cost were reduced.
• Increase the speed of computers.
• Very large-scale (VLS) integrated circuits were used.
• Had millions of circuits per cubic foot.
- Zakari & Yar, 2019

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 13


The fourth generation is just an extension of the third-generation of computers. This
generation put more power and capabilities in one ship called the microprocessor. The
Microprocessor is the brain of the computer. It could process almost all computations and
operations of the computer. Also, it can be used in appliances, digital watches, pocket
calculators, television sets, and cars (Pepito, 2002).

The BASIC programming language was developed by Bill Gates and Paul Allen for the
MITS Altair, the first commercially-available microcomputer. Later on, they formed the
Microsoft Corporation in 1975.

In 1977, Apple II personal computer was introduced that is made available to


individuals and very small companies.

The Fifth Generation: Artificial Intelligence (1981-Onward)

The fifth generation of computers is characterized by the very large-scale integrated


(VLSI) circuit (microchip), with many thousands of interconnected transistors etched into a
single silicon substrate. It is also characterized by network computers of all sizes, the Internet,
Intranets, and Extranets.

In 1982, Lotus Development Company was founded by Michael Kapor, a subsidiary of


IBM. It introduced an electronic spreadsheet product (Lotus 123), giving IBM PC credibility in
the business marketplace.

In 1984, The Macintosh desktop computer was introduced by Apple Macintosh with a
very friendly graphical user interface (GUI). This changed the interaction between the user and
the computer from a short, character-oriented exchange modeled by the teletypewriter to the
now-famous WIMP interface (WIMP stands for windows, icons, menus, and pointing devices).

In 1991, LINUX was developed by Linus Torvalds, a reliable and compactly designed
operating system that is an offshoot of UNIX. LINUX was used as an alternative to the costly
Windows Operating System.

In 1990, HyperText Markup Language (HTML) was developed by Tim Berners-Lee. It


gave way to the World Wide Web (WWW).

In 1993, Mosaic was developed by Marc Andreessen, and in 1994, he also developed the
Netscape Navigator.

In 1994, Yahoo, a global internet services provider, was founded by Jerry Yang and
David Filo. Now, it has been owned already by Verizon Communications since 2017.

In 1995, Internet Explorer was launched by the Microsoft Corporation.

The year 1996 marked the 50th year of computer history. ENIAC postal service stamp
was issued by the US Postal service in commemoration of this celebration.

In 1999, the millennium bug, or Y2K, threatened the world. The information system
only registered two last digits of the year (i.e., 99 for 1999). Thus, “00” would only mean
January 1, 1900, not January 1, 2000.

Microminiaturization is one of the continuing trends in computer development. It


compresses more circuit elements into smaller chip space. In 1999, scientists developed a
circuit the size of a single layer of molecules, and in 2000 IBM announced that it had developed
new technology to produce computer chips that operate five times faster than the most
advanced models to date.

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 14


In 1996, the Google search engine was developed by Sergey Brin and Larry Page.

In 1997, Apple received an investment from Microsoft worth 150 million. This is the
time when Apple ended its court case against Microsoft Company. The Apple Company sued
Microsoft for allegedly copying the "look and feel" of Apple’s operating system.

In 1999, the term Wi-Fi became part of the computing language, and users began
connecting to the Internet without wires.

In 2000, the Mac OS X operating system was unveiled by Apple. It provided protected
memory architecture and pre-emptive multi-tasking, among other benefits. Not to be outdone,
Microsoft rolled out Windows XP, which has a significantly redesigned graphical user interface
GUI.

In 2004, Facebook was launched by Mark Zuckerberg and his team Eduardo Saverin,
Dustin Moskovitz, and Chris Hughes. In the same year, the Mozilla Corporation released
Firefox 1.0, a web browser.

In 2005, YouTube, a website for sharing videos, was launched. It was registered by
Steve Chen, Chad Hurley, and Jawed Karim.

In 2006, MacBook Pro was introduced by Apple; this is the first Intel-based, dual-core
mobile computer and Intel-based iMac.

In 2007, the iPhone brought many computer functions to the smartphone.

In 2009, Microsoft launched Windows 7, which offers the ability to pin applications to
the taskbar and advances in touch and handwriting recognition, among other features.

In 2010, the iPad was unveiled by Apple.

In 2011, Chromebook was released by Google. This is a laptop that runs on Google
Chrome OS.

On October 4, 2012, Facebook gained 1 billion users.

In 2015, Apple Watch was released by Apple. While Windows 10 was released by
Microsoft.

The Japanese coined the fifth generation computer to describe their plan to build a
powerful computer by the mid-1990s. Later, the term evolved to include computer intelligence:
artificial intelligence, natural language, and expert system. However, the 5 th generation focuses
more on connectivity—computers to other computers (Pepito, 2002).

Artificial intelligence is the basis of the fifth generation of computers. Moreover, many
technologies in the present are still in the process of completion or even improvement. Also,
technologies of this generation include voice recognition, parallel processing, superconductors,
quantum computation, and nanotechnology.

Characteristics:

• Extremely large-scale integration.


• Parallel processing
• Possession of high-speed logic and memory chips.
• High performance, yet micro-miniaturization
• The ability of computers to mimic human intelligence, e.g., voice recognition,
facial face detector, and thumbprint.
• Satellite links, virtual reality.

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 15


• Have billions of circuits per cubic

Examples:
• Supercomputers
• Robots
• Facial face detector
• Thumbprint.

WPU-QSF-ACAD-82A Rev. 00 (09.15.20) 2ND SEMESTER, A.S.ENARIO 16

You might also like