Professional Documents
Culture Documents
BSBA-IM-2C
LESSON 1
History of ICT
The first commercial computer was the UNIVAC I, developed by John Eckert and John
W. Mauchly in 1951. It was used by the Census Bureau to predict the outcome of the
1952 presidential election. For the next twenty-five years, mainframe computers were
used in large corporations to do calculations and manipulate large amounts of
information stored in databases. Supercomputers were used in science and
engineering, for designing aircraft and nuclear reactors, and for predicting worldwide
weather patterns. Minicomputers came on to the scene in the early 1980s in small
businesses, manufacturing plants, and factories.
In 1975, the Massachusetts Institute of Technology developed microcomputers. In
1976, Tandy Corporation's first Radio Shack microcomputer followed; the Apple
microcomputer was introduced in 1977. The market for microcomputers increased
dramatically when IBM introduced the first personal computer in the fall of 1981.
Because of dramatic improvements in computer components and manufacturing,
personal computers today do more than the largest computers of the mid-1960s at
about a thousandth of the cost. Computers today are divided into four categories by
size, cost, and processing ability. They are supercomputer, mainframe, minicomputer,
and microcomputer, more commonly known as a personal computer. Personal
computer categories include desktop, network, laptop, and handheld.
The term "information technology" evolved in the 1970s. Its basic concept, however,
can be traced to the World War II alliance of the military and industry in the
development of electronics, computers, and information theory. After the 1940s, the
military remained the major source of research and development funding for the
expansion of automation to replace manpower with machine power. Since the 1950s,
four generations of computers have evolved. Each generation reflected a change to
hardware of decreased size but increased capabilities to control computer operations.
The first generation used vacuum tubes, the second used transistors, the third used
integrated circuits, and the fourth used integrated circuits on a single computer chip.
Advances in artificial intelligence that will minimize the need for complex programming
characterize the fifth generation of computers, still in the experimental stage.
Lastly, ICT helped me to realize how important school and knowledge is. Especially now
that we are in a new normal, and travels are still limited, I can still go to my studies by
using technologies and computer even I am at home. I am grateful of these
technologies, and I promise to use them to cause good.
LESSON 2
Definition of Computer
A computer is a device that accepts information (in the form of digitalized data) and
manipulates it for some result based on a program, software, or sequence of
instructions on how the data is to be processed.
Complex computers include the means for storing data (including the program, which is
also a form of data) for some necessary duration. A program may be invariable and built
into the computer hardware (and called logic circuitry as it is on microprocessors) or
different programs may be provided to the computer (loaded into its storage and then
started by an administrator or user). Today's computers have both kinds of
programming.
Major types of computers
Analog computer - represents data by measurable quantities
Desktop computer - a personal computer that fits on a desk and is often used for
business or gaming
Digital computer - operates with numbers expressed as digits
Hybrid computer - combines features of both analog and digital computers
Laptop (notebook) - an easily transported computer that is smaller than a briefcase
Mainframe (big iron) computer - a centralized computer used for large scale computing
Microcomputer - generally referred to as a PC (personal computer). Uses a single
integrated semiconductor chip microprocessor.
minicomputer - an antiquated term for a computer that is smaller than a mainframe and
larger than a microcomputer
Netbook - a smaller and less powerful version of a laptop
Personal computer (PC) - a digital computer designed to be used by one person at a
time
Smartphone - a cellular telephone designed with an integrated computer
Supercomputer - a high performing computer that operates at extremely high speeds
Tablet computer (tablet PC) - a wireless personal computer with a touch screen
Workstation - equipment designed for a single user to complete a specialized
technical/scientific task.
Modern computers inherently follow the ideas of the stored program laid out by John
von Neumann in 1945. Essentially, the program is read by the computer one instruction
at a time, an operation is performed, and the computer then reads the next instruction.
From the mid-1900s to the present, the advancement of computers is divided into five
generations. While the year span for each generation varies depending on the reference
source, the most recognized generational timeline is below.
1940 to 1956
First generation computers were room-sized machines that used vacuum tubes for
circuitry and magnetic drums for limited internal storage. These machines
used punched cards for data input and a binary machine code (language). Examples of
first generation computers include the ABC (Atanasoff Berry Computer), Colossus, IBM
650 and the EDVAC (Electronic Discrete Variable Computer).
1956 to 1963
Second generation computers replaced vacuum tubes with transistors, used magnetic
tape storage for increased storage capacity, used BAL (basic assembler language) and
continued to use punched cards for input. Transistors drew less power and generated
less heat than vacuum tubes. Examples of second-generation computers include
the IBM 7090, IBM 7094, IBM 1400, and the UNIVAC (Universal Automatic Computer).
1964 to 1971
Third generation computers used ICs (integrated circuits) with several transistors and
MOS (metal oxide semiconductor) memory. Smaller, cheaper and faster than their
predecessors, these computers used keyboards for input, monitors for output, and
employed programming languages such as FORTRAN (Formula
Translation), COBOL (Common Business Oriented Language) and C-Language.
Examples of third generation computers include the IBM 360 and IBM 370 series.
1972 to 2010
Fourth generation computers used integrated circuits and microprocessors with VLSI
(very large scale integration), RAM (random access memory), ROM (read-only
memory), and high-level programming languages including C and C++. The creation
and expansion of the World Wide Web and cloud computing (the ability to deliver
hosted services using the Internet) significantly enhanced computing capabilities during
this period. Examples of fourth generation computers include Apple's Macintosh and
IBM's PC.
Vacuum tube – an electronic device that controls the flow of electrons in a vacuum. It
used as a switch, amplifier, or display screen in many older model radios, televisions,
computers, etc.
Integrated circuit (IC) – a small electronic circuit printed on a chip (usually made of
silicon) that contains many its own circuit elements (e.g. transistors, diodes, resistors,
etc.).
Magnetic drum – a cylinder coated with magnetic material, on which data and programs
can be stored.
Magnetic core – uses arrays of small rings of magnetized material called cores to store
information.
Assembly language is like the machine language that a computer can understand,
except that assembly language uses abbreviated words (e.g. ADD, SUB, DIV…) in
place of numbers (0s and 1s).
Memory – a physical device that is used to store data, information and program in a
computer.
Artificial intelligence (AI) – an area of computer science that deals with the simulation
and creation of intelligent machines or intelligent behave in computers (they think, learn,
work, and react like humans).
First Generation of Computers
Fifth generation The present and the future Artificial intelligence based
The main characteristics of fifth generation of computers (the present and the
future)
Main electronic component: based on artificial intelligence, uses the Ultra Large-
Scale Integration (ULSI) technology and parallel processing method.
o ULSI – millions of transistors on a single microchip
o Parallel processing method – use two or more microprocessors to run
tasks simultaneously.
Language – understand natural language (human language).
Power – consume less power and generate less heat.
Speed – remarkable improvement of speed, accuracy and reliability (in
comparison with the fourth generation computers).
Size – portable and small in size, and have a huge storage capacity.
Input / output device – keyboard, monitor, mouse, trackpad (or touchpad),
touchscreen, pen, speech input (recognise voice / speech), light scanner, printer,
etc.
Example – desktops, laptops, tablets, smartphones, etc.