Professional Documents
Culture Documents
What is a COMPUTER?
• A computer is a machine or device that can be instructed to carry
out sequences of arithmetic or logical operations automatically via
computer programming.
• Modern computers have the ability to follow generalized sets of
operations, called programs. These programs enable computers to
perform an extremely wide range of tasks.
Early Computing Devices
ABACUS
• It is a Latin word, derived from
the Greek word ABAX which
means a calculating table.
• The abacus, also called a
counting frame, is a calculating
tool – the first known calculator,
was invented in Babylonia 2400
BC.
• The abacus itself doesn't
calculate; it's simply a device
for helping a human being to
calculate by remembering what
has been counted.
THE PASCAL'S
CALCULATOR
(1642)
The first practical mechanical
calculator--also known as Pascaline
or the Arithmetique.
Pascal's wheel is a mechanical
calculator invented by Blaise Pascal
at the age of 18 in the early 17th
century. Pascal was led to develop a
calculator by the laborious
arithmetical calculations required by
his father's work as the supervisor of
taxes in Rouen.
The machine had a series of
interlocking cogs (gear wheels with
teeth around their outer edges) that
could add and subtract decimal
numbers.
STEPPED DRUM
(1672)
The stepped reckoner was a digital
mechanical calculator invented by
the German mathematician
Gottfried Wilhelm Leibniz around
1672 and completed in 1694.
The name comes from the
translation of the German term for
its operating mechanism,
Staffelwalze, meaning 'stepped
drum'.
The first machine which could
execute all 4 arithmetic functions.
Engines of Calculations
• The first computer systems used vacuum tubes for circuitry and magnetic drums for memory, and
were often enormous, taking up entire rooms.
• First generation computers relied on machine language, the lowest-level programming language
understood by computers, to perform operations, and they could only solve one problem at a time.
• Computers of this generation could only perform single task, and they had no operating system.
• It would take operators days or even weeks to set-up a new problem. Input was based on
punched cards and paper tape, and output was displayed on printouts.
• These computers were very expensive to operate and in addition to using a great deal of
electricity, the first computers generated a lot of heat, which was often the cause of malfunctions.
ATANASOFF-
BERRY
COMPUTER
(1937-1942)
The first automatic electronic digital
computer was built by Dr. John V.
Atanasoff and Clifford Berry.
It was called the Atanasoff-Berry
Computer (ABC). An early electronic
digital computing device that has
remained somewhat obscure.
It was designed only to solve systems
of linear equations and was
successfully tested in 1942.
The ABC pioneered important
elements of modern computing,
including binary arithmetic and
electronic switching elements, but its
special-purpose nature and lack of a
changeable, stored program
distinguish it from modern computers.
Conceived in 1937, the machine was built by Iowa State College mathematics and physics professor John
Vincent Atanasoff with the help of graduate student Clifford Berry.
Atanasoff and Berry's computer work was not widely known until it was rediscovered in the 1960s,
amidst conflicting claims about the first instance of an electronic computer. At that time ENIAC,
that had been created by John Mauchly and J. Presper Eckert, was considered to be the first
computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and
concluded that the ENIAC inventors had derived the subject matter of the electronic digital
computer from Atanasoff.
Atanasoff-Berry Computer (ABC), an early digital computer. It was generally believed that the first
electronic digital computers were the Colossus, built in England in 1943, and the ENIAC, built in the
United States in 1945.
COLOSSUS
(1943)
Colossus was an electronic
digital computer, built
during WWII from over 1700
valves (tubes).
It was used to break the
codes of the German Lorenz
SZ-40 cipher machine that
was used by the German
High Command.
Colossus is sometimes
referred to as the world's
first fixed program, digital,
electronic, computer.
ELECTRONIC NUMERICAL
INTEGRATOR AND COMPUTER
(1943-1945)
Two University of Pennsylvania professors, John
Mauchly and J. Presper Eckert, build the Electronic
Numerical Integrator and Calculator (ENIAC).
Considered the grandfather of digital computers, it
fills a 20-foot by 40-foot room and has 18,000
vacuum tubes.
ENIAC was the world's first electronic general-
purpose computer. It was Turing-complete, digital
and able to solve "large class of numerical problems"
through reprogramming.
Although ENIAC was designed and primarily used to
calculate artillery firing tables for the United States
Army's Ballistic Research Laboratory its first program
was a study of the feasibility of the thermonuclear
weapon.
The first large-scale computer to run at electronic
speed without being slowed by any mechanical
parts. For a decade, until a 1955 lightning strike,
ENIAC may have run more calculations than all
mankind had done up to that point.
ELECTRONIC DISCRETE
VARIABLE AUTOMATIC
COMPUTER
(1946)
1954:
The FORTRAN programming language, an acronym for FORmula TRANslation,
is developed by a team of programmers at IBM led by John Backus, according
to the University of Michigan.
Second Generation: Transistors
(1956-1963)
• The world would see transistors replace vacuum tubes in the second generation of computers.
• The transistor was far superior to the vacuum tube, allowing computers to become smaller,
faster, cheaper, more energy-efficient and more reliable than their first-generation
predecessors. Though the transistor still generated a great deal of heat that subjected the
computer to damage, it was a vast improvement over the vacuum tube.
• Second-generation computers still relied on punched cards for input and printouts for
output.
• From Binary to Assembly:
Second-generation computers moved from cryptic binary machine language to symbolic, or
assembly,languages, which allowed programmers to specify instructions in words.
DATAMATIC 1000
(1957)
1970:
The newly formed Intel unveils the Intel 1103, the first Dynamic Access
Memory (DRAM) chip.
Fourth Generation: Microprocessors
(1971-Present)
A number of personal
computers hit the market,
including Scelbi & Mark-8
Altair, IBM 5100, Radio
Shack's TRS-80 —
affectionately known as the
"Trash 80" — and the
Commodore PET.
TRASH 80
(1977)
A desktop computer
released by the Apple
Computer Company in
1976.
It was designed and hand-
built by Steve Wozniak.
The idea of selling the
computer came from
Wozniak's friend Steve Jobs.
1977:
Jobs and Wozniak incorporate Apple and show the Apple II at the first West Coast
Computer Faire. It offers color graphics and incorporates an audio cassette drive for
storage.
1978:
Accountants rejoice at the introduction of VisiCalc, the first computerized
spreadsheet program.
1979:
Word processing becomes a reality as MicroPro International releases WordStar.
"The defining change was to add margins and word wrap," said creator Rob Barnaby
in email to Mike Petrie in 2000. "Additional changes included getting rid of command
mode and adding a print function. I was the technical brains — I figured out how to
do it, and did it, and documented it. "
ACORN
(1981)
1986:
Compaq brings the Deskpro 386 to market. Its 32-bit architecture provides as speed comparable to mainframes.
1990:
Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, develops HyperText Markup Language (HTML), giving
rise to the World Wide Web.
1993:
The Pentium microprocessor advances the use of graphics and music on PCs.
1994:
PCs become gaming machines as "Command & Conquer," "Alone in the Dark 2," "Theme Park," "Magic Carpet," "Descent" and "Little Big
Adventure" are among the games to hit the market.
1996:
Sergey Brin and Larry Page develop the Google search engine at Stanford University.
1997:
Microsoft invests $150 million in Apple, which was struggling at the time, ending Apple's court case against Microsoft in which it alleged
that Microsoft copied the "look and feel" of its operating system.
1999:
The term Wi-Fi becomes part of the computing language and users begin connecting to the Internet without wires.
Fifth Generation: Artificial Intelligence
(Present-Beyond)
2003:
The first 64-bit processor, AMD's Athlon 64, becomes available to the consumer market.
2004:
Mozilla's Firefox 1.0 challenges Microsoft's Internet Explorer, the dominant Web browser. Facebook, a social
networking site, launches.
2005:
YouTube, a video sharing service, is founded. Google acquires Android, a Linux-based mobile phone
operating system.
MACBOOK PRO
(2006)
2009:
Microsoft launches Windows 7, which offers the ability to pin
applications to the taskbar and advances in touch and handwriting
recognition, among other features.
iPAD
(2010)
Apple unveils the iPad, changing the way consumers view media and
jumpstarting the dormant tablet computer segment.
2011:
Google releases the Chromebook, a laptop that runs the Google Chrome OS.
2012:
Facebook gains 1 billion users on October 4.
2015:
Apple releases the Apple Watch. Microsoft releases Windows 10.
2016:
The first reprogrammable quantum computer was created. "Until now, there hasn't been any quantum-computing
platform that had the capability to program new algorithms into their system. They're usually each tailored to
attack a particular algorithm," said study lead author Shantanu Debnath, a quantum physicist and optical engineer
at the University of Maryland, College Park.
2017:
The Defense Advanced Research Projects Agency (DARPA) is developing a new "Molecular Informatics" program
that uses molecules as computers. "Chemistry offers a rich set of properties that we may be able to harness for
rapid, scalable information storage and processing," Anne Fischer, program manager in DARPA's Defense Sciences
Office, said in a statement. "Millions of molecules exist, and each molecule has a unique three-dimensional atomic
structure as well as variables such as shape, size, or even color. This richness provides a vast design space for
exploring novel and multi-value ways to encode and process data beyond the 0s and 1s of current logic-based,
digital architectures
That's all.
Group IV:
Argente, Angelica P.
Gaganao, Kate G.
Calvadores, Adam
Navidad, June