0% found this document useful (0 votes)
44 views2 pages

My Own Work

The evolution of computers from the 17th century to present day is outlined in the timeline. Key developments include the first mechanical calculator by Blaise Pascal in 1642, Charles Babbage's conceptualization of the Analytical Engine in the 1820s, Alan Turing's work on computability and the Turing machine in the 1930s, the invention of the integrated circuit by Jack Kilby and Robert Noyce in 1958, the creation of the first personal computer by Steve Jobs and Steve Wozniak in 1976, Tim Berners-Lee developing the World Wide Web in the early 1990s, the introduction of the iPhone and iPad by Apple in 2007 and 2010, IBM's Watson winning Jeopardy in

Uploaded by

lubianomarkjhon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views2 pages

My Own Work

The evolution of computers from the 17th century to present day is outlined in the timeline. Key developments include the first mechanical calculator by Blaise Pascal in 1642, Charles Babbage's conceptualization of the Analytical Engine in the 1820s, Alan Turing's work on computability and the Turing machine in the 1930s, the invention of the integrated circuit by Jack Kilby and Robert Noyce in 1958, the creation of the first personal computer by Steve Jobs and Steve Wozniak in 1976, Tim Berners-Lee developing the World Wide Web in the early 1990s, the introduction of the iPhone and iPad by Apple in 2007 and 2010, IBM's Watson winning Jeopardy in

Uploaded by

lubianomarkjhon
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

The evolution of computers from

1700's until our present day.

Timeline

1642 Blaise Pascal

Blaise Pascal invented the Pascaline, which was an


early mechanical calculator, in the 17th century. It
was designed to perform addition and subtraction of
numbers automatically, reducing human error and
speeding up calculations. The Pascaline laid the
foundation for future developments in computing
and played a significant role in the history of
calculating devices.

Charles Babbage
1820s
The Analytical Engine, conceived by Charles
Babbage in the 19th century, was a
groundbreaking concept that can be
considered as a precursor to modern
computers. It was designed to perform various
types of calculations, including arithmetic and
logical operations, and could be programmed
using punched cards.

1936 Alan Turing


Alan Turing, a British mathematician, is known for
his profound contributions to the theoretical
concepts of computation and the development
of the Turing Machine. In the 1930s, Turing
introduced the idea of a theoretical machine that
could perform any computation that a human
mathematician could, formalizing the concept of
computation itself.

Jack kilby & Robert Noyce 1958


Jack Kilby and Robert Noyce are both
credited with independently inventing
the integrated circuit (IC), a
revolutionary technology that paved
the way for modern electronics and
computing.

Steve Jobs & Steve


1976 Wozniak
Steve Jobs and Steve Wozniak, the duo who began Apple
Computer in 1976, are among the most well-known
revolutionaries of the computing age. Their invention of
the first true personal computer changed people’s ideas
of what a computer could look like and what it could do
to make their lives easier and their work more efficient.
Apple continues to be one of the most popular brands of
personal computing devices in the world.

Tim Berners-Lee 1991


Tim Berners-Lee is the inventor of the World
Wide Web, a system that allows people to
access and share information on the
internet. He developed the idea in the late
1980s and implemented the first successful
communication between a client and a
server in 1990.
The evolution of
computers from

1700's until our


present day. TIMELINE

2007
In 2007, Apple introduced the iPhone, a
revolutionary device that transformed
smartphones and mobile computing
Combining a phone, an iPod, and an
internet communication device into one,
the iPhone featured a touch screen
interface, sleek design, and an App Store.

In 2010, Apple unveiled the iPad, a 2010


groundbreaking tablet device that popularized
the concept of tablets. The iPad combined
elements of a smartphone and a laptop,
featuring a larger couch screen, a variety of
apps, and a user-friendly Interface. Its
portability, versatility, and ease of use appealed
to a wide range of users, from casual consumers
to professionals.

2011
In 2011, IBM's Watson won the game show
Jeopardy, demonstrating the remarkable
potential of artificial intelligence (Al). Watson, a
computer system, used advanced natural
language processing and machine learning
algorithms to analyze and answer complex
questions posed in natural language.

In 2015, Deep learning and neural networks made 2015


significant advancements in various fieles, including
computer vision, natural language processing and
speech recognition. These techinninges reunion the
way machines understood and processed
informatipe, trading to breakthroughs tasks ik image
recognition, language translation, and voice
commands. This program was driven by the
development of mor soo sticated architectures
arger datasets, and improved training techniques,

2020s
In the 2020s, quantum computing research achieved
breakthroughs in solving complex problems by
harnessing the principles of quantum mechanics.
Quantum computers utilize qubits, which can represent
multiple states simultaneously, allowing them to process
information in ways that classical computers cannot.
These advancements hold promise for tackling intricate
problems like cryptography, optimization, and simulating
quantum systems. However, practical quantum
computers are still in their early stages, facing
challenges in stability and error correction before they
can realize their full potential.

PRESENT
In the present, cloud computing, Al (Artificial Intelligence), IoT
(Internet of Things), and 5G technologies continue to shape
the modern computing landscape. Cloud computing enables
scalable and flexible access to resources, Al is being
integrated into various applications for automation and data
analysis, lot connects everyday objects to the internet for
data collection and control, and 5G provides faster and more
reliable wireless connectivity, facilitating real-time
communication and enabling new possibilities across
industries. These technologies are driving innovation and
transforming how we interact with and leverage digital
systems.

You might also like