Professional Documents
Culture Documents
SUBJECT:
CLOUD COMPUTING
ASSIGNMENT NO. 4
COURSE INSTRUCTOR:
DR. HUMAIRA IJAZ
COURSE CODE:
IT-4544
PREPARED BY:
WALEED AHMED
BSEF17E05
BSSE (7TH- SELF)
2017-2021
DATE:
Tuesday, June 22, 2021
DEPARTMENT OF CS & IT
UNIVERSITY OF SARGODHA
SARGODHA, 40100
Page 1|9
CLOUD COMPUTING
INTRODUCTION: 3
HARDWARE EVOLUTION: 3
1. FIRST-GENERATION COMPUTERS: 3
2. SECOND-GENERATION COMPUTERS: 4
3. THIRD-GENERATION COMPUTERS 4
4. FOURTH-GENERATION COMPUTERS 4
CONCLUSION: 4
SOFTWARE EVOLUTION: 4
A) DISTRIBUTED SYSTEMS: 7
B) CLUSTER COMPUTING: 7
C) GRID COMPUTING: 7
D) VIRTUALIZATION: 8
E) WEB 2.0: 8
F) SERVICE ORIENTATION: 8
G) UTILITY COMPUTING: 8
Page 2|9
CLOUD COMPUTING
EVOLUTION OF CLOUD
COMPUTING
(IN TERMS OF HARDWARE AND SOFTWARE EVOLUTION)
INTRODUCTION:
In this assignment, I present an overview of the history of forecasting software over the past years,
concentrating especially on the interaction between computing and technologies from mainframe
computing to cloud computing. The cloud computing is latest one. For delivering the vision of various of
computing models, this paper lightly explains the architecture, characteristics, advantages, applications
and issues of various computing models like PC computing, internet computing etc. and related
technologies.
The Fifth Generation of Computing (after Mainframe, Personal Computer, Client-Server Computing, and
the web) is the Cloud Computing. The Computing is the systematic study of algorithmic processes that
describe and transform information, their theory, analysis, design, efficiency, implementation, and
application. Cloud Computing is defined as Internet-based computing, whereby shared resources,
software and information are provided to computers and other devices on-demand, like the electricity
grid.
With the turn of the 21st century, the term "Cloud Computing" began to appear more widely used.
There, are many new computing paradigms have been developed and adopted, with the emergence of
technological advances such as multi-core processors and networked computing environments, to edge
closer toward achieving the grand vision of computing.
These new computing paradigms are cluster computing, Grid computing, P2P computing, service
computing, market oriented computing, and most recently Cloud computing. Cloud computing has
become a significant technology trend, driven by big players like Amazon, Microsoft, Google,
Salesforce.com, and transforming our current IT industry.
Cloud computing delivers large-scale utility computing services to a wide range of consumers. Within
cloud computing, users on various types of devices access programs, storage, processing and
applications over the Internet, offered by cloud computing providers, resulting in a previously
unprecedented elasticity of resources.
These evolution of Cloud Computing falls into two primary areas, hardware and software.
HARDWARE EVOLUTION:
The first step along the evolutionary path of computers occurred in 1930, when binary arithmetic was
developed. In 1941, the introduction of Konrad Zuse’s Z3 at the German Laboratory for Aviation in
Berlin was one of the most significant events in the evolution of computers. Over the course of the next
two years, computer prototypes were built to decode secret German messages by the U.S. Army.
1. FIRST-GENERATION COMPUTERS:
The first generation of modern computers can be traced to 1943, when the Mark I and Colossus
computers were developed. Colossus was used in secret during World War II to help decipher tele-
Page 3|9
CLOUD COMPUTING
printer messages encrypted by German forces using the Lorenz SZ40/42 machine. First-generation
computers were built using hard-wired circuits and vacuum tubes (thermionic valves).
2. SECOND-GENERATION COMPUTERS:
Another general-purpose computer of this era was ENIAC (Electronic Numerical Integrator and
Computer, which was built in 1946. This was the first Turing-complete, digital computer capable of being
reprogrammed to solve a full range of computing problems. NIAC contained 18,000 thermionic valves,
weighed over 60,000 pounds, and consumed 25 kilowatts of electrical power per hour.
3. THIRD-GENERATION COMPUTERS
The integrated circuit or microchip was developed by Jack St. Claire Kilby, an achievement for which he
received the Nobel Prize in Physics.
Kilby’s invention started an explosion in third-generation computers. Even though the first integrated
circuit was produced in September 1958, microchips were not used in computers until 1963.
In November 1971, Intel released the world’s first commercial microprocessor, the Intel 4004. The 4004
was the first complete CPU on one chip.
4. FOURTH-GENERATION COMPUTERS
The fourth-generation computers that were being developed at this time utilized a microprocessor that
put the computer’s processing capabilities on a single integrated circuit chip.
By combining Random Access Memory (RAM), developed by Intel, fourth-generation computers were
faster than ever before and had much smaller footprints. The 4004 processor was capable of “only”
60,000 instructions per second.
The microprocessors that evolved from the 4004 allowed manufacturers to begin developing personal
computers small enough and cheap enough to be purchased by the general public. The first
commercially available personal computer was the MITS Altair 8800, released at the end of 1974. The PC
era had begun in earnest by the mid-1980s.
CONCLUSION:
Even though microprocessing power, memory and data storage capacities have increased by many
orders of magnitude since the invention of the 4004 processor, the technology for large-scale
integration (LSI) or very-large-scale integration (VLSI) microchips has not changed all that much. For this
reason, most of today’s computers still fall into the category of fourth-generation computers.
SOFTWARE EVOLUTION:
The conceptual foundation for creation of the Internet was significantly developed by three individuals.
Vannevar Bush: Bush introduced the concept of the MEMEX in the 1930s as a microfilm-based “device
in which an individual stores all his books, records, and communications, and which is mechanized so
that it may be consulted with exceeding speed and flexibility.”
Norbert Wiener: His work in cybernetics inspired future researchers to focus on extending human
capabilities with technology. He also founded the field of cybernetics.
Page 4|9
CLOUD COMPUTING
Marshall McLuhan: Marshall McLuhan put forth the idea of a global village that was interconnected by
an electronic nervous system as part of our popular culture.
In 1957, U.S. President Dwight Eisenhower creates the Advanced Research Projects Agency (ARPA)
agency to regain the technological lead in the arms race. ARPA (renamed DARPA, the Defense Advanced
Research Projects Agency, in 1972) appointed J. C. R. Licklider to head the new Information Processing
Techniques Office (IPTO). Licklider was given a mandate to further the research of the SAGE system.
SAGE stood for Semi-Automatic Ground Environment. SAGE was the most ambitious computer project
ever undertaken at the time, and it required over 800 programmers and the technical resources of some
of America’s largest corporations.
Licklider worked for several years at ARPA, where he set the stage for the creation of the ARPANET. He
also worked at Bolt Beranek and Newman (BBN), the company that supplied the first computers
connected on the ARPANET.
The idea for a common interface to the ARPANET was first suggested in Ann Arbor, Michigan, by Wesley
Clark at an ARPANET design session set up by Lawrence Roberts in April 1967. The IMP (Interface
Message Processor) would handle the interface to the ARPANET network. The physical layer, the data
link layer, and the network layer protocols used internally on the ARPANET were implemented on this
IMP. The Interface Message Processor interface for the ARPANET went live in early October 1969.
The first networking protocol that was used on the ARPANET was the Network Control Program (NCP).
The NCP provided the middle layers of a protocol stack running on an ARPANET-connected host
computer.
Over time, there evolved four increasingly better versions of TCP/IP (TCP v1, TCP v2, a split into TCP v3
and IP v3, and TCP v4 and IPv4). Today, IPv4 is the standard protocol, but it is in the process of being
replaced by IPv6
2) EVOLUTION OF IPV6:
The amazing growth of the Internet throughout the 1990s caused a vast reduction in the number of free
IP addresses available under IPv4. IPv4 was never designed to scale to global levels.
After examining a number of proposals, the Internet Engineering Task Force (IETF) settled on IPv6, which
was released in January 1995 as RFC 1752. Ipv6 is sometimes called the Next Generation Internet
Protocol (IPNG) or TCP/IP v6. From 2004, IPv6 was widely available from industry as an integrated
TCP/IP protocol and was supported by most new Internet networking equipment.
Page 6|9
CLOUD COMPUTING
A) DISTRIBUTED SYSTEMS:
It is a composition of multiple independent systems but all of them are depicted as a single entity to the
users. The purpose of distributed systems is to share resources and also use them effectively and
efficiently. Distributed systems possess characteristics such as scalability, concurrency, continuous
availability, heterogeneity, and independence in failures. Distributed computing led to three more types
of computing and they were-Mainframe computing, cluster computing, and grid computing.
B) CLUSTER COMPUTING:
In 1980s, cluster computing came as an alternative to mainframe computing. Each machine in the
cluster was connected to each other by a network with high bandwidth. These were way cheaper than
those mainframe systems. These were equally capable of high computations. The problem of the cost
was solved to some extent but the problem related to geographical restrictions still pertained. To solve
this, the concept of grid computing was introduced.
Page 7|9
CLOUD COMPUTING
C) GRID COMPUTING:
In 1990s, the concept of grid computing was introduced. It means that different systems were placed at
entirely different geographical locations and these all were connected via the internet. These systems
belonged to different organizations and thus the grid consisted of heterogeneous nodes. Although it
solved some problems but new problems emerged as the distance between the nodes increased. The
main problem which was encountered was the low availability of high bandwidth connectivity and with
it other network associated issues. Thus, cloud computing is often referred to as “Successor of grid
computing”.
D) VIRTUALIZATION:
It was introduced nearly 40 years back. It refers to the process of creating a virtual layer over the
hardware which allows the user to run multiple instances simultaneously on the hardware. It is a key
technology used in cloud computing. It is the base on which major cloud computing services such as
Amazon EC2, VMware vCloud, etc work on. Hardware virtualization is still one of the most common
types of virtualization.
E) WEB 2.0:
It is the interface through which the cloud computing services interact with the clients. It is because of
Web 2.0 that we have interactive and dynamic web pages. It also increases flexibility among web pages.
Popular examples of web 2.0 include Google Maps, Facebook, Twitter, etc. Needless to say, social media
is possible because of this technology only. In gained major popularity in 2004.
Page 8|9
CLOUD COMPUTING
F) SERVICE ORIENTATION:
It acts as a reference model for cloud computing. It supports low-cost, flexible, and evolvable
applications. Two important concepts were introduced in this computing model. These were Quality of
Service (QoS) which also includes the SLA (Service Level Agreement) and Software as a Service (SaaS).
G) UTILITY COMPUTING:
It is a computing model that defines service provisioning techniques for services such as compute
services along with other major services such as storage, infrastructure, etc which are provisioned on a
pay-per-use basis.
Page 9|9