You are on page 1of 24

Cloud Computing

Cloud computing is on-demand access, via the internet, to computing resources—applications, servers (physical
servers and virtual servers), data storage, development tools, networking capabilities, and more—hosted at a
remote data center managed by a cloud services provider (or CSP). The CSP makes these resources available for a monthly
subscription fee or bills them according to usage.

Cloud computing enables customers to use infrastructure and applications via the internet, without installing and
maintaining them on-premises.

Compared to traditional on-premises IT, and depending on the cloud services you select, cloud computing helps do the
following:
• Lower IT costs: Cloud lets you offload some or most of the costs and effort of purchasing, installing, configuring, and
managing your own on-premises infrastructure.

• Improve agility and time-to-value: With cloud, your organization can start using enterprise applications in minutes,
instead of waiting weeks or months for IT to respond to a request, purchase and configure supporting hardware, and
install software. Cloud also lets you empower certain users—specifically developers and data scientists—to help
themselves to software and support infrastructure.

• Scale more easily and cost-effectively: Cloud provides elasticity—instead of purchasing excess capacity that sits
unused during slow periods, you can scale capacity up and down in response to spikes and dips in traffic. You can also
take advantage of your cloud provider’s global network to spread your applications closer to users around the world.

The term ‘cloud computing’ also refers to the technology that makes cloud work. This includes some form
of virtualized IT infrastructure—servers, operating system software, networking, and other infrastructure that’s
abstracted, using special software, so that it can be pooled and divided irrespective of physical hardware boundaries. For
example, a single hardware server can be divided into multiple virtual servers.

Virtualization enables cloud providers to make maximum use of their data center resources. Not surprisingly, many
corporations have adopted the cloud delivery model for their on-premises infrastructure so they can realize maximum
utilization and cost savings vs. traditional IT infrastructure and offer the same self-service and agility to their end-users.

If you use a computer or mobile device at home or at work, you almost certainly use some form of cloud computing every
day, whether it’s a cloud application like Google Gmail or Salesforce, streaming media like Netflix, or cloud file storage like
Dropbox. Industry analyst Gartner projected recently that worldwide end-user public cloud spending to reach nearly USD
600 billion in 2023 (link resides outside ibm.com).

What is cloud computing?


Cloud computing is the on-demand delivery of IT resources over the Internet with pay-as-you-go pricing. Instead of buying, owning,
and maintaining physical data centers and servers, you can access technology services, such as computing power, storage, and
databases, on an as-needed basis from a cloud provider like Amazon Web Services (AWS).

Who is using cloud computing?


Organizations of every type, size, and industry are using the cloud for a wide variety of use cases, such as data backup, disaster
recovery, email, virtual desktops, software development and testing, big data analytics, and customer-facing web applications. For
example, healthcare companies are using the cloud to develop more personalized treatments for patients. Financial services
companies are using the cloud to power real-time fraud detection and prevention. And video game makers are using the cloud to
deliver online games to millions of players around the world.

CLOUD COMPUTING SERVICES


The three main types of cloud computing include Infrastructure as a Service, Platform as a Service, and Software as a Service. Each
type of cloud computing provides different levels of control, flexibility, and management so that you can select the right set of
services for your needs.

IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service) and SaaS (Software-as-a-Service) are the three most
common models of cloud services, and it’s not uncommon for an organization to use all three.

A. SaaS (Software-as-a-Service)

SaaS—also known as cloud-based software or cloud applications—is application software that’s hosted in the cloud, and
that users access via a web browser, a dedicated desktop client, or an API that integrates with a desktop or mobile operating
system. In most cases, SaaS users pay a monthly or annual subscription fee; some may offer ‘pay-as-you-go’ pricing based
on your actual usage.

In addition to the cost savings, time-to-value, and scalability benefits of cloud, SaaS offers the following:
• Automatic upgrades: With SaaS, users take advantage of new features as soon as the provider adds them, without
having to orchestrate an on-premises upgrade.

• Protection from data loss: Because SaaS stores application data in the cloud with the application, users don’t lose
data if their device crashes or breaks.

SaaS is the primary delivery model for most commercial software today—there are hundreds of thousands of SaaS solutions
available, from the most focused industry and departmental applications, to powerful enterprise software database and AI
(artificial intelligence) software.

SaaS provides you with a complete product that is run and managed by the service
provider. In most cases, people referring to SaaS are referring to end-user
applications (such as web-based email). With a SaaS offering, you don’t have to think
about how the service is maintained or how the underlying infrastructure is
managed. You only need to think about how you will use that particular software.

B. PaaS (Platform-as-a-Service)

PaaS provides software developers with on-demand platform—hardware, complete software stack, infrastructure, and
even development tools—for running, developing, and managing applications without the cost, complexity, and inflexibility
of maintaining that platform on-premises.

With PaaS, the cloud provider hosts everything—servers, networks, storage, operating system software, middleware,
databases—at their data center. Developers simply pick from a menu to ‘spin up’ servers and environments they need to
run, build, test, deploy, maintain, update, and scale applications.

Today, PaaS is often built around containers, a virtualized compute model one step removed from virtual
servers. Containers virtualize the operating system, enabling developers to package the application with only the operating
system services it needs to run on any platform, without modification and without need for middleware.

Red Hat OpenShift is a popular PaaS built around Docker containers and Kubernetes, an open source container
orchestration solution that automates deployment, scaling, load balancing, and more for container-based applications.

PaaS removes the need for you to manage underlying infrastructure (usually
hardware and operating systems), and allows you to focus on the deployment and
management of your applications. This helps you be more efficient as you don’t need
to worry about resource procurement, capacity planning, software maintenance,
patching, or any of the other undifferentiated heavy lifting involved in running your
application.

C. IaaS (Infrastructure-as-a-Service)

IaaS provides on-demand access to fundamental computing resources—physical and virtual servers, networking, and
storage—over the internet on a pay-as-you-go basis. IaaS enables end users to scale and shrink resources on an as-needed
basis, reducing the need for high, up-front capital expenditures or unnecessary on-premises or ‘owned’ infrastructure and
for overbuying resources to accommodate periodic spikes in usage.

In contrast to SaaS and PaaS (and even newer PaaS computing models such as containers and serverless), IaaS provides the
users with the lowest-level control of computing resources in the cloud. IaaS was the most popular cloud computing model
when it emerged in the early 2010s. While it remains the cloud model for many types of workloads, use of SaaS and PaaS is
growing at a much faster rate.

IaaS contains the basic building blocks for cloud IT. It typically provides access to
networking features, computers (virtual or on dedicated hardware), and data
storage space. IaaS gives you the highest level of flexibility and management control
over your IT resources. It is most similar to the existing IT resources with which many
IT departments and developers are familiar.
Serverless computing

Serverless computing (also called simply serverless) is a cloud computing model that offloads all
the backend infrastructure management tasks–provisioning, scaling, scheduling, patching—to the cloud provider, freeing
developers to focus all their time and effort on the code and business logic specific to their applications.

What's more, serverless runs application code on a per-request basis only and scales the supporting infrastructure up and
down automatically in response to the number of requests. With serverless, customers pay only for the resources being used
when the application is running—they never pay for idle capacity.

FaaS, or Function-as-a-Service, is often confused with serverless computing when, in fact, it's a subset of serverless. FaaS
allows developers to execute portions of application code (called functions) in response to specific events. Everything
besides the code—physical hardware, virtual machine operating system, and web server software management—is
provisioned automatically by the cloud service provider in real-time as the code executes and is spun back down once the
execution completes. Billing starts when execution starts and stops when execution stops.

TYPES OF CLOUD COMPUTING

Public Cloud

Public cloud is a type of cloud computing in which a cloud service provider makes computing resources—anything from
SaaS applications, to individual virtual machines (VMs), to bare metal computing hardware, to complete enterprise-grade
infrastructures and development platforms—available to users over the public internet. These resources might be accessible
for free, or access might be sold according to subscription-based or pay-per-usage pricing models.

The public cloud provider owns, manages, and assumes all responsibility for the data centers, hardware, and infrastructure
on which its customers’ workloads run, and it typically provides high-bandwidth network connectivity to ensure high
performance and rapid access to applications and data.

Public cloud is a multi-tenant environment—the cloud provider's data center infrastructure is shared by all public cloud
customers. In the leading public clouds—Amazon Web Services (AWS), Google Cloud, IBM Cloud, Microsoft Azure, and Oracle
Cloud—those customers can number in the millions. Many enterprises are moving portions of their computing
infrastructure to the public cloud because public cloud services are elastic and readily scalable, flexibly adjusting to meet
changing workload demands. Others are attracted by the promise of greater efficiency and fewer wasted resources since
customers pay only for what they use. Still others seek to reduce spending on hardware and on-premises infrastructures.

Private cloud

Private cloud is a cloud environment in which all cloud infrastructure and computing resources are dedicated to, and
accessible by, one customer only. Private cloud combines many of the benefits of cloud computing—including elasticity,
scalability, and ease of service delivery—with the access control, security, and resource customization of on-premises
infrastructure.

A private cloud is typically hosted on-premises in the customer's data center. But a private cloud can also be hosted on an
independent cloud provider’s infrastructure or built on rented infrastructure housed in an offsite data center. Many
companies choose private cloud over public cloud because private cloud is an easier way (or the only way) to meet their
regulatory compliance requirements. Others choose private cloud because their workloads deal with confidential
documents, intellectual property, personally identifiable information (PII), medical records, financial data, or other sensitive
data. By building private cloud architecture according to cloud native principles, an organization gives itself the flexibility to
easily move workloads to public cloud or run them within a hybrid cloud (see below) environment whenever they’re ready.

Hybrid cloud

Hybrid cloud is just what it sounds like—a combination of public and private cloud environments. Specifically, and ideally,
a hybrid cloud connects an organization's private cloud services and public clouds into a single, flexible infrastructure for
running the organization’s applications and workloads.

The goal of hybrid cloud is to establish a mix of public and private cloud resources—and with a level of orchestration
between them—that gives an organization the flexibility to choose the optimal cloud for each application or workload and
to move workloads freely between the two clouds as circumstances change. This enables the organization to meet its
technical and business objectives more effectively and cost-efficiently than it could with public or private cloud alone.
Multicloud and hybrid multicloud

Multicloud is the use of two or more clouds from two or more different cloud providers. Having a multicloud environment
can be as simple using email SaaS from one vendor and image editing SaaS from another. But when enterprises talk about
multicloud, they're typically talking about using multiple cloud services—including SaaS, PaaS, and IaaS services—from two
or more of the leading public cloud providers.

Hybrid multicloud is the use of two or more public clouds together with a private cloud environment. Organizations choose
multicloud to avoid vendor lock-in, to have more services to choose from, and to access to more innovation. But the more
clouds you use—each with its own set of management tools, data transmission rates, and security protocols—the more
difficult it can be to manage your environment. Multicloud management platforms provide visibility across multiple provider
clouds through a central dashboard, where development teams can see their projects and deployments, operations teams
can keep an eye on clusters and nodes, and the cybersecurity staff can monitor for threats.

Cloud security

Traditionally, security concerns have been the primary obstacle for organizations considering cloud services,
particularly public cloud services. In response to demand, however, the security offered by cloud service providers is
steadily outstripping on-premises security solutions.

Maintaining cloud security demands different procedures and employee skillsets than in legacy IT environments.
Some cloud security best practices include the following:
• Shared responsibility for security: Generally, the cloud provider is responsible for securing cloud infrastructure and
the customer is responsible for protecting its data within the cloud—but it's also important to clearly define data
ownership between private and public third parties.

• Data encryption: Data should be encrypted while at rest, in transit, and in use. Customers need to maintain full control
over security keys and hardware security module.

• User identity and access management: Customer and IT teams need full understanding of and visibility into network,
device, application, and data access.

• Collaborative management: Proper communication and clear, understandable processes between IT, operations, and
security teams will ensure seamless cloud integrations that are secure and sustainable.

• Security and compliance monitoring: This begins with understanding all regulatory compliance standards applicable
to your industry and setting up active monitoring of all connected systems and cloud-based services to maintain visibility
of all data exchanges between public, private, and hybrid cloud environments.

Cloud use cases

With 25% of organizations planning to move all their applications to cloud within the next year, it would seem that cloud
computing use cases are limitless. But even for companies not planning a wholesale shift to the cloud, certain initiatives
and cloud computing are a match made in IT heaven.

Disaster recovery and business continuity have always been a natural for cloud because cloud provides cost-effective
redundancy to protect data against system failures and the physical distance required to recover data and applications in
the event of a local outage or disaster. All of the major public cloud providers offer Disaster-Recovery-as-a-Service (DRaaS).

Anything that involves storing and processing huge volumes of data at high speeds—and requires more storage and
computing capacity than most organizations can or want to purchase and deploy on-premises—is a target for cloud
computing. Examples include:
• Big data analytics

• Internet of Things (IoT)

• Artificial intelligence—particularly machine learning and deep learning applications

For development teams adopting Agile or DevOps (or DevSecOps) to streamline development, cloud offers the on-
demand end-user self-service that keeps operations tasks—such as spinning up development and test servers—from
becoming development bottlenecks.
Benefits of cloud computing

Agility
The cloud gives you easy access to a broad range of technologies so that you can innovate faster and build
nearly anything that you can imagine. You can quickly spin up resources as you need them–from
infrastructure services, such as compute, storage, and databases, to Internet of Things, machine learning,
data lakes and analytics, and much more.
You can deploy technology services in a matter of minutes, and get from idea to implementation several
orders of magnitude faster than before. This gives you the freedom to experiment, test new ideas to
differentiate customer experiences, and transform your business.

Elasticity
With cloud computing, you don’t have to over-provision resources up front to handle peak levels of business activity
in the future. Instead, you provision the amount of resources that you actually need. You can scale these resources
up or down to instantly grow and shrink capacity as your business needs change.

Cost savings
The cloud allows you to trade fixed expenses (such as data centers and physical servers) for variable
expenses, and only pay for IT as you consume it. Plus, the variable expenses are much lower than what
you would pay to do it yourself because of the economies of scale.

Deploy globally in minutes


With the cloud, you can expand to new geographic regions and deploy globally in minutes. For example, AWS has
infrastructure all over the world, so you can deploy your application in multiple physical locations with just a few
clicks. Putting applications in closer proximity to end users reduces latency and improves their experience.

Features of Cloud Computing – 10 Major Characteristics of Cloud Computing


Free AWS Course for AWS Certified Cloud Practitioner (CLF-C01) Start Now!!

After studying Cloud Computing Architecture. Here, we come up with Features of Cloud Computing.
The characteristics of cloud computing state its importance in the market.
So, let’s start exploring Features of Cloud Computing.

Introduction to Cloud Computing


Cloud Computing is getting more and more popularity day by day. The reason behind is the gradual growth of the
companies which are in need of the place to store their data. Therefore, companies are in competition to provide large
space to store data along with the various features and quality service.

It has been found that Cloud Computing is a model for enabling ubiquitous, convenient, on-demand network access the
computing resources. There are many services and features of cloud computing.

Features of Cloud Computing


Following are the characteristics of Cloud Computing:

1. Resources Pooling
It means that the Cloud provider pulled the computing resources to provide services to multiple customers with the
help of a multi-tenant model. There are different physical and virtual resources assigned and reassigned which depends
on the demand of the customer.
The customer generally has no control or information over the location of the provided resources but is able to specify
location at a higher level of abstraction.

2. On-Demand Self-Service
It is one of the important and valuable features of Cloud Computing as the user can continuously monitor the server
uptime, capabilities, and allotted network storage. With this feature, the user can also monitor the computing
capabilities.

3. Easy Maintenance
The servers are easily maintained and the downtime is very low and even in some cases, there is no downtime. Cloud
Computing comes up with an update every time by gradually making it better. The updates are more compatible with
the devices and perform faster than older ones along with the bugs which are fixed.

4. Large Network Access


The user can access the data of the cloud or upload the data to the cloud from anywhere just with the help of a device
and an internet connection. These capabilities are available all over the network and accessed with the help of internet.
5. Availability
The capabilities of the Cloud can be modified as per the use and can be extended a lot. It analyzes the storage usage and
allows the user to buy extra Cloud storage if needed for a very small amount.

6. Automatic System
Cloud computing automatically analyzes the data needed and supports a metering capability at some level of services.
We can monitor, control, and report the usage. It will provide transparency for the host as well as the customer.

7. Economical
It is the one-time investment as the company (host) has to buy the storage and a small part of it can be provided to the
many companies which save the host from monthly or yearly costs. Only the amount which is spent is on the basic
maintenance and a few more expenses which are very less.

8. Security
Cloud Security, is one of the best features of cloud computing. It creates a snapshot of the data stored so that the data
may not get lost even if one of the servers gets damaged.
The data is stored within the storage devices, which cannot be hacked and utilized by any other person. The storage
service is quick and reliable.

9. Pay as you go
In cloud computing, the user has to pay only for the service or the space they have utilized. There is no hidden or extra
charge which is to be paid. The service is economical and most of the time some space is allotted for free.

10. Measured Service


Cloud Computing resources used to monitor and the company uses it for recording. This resource utilization is analyzed
by supporting charge-per-use capabilities.
This means that the resource usages which can be either virtual server instances that are running in the cloud are getting
monitored measured and reported by the service provider. The model pay as you go is variable based on actual
consumption of the manufacturing organization.

Summary
IT is the company who maintains the servers, maintains the crashing of the server and takes care of it. The company
also buys the software and the licenses for the operation of their business. All these things maintain by the monthly fee
which they are expecting from the firms they are serving.
They are so much focused on providing quality service as if they fail to do so they will be behind in the competition. This
web-based platform can only access through the internet.
Cloud Computing has numerous amounts of benefits which are helping both hosts as well as the customer. A host
consists of various benefits too which benefit the customers.
There are myriads of security feature which is a positive point along with it the access time is very low and one can easily
upload and download data quickly. The company nowadays is in great need of the data storage facility and the Big
Data companies provide them very easily.

Cloud computing is the on-demand availability of computer system resources, especially data storage (cloud storage)
and computing power, without direct active management by the user.[2] Large clouds often have functions distributed over
multiple locations, each of which is a data center. Cloud computing relies on sharing of resources to achieve coherence and typically
uses a pay-as-you-go model, which can help in reducing capital expenses but may also lead to unexpected operating expenses for
users.

Definition
The National Institute of Standards and Technology's definition of cloud computing identifies "five essential characteristics":
• On-demand self-service. A consumer can unilaterally provision computing capabilities, such as server time and network
storage, as needed automatically without requiring human interaction with each service provider.
• Broad network access. Capabilities are available over the network and accessed through standard mechanisms that
promote use by heterogeneous thin or thick client platforms (e.g., mobile phones, tablets, laptops, and workstations).
• Resource pooling. The provider's computing resources are pooled to serve multiple consumers using a multi-tenant
model, with different physical and virtual resources dynamically assigned and reassigned according to consumer demand.
• Rapid elasticity. Capabilities can be elastically provisioned and released, in some cases automatically, to scale rapidly
outward and inward commensurate with demand. To the consumer, the capabilities available for provisioning often
appear unlimited and can be appropriated in any quantity at any time.
• Measured service. Cloud systems automatically control and optimize resource use by leveraging a metering capability at
some level of abstraction appropriate to the type of service (e.g., storage, processing, bandwidth, and active user
accounts). Resource usage can be monitored, controlled, and reported, providing transparency for both the provider and
consumer of the utilized service.
History
Main article: History of cloud computing
Cloud computing has a rich history that extends back to the 1960s, with the initial concepts of time-sharing becoming popularized
via Remote Job Entry (RJE). The "data center" model, where users submitted jobs to operators to run on mainframes, was
predominantly used during this era. This was a time of exploration and experimentation with ways to make large-scale computing
power available to more users through time-sharing, optimizing the infrastructure, platform, and applications, and increasing
efficiency for end users.
The use of the "cloud" metaphor to denote virtualized services traces back to 1994, when it was used by General Magic to describe
the universe of "places" that mobile agents in the Telescript environment could go. This metaphor is credited to David Hoffman, a
General Magic communications employee, based on its long-standing use in networking and telecom.[6] The expression cloud
computing became more widely known in 1996 when the Compaq Computer Corporation drew up a business plan for future
computing and the Internet. The company's ambition was to supercharge sales with "cloud computing-enabled applications". The
business plan foresaw that online consumer file storage would most likely be commercially successful. As a result, Compaq decided
to sell server hardware to internet service providers.
In the 2000s, the application of cloud computing began to take shape with the establishment of Amazon Web Services in 2002,
which allowed developers to build applications independently. In 2006 the beta version of Google Docs was released, Amazon
Simple Storage Service, known as Amazon S3, and the Amazon Elastic Compute Cloud (EC2), in 2008 NASA's development of
the first open-source software for deploying private and hybrid clouds.
The following decade saw the launch of various cloud services. In 2010, Microsoft launched Microsoft Azure, and Rackspace
Hosting and NASA initiated an open-source cloud-software project, OpenStack. IBM introduced the IBM SmartCloud framework
in 2011, and Oracle announced the Oracle Cloud in 2012. In December 2019, Amazon launched AWS Outposts, a service that
extends AWS infrastructure, services, APIs, and tools to customer data centers, co-location spaces, or on-premises facilities.
Since the global pandemic of 2020, cloud technology has surged in popularity due to the level of data security it offers and the
flexibility of working options it provides for all employees, notably remote workers.

Value proposition
Advocates of public and hybrid clouds claim that cloud computing allows companies to avoid or minimize up-front IT
infrastructure costs. Proponents also claim that cloud computing allows enterprises to get their applications up and running faster,
with improved manageability and less maintenance, and that it enables IT teams to more rapidly adjust resources to meet
fluctuating and unpredictable demand,[13][14][15] providing burst computing capability: high computing power at certain periods of
peak demand.

Additional value propositions of cloud computing include:


Topic Description
A public-cloud delivery model converts capital expenditures (e.g., buying servers) to operational expenditure. This
purportedly lowers barriers to entry, as infrastructure is typically provided by a third party and need not be
purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is "fine-
Cost
reductions
grained", with usage-based billing options. As well, less in-house IT skills are required for implementation of
projects that use cloud computing.[18] The e-FISCAL project's state-of-the-art repository[19] contains several articles
looking into cost aspects in more detail, most of them concluding that costs savings depend on the type of
activities supported and the type of infrastructure available in-house.
Device and location independence enable users to access systems using a web browser regardless of their location
Device
independence
or what device they use (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party)
and accessed via the Internet, users can connect to it from anywhere.
Maintenance of cloud environment is easier because the data is hosted on an outside server maintained by a
provider without the need to invest in data center hardware. IT maintenance of cloud computing is managed and
Maintenance
updated by the cloud provider's IT maintenance team which reduces cloud computing costs compared with on-
premises data centers.
Multitenancy enables sharing of resources and costs across a large pool of users thus allowing for:
• centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
Multitenancy • peak-load capacity increases (users need not engineer and pay for the resources and equipment to meet
their highest possible load-levels)
• utilization and efficiency improvements for systems that are often only 10–20% utilized
Performance is monitored by IT experts from the service provider, and consistent and loosely coupled
Performance
architectures are constructed using web services as the system interface.
Availability improves with the use of multiple redundant sites, which makes well-designed cloud computing
Availability
suitable for business continuity and disaster recovery.
Productivity may be increased when multiple users can work on the same data simultaneously, rather than waiting
Productivity for it to be saved and emailed. Time may be saved as information does not need to be re-entered when fields are
matched, nor do users need to install application software upgrades to their computer.
Scalability and elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis
in near real-time (Note, the VM startup time varies by VM type, location, OS and cloud providers[25]), without users
having to engineer for peak loads. This gives the ability to scale up when the usage need increases or down if
Scalability
and Elasticity
resources are not being used.[30] The time-efficient benefit of cloud scalability also means faster time to market,
more business flexibility, and adaptability, as adding new resources does not take as much time as it used
to.[31] Emerging approaches for managing elasticity include the use of machine learning techniques to propose
efficient elasticity models.
Security can improve due to centralization of data, increased security-focused resources, etc., but concerns can
persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often
as good as or better than other traditional systems, in part because service providers are able to devote resources
to solving security issues that many customers cannot afford to tackle or which they lack the technical skills to
Security
address However, the complexity of security is greatly increased when data is distributed over a wider area or over
a greater number of devices, as well as in multi-tenant systems shared by unrelated users. In addition, user access
to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users'
desire to retain control over the infrastructure and avoid losing control of information security.

Challenges and limitations


One of the main challenges of cloud computing, in comparison to more traditional on-premises computing, is data security and
privacy. Cloud users entrust their sensitive data to third-party providers, who may not have adequate measures to protect it from
unauthorized access, breaches, or leaks. Cloud users also face compliance risks if they have to adhere to certain regulations or
standards regarding data protection, such as GDPR or HIPAA.
Another challenge of cloud computing is reduced visibility and control. Cloud users may not have full insight into how their cloud
resources are managed, configured, or optimized by their providers. They may also have limited ability to customize or modify
their cloud services according to their specific needs or preferences.
In addition, cloud migration is a significant issue. Cloud migration is the process of moving data, applications, or workloads from
one cloud environment to another or from on-premises to the cloud. Cloud migration can be complex, time-consuming, and costly,
especially if there are incompatibility issues between different cloud platforms or architectures. Cloud migration can also cause
downtime, performance degradation, or data loss if not planned and executed properly.

SERVICE MODELS
Cloud computing service models arranged as layers in a stack
The service-oriented architecture (SOA) promotes the idea of "Everything as a
Service" (EaaS or XaaS, or simply aAsS). This concept is operationalized in cloud
computing through several service models as defined by the National Institute of
Standards and Technology (NIST). The three standard service models are
Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a
Service (SaaS).[4] They are commonly depicted as layers in a stack, providing
different levels of abstraction. However, these layers are not necessarily
interdependent. For instance, SaaS can be delivered on bare metal, bypassing PaaS
and IaaS, and a program can run directly on IaaS without being packaged as SaaS.

Infrastructure as a service (IaaS)


Main article: Infrastructure as a service
"Infrastructure as a service" (IaaS) refers to online services that provide high-
level APIs used to abstract various low-level details of underlying network
infrastructure like physical computing resources, location, data partitioning,
scaling, security, backup, etc. A hypervisor runs the virtual machines as guests.
Pools of hypervisors within the cloud operational system can support large
numbers of virtual machines and the ability to scale services up and down
according to customers' varying requirements. Linux containers run in isolated partitions of a single Linux kernel running directly
on the physical hardware. Linux cgroups and namespaces are the underlying Linux kernel technologies used to isolate, secure and
manage the containers. Containerisation offers higher performance than virtualization because there is no hypervisor overhead.
IaaS clouds often offer additional resources such as a virtual-machine disk-image library, raw block storage, file or object storage,
firewalls, load balancers, IP addresses, virtual local area networks (VLANs), and software bundles.
The NIST's definition of cloud computing describes IaaS as "where the consumer is able to deploy and run arbitrary software, which
can include operating systems and applications. The consumer does not manage or control the underlying cloud infrastructure but
has control over operating systems, storage, and deployed applications; and possibly limited control of select networking
components (e.g., host firewalls)."
IaaS-cloud providers supply these resources on-demand from their large pools of equipment installed in data centers. For wide-
area connectivity, customers can use either the Internet or carrier clouds (dedicated virtual private networks). To deploy their
applications, cloud users install operating-system images and their application software on the cloud infrastructure. In this model,
the cloud user patches and maintains the operating systems and the application software. Cloud providers typically bill IaaS
services on a utility computing basis: cost reflects the number of resources allocated and consumed.

Platform as a service (PaaS)


Main article: Platform as a service
The NIST's definition of cloud computing defines Platform as a Service as:
The capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications
created using programming languages, libraries, services, and tools supported by the provider. The consumer does not manage or
control the underlying cloud infrastructure including network, servers, operating systems, or storage, but has control over the
deployed applications and possibly configuration settings for the application-hosting environment.
PaaS vendors offer a development environment to application developers. The provider typically develops toolkit and standards
for development and channels for distribution and payment. In the PaaS models, cloud providers deliver a computing platform,
typically including an operating system, programming-language execution environment, database, and the web server. Application
developers develop and run their software on a cloud platform instead of directly buying and managing the underlying hardware
and software layers. With some PaaS, the underlying computer and storage resources scale automatically to match application
demand so that the cloud user does not have to allocate resources manually.

Some integration and data management providers also use specialized applications of PaaS as delivery models for data. Examples
include iPaaS (Integration Platform as a Service) and dPaaS (Data Platform as a Service). iPaaS enables customers to develop,
execute and govern integration flows. Under the iPaaS integration model, customers drive the development and deployment of
integrations without installing or managing any hardware or middleware. dPaaS delivers integration—and data-management—
products as a fully managed service. Under the dPaaS model, the PaaS provider, not the customer, manages the development and
execution of programs by building data applications for the customer. dPaaS users access data through data-visualization tools.

Software as a service (SaaS)


Main article: Software as a service
The NIST's definition of cloud computing defines Software as a Service as:
The capability provided to the consumer is to use the provider's applications running on a cloud infrastructure. The applications
are accessible from various client devices through either a thin client interface, such as a web browser (e.g., web-based email), or
a program interface. The consumer does not manage or control the underlying cloud infrastructure including network, servers,
operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific
application configuration settings.
In the software as a service (SaaS) model, users gain access to application software and databases. Cloud providers manage the
infrastructure and platforms that run the applications. SaaS is sometimes referred to as "on-demand software" and is usually priced
on a pay-per-use basis or using a subscription fee.[44] In the SaaS model, cloud providers install and operate application software in
the cloud and cloud users access the software from cloud clients. Cloud users do not manage the cloud infrastructure and platform
where the application runs. This eliminates the need to install and run the application on the cloud user's own computers, which
simplifies maintenance and support. Cloud applications differ from other applications in their scalability—which can be achieved
by cloning tasks onto multiple virtual machines at run-time to meet changing work demand.[45] Load balancers distribute the work
over the set of virtual machines. This process is transparent to the cloud user, who sees only a single access-point. To accommodate
a large number of cloud users, cloud applications can be multitenant, meaning that any machine may serve more than one cloud-
user organization.
The pricing model for SaaS applications is typically a monthly or yearly flat fee per user, [46] so prices become scalable and adjustable
if users are added or removed at any point. It may also be free. Proponents claim that SaaS gives a business the potential to reduce
IT operational costs by outsourcing hardware and software maintenance and support to the cloud provider. This enables the
business to reallocate IT operations costs away from hardware/software spending and from personnel expenses, towards meeting
other goals. In addition, with applications hosted centrally, updates can be released without the need for users to install new
software. One drawback of SaaS comes with storing the users' data on the cloud provider's server. As a result,there could
be unauthorized access to the data. Examples of applications offered as SaaS are games and productivity software like Google Docs
and Office Online. SaaS applications may be integrated with cloud storage or File hosting services, which is the case with Google
Docs being integrated with Google Drive, and Office Online being integrated with OneDrive.

Mobile "backend" as a service (MBaaS)


Main article: Mobile backend as a service
In the mobile "backend" as a service (m) model, also known as "backend as a service" (BaaS), web app and mobile app developers
are provided with a way to link their applications to cloud storage and cloud computing services with application programming
interfaces (APIs) exposed to their applications and custom software development kits (SDKs). Services include user
management, push notifications, integration with social networking services and more. This is a relatively recent model in cloud
computing, with most BaaS startups dating from 2011 or later but trends indicate that these services are gaining significant
mainstream traction with enterprise consumers.
Serverless computing or Function-as-a-Service (FaaS)
Main article: Serverless computing
Serverless computing is a cloud computing code execution model in which the cloud provider fully manages starting and
stopping virtual machines as necessary to serve requests. Requests are billed by an abstract measure of the resources required to
satisfy the request, rather than per virtual machine per hour. Despite the name, serverless computing does not actually involve
running code without servers. The business or person using the system does not have to purchase, rent or provide servers or virtual
machines for the back-end code to run on.
Function as a service (FaaS) is a service-hosted remote procedure call that utilizes serverless computing to enable deploying
individual functions in the cloud to run in response to events. Some consider FaaS to fall under the umbrella of serverless
computing, while others use the terms interchangeably.

DEPLOYMENT MODELS - Cloud computing types

Private
Private cloud is cloud infrastructure operated solely for a single
organization, whether managed internally or by a third party, and hosted
either internally or externally.[4] Undertaking a private cloud project
requires significant engagement to virtualize the business environment,
and requires the organization to reevaluate decisions about existing
resources. It can improve business, but every step in the project raises
security issues that must be addressed to prevent serious vulnerabilities.
Self-run data centers are generally capital intensive. They have a
significant physical footprint, requiring allocations of space, hardware, and environmental controls. These assets have to be
refreshed periodically, resulting in additional capital expenditures. They have attracted criticism because users "still have to buy,
build, and manage them" and thus do not benefit from less hands-on management, essentially "[lacking] the economic model that
makes cloud computing such an intriguing concept".

Public
For a comparison of cloud-computing software and providers, see Cloud-computing comparison
Cloud services are considered "public" when they are delivered over the public Internet, and they may be offered as a paid
subscription, or free of charge.[63] Architecturally, there are few differences between public- and private-cloud services, but security
concerns increase substantially when services (applications, storage, and other resources) are shared by multiple customers. Most
public-cloud providers offer direct-connection services that allow customers to securely link their legacy data centers to their cloud-
resident applications. Several factors like the functionality of the solutions, cost, integrational and organizational aspects as well
as safety & security are influencing the decision of enterprises and organizations to choose a public cloud or on-
premises solution.[65]

Hybrid
See also: Hybrid cloud storage
Hybrid cloud is a composition of a public cloud and a private environment, such as a private cloud or on-premises
resources,[66][67] that remain distinct entities but are bound together, offering the benefits of multiple deployment models. Hybrid
cloud can also mean the ability to connect collocation, managed and/or dedicated services with cloud resources. Gartner defines
a hybrid cloud service as a cloud computing service that is composed of some combination of private, public and community cloud
services, from different service providers.

A hybrid cloud service crosses isolation and provider boundaries so that it cannot be simply put in one category of private, public,
or community cloud service. It allows one to extend either the capacity or the capability of a cloud service, by aggregation,
integration or customization with another cloud service.

Varied use cases for hybrid cloud composition exist. For example, an organization may store sensitive client data in house on a
private cloud application, but interconnect that application to a business intelligence application provided on a public cloud as a
software service. This example of hybrid cloud extends the capabilities of the enterprise to deliver a specific business service
through the addition of externally available public cloud services. Hybrid cloud adoption depends on a number of factors such as
data security and compliance requirements, level of control needed over data, and the applications an organization uses.

Another example of hybrid cloud is one where IT organizations use public cloud computing resources to meet temporary capacity
needs that can not be met by the private cloud. This capability enables hybrid clouds to employ cloud bursting for scaling across
clouds. Cloud bursting is an application deployment model in which an application runs in a private cloud or data center and
"bursts" to a public cloud when the demand for computing capacity increases. A primary advantage of cloud bursting and a hybrid
cloud model is that an organization pays for extra compute resources only when they are needed. [72] Cloud bursting enables data
centers to create an in-house IT infrastructure that supports average workloads, and use cloud resources from public or private
clouds, during spikes in processing demands.
Community
Community cloud shares infrastructure between several organizations from a specific community with common concerns (security,
compliance, jurisdiction, etc.), whether managed internally or by a third-party, and either hosted internally or externally. The costs
are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud
computing are realized.

Distributed
A cloud computing platform can be assembled from a distributed set of machines in different locations, connected to a single
network or hub service. It is possible to distinguish between two types of distributed clouds: public-resource computing and
volunteer cloud.
• Public-resource computing – This type of distributed cloud results from an expansive definition of cloud computing,
because they are more akin to distributed computing than cloud computing. Nonetheless, it is considered a sub-class of
cloud computing.
• Volunteer cloud – Volunteer cloud computing is characterized as the intersection of public-resource computing and cloud
computing, where a cloud computing infrastructure is built using volunteered resources. Many challenges arise from this
type of infrastructure, because of the volatility of the resources used to build it and the dynamic environment it operates
in. It can also be called peer-to-peer clouds, or ad-hoc clouds. An interesting effort in such direction is Cloud@Home, it
aims to implement a cloud computing infrastructure using volunteered resources providing a business-model to
incentivize contributions through financial restitution.

Multi
Main article: Multicloud
Multicloud is the use of multiple cloud computing services in a single heterogeneous architecture to reduce reliance on single
vendors, increase flexibility through choice, mitigate against disasters, etc. It differs from hybrid cloud in that it refers to multiple
cloud services, rather than multiple deployment modes (public, private, legacy).

Poly
Poly cloud refers to the use of multiple public clouds for the purpose of leveraging specific services that each provider offers. It
differs from Multi cloud in that it is not designed to increase flexibility or mitigate against failures but is rather used to allow an
organization to achieve more that could be done with a single provider.

Big data
The issues of transferring large amounts of data to the cloud as well as data security once the data is in the cloud initially hampered
adoption of cloud for big data, but now that much data originates in the cloud and with the advent of bare-metal servers, the cloud
has become a solution for use cases including business analytics and geospatial analysis.

HPC
HPC cloud refers to the use of cloud computing services and infrastructure to execute high-performance computing (HPC)
applications.[81] These applications consume a considerable amount of computing power and memory and are traditionally
executed on clusters of computers. In 2016 a handful of companies, including R-HPC, Amazon Web Services, Univa, Silicon
Graphics International, Sabalcore, Gomput, and Penguin Computing offered a high-performance computing cloud. The Penguin On
Demand (POD) cloud was one of the first non-virtualized remote HPC services offered on a pay-as-you-go basis.[82][83] Penguin
Computing launched its HPC cloud in 2016 as an alternative to Amazon's EC2 Elastic Compute Cloud, which uses virtualized
computing nodes.

ARCHITECTURE

Cloud computing sample architecture


Cloud architecture, the systems architecture of the software systems involved in
the delivery of cloud computing, typically involves multiple cloud
components communicating with each other over a loose coupling mechanism
such as a messaging queue. Elastic provision implies intelligence in the use of tight
or loose coupling as applied to mechanisms such as these and others.

Cloud engineering
Cloud engineering is the application of engineering disciplines of cloud computing. It brings a systematic approach to the high-
level concerns of commercialization, standardization and governance in conceiving, developing, operating and maintaining cloud
computing systems. It is a multidisciplinary method encompassing contributions from diverse areas such
as systems, software, web, performance, information technology engineering, security, platform, risk, and quality engineering.
Security and privacy
Cloud suppliers security and privacy agreements must be aligned to the demand(s) requirements and requlations.
Main article: Cloud computing security
Cloud computing poses privacy concerns because the service
provider can access the data that is in the cloud at any time. It
could accidentally or deliberately alter or delete information.
Many cloud providers can share information with third parties
if necessary for purposes of law and order without a warrant.
That is permitted in their privacy policies, which users must
agree to before they start using cloud services. Solutions to
privacy include policy and legislation as well as end-users'
choices for how data is stored. Users can encrypt data that is
processed or stored within the cloud to prevent unauthorized
access. Identity management systems can also provide
practical solutions to privacy concerns in cloud computing. These systems distinguish between authorized and unauthorized users
and determine the amount of data that is accessible to each entity. The systems work by creating and describing identities,
recording activities, and getting rid of unused identities.

According to the Cloud Security Alliance, the top three threats in the cloud are Insecure Interfaces and APIs, Data Loss & Leakage,
and Hardware Failure—which accounted for 29%, 25% and 10% of all cloud security outages respectively. Together, these form
shared technology vulnerabilities. In a cloud provider platform being shared by different users, there may be a possibility that
information belonging to different customers resides on the same data server. Additionally, Eugene Schultz, chief technology officer
at Emagined Security, said that hackers are spending substantial time and effort looking for ways to penetrate the cloud.

"There are some real Achilles' heels in the cloud infrastructure that are making big holes for the bad guys to get into". Because
data from hundreds or thousands of companies can be stored on large cloud servers, hackers can theoretically gain control of huge
stores of information through a single attack—a process he called "hyperjacking". Some examples of this include the Dropbox
security breach, and iCloud 2014 leak. Dropbox had been breached in October 2014, having over 7 million of its users passwords
stolen by hackers in an effort to get monetary value from it by Bitcoins (BTC). By having these passwords, they are able to
read private data as well as have this data be indexed by search engines (making the information public).

There is the problem of legal ownership of the data (If a user stores some data in the cloud, can the cloud provider profit from it?).
Many Terms of Service agreements are silent on the question of ownership. [90] Physical control of the computer equipment (private
cloud) is more secure than having the equipment off-site and under someone else's control (public cloud). This delivers great
incentive to public cloud computing service providers to prioritize building and maintaining strong management of secure
services.[91] Some small businesses that don't have expertise in IT security could find that it's more secure for them to use a public
cloud. There is the risk that end users do not understand the issues involved when signing on to a cloud service (persons sometimes
don't read the many pages of the terms of service agreement, and just click "Accept" without reading). This is important now that
cloud computing is common and required for some services to work, for example for an intelligent personal
assistant (Apple's Siri or Google Assistant). Fundamentally, private cloud is seen as more secure with higher levels of control for
the owner, however public cloud is seen to be more flexible and requires less time and money investment from the user.

Market
According to International Data Corporation (IDC), global spending on cloud computing services has reached $706 billion and
expected to reach $1.3 trillion by 2025. While Gartner estimated that global public cloud services end-user spending would reach
$600 billion by 2023. As per a McKinsey & Company report, cloud cost-optimization levers and value-oriented business use cases
foresee more than $1 trillion in run-rate EBITDA across Fortune 500 companies as up for grabs in 2030. In 2022, more than $1.3
trillion in enterprise IT spending was at stake from the shift to the cloud, growing to almost $1.8 trillion in 2025, according to
Gartner.

Similar concepts
The goal of cloud computing is to allow users to take benefit from all of these technologies, without the need for deep knowledge
about or expertise with each one of them. The cloud aims to cut costs and helps the users focus on their core business instead of
being impeded by IT obstacles.[97] The main enabling technology for cloud computing is virtualization. Virtualization software
separates a physical computing device into one or more "virtual" devices, each of which can be easily used and managed to perform
computing tasks. With operating system–level virtualization essentially creating a scalable system of multiple independent
computing devices, idle computing resources can be allocated and used more efficiently. Virtualization provides the agility required
to speed up IT operations and reduces cost by increasing infrastructure utilization. Autonomic computing automates the process
through which the user can provision resources on-demand. By minimizing user involvement, automation speeds up the process,
reduces labor costs and reduces the possibility of human errors.

Cloud computing uses concepts from utility computing to provide metrics for the services used. Cloud computing attempts to
address QoS (quality of service) and reliability problems of other grid computing models.
Cloud computing shares characteristics with:
• Client–server model – Client–server computing refers broadly to any distributed application that distinguishes between
service providers (servers) and service requestors (clients).[98]
• Computer bureau – A service bureau providing computer services, particularly from the 1960s to 1980s.
• Grid computing – A form of distributed and parallel computing, whereby a 'super and virtual computer' is composed of
a cluster of networked, loosely coupled computers acting in concert to perform very large tasks.
• Fog computing – Distributed computing paradigm that provides data, compute, storage and application services closer to
the client or near-user edge devices, such as network routers. Furthermore, fog computing handles data at the network
level, on smart devices and on the end-user client-side (e.g. mobile devices), instead of sending data to a remote location
for processing.
• Utility computing – The "packaging of computing resources, such as computation and storage, as a metered service similar
to a traditional public utility, such as electricity."[99][100]
• Peer-to-peer – A distributed architecture without the need for central coordination. Participants are both suppliers and
consumers of resources (in contrast to the traditional client-server model).
• Cloud sandbox – A live, isolated computer environment in which a program, code or file can run without affecting the
application in which it runs.

See also
• Block-level storage • Edge computing
• Browser-based computing • Edge device
• Category:Cloud computing providers • e-Science
• Category:Cloud platforms • File system
• Communication protocol o Clustered file system
• Communications system o Distributed file system
• Cloud collaboration o Distributed file system for cloud
• Cloud-native computing • Fog computing
• Cloud-native processor • Fog robotics
• Cloud computing security • Green computing (environmentally sustainable computing)
• Cloud-computing comparison • Grid computing
• Cloud management • In-memory database
• Cloud research • In-memory processing
• Cloud robotics • Internet of things
• Cloud gaming • IoT security device
• Cloud storage • Microservices
• Cloudlet • Mobile cloud computing
• Computer cluster • Multi-access edge computing
• Cooperative storage cloud • Peer-to-peer
• Decentralized computing • Personal cloud
• Desktop virtualization • Robot as a service
• Dew computing • As a service
• Directory • Service-oriented architecture
• Distributed data store • Time-sharing
• Distributed database • Ubiquitous computing
• Distributed computing • Virtual private cloud
• Distributed networking • Private cloud computing infrastructure

Difference Between Cloud Based and Server Based

Technology has come a long way over the years and it’s hard to believe how far we have come in terms of
our ability to connect with others. Technology evolves over time, so does everything along with it. As with
all things in life, change is the only constant when it comes to technology as well. We have witnessed a
technological evolution within a short span of time like it was yesterday and it’s unbelievable how things were 10 years back. We
have witnessed it and probably done it, but we don’t know exactly what it is. We’re talking about cloud technology which has been
around for quite some time now and we’ve been using cloud technology unknowingly through Amazon, Gmail, Google Docs, and
more, and yet we know a very little about it. Let’s take a look at what it means to be cloud based and server based.

What is Cloud Based?


The term cloud is in fact the internet and it’s everywhere. Cloud refers to a pool of shared computing resources available to the
users on demand through web-based tools via the internet. The era of cloud started in 2006 when Amazon released its first cloud
services, Elastic Cloud Computing (EC2) and Simple Storage Service (S3), which were used by businesses and organizations in more
than 200 countries. The services offered by Cloud service providers and the number of cloud users have increased exponentially
since then. The whole idea of cloud computing is to shift everything to the cloud so that user can access the data remotely without
being physically present at a specific place. This makes data processing and storage convenient and efficient than ever. Many
businesses and organizations have started adopting this paradigm as a potential game changer to their businesses.

What is Server Based?


Server based computing refers to the applications running on the server. As the name
suggests, the base for a server based network or system is the server itself, otherwise
called as the centralized server. A server refers to a dedicated computer tasked with
managing network resources. In simple term, a server is an instance of a computer
program that accepts and responds to the requests made by other programs in the
network, otherwise known as clients. The terms “server-based computing” has been
around for several years and the idea behind it was to host data and other forms of
resources on a central computer known as a server and the clients such as desktop
computers and laptops request the server to share its resources with the clients.

Difference between Cloud Based and Server Based


Meaning
Cloud is everywhere and it manages the servers and network infrastructure management. The terms cloud-based refers to anything
be it applications, resources or services that is made available to the users on demand through web-based tools via the Internet,
as opposed to a direct connection to a server. It’s called “cloud computing” because everything from applications to data centers
to services is found in the cloud. Server, on the other hand, is a computer program that provides services to other computer
programs and their users. Server based computing refers to the technology by which applications get implemented, controlled,
and operated on the server rather than the client.

Technology
The term cloud refers to a pool of dynamically configured shared resources based on network technology where each user has
access to its own private resource called cloud that is offered by a third-party cloud service provider. These cloud service providers
deliver their computing resources over the internet which can be further accessed through a web browser. Server based computing,
on the other hand, refers to the technology where a device or a program, otherwise known as a server, is designed to managing
network resources. The servers accept and respond to requests made by another program, otherwise known as a client.

Application
A cloud based application is any software program or application that operates in the cloud space meaning it’s a program running
on a cloud infrastructure and can be accessed over the internet by various computing devices through a web browser or a program
interface. The cloud applications can be installed either on a private cloud or a private cloud. A server based application, on the
other hand, refers to a program or application stored on a remote server and accessed through a browser interface such as a web
browser. Servers provide different services such as sharing resources or data among clients along with data access and persistence.

Architecture
A cloud computing architecture is a conceptual model that encompasses all the components and subcomponents required for
cloud computing in a cloud space. Cloud provides on-demand access to a networked pool of shared resources like servers,
applications, storage, and networks, regardless of where the cloud is. Server architecture, on the other hand, is the basic
foundation on which the server is created or deployed. It basically refers to a network in which clients request and receive service
from a centralized server and the server then responds to the requests. It defines how a server along with its components is
designed, maintained and managed as a whole.

Summary of Cloud Based vs. Server Based


In a nutshell, there is a thin line between a cloud based application and server or web based application, and the line remains a
blur as ever. This is because they possess a lot of similarities in terms of functionality, but there are noteworthy differences as well,
especially when it comes to using cloud applications for the redundancy rather than using it for computing power. Any program
that runs on the internet is said to be cloud based. In fact, everything that is tagged as a cloud product is software-as-a-service
with online storage space and remote access. Server based applications refers to the applications running on the server.
Cloud Based vs. Server Based: Comparison Chart

Cloud engineering
Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to concerns of
commercialization, standardization, and governance of cloud computing applications. In practice, it leverages the methods and
tools of engineering in conceiving, developing, operating and maintaining cloud computing systems and solutions.[1] It is about the
process of designing the systems necessary to leverage the power and economics of cloud resources to solve business problems.[2]

Core features
Cloud engineering is a field of engineering that focuses on cloud services, such as "software as a service", "platform as a service",
and "infrastructure as a service". It is a multidisciplinary method encompassing contributions from diverse areas such as systems
engineering, software engineering, web engineering, performance engineering, information technology engineering, security
engineering, platform engineering, service engineering, risk engineering, and quality engineering. The nature of commodity-like
capabilities delivered by cloud services and the inherent challenges in this business model drive the need for cloud engineering as
the core discipline.

Elements of Cloud Engineering include:


• Foundation: the fundamental basics, concepts, guiding principles, and taxonomy
• Implementation: the building blocks and practice guides for Cloud realization
• Lifecycle: the end-to-end iteration of Cloud development and delivery
• Management: the design-time and run-time Cloud management from multiple perspectives
Profession
The professionals who work in the field of cloud engineering are primarily cloud architects and engineers. The key skills possessed
by cloud engineering professionals are:
• Know the language of business and domain knowledge
• Understand the conceptual, logical and physical architecture
• Master various cloud technologies, frameworks, and platforms
• Implement the solutions for quality of cloud services, e.g. HA, DR, scaling, performance
• Work on the security at multiple levels
• Develop applications for flexible deployment, provisioning, and management
• Leverage open source packages and products
• Apply agile and lean principles in design and construction

The demand for skills in advanced ICT (Information and Communication Technology) has rapidly expanded in recent years as
business and society are being transformed by the emergence of Internet and Web as ubiquitous media for enabling knowledge-
based global economy. This in turn has created a huge demand for networked-enabled parallel and distributed computing
technologies that are changing the way we conduct science, operate business, and tackle challenging problems such as epidemic
diseases and climate change.

Software
There are many platforms available for cloud engineering, enabling a variety of adaptive environments for architectural framework
design, access point sharing, and data retrieval analytics. Platform virtualization is also available, allowing multimodal hypervisor
operating system interface relay within the cloud database.[3]

History
The notion of cloud engineering in the context of cloud computing had been sparsely used in discussions, presentations and talks
in various occasions in the middle of the 2000s. The term of cloud engineering was formally coined around 2007 and the concept
of cloud engineering was officially introduced in April 2009. Various aspects and topics of this subject have been extensively
covered in a number of industry events. Extensive research has been conducted on specific areas in cloud engineering, such as
development support for cloud patterns, and cloud business continuity services. The first IEEE International Conference on Cloud
Engineering (IC2E) took place on March 25–28, 2013[4] and the second conference was held on March 10–14, 2014.

Wiki
A wiki is an online hypertext publication (a website or database
developed) collaboratively edited and managed by its own audience (a
community of users), allowing any user using a web browser to add and
edit content.

A typical wiki contains multiple pages for the subjects or scope of the
project, and could be either open to the public or limited to use within
an organization for maintaining its internal knowledge base.

Wikis are enabled by wiki software, otherwise known as wiki engines.


A wiki engine, being a form of a content management system, differs
from other web-based systems such as blog software, in that the content is created without any defined owner or leader, and wikis
have little inherent structure, allowing structure to emerge according to the needs of the users. Wiki engines usually allow content
to be written using a simplified markup language and sometimes edited with the help of a rich-text editor.

There are dozens of different wiki engines in use, both standalone and part of other software, such as bug tracking systems. Some
wiki engines are free and open-source, whereas others are proprietary. Some permit control over different functions (levels of
access); for example, editing rights may permit changing, adding, or removing material. Others may permit access without
enforcing access control. Other rules may be imposed to organize content.

There are hundreds of thousands of wikis in use, both public and private, including wikis functioning as knowledge management
resources, note-taking tools, community websites, and intranets. Ward Cunningham, the developer of the first wiki software,
WikiWikiWeb, originally described wiki as "the simplest online database that could possibly work". "Wiki" (pronounced [wiki][note
1]) is a Hawaiian word meaning "quick".

The online encyclopedia project Wikipedia is the most popular wiki-based website, and is one of the most widely viewed sites in
the world, having been ranked in the top twenty since 2007. Wikipedia is not a single wiki but rather a collection of hundreds of
wikis, with each one pertaining to a specific language. The English-language Wikipedia has the largest collection of articles: as of
July 2023, it has over 6 million articles.
What Is Cloud Storage?
Definition, Types, Benefits, and Best Practices
Cloud storage allows users to store digital data files on virtual servers.
Cloud storage is defined as a data deposit model in which digital information such
as documents, photos, videos and other forms of media are stored on virtual or
cloud servers hosted by third parties. It allows you to transfer data on an offsite
storage system and access them whenever needed. This article delves into the
basics of cloud storage.
Table of Contents
o What Is Cloud Storage?
o Types of Cloud Storage
o Benefits and Challenges of Cloud Storage Adoption
o Top 8 Best Practices to Implement Cloud Storage for Companies in 2021
What Is Cloud Storage?
Cloud storage is a data deposit model in which digital information such as documents, photos, videos and other forms of media
are stored on virtual or cloud servers hosted by third parties. It allows you to transfer data on an offsite storage system and
access them whenever needed.

Cloud storage is a cloud computing model that allows users to save important data or media files on remote, third-party servers.
Users can access these servers at any time over the internet. Also known as utility storage, cloud storage is maintained and
operated by a cloud-based service provider.
From greater accessibility to data backup, cloud storage offers a host of benefits. The most notable being large storage capacity
and minimal costs. Cloud storage delivers on-demand and eliminates the need to purchase and manage your own data storage
infrastructure. With “anytime, anywhere” data access, this gives you agility, global scale and durability.
How Cloud Storage Works
Cloud storage works as a virtual data center. It offers end users and applications virtual storage
infrastructure that can be scaled to the application’s requirements. It generally operates via a web-based
API implemented remotely through its interaction with in-house cloud storage infrastructure.

Cloud storage includes at least one data server to which a user can connect via the internet. The user
sends files to the data server, which forwards the message to multiple servers, manually or in an
automated manner, over the internet. The stored data can then be accessed via a web-based interface.

To ensure the constant availability of data, cloud storage systems involve large numbers of data servers.
Therefore, if a server requires maintenance or fails, the user can be assured that the data has been
moved elsewhere to ensure availability.

Also Read: What Is Software as a Service (SaaS)? Definition, Examples, Types, and Trends

Types of Cloud Storage

Cloud services have made it possible for anyone to store digital data and access it from anywhere. This means that cloud storage
is essentially a virtual hard drive. From saving important data such as word documents, and video files, to accessing the cloud to
process complex data and run applications – cloud storage is a versatile system. To decide which is the best cloud storage, the user
needs to determine their use case/s first. Let’s look at the different types of cloud storage solutions:

1. Private cloud storage


Private cloud storage is also known as enterprise or internal cloud storage. Data is stored on the company or organization’s intranet
in this case. This data is protected by the company’s own firewall. Private cloud storage is a great option for companies with
expensive data centers and can manage data privacy in-house. A major advantage of saving data on a private cloud is that it offers
complete control to the user. On the other hand, one of the major drawbacks of private cloud storage is the cost and effort of
maintenance and updates. The responsibility of managing private cloud storage lies with the host company.

2. Public cloud storage


Public cloud storage requires few administrative controls and can be accessed online by the user and anyone else who the user
authorizes. With public cloud storage, the user/company doesn’t need to maintain the system. Public cloud storage is hosted by
different solution providers, so there’s very little opportunity for customizing the security fields, as they are common for all
users. Amazon Web Services (AWS), IBM Cloud, Google Cloud, and Microsoft Azure are a few popular public cloud storage solution
providers. Public cloud storage is easily scalable, affordable, reliable and offers seamless monitoring and zero maintenance.
Also Read: What Is Cloud Computing? Definition, Benefits, Types and Trends
3. Hybrid cloud storage
Hybrid cloud storage is a combination of private and public cloud storage. As the name suggests, hybrid cloud storage offers the
best of both worlds to the user – the security of a private cloud and the personalization of a public cloud. In a hybrid cloud, data
can be stored on the private cloud, and information processing tasks can be assigned to the public cloud as well, with the help of
cloud computing services. Hybrid cloud storage is affordable and offers easy customization and greater user control.

4. Community cloud storage


Community cloud storage is a variation of the private cloud storage model, which offers cloud solutions for specific businesses or
communities. In this model, cloud storage providers offer their cloud architecture, software and other development tools to meet
the community’s requirements. Any data is stored on the community-owned private cloud storage to manage the community’s
security and compliance needs. Community cloud storage is a great option for health, financial or legal companies with strict
compliance policies.
Also Read: What Is Cloud Encryption? Definition, Importance, Methods and Best Practices

Benefits and Challenges of Cloud Storage Adoption


The cloud is rapidly becoming the storage environment of choice for enterprises. 30%Opens a new window of all corporate data
was stored on the cloud in 2015, which increased to 50% in 2020. The cloud storage market has also grown in tandem and is
expected to be worth $137.3 billion by 2025, as per MarketsandMarkets. This is because the cloud offers several benefits over
traditional on-premise storage systems.
Benefits of cloud storage:
o Flexibility and ease of access: Cloud storage means that your data is not tied down to any one location. Various stakeholders
can access assets stored on the cloud from a location and device of their choice without any download or installation hassles.
o Remote management support: Cloud storage also paves the way for remote management either by internal IT teams or
by managed service providers (MSPs). They can troubleshoot without being present on-site, speeding up issue resolution.
o Fast scalability: A major benefit of cloud storage is that you can provision new resources with only a few clicks without the need
for any additional infrastructure. When faced with an unprecedented increase in data volumes, this feature aids business
continuity.
o Redundancy for backup: Data redundancy (i.e., replicating the same data in multiple locations) is essential for an effective
backup mechanism. The cloud ensures your data is kept secure in a remote location in case of a natural disaster, accident, or
cyberattack.
o Long-term cost savings: In the long-term, cloud storage can save you significantly in the costs of hardware equipment, storage
facilities, power supply, and personnel, which are sure to multiply as your organization grows.
Also Read: Top 10 Cloud Data Protection Companies in 2021

Challenges of cloud storage


While there are undeniable advantages of adopting cloud storage, there are a few cons to remember as well. By navigating these
cons or challenges, you can arrive at a pragmatic cloud storage strategy that maximizes its benefits.
o Risk of vendor lock-in: If all your data is stored in a single public cloud platform, there’s a risk of vendor lock-in and potential
inflexibilities. Address this with a hybrid or multi-cloud blueprint where there is sufficient interoperability between
environments.
o Security issues around multi-tenancy: Public cloud environments are shared by multiple tenants, which can multiply your
security vulnerabilities. You can prevent this through cloud data protection and by leveraging the private cloud for sensitive data.
o Fragmentation of IT landscape: Unplanned cloud storage adoption can cause your IT landscape to become fragmented over
time. That’s why you need a detailed strategic blueprint outlining your short, mid, and long-term cloud roadmap.
o Outage and downtime risk: Cloud platforms managed by external providers could suffer from an outage, rendering the data and
applications stored in these environments inaccessible. Service level agreements should specify downtime metrics, and you need
additional redundancy for your most critical data.
o Short-term budget overruns: Cloud cost worries are extremely common, where data storage and storage processes occupy more
space than estimated. A cloud resource management tool can help address this, giving you visibility and control.
Also Read: Top 10 Hybrid Cloud Solution Companies in 2021

Selecting the right cloud storage provider


Let’s look at the most critical aspects businesses need to consider when selecting a cloud storage provider.
o Storage space: The amount of data a business processes determines the requirement for storage space. A small organization
(around 250 employees) could opt for public cloud storage services, which offer employees storage space of over 15 GB each.
It is recommended to compare various public cloud storage pricing plans before signing the deal.
o Maintenance & uptime: Cloud servers need to be maintained to make sure the data stored is secure. However, downtimes and
network failures can occur anytime. Therefore, understanding the maintenance and uptime needed by cloud service providers
is essential. Organizations should ask their chosen cloud service providers to demonstrate their downtime plans and run checks
before buying any cloud solution.
o Security: If data is compromised, then cloud storage comes in handy as a useful backup. There is no guarantee, however, that
cloud storage providers are safe from security threats. Understanding the security measures in place at the cloud storage
provider is important. Two main factors need to be considered for security: the physical security of the cloud solution provider’s
servers and the level of encryption applied to the data stored.
o Speed: The speed of downloads from the cloud has a major impact on businesses and their ability to process critical data. If
cloud storage providers place a cap on the download speed, retrieving data and running applications will take longer. Therefore,
organizations need to gauge the cloud storage download speeds of a provider before buying any storage space.
Also Read: Cloud Vs On-premise Comparison: Key Differences and Similarities

Top 8 Best Practices to Implement Cloud Storage for Companies in 2021


Even if it involves a few challenges, cloud storage implementation is now a top priority for companies. It enables easy access to
information for large, distributed teams operating in a WFH environment. It can help you gain from sophisticated data analytics
without investing in on-premise storage for large volumes of data. Most importantly, it enables interconnectivity between
different applications and data sources, generating efficiencies and business value. In the last year, cloud storage adoption has
accelerated at a dramatic pace, and the momentum will continue for the foreseeable future. Here are 8 best practices that can
help make the most of this opportunity.

1. Pilot cloud storage using non-business-critical data


The implementation of cloud storage marks a significant change in your IT operational approach, transforming how other related
processes are carried out. It will influence data-driven applications, business analytics, integrations, and other components of the
IT landscape. Therefore, it is important to first trial cloud storage at a limited scale before implementing it across the organization.
This will allow you to observe its impact on related IT processes, tweaking the implementation SLAs and configurations accordingly.
Conduct the initial pilot using non-business-critical data to avoid interrupting live processes and keep any adverse impacts
restricted to a sandboxed environment.
Also Read: Top 10 Cloud Security Challenges 2021 Needs to Address

2. Leverage multi-cloud to avoid vendor lock-in


As the cloud storage market matures, providers are eager to deliver a wide variety of services and capabilities under one offering.
However, this could lead to vendor lock-in. If you rely on a single cloud environment for all your storage requirements, any
downtime or outage experienced by that environment could cripple your entire storage landscape. And, as your storage volumes
increase with time, you will find it increasingly harder to shift out if necessary. To prevent such a situation, it is advisable to leverage
a multi-cloud landscape where different data and application buckets are stored in a different cloud environment, and there is
interoperability among platforms.
Also Read: What Is Cloud Data Protection? Definition, Importance and Best Practices

3. Specify your data retention policies before migrating


Data retention refers to the practice of holding on to a data asset for a limited period, as per the wishes of the data owner, business
relevance, or industry rules. Retention policies not only mention how long to store data but also the timeline and methodology for
retiring it. Data retention policies will determine how much cloud storage you occupy, the frequency of backup and transfer
processes, and cloud storage costs. Without a detailed retention policy document in place, enterprises are likely to exceed their
projected storage volumes well ahead of time, leading to budget overruns. That’s why you need to specify your data retention
policies before migrating to the cloud, incorporating these into service level agreements (SLAs) to ensure predictability and
compliance later on.

4. Bring cloud storage under the ambit of IT compliance and audits


IT compliance and audits tend to focus on on-premise environments and first-party managed storage, overlooking data housed in
a remote location or by an external cloud vendor. This could cause non-compliance risks later on. Even if the data is stored by a
public cloud vendor or a third-party MSP, enterprises must still take complete ownership of regulatory compliance around data
privacy, compliance, and security. Cloud storage must be regularly audited with a detailed inventory of your assets, their utilization,
and retention plans.

5. Invest in the private cloud if you operate in a regulated industry


Regulated industries such as healthcare, financial services, governments, and educational institutions typically generate and store
large volumes of sensitive information. This could range from the medical histories of patients to the names and address
information of school students or payment card details. It can be helpful to leverage private cloud storage for information such as
these, protecting the data from the risks of a multi-tenant cloud architecture.
Private cloud storage also means that you are immune to vendor-related outages and downtimes, which would render these vital
data assets inaccessible. In fact, the private cloud is mission-critical for companies in regulated industries, where sensitive data is
essential for day-to-day business processes and not just compliance-related archives.
Also Read: What Is Container Security? Definition, Components, Best Practices, and Software

6. Make remote work a focus area when planning for cloud storage
Remote work is now a major use case for cloud storage implementation and is poised to be the new normal for the foreseeable
future. Therefore, your cloud storage strategy must take the needs of a remote worker into account, from connecting with the right
productivity tools to enforcing security policies that restrict remote access in certain scenarios. Outline measures to prevent
employees from accessing cloud storage from unfamiliar and unauthorized devices. Specify clear policies to regulate which data
can be stored on the cloud and which information needs to be kept on-premise.

7. Optimize data transfer to avoid egress fees


Most public cloud platforms charge you for data retrieval (also known as egress fees) to move data out of their cloud platform. This
tactic encourages more dependency and possibly vendor lock-in, as you keep data immobile on the cloud for longer periods. Your
data transfer frequency is directly linked to your cloud costs, and frequent retrieval (for example, to run on-premise analytics) will
add to your resource consumption in the form of egress fees.
There are two ways to address this. First, you can host analytics applications within the same public cloud so that data doesn’t
need to be moved out for processing. Second, you can optimize each transfer by compressing data volumes to reduce the retrieval
fees.

8. Adopt a cloud-first cybersecurity solution


Finally, ensure that your cybersecurity solution takes your cloud storage investments into account. For example, Trend Micro offers
a cloud-first solution called Cloud One – Conformity, and there are several cloud access security broker (CASB) tools available. Even
if only a portion of your total data assets is stored in the cloud, it has to be covered by a cybersecurity solution to close any
vulnerabilities and demonstrate compliance with data protection laws.
Also Read: What Is Cloud Computing Security? Definition, Risks, and Security Best Practices

Wrapping up
Even if the cloud plays a central role in data processing and storage, the future of cloud and data storage is changing rapidly. Data
security is one of the major concerns in cloud storage, and in the future, mass data breaches will be a strong point of concern for
businesses that opt for cloud storage.
In such a scenario, will the cloud become obsolete? What are the possible alternatives to store complex data in the future? There
are many options on the table, including serverless computing. Our two essential tips for techies looking at optimizing cloud
services are conducting regular reviews and identifying redundant tasks on cloud services. The idea is to enjoy the freedom that
the cloud offers without overspending.

Best Languages for App Development to Build High-quality Apps

You might also like