You are on page 1of 71

Cloud Computing Review, Research and Challenges: A

Technical Aspect
Abstract: Cloud computing is the latest technology in delivering computing resources as a
service. Cloud computing is architecture for providing computing service via the internet on
demand and pay per use access to a pool of shared resources namely networks, storage,
servers, services and applications, without physically acquiring them. Cloud computing is a
completely internet dependent technology where client data is stored and maintain in the
data center of a cloud provider like Google, Amazon, Salesforce.som and Microsoft etc. This
research paper outlines what cloud computing is, the various cloud models and the main
security risks and issues that are currently present within the cloud computing industry. This
research paper also analyses the key research and challenges that presents in cloud
computing and offers best practices to service providers.

INTRODUCTION
Computing is a model that allows universal, on-demand and easy network access to a
common pool of configurable computing resources (e.g. networks , servers , storage, software
, and services) that can be easily supplied and released with minimal management effort or
interaction between service providers. Cloud computing can be seen as network-enabled
platforms that offer flexible, on-demand services guaranteed by QoS that can be accessed
over the Internet. Cloud Computing is a distributed architecture that centralises server
resources in order to provide computing resources and services on demand on a scalable
platform. Cloud service providers ( CSPs) provide their users with cloud systems to use to build
their web services, just as internet service providers provide high-speed broadband Internet
connectivity to customers. Both CSPs and ISPs provide services (Internet Service Providers).
Three kinds of services are usually offered by the cloud, i.e. Software as a service (SaaS),
Infrastructure as a service (IaaS) and Application as a Service ( PaaS). There are numerous
reasons for companies to shift towards cloud computing IT solutions, as they are only
expected to pay for consumption-based services. Moreover, companies should easily meet
the needs of rapidly evolving markets and ensure that their customers are still on the leading
edge. Users can access heavy applications through lightweight portable devices such as cell
phones, PCs and PDAs by leveraging this technology.

SERVICE MODELS OF CLOUD COMPUTING


There are usually three types of cloud services: Software as a Service (SaaS), Platform as a
Service (PaaS), and Infrastructure as a Service (IaaS).

A) Software as a Service (SaaS): This is where users simply use a web browser to access
software created and provided by others as a web service. Users have no power or access at

1
the SaaS level to the underlying infrastructure that is used to host the applications. Customer
Relationship Management tools and Google Docs from Salesforce are common examples that
use the cloud computing SaaS model.

B) Platform as a Service (PaaS): This is where a collection of programming languages and


software provided by a PaaS provider are used to build applications. PaaS offers a high degree
of abstraction for users that allows them to concentrate on designing their applications and
not think about the technology underlying them.

C) Infrastructure as a Service (IaaS): This is where users receive computer resources from an
IaaS provider, such as computing power, memory and storage, and use the resources to instal
and run their applications. The IaaS model is a low level of abstraction that enables users to
access the underlying infrastructure by using virtual machines, as opposed to the PaaS model.
IaaS provides users with greater versatility than PaaS because it enables the user to instal any
software stack on top of the operating system. Flexibility, however, comes at a cost and users
are liable at the IaaS level for upgrading and patching the operating system. EC2 and S3 from
Amazon Web Services are common examples of IaaS. "Erdogmus, described Software as a
Service as the central principle behind cloud computing, indicating that" it's all software in the
end" does not matter if the software being distributed is infrastructure, network or
application. Although this is valid to some degree, as they have different abstraction levels, it
nevertheless helps to differentiate between the types of service being offered. The service
models mentioned in the description of NIST are deployed in clouds, but depending on who
owns and uses them, different types of clouds exist. In the NIST concept, this is referred to as
a cloud deployment model and the four common models are:

Private Server: a cloud primarily used by one company. The company itself or a third party
can run the cloud. Examples of companies providing private clouds are the St Andrews Cloud
Computing Co-laboratory and Concur Technologies.

Public Cloud: A cloud which the general public can use (for a fee). Public clouds need
considerable investment and are traditionally owned by major companies such as Microsoft,
Google or Amazon.

Public Platform: a cloud owned by many entities and typically designed to suit their particular
requirements. The Open Cirrus cloud testbed could be regarded as a cloud for the
community that aims to support cloud computing research.

Hybrid Cloud: a cloud that is set up using a combination of the three deployment models
described above. Each cloud in a hybrid cloud could be operated separately, but it would be
possible to transfer software and data through the hybrid cloud. Hybrid clouds allow cloud
bursting to take place, which is where, when more resources are needed, a private cloud will
burst out into a public cloud.

2
CLOUD COMPUTING ENTITIES
The two key entities in the business sector are cloud providers and customers. But, the two
more emerging service level companies in the Cloud world are service brokers and resellers.
As follows, these are discussed.

Cloud Providers: Includes Internet service providers, telecommunications firms and major
outsourcers of

business processes that supply either the media (Internet connexions) or the infrastructure
(hosted data centres) that enables cloud services to be accessed by customers. Service
providers can also include systems integrators that build and maintain private cloud hosting
data centres and provide various services to clients, service brokers or resellers (e.g. SaaS,
PaaS, IaaS, etc.)

Cloud Service Brokers: Involves technology experts, technical service associations for
companies, licenced brokers and agents and influencers to direct customers in the choice of
solutions for cloud computing. Service brokers focus on negotiating consumer-provider
partnerships without controlling or maintaining the entire cloud infrastructure. In addition, on
top of the infrastructure of a cloud provider, they add extra services to make up the cloud
ecosystem of the customer.

Cloud Resellers: As cloud providers grow their business across continents, resellers may
become an important force in the cloud industry. Local IT consultancies or resellers of their
established products may be chosen by cloud providers to serve as "resellers" for their cloud-
based products in a specific region. Cloud Consumers: End users belong to the Cloud
Consumers group. However, as soon as you are a client of another cloud provider, broker or
reseller, cloud service brokers and resellers may also belong to this group. Key benefits and
potential threats and risks to Cloud Computing are described in the next section.

CLOUD COMPUTING SECURITY ARCHITECTURE


Since the computers used to deliver services do not belong to the users themselves,
protection in cloud computing is a particularly troubling problem. Users have no power over
what could happen to their results, nor any knowledge of it. In situations where customers
have sensitive and personal information stored in a cloud computing service, this is a great
concern. Users would not risk their privacy, so providers of cloud storage services must ensure
that information is protected for consumers. This, though, is becoming increasingly difficult
because there always seems to be someone to find out a way to disable the protection and
take advantage of user information as security developments is made. SLA Track, Metering,
Billing, Resource Provisioning, Scheduler & Dispatcher, Load Balancer, Advance Resource
Reservation Monitor, and Policy Management are some of the essential components of the
service provider layer. Some of the Service Provider Layer-related security concerns are

3
Identity, Infrastructure, Privacy, Data transmission, People and Identity, Audit and
Compliance, Cloud integrity and Binding Issues.

Any of the main components of the Virtual Machine Layer produce and control the
number of virtual machines and the number of operating systems. VM Sprawl, VM Escape,
Infrastructure, Customer Isolation, Cloud Legal and Regularity Problems, Identity and Access
Management Some of the important components of the Data Center (Infrastructure) Layer
include the servers, CPUs, memory, and storage, and are now commonly referred to as
Infrastructure-as-a-Service (IaaS). Some of the main components of the Data Center
(Infrastructure) Layer Safe data at rest, Physical Protection: Network and Server are some of
the Data Center Layer based security concerns. To understand the risks associated with
services in a business context, risk assessment controls are critical. Activities have been
undertaken by the National Institute of Standard and Technology (NIST), USA
(http:/www.nist.gov/) to support cloud computing standards.

Several standards groups and industry consortia are creating requirements and test
beds in order to overcome the challenges and to facilitate cloud computing. Cloud Protection
Alliance (CSA), Internet Engineering Task Force (IETF), Storage Networking Industry
Association (SNIA) etc., are some of the current standards and test bed organisations. A cloud
API, on the other hand, offers either a functional interface or a management interface (or
both). There are several facets of cloud management that can be standardised for
interoperability.

Federated protection (e.g. identity) across clouds, metadata and data exchanges
between clouds, structured tracking, auditing, billing, reporting and notification outputs for
cloud applications and services, cloud-independent representation for policies and
governance, etc., are some potential standards. Figure 2 shows the high-level view of the
security architecture of cloud computing.

KEY SECURITY ISSUES IN CLOUD COMPUTING


Cloud computing is made up of parts of software, systems and networks. Each section
conducts various activities and provides companies and individuals around the world with
various items. Software as a Service ( SaaS), Utility Computing, Online Applications, Platform
as a Service (PaaS), Managed Service Providers (MSP), Service Commerce and Internet
Convergence are all included in the business application. Cloud computing has various security
concerns, including networks, databases , operating systems, virtualization, resource
scheduling, transaction management, load balancing, market control and memory
management, as it encompasses several technologies. Security problems are also applicable
to cloud computing with all of these systems and technologies. The network that links the
systems in a cloud, for instance, must be safe and it must be secure to map the virtual
machines to the physical machines. Data security involves encrypting the data as well as

4
ensuring that appropriate policies are enforced for data sharing. The given below are the
various security concerns in a cloud computing environment.

 Access to server and application

 Data Transmission

 Virtual Machine Security

 Network Security

 Data Security

 Data Privacy

 Data Integrity

 Data Location

 Data Availability

 Data Segregation

 Security Policy and Compliance

 Patch management

RESEARCH CHALLENGES IN CLOUD COMPUTING


Cloud Computing research addresses the challenges of meeting the needs of private, public
and hybrid cloud computing systems of the next decade, as well as the challenges of enabling
the advantages of cloud computing to be taken advantage of by applications and
development platforms. Cloud computing research is also at an early stage. Many current
problems have not been completely solved, although new issues continue to arise from
applications in the industry. Some of the challenging cloud computing research problems are
given below.

 Service Level Agreements (SLA’s)

 Cloud Data Management

 & Security Data Encryption

 Migration of virtual Machines

 Interoperability

 Access Controls
5
 Multitenancy

 Server Consolidation

 Reliability

 & Availability of Service Common Cloud Standards

 Platform Management

Service Level Agreements (SLA's): The cloud is operated by service level agreements that
enable several instances of one application to be replicated on multiple servers if necessary;
the cloud will reduce or shut down a lower-level application depending on a priority scheme.
The assessment of cloud vendors' SLAs is a major challenge for cloud customers. Most
suppliers create SLAs to create a protective shield against legal action, while giving customers
limited guarantees. Thus, there are some important concerns that need to be taken into
account by consumers before signing a contract with a supplier, such as data security,
outages, and price structures. If they resolve the necessary issues at the right time, the
specification of SLAs will better represent the needs of the clients.

Cloud Data Management: Cloud data Cloud data management is an important research topic
in cloud computing and can be very broad (e.g. text-based or science applications),
unstructured or semi-structured, and usually append-only with rare changes. As service
providers usually do not have access to the data centre physical protection system, in order to
achieve maximum data security, they must rely on the infrastructure provider. Also for a
virtual private cloud, only remotely can the service provider specify the security setting,
without knowing whether it is completely enforced. In this sense, the infrastructure provider
has to achieve goals such as confidentiality and auditability. These file systems are distinct in
their storage structure, access pattern and application programming interface from
conventional distributed file systems. They do not implement the standard POSIX interface, in
particular, and thus introduce compatibility problems with legacy file systems and
applications. This topic has been explored in many research efforts.

Data Encryption: Encryption is a crucial data protection technique. Understanding data in


motion and encryption of data at rest. Note, security can range from simple (easy to handle,
low cost and, quite frankly, not very safe) to highly secure (very difficult, costly to manage,
and access-limiting). It is decrypted and processed until the object enters the cloud. Is there
an option to encrypt it until it is saved? Before you upload the file for cloud computing, do you
want to think about encryption or do you prefer the cloud computing service to do it for you
automatically? There are ways to understand the cloud computing solution and make your
choices based on the desired security levels.

Virtual Machine Migration: applications are not exclusive to hardware; multiple programmes
can use virtualization to run on one machine, or several machines can run one programme. By
allowing virtual machine migration to balance the load across the data centre, virtualization
can provide major advantages in cloud computing. Furthermore, virtual machine migration in

6
data centres allows robust and highly responsive provisioning. From process migration
techniques, virtual machine migration has evolved. More recently, "live" migration of VMs
was introduced by Xen and VMWare, involving extremely short downtimes ranging from tens
of milliseconds to a second. Avoiding hotspots is the biggest advantage of VM migration, but
this is not straightforward. At present, the identification of workload hotspotsand the
initiation of a migration lack the agility to respond to sudden changes in workload. Moreover,
the in memory state should be transferred consistently and efficiently, with integrated
consideration of resources for applications and physical servers.

Interoperability: This is the ability of two or more systems to work together to share
information and to use the information that they exchange. Many public cloud networks are
configured and not intended to communicate with each other as closed systems. The lack of
collaboration between these networks makes it hard for companies to integrate their cloud IT
systems and realise cost savings and productivity gains. Industry standards must be
established to help cloud service providers build interoperable systems and enable data
portability in order to address this challenge. Organizations need to deliver services
automatically, handle VM instances, and use a single tool set to work with both cloud-based
and enterprise-based applications that can work through existing programmes and multiple
cloud providers. There is a need to provide cloud interoperability in this situation.

Access Controls: The management of authentication and identification is more critical than
ever. And, it's not all that different either. What is the standard of password strength
compliance and change frequency invoked by the service provider? What is the technique for
password and account name recovery? How are passwords supplied to users after a change is
made? What about logs and links to the right to audit? This is not all that different from how
you secure your internal systems and data, and it works the same way that you can protect
that access aspect if you use strong passwords, updated regularly, with typical IT protection
processes.

Multi-tenancy: There are many types of cloud applications that can be accessed via the
Internet by users, from small Internet-based widgets to large enterprise software applications
that have enhanced security requirements based on the type of data stored on the
infrastructure of the software provider. For several purposes, these application requests
require multi-tenancy, with cost being the most significant. Reaction times and output for
other customers can be influenced by many customers accessing the same hardware,
application servers, and databases. Specifically, resources are shared at each infrastructure
layer for application-layer multi-tenancy and have legitimate security and efficiency issues.
Many service requests that access resources at the same time, for example, increase wait
times but not inherently Processor time, or the number of connexions to an HTTP server has
been depleted, and the service must wait before an available link can be used or, in the worst
case scenario, the service drops Consolidation of servers: The increased usage of energy and
the decrease in power and cooling requirements achieved by server consolidation are now
being extended into the cloud. In a cloud computing system, server consolidation is an
efficient approach to optimise resource usage while minimising energy consumption. In order

7
to merge VMs residing on multiple under-used servers on a single server, Live VM migration
technology is also used so that the remaining servers can be set to an energy-saving state. The
topic of consolidating servers in a data centre optimally is also formulated as a variant of the
problem of vector bin-packing, which is a problem of NP-hard optimization. Various heuristics
for this topic have been proposed.

Reliability & Service Availability: When a cloud provider provides on-demand software as a
service, the problem of reliability falls into the picture. In order for users to access it under any
network conditions (such as during sluggish network connexions), the app needs to have a
consistent quality factor. Owing to the unreliability of on-demand apps, there are a few cases
found. The MobileMe cloud service from Apple, which stores and synchronises data across
multiple devices, is one example. When several users were not able to access mail and
synchronise data correctly, it started with an embarrassing start. Providers are turning to
technology such as Google Gears, Adobe AIR, and Curl to prevent such issues, enabling cloud-
based applications to run locally, some even allowing them to run in the absence of a network
connexion. These tools provide access to the desktop's storage and processing resources for
web applications, forming a connection between the cloud and the user's own computer.
Considering the use of software such as 3D gaming applications and video conferencing
systems, reliability is still a challenge to achieve for an IT solution that is based on cloud
computing.

Popular Cloud Standards: Cloud Computing security-based accreditation will cover three
main fields, which are technology, staff and operations. Organizations such as Jericho Forum1
are likely to be guided by technical requirements before being ratified by existing bodies, such
as ISO2 (International Standard Organization). The Institutefor Information Security
Professionals3 (IISP) also provides formal accreditation for security professionals on the
personnel side. There are some workable solutions for the operational components, such as
modifying ISO 27001 and using it as the default measurement standard within the SAS 704
system. One of the key issues at present is that there are many scattered activities going in
the direction of Cloud accreditation, but there is a lack of a common body to organise those
activities. It will also be a major challenge to create a single accreditation body to certify Cloud
services.

Platform Management: Difficulties in the delivery of middleware capabilities in a multi-


tenant, elastic and flexible environment to develop, deploy, integrate and manage
applications. One of the most significant components of cloud systems provides developers
with different types of platforms to write applications that run in the cloud or use cloud
services, or both. Different names, including on-demand and platform as a service (PaaS), are
used for this type of platform today. There is tremendous potential for this modern way of
helping applications. Most of what the application requires already exists when a
development team develops an on-site application (i.e. one that will operate within an
organisation). The operating system provides fundamental support for programme execution,
database interaction, and more, while other computers in the environment provide services
such as remote storage.

8
Cloud Computing: Opportunities and Challenges
Opportunities in Cloud
1. Cloud Opportunities in Education

Education plays an important role in maintaining the economic growth of a country.


Nowadays the class room teaching is changing and students are becoming more technology
oriented. Therefore, in this changing environment, it is important to incorporate latest
technologies in the teaching and learning process. The cloud helps the students, teachers,
faculty, parents, and staff to have ondemand access to critical information using any device
from anywhere. Both public and private institutions can use the cloud to deliver better
services, even as they work with fewer resources. Cloud computing technology can provide
solutions for the problems in smart education system. Cloud computing enables users to
control and access data via the Internet. The primary users of a typical higher education cloud
include students, faculty, administrative staff, Examination Branch and Admission Branch. All
the primary users of the institution are connected to the cloud. Separate login is provided for
all the users for their respective works. Teachers can upload their class Tutorials, assignments,
and tests on the cloud server which students will be able to access all the teaching materials
cab be provided by the teachers via Internet using computers and other electronic devices
both at home and college and 24/7. The education system will make it possible for teachers to
identify problem areas in which students tend to make mistakes, by analyzing students’ study
records. In doing so, it will also allow teachers to improve teaching materials and methods.
This will not only make it possible for students to use online teaching materials during class
hours but also at home. Utilization of cloud computing systems will reduce the cost of
operation because servers and learning materials are shared with other colleges.

Challenges
Security and Privacy:

Thesis a major concern among many institutions of higher learning to adopt cloud computing.
Cloud computing calls for the introduction of a third party who is the platform providers.
Hence, the privacy and security of data is hard to maintain.

Benefits

Most institutions of higher education are not yet convinced of the benefits that come with
cloud computing. Such institutions are more concerned with their conventional IT portfolio
and how to make cloud computing part of it. Students can learn at any time as they wish. If
any class is missed, they can go through the class again which is stored in the cloud.

9
Service Quality

This is one of the reasons cited by learning institutions for not shifting to cloud computing.
Institutions argue that the SLAs stipulated by the providers of cloud services are insufficient
when it comes to availability and security as well as scalability.

Lack of adequate network responsiveness

In case inadequate bandwidth of the network, it becomes impossible to deliver complex


services. Most of the learning institutions lack adequate bandwidth, hence cannot adopt cloud
computing affectively.

Integration

Different applications require complex integration as to connect to the available on-premise


applications, as well as cloud applications. This calls for the integration of existing university
data structures and systems with cloud applications. Thus, there is a need to have a quick,
cost effective and simple way to connect university systems with cloud applications.

2. Opportunities for Entrepreneurs

Several corporations are hoisting their computer networks into the “clouds”. Cloud Computing
is an emerging IT development, deployment and delivery model that enables real-time
delivery of products, services and solutions over the Internet. With cloud computing and
associated cloud services coming in a myriad of forms like software-as-a-service, storage on
demand, internal and external clouds, etc. Large corporations across multiple industries are
now discovering their ability to utilize cloud services to achieve cost-savings, expand their
businesses, and even decrease their carbon footprints.

Challenges
Cloud computing, which some people claimed as a new technology, has helped a lot of
organizations in doing business. Although cloud computing brings some benefits to the
organizations as aforementioned, there are some shortcomings for decision makers that need
to be taken into consideration. When cloud capacity is more than 80% occupied, the
computers will be irresponsible. There is a chance of crashing between servers and
computers. This will lead to the loss of valuable data such as customers’ data, organizations’
sales report etc. Cloud attack is also a major issue in cloud computing. Cloud computing is a
place for the users to host their web services such as web hosting and cloud storage. This has
attracted the hackers to steal the business data, such as daily sales, profit reports, financial
reports etc.

3. Opportunities for Health Care

Health care, as with any other service operation, requires continuous and systematic
innovation in order to remain cost effective, efficient and timely, and to provide high-quality
services. Many managers and experts believe that cloud computing can improve health care
10
by reducing electronic health record startup expenses, such as hardware, software,
networking, personnel, and licensing fees, and therefore will encourage its adoption. One
example of a cloud-based healthcare service is a proposed system that automates the process
of collecting patients’ vital data via a network of sensors connected to legacy medical devices,
and to deliver the data to a medical center’s “cloud” for storage, processing and distribution.
However, there are many challenges facing health-care providers in moving all their data to
the cloud.

A typical smart health care system is depicted in Fig. 3. According to this system, the
patients’ can register their details through online and their information will be stored in the
cloud. After selecting the trustable doctor, the patients can go for the diagnosis phase. The
doctors can analyze the patients’ history by retrieving the previous information about patients
from cloud and also analyze the history with the present condition. After that doctors will give
prescription via online and the treatment will be started. The patients and doctors can also
interact via online for the review. The health care analysis about the patients can be done via
cloud.

Challenges

The biggest issue in Health care industry is security and privacy of information. For example, if
medical data is stored on the cloud, then health-care services no longer have complete
control over the security of their patients’ information. There are some risk factors related to
the privacy which increase the possibility of the data being exposed or lost. Additionally, there
are different regulations that can vary from region to region regarding patient information,
making compliance with these various regulations potentially complicated. And, if there are
online server outages, availability of the data is severed during that time

Related work:

Several studies have been carried out regarding security issues in cloud computing from
several points of view. Jarabek presented an overview of the benefits and drawbacks of
virtualization in a cloud context . Also he examined the side-channel information leaks which
are particularly critical in a virtualized cloud environment and the Issues of security auditing
and cloud management. Ian Foster compared cloud computing with grid computing from
various insights; Cong [8] proposed a scheme of integration of storage correctness insurance
and data error localization. The proposed scheme is highly efficient and resilient against
Byzantine failure, malicious data modification attack, and even server colluding attacks. Rohit
discussed various security concerns for Cloud computing environment from network,
application and data storage perspectives and suggested some of their solutions. This paper
presents a survey of the cloud computing attacks from different levels such as cloud Security
Provider (CSP) level, network level and finally end user level. Additionally security attacks
mitigation is also discussed in this paper.

11
CLOUD SECURITY ATTACKS
Cloud computing involves three parties: Cloud Customer or user, Cloud Service Provider CSP and Cloud
network (usually the Internet that can be considered as the transmission media of the cloud).

There are many security threats at different levels, such as threats at Cloud Service Provider
CSP level, network Level and user/host level. These threats must be dealt with since it is necessary to
keep the cloud up and running continuously. In this section we will study different types of attacks at
different levels and the ways to reduce their damage of effect.

1. Cloud Service Provider CSP level attacks

The shared nature of the cloud and the increased demand on shared resource of the cloud
computing could be an attractive target to attackers. End users should take into consideration
the vulnerabilities of cloud computing before migrating to it. Examples of shared resources
are computing capacity, storage, and network [3]; this shared nature exposes the cloud to
many security breaches that are listed below:

(i) Guest-hopping attack:

Is defined as any separation failure between shared infrastructures. An attacker will try get
access to one virtual machine by penetrating another virtual machine hosted in the same
hardware. One of the possible mitigations of guest hopping attack is the Forensics and VM
debugging tools to observe any attempt to compromise VM.

Another possible mitigation is using High Assurance Platform (HAP) which provides a high
degree of isolation between virtual machines.

(ii) SQL injection:

Is often used to attack websites. It is accomplished by injecting SQL commands into a


database of an application from the web to dump or crash that database. To mitigate SQL
injection attack; it is necessary to remove all stored procedures that are rarely used. Also,
assign the least possible privileges to users who have permissions to access the database.

(iii) Side channel attack:

Is when the attacker places a malicious virtual machine on the same physical machine as the
victim machine; in that way the attacker can access all the confidential information on the
victim machine. International Journal of Computer Networks & Communications (IJCNC) Vol.5,
No.5, September 2013 212 As a countermeasure, it might be preferable to ensure that none
of the legitimate user VMs resides on the same hardware of other users. This completely
eliminates the risk of side-channel attacks in a virtualized cloud environment.

(iv) Malicious Insider:

One of the cloud computing challenges located at the data centers of the service providers is
when its employee is granted access to sensitive data of some or all customers administrators.

12
Such system privileges can expose these information to security threats. Strict privileges’
planning, security auditing can minimize this security threat.

(v) Data storage security:

In cloud computing, user’s data is stored in the Cloud Service Provider (CSP) set of servers,
which are running in a simultaneous and distributed manner. Ensuring data integrity and
confidently is vital. According to [8],[9],[16] there are some means to ensure integrity and
confidently of the data stored at the CSP that are listed below.

1. Ensure limited access to the users’ data by the CSP employees.

2. Strong authentication mechanisms to ensure that only legitimate employees gain access
and control CSP servers.

3. The CSP should use well defined Data backup and redundant data storage to make data
recovery possible.

(vi) Address Resolution Protocol (ARP) Cache Poisoning:

Address Resolution Protocol (ARP) is used in the TCP/IP stack to resolve an IP address (logical)
at the sender side into MAC address (physical) address at the receiver side. The ARP cache
stores a table that maps all the IP address of the networked devices and their corresponding
MAC addresses. An attacker can exploit some weakness in the ARP protocol to map an IP
address of the network to one malicious MAC, and then update the ARP cache with this
malicious MAC address. To mitigate this attack it is possible to use static ARP entries, this
technique can work for small networks like private clouds; but on large scale clouds it is better
to use other techniques such as port security features that locks a specific port on the switch
( or network device) to a specific IP address.

It should be noticed that CSP must have the latest network security enhancement techniques
such as Firewalls, Intrusion Detection/Prevention techniques, Centralized antivirus and anti-
malware techniques that runs multi antivirus solutions simultaneously to ensure best virus
and malware protection. Another important issue is to use IPSEC/VPN techniques between
the CSP and cloud users whenever it is possible. In addition, high physical security at the CSP
data center/s is vital, physical data security includes: Access control only for authorized
personnel, special fire systems, very specific data storage and backup strategists, etc.

2. Network Level Security attacks

Cloud computing depends mainly on the existing networks infrastructure such as LAN, MAN
and WAN; that is why cloud computing is exposed to the same security attacks. These attacks
may be originated from users outside the cloud (a user intended to attack the cloud for any
purpose), or a malicious insider residing between the user and the CSP and trying to interrupt
the data to/from the cloud. In this section we will try to focus on the network level security
attacks and their possible counter measures to insure proper data confidentiality and
integrity.
13
(i) Domain Name System (DNS) attacks

In the Internet, hosts are defined by names that are easy to remember by humans, while
computers deal with numbers. Each connected computer to the Internet has a globally unique
Internet Protocol (IP). The Domain Name System (DNS) converts host names into
corresponding Internet Protocol (IP) addresses using a distributed database scheme. Internet
DNS servers are subject to different types of attacks such as: ARP cache poisoning (as
explained in 3.1.6), domain hijacking, and man-in-the-middle attacks. A discussion of these
attacks can be found below.

(ii) Domain hijacking

Domain hijacking is defined as changing the name of a domain without the knowledge or
permission from the domain’s owner or creator. Domain hijacking enables intruders to access
sensitive corporate information and perform illegal activity such as phishing, where a website
is replaced by an identical website that records private information. One of the possible ways
to make domain hijacking very difficult is proposed by Internet Corporation for Assigned
Names and Numbers (ICANN) which forces a 60-day waiting period between a change in
registration information and a transfer to another registrar; most likely that the domain
creator will discover any change in that period. Another solution is using Extensible
Provisioning Protocol (EPP) that is used by many domain registries. EPP uses an authorization
code issued exclusively to the domain registrant as a security measure to prevent
unauthorized name changing.

(iii) IP Spoofing

IP spoofing is where the attacker gains unauthorized access to a computer by pretending that
the traffic has originated form a legitimate computer. IP spoofing is utilized to make other
attacks such as Denial of Service attack and Man in The Middle attack:

Denial of service attacks (DoS):

The purpose of these attacks is making the target network/computer resources unavailable. In
DoS attack the attacker floods the victim host with a huge number of packets in a short
amount of time, DoS is concerned only with consuming bandwidth and resources of the target
network/computer. The attacker uses a spoofed IP address as the source IP address to make
tracking and stopping of Dos very difficult. Furthermore, it is possible to the attacker to use
multiple compromised machines which he has already hijacked to attack the victim machine
at the same time (this attack is known as Distributed DoS) and it is very difficult to track and
stop.

TCP SYN flooding:

Is an example of DoS attack; the attacker floods the victim machine with a stream of spoofed
TCP SYN packets. This attack exploits the limitations of the three way handshake in
maintaining half-open connections.

14
Man In The Middle Attack (MITM): An attacker gains access to the network traffic using
network packet sniffer, routing and transport protocols flaws, these attacks could be used for
theft of confidential information. IP spoofing can be reduced using packet filtering by firewall,
strong encryption and origin authentication techniques.

End users’ attacks: Most of the cloud users’ attacks are phishing, fraud, and exploitation of
software vulnerabilities still work and can threaten the cloud service infrastructure.

Phishing and fraud: are attempts to steal the identity of a legitimate user such as
usernames, passwords, and credit card details. Phishing is typically carried out by sending the
user an email that contains a link to a fraud website that looks like a legitimate one, when the
user goes to that fake website, his user name and password will be sent to the attacker who
can use them to attack the cloud. Another form of phishing and fraud is to send the user an
email that pretends to become from the cloud service provider and asking the user to supply
his username and password for maintenance purposes for example; but indeed that spoofed
email came from an attacker to gain the user credentials then using them to attack the cloud.
Countermeasures of phishing are the use of Spam-filters, using plug-in spam blocker in the
Internet browsers and finally train the users not to respond to any spoofed email and not to
give their credentials to any website.

Exploitation of software vulnerabilities: Is any security flaw or weakness that can be found in
an operating system or a software that leads to security breach. This security breach is used
by an attacker to implant a malware for his own purpose. Common example of this attack is
buffer overflow where the operating system or software hangs and uncontrolled format string
that can be used to crash a program or to execute malicious code. Software vendors regularly
release security updates to address these flaws; updating systems with the latest security
updates can mitigate these attacks.

On The Privacy Of Cloud Computing


Cloud computing is a model for providing on-demand access to computing service via the
Internet. In this instance, the Internet is the transport mechanism between a client and a
server located somewhere in cyberspace, as compared to having computer applications
residing on an “on premises” computer. Adoption of cloud computing practically eliminates
two ongoing problems in IT service provisioning: the upfront costs of acquiring computational
resources and the time delay of building and deploying software applications. The technology
is not without a downside, which in this case is the privacy of business and personal
information. This paper provides a conspectus of the major issues in cloud computing privacy
and should be regarded as an introductory paper on this important topic.

It seems as though most computer users would like privacy and information security
while having convenient access to interlinked computing services both on-premises and in the
cloud. In this instance, the cloud is a metaphor for the Internet, which can be used as the
delivery vehicle for computing services and the storage of information. Advocates of cloud
computing are faced with two major problems, that is, in addition to the usual problem of
15
transferring one’s resources from one operational environment to another. The first of the
major problems is the ongoing feeling that we are experiencing the “déjà vu all over again”
syndrome. Many of us have gone through an avalanche of new technological advances
intended as solutions to our administrative and operational problems – at least, the ones
involving management and information systems. Some of the technical innovations we have
experienced include scalable main-frame computers, advanced operating systems, time
sharing, client/server, online systems, mini computers, personal computers, artificial
intelligence, hand-held computers, the Internet and the World Wide Web, mobile computers,
social networking, and by the time this paper is published, there will no doubt be several more
entries to add to the list. So one has reason to be skeptical of someone writing that cloud
computing is worthy of serious attention. Of course, we think it is, for obvious reasons.

The second major issue is privacy, and it stems from the fact that with cloud
computing, data and programs are stored off-premises and managed by a service provider.
When a third party gets a hold of your data, who knows what is going to happen to it. Many
proponents of cloud computing conveniently characterize it as analogous to the electric
utility. The basic idea is that the private generators of the twentieth century were replaced by
the electricity grids of today without undue concern. It is easy to imagine, however, that the
measurement of electricity usage would have been of concern to some people in the early
1900s. Although similar in some respects, cloud computing is different in one important way.
The cloud will typically handle information, which is the basic unit of exchange, about which
security and privacy are of paramount concern. With electricity, there is no interest in
individual electrons. With information, the key issues are identity, security, and privacy. The
side issues are one’s inherent identity attributes (such as age, gender, and race),
accountability (for online computing activities), and anonymity (in order to preserve free
speech and other forms of behavior for the parties involved). The main consideration may
turn out to be a matter of control, because from an organizational perspective, control over
information has historically been with the organization that creates or maintains it. From a
personal perspective, on the other hand, a person should have the wherewithal to control
their identity and the release of information about themselves, and in the latter case, a
precise determination of to whom it is released and for what reason. Who owns the data? Is it
the person about whom the data pertains? Is it the organization that prototypically manages
the data? Or, is it the cloud provider that physically stores the data somewhere out in
cyberspace? Consider your financial information. Is it your property or is it your bank’s
business property? We will try to provide a perspective on this important issue in the
following sections. Privacy issues are not fundamentally caused by cloud computing, but they
are exacerbated by employing the technology for economic benefit. To put it as diplomatically
as possible, if a business employs cloud computing to save money on its IT bill, should it be
allowed to do so at the “privacy” expense of its customers?

CLOUD COMPUTING CONCEPTS

Cloud computing is an architectural model for deploying and accessing computer facilities via
the Internet. A cloud service provider would supply ubiquitous access through a web browser

16
to software services executed in a cloud data center. The software would satisfy consumer
and business needs. Because software availability plays a major role in cloud computing, the
subject is often referred to as software-as-a-service (SaaS). Conceptually, there is nothing
particularly special about a cloud data center, because it is a conventional web site that
provides computing and storage facilities. The definitive aspect of a cloud data center is the
level of sophistication of hardware and software needed to scale up to service a large number
of customers. Cloud computing is a form of service provisioning where the service provider
supplies the network access, security, application software, processing capability, and data
storage from a data center and operates that center as a utility in order to supply ondemand
self service, broad network access, resource pooling, rapid application acquisition, and
measured service. The notion of measured service represents a “pay for what you use”
metered model applied to differing forms of customer service.

Cloud Service Characteristics

The operational environment for cloud computing supports three categories of informational
resources for achieving agility, availability, collaboration, and elasticity in the deployment and
use of cloud services that include software, information, and cloud infrastructure. The
software category includes system software, application software, infrastructure software,
and accessibility software. The information category refers to large collections of data and the
requisite database and management facilities needed for efficient and secure storage
utilization. The category of cloud infrastructure is comprised of computer resources, network
facilities, and the fabric for scalable consumer operations. We are going to adopt a description
of a cloud framework that necessarily includes three forms of description: terminology,
architectural requirements, and a reference model. The description generally adheres to the
National Institute of Standards and Technology (NIST) cloud-computing paradigm. (Mell
2009b, Brunette 2009) Agility generally refers to the ability to respond in a timely manner to
market and product changes through business alignment, which is achieved by decreasing the
lead time to deploy a new application by reducing or eliminating the effect of training,
hardware acquisition, and software acquirement. Thus, the IT department is able to respond
more quickly to business needs. Availability concerns two aspects of computer utilization: the
time that the facilities are available for use and the scope of the resources that are available.
Cloud computing facilitates collaboration through network access, provided that the software
tools for end user cooperation are available. Elasticity is the characteristic of cloud services
that permits computing and storage capability to be scaled up to meet demands on an on-
demand basis through resource pooling.

Based on this brief assessment, we can characterize cloud computing as possessing the following
characteristics: (Nelson 2009)

 On-demand self service

 Broad network access

 Resource pooling

17
 Rapid elasticity

 Measured service

The benefit of having lower costs and a less complex operating environment is particularly
attractive to small-to medium-sized enterprises, certain governmental agencies, research
organizations, and many countries.

Cloud Computing Utilization

There are four main actors – so to speak – in cloud computing: the cloud service provider, the
software service provider, the customer, and the user. Each of the actors represents centers
of computer-related activity that can overlap to some degree. The cloud service provider (CSP)
owns the infrastructure, hardware, software, and network facilities needed to supply cloud
computing services managed by a cloud operating system. The CSP performs a function
known as hosting that can be used to run computer programs, referred to as applications. This
facility, known in some circles, as a cloud platform (CP), can be regarded as an application
service that runs in the cloud. More specifically, a cloud platform provides services to
applications in the same manner that “software as a service” programs provide services to
clients using the cloud as a transport medium. A cloud platform is as much about operating in
the cloud, as it is about developing applications for the cloud. A software service provider
develops applications that are used by customers to obtain computing services. The SSP can
be an independent software vendor (ISV) or an organization that develops a software package
that uses the CP as a delivery vehicle for computing and provides application services to
customers. ISV software can be used by many customers in the usual fashion for software
deployment. When it is shared during operation to achieve economy-of-scale, it is regarded as
a multi-tenant model, wherein each customer is one of the tenants. The customer (C) is
typically an enterprise that is comprised of several employees that use the application and are
regarded as users. The user (U) is probably going to be a person that uses the cloud
computing service via a web browser in one of the following capacities: as an employee of an
organization that is contracted to use SaaS provided by an ISV or acquired independently to
run in the cloud on a cloud platform; or as a user of third-party SaaS developed by an ISV or
the CSP. The four relevant scenarios are summarized by the following schema:

CSP – CP – ISV – C – U

CSP – CP – ISV – U

CSP – CP – C – U

CSP – CP – U

For example, you will be using scenario CSP – CP – ISV – C – U if your company has acquired an
operational package from a software vendor and is hosting that software in the cloud.
Similarly, you will be using scenario CSP – CP – U if you are using an office package provided
by a CSP and accessed via your browser. This form of conceptualization is important from a

18
privacy point-of-view, because each exchange between modules represents a touch point for
privacy concerns.

Cloud Platform

A cloud platform provides the facility for an application developer to create applications that
run in the cloud or use cloud platform services that are available from the cloud. Chappell lists
three kinds of cloud services: SaaS user services, on-premises application development
services (attached services), and cloud application development services. (Chappell 2009) An
SaaS application runs entirely in the cloud and is accessible through the Internet from an on-
premises browser. Attached services provide functionality through the cloud to support
serviceoriented architecture (SOA) type component development that runs on-premises.
Cloud application development services support the development of applications that
typically interact while running in the cloud and on-premises.

A cloud platform can be conceptualized as being comprised of three complementary


groups of services: foundations, infrastructure services, and application services. The
foundation refers to the operating system, storage system, file system, and database system.
Infrastructure services include authorization/authentication/security facilities, integration
between infrastructure and application services, and online storage facilities. Application
services refer to ordinary business services that expose “functional” services as SOA1
components. Cloud platforms are a lot like enterprise-level platforms, except that they are
designed to scale up to support Internet-level operations.

CLOUD ARCHITECTURE

Cloud architecture is a collection of three categories of information resources for the


deployment and use of cloud services that include software, information, and cloud
infrastructure. (Katzan 2009) The software category includes system software, application
software, infrastructure software, and accessibility software. The information category refers
to large collections of data and the requisite database and management facilities needed for
efficient and secure storage utilization. The category of cloud infrastructure includes compute
resources, network facilities, and the fabric for scalable consumer operations. We are going to
adopt an ontological formulation to the description of a cloud framework that necessarily
includes three classes of information: terminology, architectural requirements, and a
reference model. The description generally adheres to the National Institute of Standards and
Technology (NIST) cloud-computing paradigm. (Mel op cit)

Service Models

The cloud service models give a view of what a cloud service is. It is a statement of being. A
cloud service system is a set of elements that facilitate the development of cloud applications.
(Youseff 2009) Here is a description of the three layers in the NIST service model description.

Cloud Software as a Service (SaaS). The capability provided to the consumer is to use the
provider’s applications running on a cloud infrastructure. The applications are accessible from

19
various client devices through a thin client interface such as a web browser (e.g., web-based
email). The consumer does not manage or control the underlying cloud infrastructure
including network, servers, operating systems, storage, or even individual application
capabilities, with the possible exception of limited user-specific application configuration
settings.

Cloud Platform as a Service (PaaS). The capability provided to the consumer is to deploy onto
the cloud infrastructure consumer-created or acquired applications created using
programming languages and tools supported by the provider. The consumer does not manage
or control the underlying cloud infrastructure including network, servers, operating systems,
or storage, but has control over the deployed applications and possibly application hosting
environment configurations.

Cloud Infrastructure as a Service (IaaS). The capability provided to the consumer is to


provision processing, storage, networks, and other fundamental computing resources where
the consumer is able to deploy and run arbitrary software, which can include operating
systems and applications. The consumer does not manage or control the underlying cloud
infrastructure but has control over operating systems, storage, deployed applications, and
possibly limited control of select networking components (e.g., host firewalls).

The three service model elements should be deployed in a cloud environment with the
essential characteristics in order to achieve a cloud status.

Service Deployment Models

The essential elements of a cloud service system are given above. In order to develop
enterprise-wide applications, a domain ontological viewpoint has to be assumed with
deployment models from the following list: (Mel op cit.)

Private cloud. The cloud infrastructure is operated solely for an organization. It may be
managed by the organization or a third party and may exist on premise or off premise.

Community cloud. The cloud infrastructure is shared by several organizations and supports a
specific community that has shared concerns (e.g., mission, security requirements, policy, and
compliance considerations). It may be managed by the organizations or a third party and may
exist on-premises or off-premises.

Public cloud. The cloud infrastructure is made available to the general public or a large
industry group and is owned by an organization selling cloud services.

Hybrid cloud. The cloud infrastructure is a composition of two or more clouds (private, community, or
public) that remain unique entities but are bound together by standardized or proprietary technology
that enables data and application portability (e.g., cloud bursting for load-balancing between clouds).

Most cloud software service application domains will be synthesized from a combination of the
deployment models.

20
CLOUD SECURITY
The scope of cloud security is huge by any objective measure. One ordinarily thinks of cloud
security in terms of authorization, authentication, accountability, end-to-end trust, and so
forth. However, it is important to view cloud security concerns in a broader context of data
protection, disaster recovery, and enterprise continuity. The storage of customer data may be
useful for operations and research but also opens the door for misuse and violation of privacy
policy. Government regulations, such as the FFIEC (Federal financial Institutions Examination
Council), HIPAA (Health Insurance Portability and Accountability Act), and PCI DSS (Payment
Card Industry Data Security Standards), are in place and strict adherence to the guidelines by
cloud service providers can only be achieved through systems design and effective auditing,
(Web Hosting Fan 2009) Accordingly, even though we are going to concentrate on the former,
it is important to keep the latter in mind through PCI DSS, SOX, and HIPAA compliance2 . This
can be achieved through ISO/IEC 27001:2005 certification and SAS 70 Type I and II
attestations. (Shinder 2009) In this section, we are going to develop an operational basis for
cloud computing privacy based on security.

Identity

Identity is a means of denoting an entity in a particular namespace and is the basis of security
and privacy – regardless if the context is digital identification or non-digital identification. We
are going to refer to an identity object as an entity. An entity may have several identities and
belong to more than one namespace.

A pure identity denotation is independent of a specific context, and a federated identity


reflects a process that is shared between identity management systems. When one identity
management system accepts the certification of another, a phenomenon known as “trust” is
established. The execution of trust is often facilitated by a third party that is acknowledged by
both parties and serves as the basis of digital identity in cloud services.

Access to computing facilities is achieved through a process known as authentication,


whereby an entity makes a claim to its identity by presenting an identity symbol for
verification and control. Authentication is usually paired with a related specification known as
authorization to obtain the right to address a given service.

Authentication

In a cloud computing environment, an SaaS service provider is commonly faced with two
situations: the single tenant model and the multi-tenant model. In the single tenant model,
typified by the [CSP – CP – C – U] and [CSP – CP – U] scenarios, given previously, a single sign-
on to the cloud service is ordinarily required. This means that the end user would then have to
log on to the local computer and then log on to the application service at the cloud platform.
This is typically the case with consumer cloud services and customer-developed application
software. When the application requires an additional sign-on, it must maintain its own user
accounts – a process known as delegated administration

21
When authentication requires a sign-on to an enterprise system running on the cloud
and then on to a specific application, a multiple sign-on would ordinarily be required. With a
decentralized authentication system, the user would sign-on to an authentication server that
would issue a token accepted by a federated server as proof of identity, required by specific
applications. An SaaS provider with thousands of customers would prefer a decentralized
solution in lieu of establishing a trust relationship with each of its customers.

Authorization

Typically, authorization refers to permission to perform certain actions. In cloud computing,


users are assigned roles that must match corresponding roles associated with a requisite SaaS
application. Each SaaS application contains a set of roles pertinent to the corresponding
business function. Access is further controlled by business rules that specify conditions that
must be met before access is granted. The role/business-rule modality also applies to storage
in the cloud, and this is where the practice of privacy kicks in.

In general, the combination of identification and authentication determine who can sign-on
to a system – that is, who is authorized to use that system. Authorization, often established
with access control lists, determines what functions a user can perform. A related measure,
known as accountability, records a user’s actions. Authorization cannot occur without
authentication.

In general, there are two basic forms of access control: discretionary access control, and
mandatory access control. With discretionary access control (DAC), the security policy is
determined by the owner of the security object. With mandatory access control (MAC), the
security policy is governed by the system that contains the security object. Privacy policy
should, in general, be governed by both forms of access control. DAC reflects owner
considerations, and MAC governs inter-system controls.

Accountability

Accountability is determined by audit trails and user logs that are prototypically used to
uncover security violations and analyze security incidents. In the modern world of computer
and information privacy, accountability would additionally incorporate the recording of
privacy touch points to assist in managing privacy concerns. Although the Internet is a fruitful
technology, it garners very little trust. Why? It is very cumbersome to assign responsibility for
shortcomings and failure in an Internet operational environment. Failure now takes on an
additional meaning. In addition to operational failure, it is important to also include “failure to
perform as expected," as a new dimension.

Trustworthy Computing

Trustworthy computing refers to the notion that people in particular and society as a whole
can trust computers to safeguard things that are important to them. Medical and financial
information are cases in point. Computing devices, software services, and reliable networks
are becoming pervasive in everyday life, but the lingering doubt remains over whether or not

22
we can trust them. Expectations have risen with regard to technology such that those
expectations now encompass safety, reliability, and the integrity of organization that supply
the technology. Society will only accept a technological advance when an efficient and
effective set of policies, engineering processes, business practices, and enforceable regulation
are in place. We are searching for a framework to guide the way to efficacy in computing.

It is generally felt that a framework for understanding a technology should reflect the
underlying concepts required for its development and subsequent acceptance as an
operational modality. A technology should enable the delivery of value rather than constrain
it, and that is our objective with trustworthy computing. Security has changed with the advent
of the Internet and the well-publicized threats to computing and storage facilities. In the past,
security was primarily concerned with keeping harmful people out. In the modern world of
the Internet, the objective is to enable the right people to access the right information in a
trusted environment. Thus, security is an enabler for greater freedom and confidence in
information systems.

As with many utilities, trustworthy computing should be intuitive, controllable,


reliable, and predictable. In order to achieve these lofty goals, we are going to look to the
framework developed at Microsoft (Mundie 2002) consisting of goals, means, and execution.
The set of goals reflects a subject’s perspective and is comprised of security, privacy,
reliability, and business integrity considerations. The set of means refers to the computer
industry’s viewpoint and includes secure-by-design, secure-by-default, secure-in-deployment,
fair-information principles, availability, manageability, accuracy, usability, responsiveness, and
transparency. Execution concerns the manner in which an organization does business and
includes intent, implementation, evidence, and integrity. One approach to using the
framework is through the concept of a trusted stack constructed from five important
elements: secure hardware, a trusted operating system, trusted applications, trusted people,
and trusted data. (Charney 2008)

A simple view of trustworthy computing is that it is comprised of security, privacy, and


usability. Usability and security have been introduced. All that is required to achieve essential
trust in cloud computing is privacy.

CLOUD PRIVACY
The cloud will typically process and store information about which privacy is of paramount
concern. The main issue is identity, which serves as the basis of privacy or lack of it, and
undermines the trust of individuals and organizations in other entities. The key consideration
may turn out to be the integrity that organizations adopt when handling personal information
and how accountable they are about their information practices. From an organizational
perspective, control over information should remain with the end user or the data’s creator
with adequate controls over repurposing. From a personal perspective, the person should
have the wherewithal to control his or her identity as well as the release of socially sensitive
identity attributes. Who owns the data? Is it the person about whom the data pertains? Is it
the organization that prototypically stores the data? Or, is it the cloud provider that physically
23
stores the data somewhere out in cyberspace? As an example, is your financial information (as
personal data) your property or is it your bank’s business property?

Key Factors in Privacy Protection

One of the beneficial aspects of the present concern over information privacy is that it places
the person about whom data are recorded in proper perspective. Whereas such a person may
be the object in an information system, he or she is regarded as the subject in privacy
protection. This usage of the word subject is intended to imply that a person should in fact
have some control over the storage of personal information.

More specifically, the subject is the person, natural or legal, about whom data are
stored. The beneficial user is the organization or individual for whom processing is performed,
and the agency is the computing system in which the processing is performed and information
is stored. In many cases, the beneficial user and the subject are members of the same
organization. In most instances, however, this will not be the case. For example, the agency
may be a service company, and the subject may be a creditor.

In general, the beneficial user obtains value from the data processed and has some
control over the manner and time span in which the processing is performed. The agency
need not be aware of the end use of the information or how and when the processing is
performed.

The heart of the issue is privacy protection, which normally refers to the protection of
rights of individuals. While the concept may also apply to groups of individuals, the individual
aspect of the issue is that which raises questions of privacy and liberty.

Privacy Theory

Privacy refers to the claim of persons to determine when, how, and to what extent
information about themselves is communicated to others. Much of the literature is concerned
with the physical state of being private. The four states of being private are solitude, intimacy,
anonymity, and reserve. Solitude implies physical separation from the group. Intimacy implies
participation in a small unit that achieves corporate solitude. Anonymity implies freedom from
identification and surveillance, which may be informational or physical. Reserve implies the
creation of a psychological barrier that protects the individual from unwanted intrusion.
(Katzan 1980) The states serve to provide personal autonomy, emotional release, self
evaluation, and limited and protected communication. Privacy is needed to realize basic
personal and organizational objectives. Also, there is a universal tendency of individuals to
invade the privacy of others and of society as a whole to engage in surveillance to enforce its
norms.

Privacy Domain

Personal information is being collected about individuals through information and


communication technology inherent in most social and economic activities. When we search
the Web, our search phrases are being stored for possible analysis and review. (Conti 2009)
24
When we drive our cars, our license plate numbers and locations are stored by law
enforcement. When we purchase items with a credit card, a record of our activity is available
to organizations with authority. The list could go on and on, but is well summarized by Ann
Cavoukian.

Our digital footprints are being gathered together bit by bit, megabyte by megabyte,
terabyte by terabyte, into personas and profiles and avatars – virtual representations of us, in
a hundred thousand simultaneous locations. … novel risks and threats are emerging from this
digital cornucopia. Identity fraud and theft are the diseases of the Information Age, along with
new forms of discrimination and social engineering made possible by the surfeit of data.

There are other considerations to the subject of privacy. The majority of companies
doing business online and notonline have privacy policies in place that do little to protect
consumer privacy. (ACLU 2010, p. 8) The policies give wide latitude to the companies and
essentially provide nothing more than informing the consumer of what do, as if telling
constitutes legality. The consumer is given few choices to control personal information. In
fact, the current situation concerning privacy is that the consumer wants greater control over
their information, and a 2009 study found that 69% of adult Internet consumers want the
legal right to know everything that a company knows about them.

Privacy Assessment

The Federal Bureau of Investigation (U.S.A.) lists seven criteria for evaluating privacy concerns
for individuals and for designing cloud computing applications4 :

 What information is being collected?

 Why is the information being collected?

 What is the intended use of the information?

 With whom will the information be shared?

 What opportunities will individuals have to decline to provide information or to consent to


particular uses of the i How will the information be secure?

 How will the information be secure?

 Is this a system of records?

Since privacy is a fundamental right in the United States, the above considerations
obviously resulted from extant concerns by individuals and privacy rights groups. In a 2009
Legislative Primer, the following concerns are expressed by the Center for Digital Democracy:

Tracking people’s every move online is an invasion of privacy. Online behavioral tracking is
even more distressing when consumers aren’t aware who is tracking them, that it’s
happening, or how the information will be used. Often consumers are not asked for their

25
consent and have no meaningful control over the collection and use of their information,
often by third parties with which they have no relationships.

Online behavioral tracking and targeting can be used to take advantage of vulnerable
consumers. Information about a consumer’s health, financial condition, age, sexual
orientation, and other personal attributes can be inferred from online tracking and used to
target the person for payday loans, sub-prime mortgages, bogus health cures and other
dubious products and services. Children are an especially vulnerable target audience since
they lack the capacity to evaluate ads.

Online behavioral tracking and targeting can be used to unfairly discriminate against
consumers. Profiles of individuals, whether accurate or not, can result in “online redlining” in
which some people are offered certain consumer products or services at higher costs or with
less favorable terms than others, or denied access to goods and services altogether.

Online behavioral profiles may be used for purposes beyond commercial purposes. Internet
Service Providers (ISPs), cell phone companies, online advertisers and virtually every business
on the web retains critical data on individuals. In the absence of clear privacy laws and
security standards these profiles leave individuals vulnerable to warrantless searches, attacks
from identity thieves, child predators, domestic abusers and other criminals. Also, despite a
lack of accuracy, employers, divorce attorneys, and private investigators may find the
information attractive and use the information against the interests of an individual.
Individuals have no control over who has access to such information, how it is secured, and
under what circumstances it may be obtained.

Based on these issues, the primer includes the following recommendations for legislative
consideration:

 Individuals should be protected even if the information collected about them in behavioral
tracking cannot be linked to their names, addresses, or other traditional “personally
identifiable information,” as long as they can be distinguished as a particular computer user
based on their profile.

 Sensitive information should not be collected or used for behavioral tracking or targeting.
Sensitive information should be defined by the FTC and should include data about health,
finances, ethnicity, race, sexual orientation, personal relationships and political activity.

 No behavioral data should be collected or used from children and adolescents under 18 to
the extent that age can be inferred.

 There should be limits to the collection of both personal and behavioral data, and any such
data should be obtained by lawful and fair means and, where appropriate, with the knowledge
or consent of the individual.

 Personal and behavioral data should be relevant to the purposes for which they are to be
used.

26
 The purposes for which both personal and behavioral data are collected should be specified
not later than at the time of data collection and the subsequent use limited to the fulfillment of
those purposes, and with any change of purpose of the data the individual must be alerted and
given an option to refuse collection or use.

 Personal and behavioral data should not be disclosed, made available or otherwise used for
purposes other than those specified in advance except: a) with the consent of the individual; or
b) by the authority of law.

 Reasonable security safeguards against loss, unauthorized access, modification, disclosure


and other risks should protect both personal and behavioral data.

 There should be a general policy of openness about developments, practices, uses and
policies with respect to personal and behavioral data. Means should be readily available for
establishing the existence and nature of personal data, and the main purposes of their use, as
well as the identity and usual residence of the data controller.

 An individual should have the right: a) to obtain from a behavioral tracker, or otherwise,
confirmation of whether or not the behavioral tracker has data relating to him; b) to have
communicated to him data relating to him within a reasonable time; at a charge, if any, that is
not excessive; in a reasonable manner; and in a form that is readily intelligible to him; c) to be
given reasons if a request made under subparagraphs (a) and (b) is denied, and to be able to
challenge such denial; and d) to challenge data relating to him and, if the challenge is
successful, to have the data erased, rectified, completed or amended.

 Consumers should always be able to obtain their personal or behavioral data held by an
entity engaged in tracking or targeting.

 Every entity involved in any behavioral tracking or targeting activity should be accountable
for complying with the law and its own policies.

 Consumers should have the right of private action with liquidated damages; the appropriate
protection by federal and state regulations and oversight; and the expectation that online data
collection entities will engage in appropriate practices to ensure privacy protection (such as
conducting independent audits and the appointment of a Chief Privacy Officer).

 If a behavioral targeter receives a subpoena, court order, or legal process that requires the
disclosure of information about an identifiable individual, the behavioral targeter must, except
where otherwise prohibited by law, make reasonable efforts to a) notify the individual prior to
responding to the subpoena, court order, or legal process; and b) provide the individual with as
much advance notice as is reasonably practical before responding.

 The FTC should establish a Behavioral Tracker Registry.

 There should be no preemption of state laws.

27
Accordingly, it would seem that some form of data governance is in order to protect
the privacy rights of subjects.

Privacy Analysis of Cloud Computing

In order to integrate the cloud computing and privacy issues, the World Privacy Forum has
come up with a set of findings that are summarized in the following list :

 Cloud computing has significant implications for the privacy of personal information as well
as for the confidentiality of business and government information.

 A user’s privacy and confidentiality risks vary significantly with the terms of service and
privacy policy established by the cloud provider.

 For some types of information and some categories of cloud computing users, privacy and
confidentiality rights, obligations, and status may change when a user discloses information to
a cloud provider.

 Disclosure and remote storage may have adverse consequences for the legal status of or
protections for personal or business information.

 The location of information in the cloud may have significant effects on the privacy and
confidentiality protection of information and on the privacy obligations of those who process
or store the information.

 Information in the cloud may have more than one legal location at the same time, with
differing legal consequences.

 Laws could oblige a cloud provider to examine user records for evidence of criminal activity
and other matters.

 Legal uncertainties make it difficult to assess the status of information in the cloud as well
as the privacy and confidentiality protections available to users.

 Responses to the privacy and confidentiality risks of cloud computing include better policies
and practices by cloud providers, changes to laws, and more vigilance by users.

Some of the open items in cloud computing privacy that immediately come to mind are listed
as follows:

 A business sharing information with a cloud provider.

 Consequences of third party storage for individuals and business.

 Information disclosure to private parties.

 Location of cloud data and local law.

 Change of a cloud provider.

28
 Cloud provider disclosure obligations.

 Audits, security, and subpoenas.

Based on this analysis, it would seem that consumer-based cloud services would be good
candidates for public and community clouds. For business, education, and government,
private and hybrid clouds are prudent options until the legal questions can be resolved.

CLOUD COMPUTING ENVIRONMENT: A REVIEW


ABSTRACT

Cloud computing is a vigorous technology by which a user can get software, application,
operating system and hardware as a service without actually possessing it and paying only
according to the usage. Cloud Computing is a hot topic of research for the researchers these
days. With the rapid growth of Internet technology cloud computing have become main
source of computing for small as well big IT companies. In the cloud computing milieu the
cloud data centers and the users of the cloud-computing are globally situated, therefore it is a
big challenge for cloud data centers to efficiently handle the requests which are coming from
millions of users and service them in an efficient manner.Cloud computing is Internet based
development and use of computer technology. It is a style of computing in which dynamically
scalable and often virtualized resources are provided as a service over the Internet. Users
need not have knowledge of, expertise in, or control over the technology infrastructure "in
the cloud" that supports them. Scheduling is one of the core steps to efficiently exploit the
capabilities of heterogeneous computing systems. On cloud computing platform, load
balancing of the entire system can be dynamically handled by using virtualization technology
through which it becomes possible to remap virtual machine and physical resources according
to the change in load. However, in order to improve performance, the virtual machines have
to fully utilize its resources and services by adapting to computing environment dynamically.
The load balancing with proper allocation of resources must be guaranteed in order to
improve resource utility. Load balancing is a critical aspect that ensures that all the resources
and entities are well balanced such that no resource or entity neither is under loaded nor
overloaded. The load balancing algorithms can be static or dynamic. Load balancing in this
environment means equal distribution of workload across all the nodes. Load balancing
provides a way of achieving the proper utilization of resources and better user satisfaction.
Hence, use of an appropriate load balancing algorithm is necessary for selecting the virtual
machines or servers. This paper focuses on the load balancing algorithm which distributes the
incoming jobs among VMs optimally in cloud data centers. In this paper, we have reviewed
several existing load balancing mechanisms and we have tried to address the problems
associated with them.

NEED FOR CLOUD COMPUTING

What could we do with 1000 times more data and CPU power? One simple question. That’s all
it took the interviewers to bewilder the confident job applicants at Google. This is a question

29
of relevance because the amount of data that an application handles is increasing day by day
and so is the CPU power that one can harness.

There are many answers to this question. With this much CPU power, we could scale
our businesses to 1000 times more users. Right now, we are gathering statistics about every
user using an application. With such CPU power at hand, we could monitor every single user
click and every user interaction such that we can gather all the statistics about the user. We
could improve the recommendation systems of users. We could model better price plan
choices. With this CPU power, we could simulate the case where we have say 1, 00,000 users
in the system without any glitches. There are lots of other things we could do with so much
CPU power and data capabilities. But what is keeping us back. One of the reasons is the large-
scale architecture which comes with these are difficult to manage. There may be many
different problems with the architecture we have to support. The machines may start failing,
the hard drives may crash, the network may go down and many other such hardware
problems. The hardware has to be designed such that the architecture is reliable and scalable.
This large-scale architecture has a very expensive upfront and has high maintenance costs. It
requires different resources like machines, power, cooling, etc. The system also cannot scale
as and when needed and so is not easily reconfigurable. The resources are also constrained by
the resources. As the applications become large, they become I/O bound. The hard drive
access speed becomes a limiting factor. Though the raw CPU power available may not be a
factor, the amount of RAM available clearly becomes a factor. This is also limited in this
context. If at all the hardware problems are managed very well, there arises the software
problems. There may be bugs in the software using this much of data. The workload also
demands two important tasks for two completely different people. The software has to be
such that it is bug free and has good data processing algorithms to manage all the data.

BENIFITS OF CLOUD COMPUTING

Some common benefits of cloud computing are:

• Reduced Cost: Since cloud technology is implemented incrementally (step-by-step), it saves


organizations total expenditure.

• Increased Storage: When compared to private computer systems, huge amounts of data can
be stored than usual.

• Flexibility: Compared to traditional computing methods, cloud computing allows an entire


organizational segment or portion of it to be outsourced.

• Greater mobility: Accessing information, whenever and wherever needed unlike traditional
systems (storing data in personal computers and accessing only when near it).

• Shift of IT focus: Organizations can focus on innovation (i.e., implementing new products
strategies in organization) rather than worrying about maintenance issues such as software
updates or computing issues.

30
These benefits of cloud computing draw lot of attention from Information and Technology
Community (ITC). A survey by ITC in the year 2008, 2009 shows that many companies and
individuals are noticing that CC is proving to be helpful when compared to traditional
computing methods.

KEY CHARACTERISTICS

● Cost is greatly reduced and capital expenditure is converted to operational expenditure.


This lowers barrier to entry, as infrastructure is typically provided by a third-party and does
not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a
utility computing basis is fine-grained with usage-based options and minimal or no IT skills are
required for implementation.

● Device and location independence enable users to access systems using a web browser
regardless of their location or what device they are using, e.g., PC, mobile. As infrastructure is
off-site (typically provided by a third-party) and accessed via the Internet the users can
connect from anywhere.

● Multi-tenancy enables sharing of resources and costs among a large pool of users, allowing
for:

● Centralization of infrastructure in areas with lower costs (such as real estate, electricity,
etc.).

● Peak-load capacity increases (users need not engineer for highest possible load-levels).

● Utilization and efficiency improvements for systems that are often only 10-20% utilized.
7265

● Reliability improves through the use of multiple redundant sites, which makes it suitable for
business continuity and disaster recovery. Nonetheless, most major cloud computing services
have suffered outages and IT and business managers are able to do little when they are
affected.

LOAD BALANCING

One of the foremost usually used applications of load balancing is to produce quality of
service from multiple servers, typically called a server data center. Usually load-balanced
systems are properly working inside popular internet sites, big chat networks, high-bandwidth
file transfer protocol sites, and domain name System (DNS) servers. It additionally prevents
the clients from contacting back-end servers directly, which can have security advantages by
hiding the structure of the inner network. Some load balancers give a mechanism for
improving the one parameter specially within back end server Load balancing offers the IT
team an opportunity to attain a considerably higher fault tolerance. It will mechanically give
the capability required to handle any increase or decrease of application traffic. It is
additionally necessary that the load balancer itself doesn't become the cause of failure.
Sometimes load balancers enforced in high-availability servers can additionally replicate the
31
user’s session needed by the application. Load balancing is dividing workload between a set of
computers in order to receive the good response time and all the nodes are equally loaded
and, in general, all users get served quicker. Load balancing may be enforced with hardware,
software, or a mix of each. Typically, load balancing is that the main reason for server’s
unbalanced response time. Load balancing plans to optimize the usage of resources, maximize
overall success ratio, minimize waiting time interval, and evade overloading of the resources.
By the utilization of multiple algorithms and mechanisms with load balancing rather than one
algorithm might increase reliability and efficiency. Load balancing within the cloud differs from
classical thinking on load balancing design and implementation by misusage of data center
servers to perform the requests on the basis of first come first serve basis. The older load
balancing algorithm allocates the requests according to the incoming requests of the client.
Load balancing is one amongst the central problems in cloud computing. It’s a mechanism that
distributes the dynamic workload equally across over the nodes or virtual machines within the
whole cloud server to avoid a state of conflict wherever some virtual machines are measured
as heavily loaded whereas others nodes or hosts are measured as idle or doing very little
work. It helps to realize a high client satisfaction and resource utilization magnitude relation,
consequently increasing the performance and resource utility of the system. It additionally
makes sure that each computing resource in the cloud server is distributed with efficiently and
fairly among all the requests of the client. It additionally prevents bottlenecks of the system
which can occur because of load imbalance.

GOALS OF LOAD BALANCING

The goals of load balancing are:

• To improve the performance of the system.

• To have a backup of the load or entire server just in case the system fails or even partly fails.

• To maintain the system stability

• To accommodate future modification within the system

LOAD BALANCING CLASSIFICATION

This is chiefly divided into 2 categories: static load balancing mechanism and dynamic load balancing
mechanism:

1) Static approach: - This approach is especially outlined within the fixed style and is free from the
workload at any point of time. Static load balancing algorithms divide the traffic equivalently between
all servers.

2) Dynamic approach: - This approach is thought of solely through the current workload in the server
by using different load balancing algorithms and selection mechanisms. Dynamic approach is
additionally required in cloud computing as the number of requests are never predicted. Dynamic load
balancing is divided in 2 varieties as non-distributed (centralized) Approach and distributed approach.
It’s outlined as following:

32
a) Centralized approach: - In centralized approach, solely one node is liable for managing and
distribution among the complete cloud system model. Alternative all nodes aren't liable for handling
the requests and providing the response.

b) Distributed approach: - In distributed approach, every node severally builds its own load vector. The
work is divided among all the nodes of the server. They aggregate the load information of alternative
nodes. Distributed approach is additional appropriate for complicated and very large systems inside
the cloud computing.

METRICS FOR LOAD BALANCING IN CLOUD

Various metrics considered in existing load balancing techniques in cloud computing are discussed
below-

∙ Scalability is the ability of an algorithm to perform load balancing for a system with any finite number
of nodes. This metric should be improved.

∙ Resource Utilization is used to check the utilization of re-sources. It should be optimized for an
efficient load balancing.

∙ Performance is used to check the efficiency of the system. This has to be improved at a reasonable
cost, e.g., reduce task response time while keeping acceptable delays.

∙ Response Time is the amount of time taken to respond by a particular load balancing algorithm in a
distributed system. This parameter should be minimized.

∙ Overhead Associated determines the amount of overhead involved while implementing a load-
balancing algorithm. It is composed of overhead due to movement of tasks, inter-processor and
interprocess communication. This should be minimized so that a load balancing technique can work
efficiently.

LOAD BALANCING ALGORITHMS

Following load balancing algorithms are currently prevalent in clouds:-

Round Robin:In this algorithm, the processes are divided between all processors. Each
process is assigned to the processor in a round robin order. The process allocation order is
maintained locally independent of the allocations from remote processors. Though the work
load distributions between processors are equal but the job processing time for different
processes are not same. So at any point of time some nodes may be heavily loaded and others
remain idle. Thi s algorithm is mostly used in web servers where http requests are of similar
nature and distributed equally.

Connection Mechanism:Load balancing algorithm [8] can also be based on least connection
mechanism which is a part of dynamic scheduling algorithm. It needs to count the number of
connections for each server dynamically to estimate the load. The load balancer records the
connection number of each server. The number of connection increases when a new
connection is dispatched to it, and decreases the number when connection finishes or timeout
happens.

33
Randomized: Randomized algorithm is of type static in nature. In this algorithm a process can
be handled by a particular node n with a probability p. The process allocation order is
maintained for each processor independent of allocation from remote processor. This
algorithm works well in case of processes are of equal loaded. However, problem arises when
loads are of different computational complexities. Randomized algorithm does not maintain
determini stic approach. It works well when Round Robin algorithm generates overhead for
process queue.

Equally Spread Current Execution Algorithm: Equally spread current execution algorithm [9]
process handle with priorities. it distribute the load randomly by checking the size and
transfer the load to that virtual machine which is light ly loaded or handle that task easy and
take less time , and give maximize throughput. It is spread spectrum technique in which the
load balancer spread the load of the job in hand into multiple virtual machines.

Throttled Load Balancing Algorithm: Throttled algorithm is completely based on virtual


machine. In this client first requesting the load balancer to check the right virtual machine
which access that load easily and perform the operations which is given by the client or user.
In this algorithm the client first requests the load balancer to find a suitable Virtual Machine
to perform the required operation.

A Task Scheduling Algorithm Based on Load Balancing: Y. Fang et al discussed a two-level


task scheduling mechanism based on load balancing to meet dynamic requirements of users
and obtain high resource utilization. It achieves load balancing by first mapping tasks to virtual
machines and then virtual machines to host resources thereby improving the task response
time, resource utilization and overall performance of the cloud computing environment.

Min-Min Algorithm: It begins with a set of all unassigned tasks. First of all, minimum
completion time for all tasks is found. Then among these minimum times the minimum value
is selected which is the minimum time among all the tasks on any resources. Then according
to that minimum time, the task is scheduled on the corresponding machine. Then the
execution time for all other tasks is updated on that machine by adding the execution time of
the assigned task to the execution times of other tasks on that machine and assigned task is
removed from the list of the tasks that are to be assigned to the machines. Then again the
same procedure is followed until all the tasks are assigned on the resources. But this approach
has a major drawback that it can lead to starvation.

Max-Min Algorithm: Max-Min is almost same as the min-min algorithm except the following:
after finding out minimum execution times, the maximum value is selected which is the
maximum time among all the tasks on any resources. Then according to that maximum time,
the task is scheduled on the corresponding machine. Then the execution time for all other
tasks is updated on that machine by adding the execution time of the assigned task to the
execution times of other tasks on that machine and assigned task is removed from the list of
the tasks that are to be assigned to the machines.

34
RELATED WORK

Nguyen Khac Chien et al. (2016) has proposed a load balancing algorithm which is used to
enhance the performance of the cloud environment based on the method of estimating the
end of service time. They have succeeded in enhancing the service time and response time of
the user.

Ankit Kumar et al (2016) focuses on the load balancing algorithm which distributes the
incoming jobs among VMs optimally in cloud data centers. The proposed algorithm in this
research work has been implemented using Cloud Analyst simulator and the performance of
the proposed algorithm is compared with the three algorithms which are preexists on the
basis of response time. In the cloud computing milieu, the cloud data centers and the users of
the cloud-computing are globally situated, therefore it is a big challenge for cloud data centers
to efficiently handle the requests which are coming from millions of users and service them in
an efficient manner.

S.Yakhchi et al. (2015) discusses that the energy consumption has become a major challenge
in cloud computing infrastructures. They proposed a novel power aware load balancing
method, named ICAMMT to manage power consumption in cloud computing data centers.
We have exploited the Imperialism Competitive Algorithm (ICA) for detecting over utilized
hosts and then we migrate one or several virtual machines of these hosts to the other hosts to
decrease their utilization. Finally, we consider other hosts as underutilized host and if it is
possible, migrate all of their VMs to the other hosts and switch them to the sleep mode.

Surbhi Kapoor et al. (2015) aims at achieving high user satisfaction by minimizing response
time of the tasks and improving resource utilization through even and fair allocation of cloud
resources. The traditional Throttled load balancing algorithm is a good approach for load
balancing in cloud computing as it distributes the incoming jobs evenly among the VMs. But
the major drawback is that this algorithm works well for environments with homogeneous
VMS, does not considers the resource specific demands of the tasks and has additional
overhead of scanning the entire list of VMs every time a task comes. The issues have been
addressed by proposing an algorithm Cluster based load balancing which works well in
heterogeneous nodes environment, considers resource specific demands of the tasks and
reduces scanning overhead by dividing the machines into clusters.

Shikha Garg et al. (2015) aims to distribute workload among multiple cloud systems or nodes
to get better resource utilization. It is the prominent means to achieve efficient resource
sharing and utilization. Load balancing has become a challenge issue now in cloud computing
systems. To meets the user’s huge number of demands, there is a need of distributed solution
because practically it is not always possible or cost efficient to handle one or more idle
services. Servers cannot be assigned to particular clients individually. Cloud Computing
comprises of a large network and components that are present throughout a wide area.
Hence, there is a need of load balancing on its different servers or virtual machines. They have
proposed an algorithm that focuses on load balancing to reduce the situation of overload or
under load on virtual machines that leads to improve the performance of cloud substantially.
35
Reena Panwar et al. (2015) describes that the cloud computing has become essential
buzzword in the Information Technology and is a next stage the evolution of Internet, The
Load balancing problem of cloud computing is an important problem and critical component
adequate operations in cloud computing system and it can also prevent the rapid
development of cloud computing. Many clients from all around the world are demanding the
various services rapid rate in the recent time. Although various load balancing algorithms have
been designed that are efficient in request allocation by the selection of correct virtual
machines. A dynamic load management algorithm has been proposed for distribution of the
entire incoming request among the virtual machines effectively.

Mohamed Belkhouraf et al. (2015) aims to deliver different services for users, such as
infrastructure, platform or software with a reasonable and more and more decreasing cost for
the clients. To achieve those goals, some matters have to be addressed, mainly using the
available resources in an effective way in order to improve the overall performance, while
taking into consideration the security and the availability sides of the cloud. Hence, one of the
most studied aspects by researchers is load balancing in cloud computing especially for the big
distributed cloud systems that deal with many clients and big amounts of data and requests.
The proposed approach mainly ensures a better overall performance with efficient load
balancing, the continuous availability and a security aspect.

Lu Kang et al. (2015) improves the weighted least connections scheduling algorithm, and
designs the Adaptive Scheduling Algorithm Based on Minimum Traffic (ASAMT). ASAMT
conducts the real-time minimum load scheduling to the node service requests and configures
the available idle resources in advance to ensure the service QoS requirements. Being
adopted for simulation of the traffic scheduling algorithm, OPNET is applied to the cloud
computing architecture. Hiren H. Bhatt et al. (2015) presents a Flexible load sharing algorithm
(FLS) which introduce the third function. The third function makes partition the system in to
domain. This function is helpful for the selection of other nodes which are present in the same
domain. By applying the flexible load sharing to the particular domains in to the distribute
system, the performance can be improved when any node is in overloaded situation.

CLOUD SIM

Cloud service providers charge users depending upon the space or service provided. In R&D, it
is not always possible to have the actual cloud infrastructure for performing experiments. For
any research scholar, academician or scientist, it is not feasible to hire cloud services every
time and then execute their algorithms or implementations. For the purpose of research,
development and testing, open source libraries are available, which give the feel of cloud
services. Nowadays, in the research market, cloud simulators are widely used by research
scholars and practitioners, without the need to pay any amount to a cloud service provider.

Tasks performed by cloud simulators :

The following tasks can be performed with the help of cloud simulators:

• Modelling and simulation of large scale cloud computing data centres.


36
• Modelling and simulation of virtualised server hosts, with customisable policies for
provisioning host resources to VMs.

• Modelling and simulation of energy-aware computational resources.

• Modelling and simulation of data centre network topologies and message-passing


applications.

• Modelling and simulation of federated clouds.

• Dynamic insertion of simulation elements, stopping and resuming simulation.

• User-defined policies for allocation of hosts to VMs, and policies for allotting host resources
to VMs.

The scope and features of cloud simulations include:

• Data centres

• Load balancing

• Creation and execution of cloudlets

• Resource provisioning

• Scheduling of tasks

• Storage and cost factors

Data Security In Cloud Computing: A Review


NECESSARY CHARACTERISTICS OF CLOUD COMPUTING

Cloud technology is in the news quite often these days, but it still seems to be mysterious and
confusing to the non-techie crowd. Cloud options are enticing various industries across the
board, which is why it’s important to know its essential characteristics as a software offering.
Here are the five main characteristics that cloud computing offers businesses today.

i. On-demand self-service:-A consumer can unilaterally provision computing capabilities, such


as server time and network storage, as needed automatically without requiring human
interaction with each service provider.

ii. Broad network access: - Capabilities are available over the network and accessed through
standard mechanisms that promote use by heterogeneous thin or thick client platforms (e.g.,
mobile phones, tablets, laptops, and workstations).

37
iii. Resource pooling: -The provider’s computing resources are pooled to serve multiple
consumers using a multi-tenant model, with different physical and virtual resources
dynamically assigned and reassigned according to consumer demand. There is a sense of
location independence in that the customer generally has no control or knowledge over the
exact location of the provided resources but may be able to specify location at a higher level
of abstraction (e.g., country, state, or datacenter). Examples of resources include storage,
processing, memory, and network bandwidth.

iv. Rapid elasticity: - Capabilities can be elastically provisioned and released, in some cases
automatically, to scale rapidly outward and inward commensurate with demand. To the
consumer, the capabilities available for provisioning often appear to be unlimited and can be
appropriated in any quantity at any time.

v. Measured service: - Cloud systems automatically control and optimize resource use by
leveraging a metering capability1 at some level of abstraction appropriate to the type of
service (e.g., storage, processing, bandwidth, and active user accounts). Resource usage can
be monitored, controlled, and reported, providing transparency for both the provider and
consumer of the utilized service.

IMPORTANCE OF SECURITY IN CLOUD COMPUTING

The power, exibility and ease of use of CC comes with lot of security challenges. Even though
CC is a new intuitive way to access applications and make work simple, there are a number of
challenges/issues that can affect its adoption. A non-exhaustive search in this field reveals
some issues. They are: Service Level Agreements (SLA), what to migrate, security, etc. Cloud
Computing has a feature of automatic updates, which means a single change by an
administrator to an application would reect on all its users. This advertently also leads to the
conclusion that any faults in the software are visible to a large number of users immediately,
which is a major risk for any organization with little security.

It is also agreed up on by many researchers that security is a huge concern


for adoption of cloud computing. A survey by IDC on 263 executives also shows that security is
ranked first among challenges in CC. Even though a company boasts to have top class security
and does not update its security policies from time to time, it will be prone to security
breaches in near future.

SECURITY CONCERNS IN CLOUD COMPUTING

a.Users authentication: User authentication process must be improvised to ensure that


malicious users do not get access to powerful computing systems in cloud computing.

b.Leakage of data or Data loss: Data can be at risk if an unauthorized person gains access to
shared pool of resources and deletes or modifies data. This risk can increase further if there
exists no backup for that data.

c.Clients trust: There must be strong authentication practices implemented to ensure that the
client's data is being protected from unauthorized access.
38
d.Malicious users handling: Malicious users can be attackers using cloud services with a
malicious intent or an insider who has gained the trust of company but works to gain access to
sensitive information stored in cloud.

e.Hijacking of sessions: These kind of attacks happen when a legitimate user is prone to
phishing or insecure application interfaces that can be exploited by attackers. Through this
kind of attacks, attackers gain user credentials and hijack legitimate users sessions.

f.Wrong usage of CC and its services: Cloud computing service providers give access to try
their cloud services for a limited period of time for free. Some users utilize this trial period to
misuse the resources obtained through CC service provider.

RESEARCH MOTIVATION
In this research work, we will try to enhance Security between the client and cloud accessing
the cloud. No doubt, cloud has got multiple benefits but we should not forget that there is a
high risk of data getting confidential information getting leaked. In order to avail the benefits
of cloud, we must ensure the security of data being transferred between the client and
user.Security is the key for the Cloud success, security in the cloud is now the main challenge
of cloud computing. Until a few years ago all the business processes of organizations were on
their private infrastructure and, though it was possible to outsource services, it was usually
non-critical data/applications on private infrastructures. Now with cloud computing, the story
has changed. The traditional network perimeter is broken, and organizations feel they have
lost control over their data. New attack vectors have appeared, and the benefit of being
accessible from anywhere becomes a big threat.

• No secure authentication: In the present work, there is no secure authentication procedure


defined. When you log on to your machine and then try to access a resource, say a file server
or database, something needs to assure that your username and password are valid. With
sensitive data stored in the cloud of the different users, we need a strong authentication
mechanism. Data breaches because of no/weak authentication.

•No Gateway is defined: The user should not be directly connected to the cloud provider as
there is high risk of data getting stolen or hacked by the third party intruder. There is a
requirement of gateway/broker that acts as an intermediate between the cloud provider and
the client.

•No Read/Write policies have been defined. Different privileges should be given to the
different types of users.

With the continuous growth and expansion of cloud computing, security has become
one of the serious issues. Cloud computing platform need to provide some reliable security
technology to prevent security attacks, as well as the destruction of infrastructure and
services. There is no doubt that the cloud computing is the development trend in the future.
Cloud computing brings us the approximately infinite computing capability, good scalability,
service on-demand and so on, also challenges at security, privacy, legal issues and so on. But
39
to solving the existing issues becomes utmost urgency. To protect against the compromise of
the compliance integrity and security of their applications and data, firewall, Intrusion
detection and prevention, integrity monitoring, log inspection, and malware protection.
Proactive enterprises and service providers should apply this protection on their cloud
infrastructure, to achieve security so that they could take advantage of cloud computing
ahead of their competitors. These security solutions should have the intelligence to be self-
defending and have the ability to provide real-time detection and prevention of known and
unknown threats. To advance cloud computing, the community must take proactive measures
to ensure security.

Cloud Computing: A new Era of Computing in the Field of


Information Management
COMPONENTS OF CLOUD

A Cloud system consists of three major components such as clients, data center, and
distributed servers. Each element has a definite purpose and plays a specific role.

A. Clients

Clients are, in a cloud computing architecture are similar to the clients of everyday local area
network (LAN). These are the computers which are residing on the desk of the end users. This
is where the frontend applications are installed. They can be laptops, tablet computers,
mobile phones, or PDAs. In short clients are the devices at the user side and used to manage
client information. The physical specification brings the client into the following three
categories:

 Mobile - Mobile devices include Smart phones, Tablets or PDAs.

 Thin – These are the dump terminals having no hard disk space rather it let the servers do
all processing activities. It simply displays the information.

 Thick - This type of client is a regular computer, using a web browser like Firefox or Internet
Explorer to connect to the cloud.

Today, the thin clients are the popular solution for implementing a cloud solution,
because of the following characteristics.

 Low costs hardware – The hardware specification of a thin client is low cost in nature. It is
because it doesn’t have any storage and processing capabilities. It simply transmits the data
to sever for processing.

40
 Failure points – As all the clients are managed by server therefore there is very less chance
of infrastructure failure.

 Security - Since data is managed centrally by servers where the processing takes place
therefore the clients are free from any attacks such as malwares and there is less chance for
data to be lost if the client computer crashes or is stolen.

 Infrastructure Management – In case of a failure of client or if the client dies it is easy to


replace the clients in the cloud infrastructure.

 Cost Effective – The last but not least, thin clients consume less power than thick clients
which saves energy (which is a scarce resource when the clients are movable) and which in
turn also cost effective to the users.

B. Data Center

The data center is the collection of servers where the applications to which the user
subscribes are hosted. A data center server can be virtualized in nature where the software
can be installed in the main physical server but appeared as separate server identity to the
user. In this way, one can have half a dozen virtual servers running on one physical server.

C. Distributed Servers

It is not necessary that the data center always contains only one server in one place.
Sometimes, servers are placed in geographically disparate locations in the globe. But from the
end user perspective it seems that data is coming from a central server. In this approach if one
server is down or instantly not available to a client request, may be due to congestions etc.,
the other servers activate to cater the clients. In order to provide seamless service to the
client, the data in these servers are synchronized frequently.

LAYERS OF CLOUD COMPUTING MODELS

As the cloud computing model gains popularity, it is important to understand the service layers that
define it. Each layer of the cloud computing model exists conceptually on the foundation of the
previous layers and provides services to the adjacent layers, as it is in the case of OSI (Open Systems
Interconnections) model of computer network.

Within this model, there are three different service layers which are used to specify
Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS).
Additionally, there are three layers viz; Hardware layer, Virtualization layer and the Client layer that
are not provided as user services. The Hardware Layer and the Virtualization Layer are owned and
operated by the cloud services provider while the Client Layer is supplied by the end users.

A. The Hardware Layer

The hardware layer is also known as the server layer. It represents the physical hardware that provides
actual resources that make up the cloud. As in most of the cases, the hardware service is provided by
the cloud service provider, it seems to be the least important layer in the cloud infrastructure. This

41
layer can be made redundant by utilizing multiple hardware platforms to make the system fault
tolerant.

B. The Virtualization Layer

This layer is also known as the infrastructure layer. The virtualization layer is the result of various
operating systems being installed as virtual machines. It is a very useful concept in context of cloud
systems. It is the software implementation of a server to execute different programs like a real
machine. Virtualization enables an user to use different services of a cloud. The remote data center
provides different services in a full or partial virtualized manner.

C. Infrastructure as a service (IaaS)

The infrastructure layer builds on the virtualization layer by offering the virtual servers as a service to
users. The clients are billed for these virtual servers but not for the actual hardware. This reduces the
cost of unnecessary hardware procurement physical servers or data storage systems.

D. Platform as a service (PaaS)

This layer provides an operating system platform for hosting various applications. PaaS solutions are
basically the development platforms for which the development tool itself is hosted in the Cloud
through IaaS and accessed through a browser. With PaaS, developers can build Web applications as
per their native systems and to deploy those applications in the cloud virtualized server without any
specialized systems administration skills.

E. Software as a service (SaaS)

If the users do not want to develop the cloud application, the SaaS layer [4][9][10] is the solution. The
users simply procure a service, such as email or CRM (Customer Resource Management). Billing can be
based on utilization of these services. In this case, it is a simple way to get the application functionality
that the users need without incurring the cost of developing that application.

F. The Client

This is also known as end user layer where the users interact with the cloud. In cloud computing, the
users generally access cloud resources through networked client devices, such as desktop computers,
laptops, tablets and smart phones. Examples are thin clients, mobile, thick clients and the browser-
based Chromebook etc. However, many of these applications do not require specific software on the
client and instead use a web browser to interact with the cloud application.

CLOUD COMPUTING BENEFITS

In order to get the optimum benefit from cloud computing, developers or the users must be
able to port their applications to the cloud system. The benefits of porting applications in a
cloud server include low processing/ response time, low risk of infrastructure deployment,
low cost of implementation and high pace of innovation.

42
A. Low processing time

In comparison to a single server, the processing speed of cloud servers will be much higher.
For example running a large batch job in 100 of cloud servers is much faster than that of
running the same batch in a single server.

B. Low response time

As the cloud servers are the collection of high speed parallel processors, the response time
[16][17] to a request will also be significantly low.

C. Low risk of infrastructure

Infrastructure risk can be reduced by adopting a cloud solution. For a public cloud [4], the risk
of infrastructure is solely borne by the cloud service provider and the client simply billed for
the deployment. In case of any infrastructure failure, it is the service provider who will provide
instant support to the customer and thus reducing the risk of purchasing and deploying
physical servers by the customers.

D. Lower cost of implementation

Since the infrastructure [4][6] is not purchased rather it is rented in case of a public cloud, the
cost of implementation can be controlled to the extent of the capital investment to be almost
zero. Moreover, most of the cloud applications are readymade, which reduces the cost and
time of developing a custom application.

E. High pace of innovation

Cloud computing can help to increase the pace of innovation. The low cost of entry to new
markets helps to level the playing field, allowing start-up companies to deploy new products
quickly and at low cost. This allows small companies to compete more effectively with
traditional organizations whose deployment process in enterprise data centers can be
significantly longer. Increased competition helps to increase the pace of innovation and with
many innovations being realized through the use of open source software, the entire industry
serves to benefit from the increased pace of innovation that cloud computing promotes.

FUTURE APPLICATIONS
The application of cloud solutions has multiple applications in many areas viz., agriculture, IT
industry, library etc. The detail application of cloud solution can be summarized as –

A. Cloud in Information Technology

In the current period of Information Technology cloud computing is opening an epoch of


fundamental managerial changes of business organizations, with virtualized organizations by
using web 2.0 tools, net PCs, mobile technology and various services. According to - “the
world shifts from using Information Technology (IT) for transaction and information
management to a far more organic Business Technology (BT) for collaboration and interaction
43
management.” Therefore, in the future period it is no doubt Cloud computing will change the
way IT professionals work, and the kinds of jobs they do. It will also bring a fundamental
change in how managers think about business, coordinate tasks and people.

B. Cloud in Agriculture

Agriculture has traditionally been maintained by farmers’ communities] where the sharing of
knowledge is regarded as very important criteria of efficient farming. The collection and
sharing of knowledge will definitely result in better overall efficiency and productivity. The
application of Cloud computing in the field of agriculture will be Sales to customers and
production planning for cultivated land. This can be performed together using the help of
cloud computing. Similarly Management of all sorts of data relating to cultivated land,
including location, land rights, area, soil, and land characteristics can be integrated.

C. Cloud in Libraries

Cloud computing already set the foot prints in commercial sectors and is now beginning to
find a suitable place in library science. Libraries may put more and more content into the
cloud. Using cloud computing user would be able to browse a physical collection of books,
journals, CDs or DVDs etc. The user can also take out a book for scan a bar code into his tablet
pc. All historical and rare documents would be scanned into a comprehensive, easily
searchable database and would be accessible to any researcher.

D. Cloud in Education

The potential of cloud computing for improving efficiency, cost and convenience for the
educational sector is being recognized by a number educational institutions. Due to
economical factors such as budget crisis many educational institutions are also adopting cloud
computing for paper less teaching learning process. Many universities also found cloud
computing to be attractive to use in one of their courses which was focused exclusively on
developing and deploying SaaS applications.

CLOUD COMPUTING ISSUES

Cloud computing rises lots of technical as well as legal issues. The presence of internet in the
cloud computing rises same legal and technical issue of internet cloud computing is a big
achievement of IT industry. Companies can aggrandize their business with very low cost but
they want safe environment in cloud to maintain their downfall or prosperity.

A. SECURITY: In cloud computing cloud acts as a big black box. Nothing inside the cloud is
visible to the client. Clients have no idea or control over what happens inside a cloud. multiple
users and companies share their personal and private information and hackers can hack their
data by APIs login in cloud which is big threat or risk for users .As security is one of the most
difficult task to implement in cloud computing.

B. PRIVACY: Cloud computing uses virtual computing technology in which user’s personal
data saved in multiple virtual data centres instead of remain in same physical location. As
44
users may leak their personal information when they access cloud computing services. All
virtualized resources can be easily created or destroyed in any cluster.

C. AVAILABILITY: System can be able to accessed by authorized party in cloud computing


infrastructure. Security branches, software and data can be accessed by authorized client. So
cloud computing offers the open platform to user and attackers will become a big threat for
clients.

D. CONFIDENTIALITY: Cloud computing provides easy way to access data and software to
clients so cloud server are not honest for clients. There is a fear of loss of control over data for
users.

E. RELIABILITY: Servers available in the cloud experiences some problems as resident servers.
It is important to monitor services being provided using internal or third party.

F. PERFORMANCE BANDWIDTH COST: Cloud computing saves money of companies on


hardware but they have to spend more for the bandwidth. Cloud provides low cost for smaller
application but offers high cost for high data comprehensive application. Cloud requires
sufficient bandwidth to deliver complex data over network due to which many companies
waiting for a reduced cost before switching to the cloud.

CLOUD AND GRID COMPUTING

Cloud computing uses multiple servers to provide virtual access of resources by internet and
it is adopted object oriented programming concept to implement any system. Cloud
computing used for small and large scale business activities. To perform complex calculations,
high performance computer systems were used by inter connecting data links. As combined
hardware and software infrastructure is known as grid computing. A grid basically uses
multiple processing computing units to perform a single task and single task is broken into
multiple tasks and each machine assigned by a task. When sub tasks completed their task than
they report to primary machine which controls all machine’s task and they combined to
produce output.

1. In grid computing multiple servers allocated to fulfill single task on the other hand in cloud
computing virtualization of servers used in which one server is used to compute several task
concurrently.

2. Grid computing is used for job execution. The execution of a program is performed for a
limited period of time but cloud computing frequently supports long running services.

3. Cloud computing provides opportunity of unlimited scalability, cost saving, workload


balancing but grid computing is a complex computing technology due to which cost increases.
As cloud computing is more beneficial as compare to grid computing.

45
CLOUD AND MOBILE CLOUD COMPUTING

In public cloud, combination of hardware and software services provides to public users. It
gives the flexibility to users to quickly add or scale up virtual servers and services according to
their need. Cloud computing allows users to store the data like file, folder, documents in cloud
area on internet and client can able to access all files and data wherever in the world but
client should have a physical device to access data. Mobile computing provides physical device
to clients like mobile phones, smart phones, tablets, laptops. These devices enable clients
anywhere in the world because of the small size of the device. To take more advantages of
cloud computing mobile users easily access multiple application and services over the wireless
connection on web browser. As mobile cloud computing is the combination of mobile web
application and cloud computing.

Mobile device do not need high level configuration system because multimedia
services, transcoding and complex calculation processed on cloud. In mobile cloud computing
[1] data is stored in the cloud which reduces the chance to loss data from mobile device as
cloud provides remotely security service to mobile users. As mobile cloud computing is
reliable to provide high data storage capacity, good processing power and security to mobile
users. It provides lower bandwidth which is a technical issue in MCC. The bandwidth is
determined by data distribution policy to share data between users. Which uses calling
profile, signal strength profile, power profile to make a decision table of bandwidth by using
markov decision process algorithm and that decision table decide whether or not a user have
to download or receive data with other user due to bandwidth limitation.

In MCC offloading takes place which increases the battery life of the mobile device and
application performance. But offloading consumes more energy as compare to local
processing when the small size codes used. As offloading is a computing issue in MCC. In MCC
security related issue are introduced in two categories - security for mobile users and security
of data. As mobile computing has multiple advantages for mobile users and some technical
issues also which are depend on both mobile communication and cloud computing.

Data Security In Cloud Computing


Cloud computing is Internet ("cloud") based development and use of computer technology
("computing"). It is a style of computing in which dynamically scalable and often virtualized
resources are provided as a service over the Internet. Cloud computing uses the internet and
the central remote servers to support different data and applications. It is an internet based
technology. It permits the users to approach their personal files at any computer with internet
access. The cloud computing flexibility is a function of the allocation of resources on
authority’s request. Cloud computing provides the act of uniting. Scientific computing in the
21st century has evolved from fixed to distributed work environment. The current trend of
Cloud Computing (CC) allows accessing business applications from anywhere just by
connecting to the Internet. Evidence shows that, switching to CC organizations' annual
46
expenditure and maintenance are being reduced to a greater extent. However, there are
several challenges that come along with various benefits of cloud computing. Among these
include securityaspects. Our aim is to identify security challenges for adapting cloud
computing and their solutions from real world for the challenge that do not have any proper
mitigation strategies identified. This non-existence of global standards and guidelines could be
help academics to know the state of practice and formulate better methods/standards to
provide secure interoperability. The identified cloud computing security challenges and
solutions can be referred by practitioners to understand which areas of security need to be
concentrated while adapting/migrating to a cloud computing environment.

A Comprehensive Survey on Cloud Computing


Cloud Computing is an innovation ideas that helps in reducing the computing cost. Cloud
Computing offers better computing through improved utilization and reduced administration
and infrastructure costs. Cloud computing is the long-held dream of computing as a utility.
Cloud Computing is the combination of Software as a Service (SaaS) and Utility Computing.
Cloud computing shares characteristics with autonomic computing, peer to peer, grid
computing, client server model, mainframe computer and utility computing. It has various
open source resources which gives different platform for better computing utilization. Cloud
computing are managed by Cloud Management tools, loaded and tested by various other
software testing tools. Cloud computing modelling and simulation is done by Cloud Sim or
SPECI or Ground Sim or DC Sim on the basis of testing benchmark. The application of Cloud
Computing is discussed.

Cloud computing describes about different types of computing concepts that involve a
large number of computers connected through a real time communication network such as
the Internet. Cloud computing is similar like distributed computing over a network which
means it has ability to run a program on many connected computers at the same time. Cloud
computing mainly relies on the sharing of resources for achieving coherence and economies
of scale like utility computing. The foundation of cloud computing is the broader concept
mainly converged infrastructure and shared services. Buyya et al. have explained the cloud as
follows: “Cloud is a parallel and distributed computing system consisting of a collection of
inter connected and virtualized computers that are dynamically provisioned and presented as
one or more unified computing resources based on service level agreement (SLA) established
through negotiation between the service provider and consumers.” The cloud has reached
into our daily life and led to a broader range of innovations, but people often misunderstand
what cloud computing is. Built on many old IT technologies, cloud computing is actually an
evolutionary approach that completely changes how computing services may produce, priced
and delivered. It allows to access services that reside in a distant data centre, other than local
computers or other Internet connected devices. Cloud services are charged based on the
amount consumed by worldwide users. Such an idea of computing as a utility is a long-held
dream in the computer corporate, but it is still immature until the advent of low-cost data
centres that will enable this dream to come true.

47
Data centres, behaving as “cloud providers”, is computing infrastructures which
provide many kinds of agile and effective services to customers. A broad range of IT
companies including Amazon, Cisco, Sales force, Yahoo, Facebook, Microsoft and Google,
Cisco have their own data centres and provide pay-as-you-go cloud services. Two different but
similar types of cloud service should be identified first. One is on-demand computing instance
and the other is on-demand computing capacity.

The advancement of Cloud computing came up due to fast growing usage of internet
among the citizenry. The cloud computing is not a totally new technology but it is essentially a
journey through distributed, cluster, grid and now cloud computing. In fact, in surge of rapid
usage of internet all over the globe, cloud computing has already been steered in the IT
industry.

The paper contain analogous system, cloud computing basics, open source resource of
cloud, cloud computing services, cloud management tools, cloud computing simulations and
modelling. Even cloud testing benchmark and application of the cloud is discussed.

ANALOGOUS SYSTEM

Cloud computing has related characteristics with: Autonomic Computing- Computer systems
which is capable of managing itself means self-management. Client Server model- Client
server computing mainly refers to any distributed application that differentiate between
service providers (servers) and service requesters (clients).

Grid Computing- A form of distributed and parallel computing where by a super and virtual
information processing system is composed of a cluster of network loosely coupled computers
acting in concert to perform very large tasks.

Mainframe Computer- Powerful computer used mainly by large organizations for critical
applications, typically bulk data processing such as census, industry and consumer statics,
police and secret intelligence services, enterprise resource planning and financial transaction
processing.

Utility Computing- The packaging of computing resources such as computation and storage as
a metered service similar to a traditional public utility such as electricity.

Peer-to-peer- Distributed architecture without the need for central coordination, with
participants being at the same both suppliers and consumers of resources.

CLOUD COMPUTING BASICS

Since 2007, the term Cloud has become one of the most buzz words in IT industry. Dozens of
researchers seek to define cloud computing from different application aspects, but there is no
exact definition on it. Among the various definitions, we choose three widely quoted as
follows:

48
 I. Foster: “A large-scale [2] distributed computing paradigm that is driven by economies of
scale, in which a pool of abstracted virtualized, dynamicallyscalable, managed computing
power, storage, platforms, and services are delivered on demand to external customers over
Internet.”

As an academic representative, Foster focuses on various technological features that


differentiate cloud computing from other distributed computing paradigms. For example,
computing entities are virtualized and delivered as services, and these services are
dynamically driven by economies of scale.

 Gartner: “A style of computing [2] where scalable and elastic IT capabilities are provided as a
service to multiple external customers using Internet technologies.”

Garter is an IT consulting company, so it examines the qualities of cloud clouding


mostly from the point of view of the industry. Functional characteristics are emphasized in
this definition, such as whether cloud computing is scalable, elastic, service offering and
Internet based.

 NIST: “Cloud computing [2]is a model for enabling convenient, on-demand network access
to a shared pool of configurable computing resources (e.g., networks, servers, storage,
applications, and services) that can be rapidly provisioned and released with minimal
management effort or service provider interaction.”

A. System Architecture

Clouds are usually referred to as a large pool of computing and storage resources which can
be accessed via standard protocols with an abstract interface.

The fabric layer contains the raw hardware level resources, such as compute
resources, memory resources, and network resources. On the unified resource layer,
resources have been virtualized so that they can be exposed to upper layer and end users as
integrated resources. The platform layer adds to a collection of specialized tools, services and
middleware on top of the unified resources to provide a development and deployment
platform. The application layer includes the applications that would prevail in the clouds.

B. Characteristics of Cloud Computing

There are five necessary characteristics of Cloud Computing which explains their relation and
difference from the traditional computing.

 On-demand-self-service The consumer can provision or un-provision the services when


needed, without the human interaction with the service provider.

 Broad Network Access It has capabilities over the network and access done through
standard mechanism.

49
 Resource Pooling The computing resources of the provider are pooled to serve multiple
consumers which are using a multi-tenant model, with various physical and virtual resources
dynamically assigned, depending on consumer demand.

 Rapid Elasticity Services can be rapidly and elastically provisioned.

 Measured Service Cloud Computing systems automatically check and optimize resource
usage by providing a metering capability to the type of services (e.g. storage, bandwidth,
processing, or active user accounts)

C. Cloud Deployment Model

Clouds are deployed in different modes, depending on the usage scopes. There are four
primary cloud deployment models.

 Public Cloud

o Cloud infrastructure available for general public and general usages.

 Private Cloud

o The cloud available mainly for a single organization and provide security to its resources.

 Community Cloud

o Cloud infrastructure which can be used by several organizations in a private community and
supports a specific community which shares resources.

 Hybrid Cloud

o It uses a combination of public cloud, private cloud and even local infrastructures, which is
typical for most IT vendors.

D. Cloud Service Model Cloud computing provides services, basically by three models that is
software, platform and infrastructure.

OPEN SOURCE RESOURCES FOR CLOUD COMPUTING

Open source software has been on the rise as many concerns during the protracted economic
downturn and one of the areas where it is starting to offer companies a lot of flexibility and
cost savings is in cloud computing. Cloud deployments can save money, free businesses from
vendor lock -ins that could really sting over time and offer flexible ways to unite public and
private applications. The following are few top open source cloud applications, educational
resources, services, funding options, general items of interest and more.

50
A. Eucalyptus

Ostatic spread news about US Santa Barbara’s open source cloud project last year. It released
an open source infrastructure for cloud computing on clusters that duplicates the
functionality of Amazon’s EC2. Eucalyptus mainly uses the Amazon command line tools. Start
Up Eucalyptus Systems was launched this year with venture funding and the staff include
original architects of the Eucalyptus project. The company released its first major update to
the software framework, which is also powering the cloud features in the new version of
Ubuntu Linux.

B. Red Hat’s

Cloud Linux focused open source player Red Hat has been rapidly expanding its focus on cloud
computing. Red Hat held its Open Source Cloud Computing Forum which includes a large
number of presentations from movers and shakers focused on open source cloud initiatives.
Novell is also an open source focused company that is increasingly focused on cloud
computing.

C. Traffic Server

Yahoo moved its open source cloud computing initiatives up a notch with the donation of its
Traffic Server product to the Apache Software Foundation. Traffic Server is used at Yahoo to
manage its own traffic and it enables session management, configuration management,
authentication, configuration management, routing and load balancing for entire cloud
computing software stacks. It act as overlay to raw cloud computing services. Even allows IT
administrators to allocate resources which includes handling thousands of virtualized services
concurrently.

D. Cloudera

The open source Hadoop software framework is increasingly used in cloud computing
deployments due to its flexibility to data intensive queries, cluster based, and other tasks. It’s
managed by the Apache Software Foundation and Yahoo has its own time tested Hadoop
distribution. Cloudera can be promising startup focused on providing commercial support for
Hadoop.

E. Puppe t

Virtual servers are on the rise in cloud computing deployments and Reductive Labs open
source software. It is built on the legacy of the Cfengine System and highly respected by many
system administrators for managing them. It can be used to manage large numbers of systems
or virtual machines through automated procedures, without having to manage a lot of
complex scripting.

51
F. Enomaly

The Company’s Elastic Computing Platform (ECP) has its roots in widely used Enomalism open
source provisioning and management software. It is designed to take much of the complexity
out of starting a cloud infrastructure. ECP is a programmable virtual cloud computing
infrastructure for small, medium and large businesses.

G. Joyent

Joyent have Reasonably Smart, a fledgling open source cloud startup based on JavaScript and
Git. Its cloud hosting infrastructure and cloud management software incorporate many open
source tools for public and private clouds. The company can also help to optimize a speedy
implementation of the open source MySQL database for cloud use.

H. Zoho

Many people use Zoho’s huge suite of free, online applications which is competing with
Google Docs. Zoho’s core is entirely open source- a shining example of how SaaS solutions can
work in harmony with open source.

I. Globus

Nimbus This open source toolkit allows businesses to turn clusters into Infrastructure-as-a-
Service (IaaS) clouds. The Amazon EC2 interface is carried over, but is not the only interface
that can be selected.

J. Reservoir

This is a main European research initiative on virtualized infrastructure and cloud computing.
It is far reaching project targeted to develop open source technology for cloud computing and
help businesses avoid vendor lock in.

K. OpenNebula

The OpenNebula VM Manager is a core portion of Reservoir. It is an open source answer to


the many virtual machine management offerings from proprietary players and interfaces
easily with cloud infrastructure tools and services. It is an open source virtual infrastructure
engine that enables the dynamic deployment and replacement of virtual machines on a pool
of physical resources. It’s good to see open source tools and resources competing in the cloud
computing space. The end result should be more flexibility for organizations that want to
customize their approaches. Open source cloud also have the potential to keep pricing for all
competing services on a level playing field.

52
CLOUD COMPUTING SERVICES

The three best cloud computing services are-

A. Heroku

It is an impressive cloud hosting platform that is both robust and easy to use. It has powerful
features includes compatibility with every major Web language, faster application
deployment, analytics and monitoring, an intuitive user interface, easy scale management and
simple third party add on integration from outside vendors. It also provides a collection of
tutorials and documentations for uncomplicated use.

B. Amazon Elastic Beanstalk

Amazon Elastic Beanstalk is an excellent cloud platform that provides a comprehensive set of
cloudcomputing tools. Its services include Amazon EC2 (Elastic Compute Cloud), a Linux or
Windows server in cloud; Amazon S3 (Simple Storage Service), where files like documents,
photos, videos etc. can be stored; Amazon RDS (Relational Database Service) for databases;
and Amazon ELB (Elastic Load Balancing), which manages traffic in and out of cloud.

C. Windows Azure

Windows Azure, which is perfect for anyone who wants to source out all the administrative
tasks associated with cloud computing. It offers a platform-as-a-service that handles
everything from setting up and configuring to updating and maintaining your cloud server. It
also support .NET applications. Windows Azure package has in-depth documentation and
tutorials, email and phone support which makes it one of the most user-friendly cloud
services. Cbeyond, a cloud services and communications provider for SMBs suggests of finding
cloud-computing services that offer both cloud and network connectivity to describe logistics,
save capital and avoid the headaches of dealing with multiple vendors.

CLOUD MANAGEMENT TOOLS

Cloud computing environments is managed by some form tool mainly called as Cloud
Management Tools. This tools monitor, provision and tools that cross the divide between
both. The main cloud infrastructure management products offer similar core features:

 Most support different cloud types (often referred to as hybrid clouds).

 Most support the on-the-fly creation and provisions of new objects and the destruction of
unnecessary objects like servers, apps and storage.

 Most provide the usual types of reports on the status like response time, uptime, quota use
etc. and have a dashboard that can be drilled into.

When compared on basis of those three criteria, few vendors only offer pervasive
approaches in handling provisioning and managing metrics in hybrid environments: Right
Scale, Zeus, Kaavo, Scalr and Morph. There are also options offered by cloud vendors

53
themselves that meet the second and third criteria such as Cloud Watch from Amazon Web
Services.

The large companies known for their traditional data centre monitoring applications
have been slow to hit the cloud market. Their products are rehashes of existing applications
that do little in the way of providing more than alerting and reporting tools. CA is on an
acquisition spree to fix this and just acquired 3Tera, a cloud provisioning player.

An example of the confusion in the industry is IBM's Tivoli product page for cloud
computing. It is noticed that by clicking the Getting Started tab results in a 404 error. Nice
work, IBM. Meanwhile, HP's Open View (now called Operations Manager) can manage cloud
based servers, but so far only manage like another server. BMC is working on a cloud
management tool whereas it doesn't have anything beyond its normal products. These are the
good infrastructure management and provisioning options available today.

A. Right Scale

Right Scale is better management tools for cloud environments. It also offer a free edition
with limitations on features and capacity, designed for introduction of the product. Right
Scale's product is broken down into four parts:

 Cloud Management Environment

 Cloud-Ready Server Template and Best Practice Deployment Library

 Adaptable Automation Engine

 Multi-Cloud Engine

A fifth feature states that the "Readily Extensible Platform supports programmatic
access to the functionality of the Right Scale Platform." When reviewing the product, these
features aren't really separate from one another but make a nice integrated offering.

The management environment are the main interface and users can find with the
software. It is designed to understand by user through the initial process of migrating to the
cloud using their templates and library. The management environment is used for managing
that environment, by continuing builds and ensuring resource availability. Here the
automation engine comes into play: being able to quickly provision and put into operation
additional capacity, or remove that excess capacity as needed. There is also Multi-Cloud
Engine which supports Amazon, Eucalyptus, GoGrid and Rackspace. It is also working on
supporting the Chef open-source systems integration specifications. Chef is designed from the
ground up for the cloud.

B. Kaavo

Kaavo plays in a very similar to RightScale. The product is mainly used for:

 Single-click deployment of complex multi-tier applications in the cloud (Dev, QA, Prod)

54
 Handling demand bursts/variations by automatically adding/removing resources

 Run-time management of application infrastructure in the cloud

 Encryption of persisting data in the cloud

 Automation of workflows to handle run-time production exceptions without human


intervention.

The core of Kaavo's product is named IMOD. IMOD handles configuration, changes
(adjustments in their terminology) and provisioning to the cloud environment and across
multiple vendors in a hybrid model. Similar to all major CIM players, Kaavo's IMOD sits at the
"top" of the stack, managing the infrastructure and application layers.

One particular feature in IMOD is its multi-cloud, single system tool. For example,
create a database backend in Rackspace while putting the presentation servers on Amazon. It
supports Amazon and Rackspace in the public space and Eucalyptus in the private space.
Though it should be noted that most cloud management can support Eucalyptus when it also
support Amazon.

Both Kaavo and RightScale offer scheduled "rampups" or "ramp-downs" (dynamic


allocation based on demand) and monitoring tools to ensure that information and internal
metrics (like SLAs) are transparently available. The dynamic allocation helps in meeting the
demands of those SLAs. Both provide the ability to keep templates as well to ease the
deployment of multi-tier systems.

C. Zeus

Zeus was famous for its rock-solid Web server, one that didn't have a great deal of market
share but did cause a bunch of rapid fans and top-tier customers. Zeus with Apache and to a
lesser-extent IIS which dominates the market had glut of load balancers. It took its expertise in
the application server space and produced the Application Delivery Controller piece of the
Zeus Traffic Controller. It is using traditional load balancing tools to test availability and
spontaneously generate or destroy additional instances in the cloud and provides on-the-fly
provisioning. It currently supports this on Rackspace and Amazon platforms.

D. Scalr

Scalr is a small project hosted on Google Code and Scalr.net that creates dynamic clusters, as
same like Kaavo and RightScale on the Amazon platform. It supports triggers upsizing and
downsizing based on traffic demands, snapshots (which can be shared, incidentally, a very
cool feature). It is custom building of images for each server or server-type. Scalr does not
support the wide number of platforms, applications, operating systems and databases. It is
attached to the traditional expanded-LAMP architecture ([3] LAMP plus Ruby, Tomcat etc.)

E. Morph

55
It is not a true management platform. MSP-minded Morph products offers similar
functionality in its own private space. Morph Cloud Server is a newer product fills the
management and provisioning space as an appliance. It is aimed at the enterprise seeking to
deploy a private cloud. Morph Cloud Server a top tier product is based on the IBM Blade
Center and supports hundreds of virtual machines.

The core is an Ubuntu Linux operating system and the Eucalyptus cloud computing
platform. It is mainly aimed to manage service provider market. It allows for the creation of
private clouds and the dynamic provisioning within those closed clouds. Later, it made quite a
splash and bears watching mostly because of its open-source roots and participation in open-
cloud systems.

F. Cloud Watch

Amazon's Cloud Watch works on Amazon's platform only. It limits its overall usefulness as it
cannot be a hybrid cloud management tool. Since Amazon's Elastic Compute Cloud (EC2) is
the biggest platform and still bears mentioning.

Cloud Watch for EC2 supports dynamic provisioning (called auto-scaling), load
balancing and monitoring, all managed through a central management console used by
Amazon Web Services. Its biggest advantage is that no additional software is required to
install and no additional website to access applications through. The product is clearly not for
enterprises that need hybrid support which exclusively use Amazon should know that it is as
robust and functional as the other market players.

SOFTWARE TESTING TOOLS TO TEST CLOUD APPLICATIONS


New Open Source Testing Tools are emerging which can be deployed, managed and tested for
the latest Cloud Computing Software applications. With its dynamic scalability, flexibility and
virtualized resources are provided as a service, Cloud Computing is seen as the dawn of a new
era for application services. With examples of general purpose applications Google
Documents, Zoho, Buzzword and Flickr that use Cloud Computing Technology can be seen as
the most viable option for application development and deployment. Various IT Giants such as
Microsoft, Amazon and Google and they all vying for a spot within the Cloud Computing space
for Cloud based Software Applications currently available. Even in the near future Software
Testing appear to be the current favoured use of Cloud environments.

A recent survey by Evans Data, an independent research firm that conducts periodic
surveys of developers had found that about those who is using cloud facilities to run
applications, 49.8% said they were doing so experimentally or for prototyping; 28.6% for
noncritical business applications and 21.6% of business critical applications. They understand
Cloud environments as being “better for testing because they can be set and torn down
rapidly, sometimes at less expense than on-premise facilities”. The question to answer then is
what Software testing tools are available to assist developers and Quality Assurance
56
individuals in their application development and testing processes. Software Testing tools that
are used for testing of conventional applications are of little use when applied to Cloud
Testing as there is a need for tools to allow Software developers and Tester to analyse the
network, desktop and implications of changes within the Cloud.

A growing variety of Cloud based Open Source Software Testing Tools are being
published. Cloud Tools for example is a set of tools for deploying, managing and testing Java
EE applications on Amazon's Elastic Computing Cloud (EC2). Containing three main parts,
which includes machine images that can be configured to run on Tomcat and Maven & Grails
plug-in this is an amazing tool to use for Open source cloud software testing.

PushToTest TestMaker is a distributed test environment that can run tests on test
equipment, or in a Cloud Computing environment. It introduces specific commands to support
automatic Cloud Testing services. Cloud Tools and PushToTest Test Maker represent examples
of products that will help shape the future of robust Cloud based Software Testing
Applications. Though the technology is in its early stage, various testing tools are emerging
that can provide assistance in cloud based software testing. Other popular testing tools
include:

A. SoapUI

It is a functional testing tool used for testing web services and APIs. Even provide extensive
support for various kinds of protocols such as SOAP, REST, JMS, HTTP and JDBC. The reason
for using SoapUI, is that it's easy to get started with. It does not requires any programming
knowledge, but is nonetheless very powerful and has extensive scripting support for all kinds
of advance use cases.

B. LoadUI

LoadUI is the one of the world's most downloaded load testing software. It's free, open-source
and easily extendable using Groovy. It can also integrate with SoapUI, letting to leverage
existing functional tests when load testing. It helps in finding answer about the questions:
“Does it perform? Does it scale? What causes the bottlenecks? Is the experience consistent for
users on different continents? ”.

C. TestMaker

TestMaker is a distributed test environment. It runs the tests on the test equipment, in a
Cloud Computing environment, or both. It introduces specific commands to support
automatic Cloud Testing. For example, identify a cloud testing service like Amazon EC2 in a
TestScenario. TestMaker creates the TestNodes in EC2 instances, runs the test, retrieves the
results, and puts down the EC2 instances. All in a ‘lights out’ manner for full Cloud Test
automation.

57
APPLICATION OF CLOUD
Cloud computing has gained huge popularity in the industry due to its ability to host
applications for which the services can be delivered to consumers rapidly at minimal cost.
Applications from a range of domains, from science to engineering, gaming, and social
networking, are considered.

A. Scientific Applications

Scientific applications are a sector that is increasingly using cloud computing systems
and technologies. The immediate benefit seen by researchers and academics are the
potentially infinite availability of computing resources and storage at sustainable prices
compared to a complete in-house deployment. Cloud computing systems meet the needs of
different types of applications in the scientific domain: high-performance computing (HPC)
applications, high-throughput computing (HTC) applications, and data-intensive applications.
The opportunity to use cloud resources is even more appealing because minimal changes
need to be made to existing applications in order to leverage cloud resources. The most
relevant option is IaaS solutions, which offer the optimal environment for running bag-of-tasks
applications and workflows. PaaS solutions have been considered as well. They allow scientists
to explore new programming models for tack- ling computationally challenging problems.
Applications have been redesigned and implemented on top of cloud programming
application models and platforms to leverage their unique capabilities. Problems that require
a higher degree of flexibility in terms of structuring of their computation model can leverage
platforms such as Aneka, which supports MapReduce and other programming models.

B. Healthcare: ECG analysis in the cloud

Healthcare is a domain in which computer technology has found several and diverse
applications: from supporting the business functions to assisting scientists in developing
solutions to cure diseases. An important application is the use of cloud technologies to
support doctors in providing more effective diagnostic processes. The capillary development
of Internet connectivity and its accessibility from any device at any time has made cloud
technologies an attractive option for developing health-monitoring systems. ECG data analysis
and monitoring constitute a case that naturally fits into this scenario. ECG is the electrical
manifestation of the contractile activity of the heart’s myocardium. This activity produces a
specific waveform that is repeated over time and that represents the heartbeat. The analysis
of the shape of the ECG waveform is used to identify arrhythmias and is the most common
way to detect heart disease. Cloud computing technologies allow the remote monitoring of a
patient’s heartbeat data, data analysis in minimal time, and the notification of first-aid
personnel and doctors should these data reveal potentially dangerous conditions. [7] This way
a patient at risk can be constantly monitored without going to a hospital for ECG analysis.

58
C. Biology: Protein Structure Prediction

Applications in biology often require high computing capabilities and often operate on
large data- sets that cause extensive I/O operations. Because of these requirements, biology
applications have often made extensive use of supercomputing and cluster computing
infrastructures. Similar capabilities can be leveraged on demand using cloud computing
technologies in a more dynamic fashion, thus opening new opportunities for Bioinformatics
applications.

Protein structure prediction is a computationally intensive task that is fundamental to


different types of research in the life sciences. Among these is the design of new drugs for the
treatment of diseases. The geometric structure of a protein cannot be directly inferred from
the sequence of genes that compose its structure, but it is the result of complex computations
aimed at identifying the structure that minimizes the required energy. This task requires the
investigation of a space with a massive number of states, consequently creating a large
number of computations for each of these states. The computational power required for
protein structure prediction can now be acquired on demand, without owning a cluster or
navigating the bureaucracy to get access to parallel and distributed computing facilities. Cloud
computing grants access to such capacity on a pay-per-use basis. This concept is distinctive of
cloud technologies and constitutes a strategic advantage when applications are offered and
delivered as a service.

D. Biology: Gene Expression Data Analysis for Cancer Diagnosis

Gene expression profiling is the measurement of the expression levels of thousands of


genes at once. It is used to understand the biological processes that are triggered by medical
treatment at a cellular level. Together with protein structure prediction, this activity is a
fundamental component of drug design, since it allows scientists to identify the effects of a
specific treatment. Another important application of gene expression profiling is cancer
diagnosis and treatment.

This problem is often approached with learning classifiers, which generate a


population of conditionaction rules that guide the classification process. Among these, the
eXtended Classifier System (XCS) has been successfully utilized for classifying large datasets in
the bioinformatics and computer science domains. A variation of this algorithm, CoXCS, has
proven to be effective in these conditions. Cloud-CoXCS is a cloudbased implementation of
CoXCS that leverages Aneka to solve the classification problems in parallel and compose their
outcomes.

E. Geoscience: Satellite Image Processing

Geoscience applications collect, produce, and analyse massive amounts of geospatial


and nonspatial data. As the technology progresses and our planet become more instrumented
(i.e., through the deployment of sensors and satellites for monitoring), the volume of data
that needs to be processed increases significantly. In particular, the geographic information
system (GIS) is a major element of Geoscience applications. GIS applications capture, store,
59
manage, analyse, manipulate, and present all types of geographically referenced data. This
type of information is now becoming increasingly relevant to a wide variety of application
domains: from advanced farming to civil security and natural resources management. As a
result, a considerable amount of Geo-referenced data is ingested into computer systems for
further processing and analysis. Cloud computing is an attractive option for executing these
demanding tasks and extracting meaningful information to support decision makers.

F. Business and consumer applications

The business and consumer sector is the one that probably benefits the most from
cloud computing technologies. On one hand, the opportunity to transform capital costs into
operational costs makes clouds an attractive option for all enterprises that are IT-centric. On
the other hand, the sense of ubiquity that the cloud offers for accessing data and services
makes it interesting for end users as well. The combination of all these elements has made
cloud computing the preferred technology for a wide range of applications, from CRM and
ERP systems to productivity and social-networking applications.

CRM and ERP: Customer relationship management (CRM) and enterprise resource
planning (ERP) applications are market segments that are flourishing in the cloud, with CRM
applications the more mature of the two. Cloud CRM applications constitute a great
opportunity for small enterprises and start-ups to have fully functional CRM software without
large up-front costs and by paying subscriptions. ERP solutions in the cloud are less mature
and have to compete with wellestablished in-house solutions. ERP systems integrate several
aspects of an enterprise: finance and accounting, human resources, supply chain
management, manufacturing, project management, and CRM. Their goal is to provide a
uniform view and access to all operations that need to be performed to sustain a complex
organization.

Salesforce.com is probably the most popular and developed CRM solution available
today. The application provides customizable CRM solutions that can be integrated with
additional features developed by third parties. Salesforce.com is based on the Force.com
cloud development plat- form. This represents scalable and high-performance middleware
executing all the operations of all Salesforce.com applications.

Microsoft Dynamics CRM is the solution implemented by Microsoft for customer


relationship management. Dynamics CRM is available either for installation on the
enterprise’s premises or as an online solution priced as a monthly per-user subscription.
NetSuite provides a collection of applications that help customers manage every aspect of the
business enterprise. Its offering is divided into three major products: NetSuite Global ERP,
NetSuite Global CRM1, and NetSuite Global ECommerce. Moreover, an all-in-one solution:
NetSuite One World, integrates all three products together.

Productivity:

Productivity applications replicate in the cloud some of the most common tasks that
we are used to performing on our desktop: from document storage to office automation and
60
complete desktop environments hosted in the cloud. One of the core features of cloud
computing is available anywhere, at any time, and from any Internet-connected device.
Therefore, document storage constitutes a natural application for such technology. Online
storage solutions preceded cloud computing, but they never became popular.

Perhaps the most popular solution for online document storage is Dropbox, an online
application that allows users to synchronize any file across any platform and any device in a
seamless manner. Dropbox provides users with a free amount of storage that is accessible
through the abstraction of a folder. Another interesting application in this area is iCloud, a
cloud-based document-sharing application provided by Apple to synchronize iOS-based
devices in a completely transparent manner. There are other solutions for online document
sharing, such as Windows Live, Amazon Cloud Drive and CloudMe.

Google Docs is a SaaS application that delivers the basic office automation capabilities
with support for collaborative editing over the Web. The application is executed on top of the
Google distribute computing infrastructure, which allows the system to dynamically scale
according to the number of users using the service. Google Docs allows users to create and
edit text documents, forms, spreadsheets, presentations and drawings. It aims to replace
desktop products such as Microsoft Office and OpenOffice and provide similar interface and
functionality as a cloud service.

Social networking applications have grown considerably in the last few years to
become the most active sites on the Web. To sustain their traffic and serve millions of users
seamlessly, services such as Twitter and Facebook have leveraged cloud computing
technologies. The possibility of continuously adding capacity while systems are running is the
most attractive feature for social networks, which constantly increase their user base.

Media applications are a niche that has taken a considerable advantage from
leveraging cloud computing technologies. In particular, video-processing operations, such as
encoding, transcoding, com- position, and rendering, are good candidates for a cloud-based
environment. These are computationally intensive tasks that can be easily offloaded to cloud
computing infrastructures. Animoto is perhaps the most popular example of media
applications on the cloud. Video encoding and transcoding are operations that can greatly
benefit from using cloud technologies.

Cloud Computing Overview & Current Research Technologies


Cloud computing is a technology that allows users to access shared computing resources over
the internet on demand, and it is an internet-based model for handling, storing, and
processing data. It develops and deploys flexible enterprisewide operations on the cloud
platform. In cloud we store, manage and process data on remote server. Many industries,
such as banking, education the cloud due to the efficiency of services provided by the pay-per-
use pattern based on the resources such as processing power used, transactions carried out,
bandwidth consumed, data transferred, etc in cloud computing No experts required for
hardware and software maintenance. Even No server space required. The cloud is provide
61
better data security. Infrastructure as a Service (IaaS), Software as a Service (SaaS), and
Platform as a Service (Platform as a Service) are three types of cloud computing services. This
research paper analyses the cloud computing Architecture, different Deployment and service
model and research technologies.

The term “cloud” refers to a network or the


internet. It is a technology that allows you to store and access data, such as files, audio, and
video, over the internet. The advantages of using cloud computing include:

i) High Security

ii) High Performance

iii) Multi-Sharing

iv) High Scalability

v) On-demined self service

vi) Automatic software updates

vii) Pay per use

COMPUTING PLATFORMS AND TECHNOLOGIES

Amazon Web Services (AWS)

Amazon’s AWS (Amazon Web Services) is a secure cloud service network. It provides services
including database storage, processing resources, content delivery, Simple Queue,
Simple Email, Relational Database and other features to allow the company expand.

Google AppEngine

Google AppEngine is a scalable runtime environmet. Google app engine is an example of


platform as service (PaaS). Google app engine provides web app developers and enterprises
with access to google’s scalable hosting and tire 1 internet service. The google app engine
supports applications which are written in java or python.

Windows Azure

is another name for Microsoft Azure. It supports a broad range of programming languages,
databases, operating systems and frameworks, allowing IT professionals to rapidly develop,
deploy, and manage applications on a global network. It also helps users to organise their
services into various classes.

o o Microsoft Azure is a scalable, versatile, and cost-effective cloud computing platform.

62
o o It allows you to start for free and also offers a pay-per-use model. o o It supports a variety
of programming languages, including C#, Node.JS, Java, and others.

o o Its IaaS architecture enables us to start a general-purpose virtual machine on a variety of


platforms, including Windows and Linux.

Aneka

Aneka provides Programming application interface (API) and virtual execution environment.
Aneka provides a set of application for expressing business logic of application.

o It combines with multiple virtual machines

o It multiple programming language

o Aneka implementation of PaaS model.

o Aneka acts as a private, public, and Hybrid cloud.

Aneka Container-

Aneka is a middleware and also represents runtime event for executing application. They are
provide three types of services

1) Execution Services

2) Foundation Services

3) Fabric Services

Data Security In Cloud Computing


IMPORTANCE OF SECURITY IN CLOUD COMPUTING

The power, exibility and ease of use of CC comes with lot of security challenges. Even though
CC is a new intuitive way to access applications and make work simple, there are a number of
challenges/issues that can affect its adoption. A non-exhaustive search in this field reveals
some issues. They are: Service Level Agreements (SLA), what to migrate, security, etc. Cloud
Computing has a feature of automatic updates, which means a single change by an
administrator to an application would reect on all its users. This advertently also leads to the
conclusion that any faults in the software are visible to a large number of users immediately,
which is a major risk for any organization with little security.

It is also agreed up on by many researchers that security is a huge concern for


adoption of cloud computing. A survey by IDC on 263 executives also shows that security is
ranked first among challenges in CC. Even though a company boasts to have top class security

63
and does not update its security policies from time to time, it will be prone to security
breaches in near future.

SECURITY CONCERNS IN CLOUD COMPUTING

a. Users authentication: User authentication process must be improvised to ensure that


malicious users do not get access to powerful computing systems in cloud computing.

b. Leakage of data or Data loss: Data can be at risk if an unauthorized person gains access to
shared pool of resources and deletes or modifies data. This risk can increase further if there
exists no backup for that data.

c. Clients trust: There must be strong authentication practices implemented to ensure that
the client's data is being protected from unauthorized access.

d. Malicious users handling: Malicious users can be attackers using cloud services with a
malicious intent or an insider who has gained the trust of company but works to gain access to
sensitive information stored in cloud.

e. Hijacking of sessions: These kind of attacks happen when a legitimate user is prone to
phishing or insecure application interfaces that can be exploited by attackers. Through this
kind of attacks, attackers gain user credentials and hijack legitimate users sessions.

f. Wrong usage of CC and its services: Cloud computing service providers give access to try
their cloud services for a limited period of time for free. Some users utilize this trial period to
misuse the resources obtained through CC service provider.

RESEARCH MOTIVATION

In this research work, we will try to enhance Security between the client and cloud accessing
the cloud. No doubt, cloud has got multiple benefits but we should not forget that there is a
high risk of data getting confidential information getting leaked. In order to avail the benefits
of cloud, we must ensure the security of data being transferred between the client and
user.Security is the key for the Cloud success, security in the cloud is now the main challenge
of cloud computing. Until a few years ago all the business processes of organizations were on
their private infrastructure and, though it was possible to outsource services, it was usually
non-critical data/applications on private infrastructures. Now with cloud computing, the story
has changed. The traditional network perimeter is broken, and organizations feel they have
lost control over their data. New attack vectors have appeared, and the benefit of being
accessible from anywhere becomes a big threat.

• No secure authentication: In the present work, there is no secure authentication procedure


defined. When you log on to your machine and then try to access a resource, say a file server
or database, something needs to assure that your username and password are valid. With
sensitive data stored in the cloud of the different users, we need a strong authentication
mechanism. Data breaches because of no/weak authentication.

64
• No Gateway is defined: The user should not be directly connected to the cloud provider as
there is high risk of data getting stolen or hacked by the third party intruder. There is a
requirement of gateway/broker that acts as an intermediate between the cloud provider and
the client.

• No Read/Write policies have been defined: Different privileges should be given to the
different types of users.

CHARACTERISTICS OF CLOUD COMPUTING

Cloud computing have some of the following characteristics in order to meet client or user
requirements and to provide qualitative services.

1) High scalability: It means on request provisioning of resources on a huge scale without


requiring human cooperation with each service provider.

2) High availability and reliability: The availability of servers are more reliable and high
henceforth it limits the chances of disappointment in the infrastructure.

3) Agility: It shares the resources among users and works very quickly.

4) Multi-sharing: Various customer and applications work all the more adequately with less
cost by sharing fundamental infrastructure utilizing distributed computing.

5) Maintenance: Maintenance of cloud computing applications is easier as they are not


required to be installed on each computer and can also be accessed from various places,
ultimately reducing the cost.

6) Low cost: It is cost effective because the company no more needs to set its own
infrastructure. It pays according to resources it has consumed.

7) Services in pay-per-use mode: APIs (Application Programming Interfaces) are given to the
clients for accessing the services on the cloud and pay on the basis of service is use.

8) On-Demand Self Service: Cloud Computing allows the clients to use services and resources
on request for human interaction with cloud service providers. One can logon to a website
whenever and use them. Computing resources include virtual machines, processing power,
storage etc.

9) Broad network access: Resources such as virtual machines, storage, processing power, can
be accessed over a internet using heterogeneous gadgets like mobiles phones, laptops,
computers, etc. Since cloud computing is internet based, it can be accessed at any time and
from anywhere.

10) Resource Pooling: Cloud computing allows multiple occupants to share a pool of
resources. One can share a single physical instance of database, hardware and basic
infrastructure. For example, a physical server may host several virtual machines belonging to
different users.

65
11) Rapid elasticity: It is very easy to scale the resources up or down at any time. Resources
used by the customers or currently assigned to customers are automatically monitored and
resources. It makes it possible.

12) Measured Service: In Measured service cloud provider controls and monitors every one of
the parts of cloud service. It depends on capacity planning, Resource billing, optimization and
etc.

CLOUD COMPUTING TECHNOLOGIES

There are different innovations that are working behind the cloud computing platform to
make it reliable, adaptable and usable and they are:

A. Virtualization

B. Service-Oriented Architecture (SOA)

C. Grid computing

D. Utility Computing

A. Virtualization

Virtualization is a procedure that licenses sharing of a physical instance of resource or an


application among various customers or an organization. It does so by assigning a logical name
to a physical resource and providing a pointer to that physical resource when demanded. The
main use of this technology is to provide the applications with a standard version to their
cloud clients. For example, if the updated version of the application is released then cloud
provider ought to provide the updated version to their clients. For example, VMware, and Xen
offer virtualized IT frameworks on request. Virtual system progresses, for example, Virtual
Private Network (VPN), support clients with a modified network environment to get Cloud
resources. Virtualization techniques are the bases of the Cloud Computing since they render
scalable and flexible hardware services. The Multitenant architecture offers virtual
disengagement among the various tenants and in this way the organizations can utilize and
customize the application just as they each have its own particular instance running. Types of
Virtualization Following are types of virtualization:

1. Hardware Virtualization

2. Operating system Virtualization

3. Server Virtualization

4. Storage Virtualization The Architecture of virtual cloud model

1) Hardware Virtualization: If the Virtual Machine Manager (VMM) or Virtual Machine


Software (VMS) is directly installed on the hardware system, it is called as Hardware
virtualization. The hardware virtualization is utilized for the server platform because

66
controlling a virtual machine is not hard than controlling a physical server. There are various
types of virtualization.

2) Operating system Virtualization: If the Virtual Machine Manager (VMM) or Virtual Machine
Software (VMS) is installed on the Host Operating System rather than being directly installed
on the hardware system, it is called as Operating System Virtualization. Operating System
Virtualization is improved the situation testing the applications on various platforms of OS.

3) Server Virtualization: If the Virtual Machine Manager (VMM) or Virtual Machine Software
(VMS) is directly installed on the server system, it is called as Server Virtualization. If single
physical server is divided into multiple servers for balancing the load on demand basis, then
Server Virtualization is used.

4) Storage Virtualization:The process of gathering the physical storage from various network
storage devices is known as Storage virtualization. After gathering multiple storage devices to
the physical storage would seem that a single storage device. Storage virtualization is utilized
for back-up and recovery purposes.

B. Service-Oriented Architecture (SOA): Service-Oriented Architecture utilizes applications as


a service for other applications. in any case the sort of seller, item or innovation... In this way,
it is conceivable to trade of information between utilizations of various sellers without extra
programming or rolling out improvements to service. SOA is an application structure which
takes everyday business applications and divides them into particular business procedures
and function called Services.this component of cloud innovation enables associations to get to
cloud-based registering arrangements with highlights that can be adjusted on request, as
business needs change.

SOA places the commitment and expenses of deployment, development, and support
of web benefit parts on the web services supplier, which allows a web services purchaser to
get to different web services without the cost or overhead that is connected with
conventional methods for IT services . SOA is a successful mechanical piece of distributed
computing since it energizes incorporated appropriation and fragment reuse, which
essentially drives down the cost of programming advancement and conveyance. Service-
Oriented Computing introduces and diffuses two important concepts, which are also
fundamental for Cloud computing i.e Quality of Service (QoS) and Software as a Service (SaaS).
Quality of Service identifies a set of functional and non-functional attributes that can be used
to evaluate the behavior of a service from different perspectives and the Software as a Service
introduces a new delivery model for applications. It has been inherited from the world of
Application Service Providers (ASPs).

C. Grid computing:

Grid Computing is defined as distributed computing in which a number of computers from


multiple areas are connected together to achieve a common goal. The computer resources
are different and geographically spread.. therefore, Grid Computing breaks a large problem
into smaller pieces. So, These smaller pieces are disseminated to systems that reside within
67
the grid. Grid system is intended for sharing of resources through distributed and huge -scale
cluster computing. Grid computing is popular in e-science, forms of research that often
require huge computing power and collaboration between various data and computing
services. Planning applications on Grids can be a mind boggling errand, particularly while
organizing the stream of data over circulated registering resources. Network work process
frameworks have been made as a particular type of a work procedure administration
framework planned particularly to create and execute a progression of work process, or a
computational or, information control steps, or in the Grid setup. A famous Grid Computing
project is Folding@Home. The project involves utilizing unused computing powers of
thousands of computers to perform a complex scientific problem. The goal of the project is
"to understand proteinfolding, misfolding, and related diseases".

D. Utility Computing: Utility computing relies upon Pay-per-Utilize model. It gives


computational resources on ask for as a metered benefit. All the managed IT administrations,
Grid computing, distributed computing are an idea follow on the concept of grid computing.

In reality pricing on cloud computing can be very complex. As an example pricing of


Amazon S3 as on November 2009 is explained below. Amazon charges for using US S3 are
divided into three parts - storage charges, data transfer charges and charges for a number of
requests. These charges are summed together to compute the total billing.

Data transfer charges are further divided into data transfer input and data transfer
output. Data transfer rate for incoming data $0.100 per GB. Utility computing helps in
reducing initial investment. As the computing requirements for an individual or an
organization changes, the billing changes accordingly, without incurring any additional cost. If
the usage has reduced, then billing will also reduce accordingly.

CONCLUSION
Cloud computing can be viewed as a new phenomenon which is set to change the way we
utilize the Internet, there is much to be careful about. There are numerous new advances
rising at a fast rate, each with mechanical headways and with the capability of making
human's lives simpler. This paper clarifies a brief overview of Cloud Computing, Deployment
Models and Service Models, Cloud computing techniques, Virtualization, SOA, Grid Computing
and Utilities of Cloud Computing. We meant to center around the introduction and depiction
of the developing patterns in the technology utilized for cloud based services. Several types of
research were made in this field with a specific end goal to offer the on-request benefits for
the clients and limit downtime while moving VM's memory starting with one physical host
then onto the next. We will attempt later on to unite them to build up a reasonable approach
to describe, find, form and manage computing resources and network components
constituting the cloud based of the SOA idea. This technique will give the adaptability and
scalability expected to guarantee interoperability amongst systems and heterogeneous
resources shaping the cloud. We will center around the dynamic creation of systems shaping
cloud and permit the disclosure, decay and execution services "cloud services" on request.
The association and arrangement of these administrations will be managed ideally under SOA.
68
With the continuous growth and expansion of cloud computing, security has become
one of the serious issues. Cloud computing platform need to provide some reliable security
technology to prevent security attacks, as well as the destruction of infrastructure and
services. There is no doubt that the cloud computing is the development trend in the future.
Cloud computing brings us the approximately infinite computing capability, good scalability,
service on-demand and so on, also challenges at security, privacy, legal issues and so on. But
to solving the existing issues becomes utmost urgency. To protect against the compromise of
the compliance integrity and security of their applications and data, firewall, Intrusion
detection and prevention, integrity monitoring, log inspection, and malware protection.
Proactive enterprises and service providers should apply this protection on their cloud
infrastructure, to achieve security so that they could take advantage of cloud computing
ahead of their competitors. These security solutions should have the intelligence to be self-
defending and have the ability to provide real-time detection and prevention of known and
unknown threats. To advance cloud computing, the community must take proactive measures
to ensure security.

In present days cloud computing is one of the greatest platform which provides
storage of data in very lower cost and available for all time over the internet. But it has more
critical issue like security, load management and fault tolerance. In this paper we are
discussing load balancing approaches. Resource scheduling management design on Cloud
computing is an important problem. Scheduling model, cost, quality of service, time, and
conditions of the request for access to services are factors to be focused. A good task
scheduler should adapt its scheduling strategy to the changing environment and load
balancing Cloud task scheduling policy. Cloud Computing is high utility software having the
ability to change the IT software industry and making the software even more attractive. This
paper is based on cloud computing technology which has a very vast potential and is still
unexplored. The capabilities of cloud computing are endless. Cloud computing provides
everything to the user as a service which includes platform as a service, application as a
service, infrastructure as a service. One of the major issues of cloud computing is load
balancing because overloading of a system may lead to poor performance which can make the
technology unsuccessful. So there is always a requirement of efficient load balancing
algorithm for efficient utilization of resources. Our paper focuses on the various load
balancing algorithms and their applicability in cloud computing environment.

Cloud computing refers to the “servers” that are accessed over the internet (present at
remote location). Cloud computing is a storing, Managing, and accessing the data & programs
on the remote servers that are hosted on internet instead of computers hard device. Cloud
computing provides on-demand network access to a broad range of tools in the cloud from a
variety of service providers, such as Google Cloud, Microsoft Azure, IBM Cloud, Amazon Web
Services, and Aneka. It saves money, but it can also lead to risk issues and resource
suspension when used in large quantities. This research paper effort presents an overview of
Cloud Computing and Furthermore, research cloud computing technologies which are
currently faced in the Cloud computing were also highlighted.

69
Cloud computing is an emerging technology. It is an attractive solution when the
infrastructure or the IT personnel are not available or too expensive; but it has its drawback.
The drawback can be mainly found in the security threats and vulnerabilities of the cloud
computing. Unlike traditional solutions where threats come from two known sources inside or
outside the network; cloud computing security threats might originate from different sources.
In this paper we discussed most of the cloud security threats from three prospective levels:
application, network and user levels. Also we address some possible ways to reduce security
as possible.

In present days cloud computing is one of the greatest platform which provides
storage of data in very lower cost and available for all time over the internet. But it has more
critical issue like security, load management and fault tolerance. In this paper we are
discussing load balancing approaches. Resource scheduling management design on Cloud
computing is an important problem. Scheduling model, cost, quality of service, time, and
conditions of the request for access to services are factors to be focused. A good task
scheduler should adapt its scheduling strategy to the changing environment and load
balancing Cloud task scheduling policy. Cloud Computing is high utility software having the
ability to change the IT software industry and making the software even more attractive. This
paper is based on cloud computing technology which has a very vast potential and is still
unexplored. The capabilities of cloud computing are endless. Cloud computing provides
everything to the user as a service which includes platform as a service, application as a
service, infrastructure as a service. One of the major issues of cloud computing is load
balancing because overloading of a system may lead to poor performance which can make the
technology unsuccessful. So there is always a requirement of efficient load balancing
algorithm for efficient utilization of resources. Our paper focuses on the various load
balancing algorithms and their applicability in cloud computing environment.

Cloud computing is the use of computing resources that are delivered as a service over
a network. It shares characteristics with autonomic computing, grid computing, client server
model, mainframe computer, utility computing and peer to peer. The detailed study of cloud
computing basics like deployment model, system architecture, cloud services and types of
cloud has been done. The open source resources of cloud computing has been studied. The
management tools, load tools and test tools are found for cloud computing.

Cloud computing is an emerging technology. It is an attractive solution when the


infrastructure or the IT personnel are not available or too expensive; but it has its drawback.
The drawback can be mainly found in the security threats and vulnerabilities of the cloud
computing. Unlike traditional solutions where threats come from two known sources inside or
outside the network; cloud computing security threats might originate from different sources.
In this paper we discussed most of the cloud security threats from three prospective levels:
application, network and user levels. Also we address some possible ways to reduce security
as possible.

With the continuous growth and expansion of cloud computing, security has become
one of the serious issues. Cloud computing platform need to provide some reliable security
70
technology to prevent security attacks, as well as the destruction of infrastructure and
services. There is no doubt that the cloud computing is the development trend in the future.
Cloud computing brings us the approximately infinite computing capability, good scalability,
service on-demand and so on, also challenges at security, privacy, legal issues and so on. But
to solving the existing issues becomes utmost urgency. To protect against the compromise of
the compliance integrity and security of their applications and data, firewall, Intrusion
detection and prevention, integrity monitoring, log inspection, and malware protection.
Proactive enterprises and service providers should apply this protection on their cloud
infrastructure, to achieve security so that they could take advantage of cloud computing
ahead of their competitors. These security solutions should have the intelligence to be self-
defending and have the ability to provide real-time detection and prevention of known and
unknown threats. To advance cloud computing, the community must take proactive measures
to ensure security.

71

You might also like