You are on page 1of 16

Mediterranean Journal of Basic and Applied Sciences (MJBAS)

Volume 4, Issue 4, Pages 194-209, October-December 2020

Secure and Privacy-Based Data Sharing Approaches in Cloud Computing for


Healthcare Applications
Dr. Vinod Varma Vegesna
Sr. IT Security Risk Analyst, The Auto Club Group, United States of America. Email: vinodvarmava@gmail.com

DOI: http://doi.org/10.46382/MJBAS.2020.4409

Copyright: © 2020 Dr. Vinod Varma Vegesna. This is an open access article distributed under the terms of the Creative Commons Attribution License, which
permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Article Received: 12 September 2020 Article Accepted: 29 November 2020 Article Published: 31 December 2020

ABSTRACT

The cloud framework is extensively employed in the medical industry for a broad range of applications including medical information storage,
distribution, as well as administration. Given the advantages of cloud computing, several medical companies are exploring implementing such
techniques to address various difficulties inside the medical sector. It evolved into an essential component of healthcare delivery. It may help
medical companies concentrate on their activities, and medical assistance, including clinical management. This provides a safer approach for sharing
confidential material with hospitals as well as third-party studies and medical institutes. Nevertheless, because the structure of cloud technology
emerges as well as develops fast, especially theoretical and practical implications, significant legal/contractual, economical, customer satisfaction,
connectivity, cybersecurity, and confidentiality problems remain in the research phase. In this article, we explain several cloud-based computing
services, as well as implementation strategies, also highlight important problems. This study also focuses on safe and secure information exchange
options within cloud technology for medical domains.
Keywords: Privacy-based data sharing, Healthcare, Security breach.

1. Introduction
Transferring confidential clinical information including patient information to the clouds maintained through
private entities exposes it to security breaches and, as a result, poses important privacy hazards. The physician at a
clinic, for instance, who has been recognized as well as approved, gets full access to health services, particularly
private data. Patients, by contrast, are opposed to having private and confidential information posted with someone
who is not entirely recognized and approved [1-4]. Accessing patient records inside a cloud context creates
significant issues regarding the safety and confidentiality of this information, which requires special consideration.
It needs to be guaranteed that only legitimate users have access to highly confidential patient data. To prevent
unwanted access to the information, management should be handed to the proper individuals who might decode
and collect information appropriately [5-8]. Because patient data is indeed a major infringement of the person's
confidentiality, it requires to be handled with caution. While security concerns are the most important factors in the
adoption of cloud-based technology inside the health sector, security is typically among the most critical
considerations in e-health data centers, particularly when exchanging patient records. Numerous researchers,
based on the research, examined associated information privacy challenges in exchanging patient records as well
as provided unique techniques to handle identity management. Cloud technology had grown in popularity owing to
its several benefits, including faster performance, simplicity, efficiency improvements, and incident management.
Cloud technologies are well-known for their fully compatible, which may give several advantages to users. There
seems to be presently a demand for IT firms to improve existing activities in information exchange. Based on an
Information Week poll, mostly all IT businesses contributed sensitive information. 74% of organizations
communicate their information with consumers, while 64% communicate everything with providers. Nearly a
quarter of the firms polled prioritize information exchange.

ISSN: 2581-5059 www.mjbas.com


194
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

The Clouds, on the other hand, are vulnerable to several security and confidentiality threats. A most significant
impediment to advancement, as well as widespread acceptance of something like the Clouds, seems to be the
confidentiality and security issues connected with this as well. Based on an IDC Enterprises Panel poll, 75% of
questioned respondents are concerned about the vulnerability of the most important information technology and
business systems. Several safety and privacy assaults originate inside the cloud-based service provider itself since
they frequently possess easy access to the archived information or acquire this to transfer to other entities for
financial gain. Figure 1 depicts a conceptual use of cloud services in healthcare.

Figure 1. Use of cloud services in healthcare

As a result, privacy and security of information seem to be the primary concerns in cloud-based solutions. Cloud
encryption standards to ensure transparency that will be utilized or stored in a cloud environment [9-13]. It
provides the capability to effortlessly and safely utilize pooled cloud-based services since all information
contained in the public cloud is encrypted. Cloud technology encryption approaches secure important documents
while impeding communication. Among the most often utilized approaches in safeguarding information systems,
compliance is the security system. The authentication process is a policy that either allows, denies, or restricts
resources to be accessed in a computer system. It moreover detects or logs any attempts at gaining a machine's
access. It is an extremely significant tool for cybersecurity management. As a result, another difficult task inside
the cloud technology situation is determining how safely and securely exchange user information.

CP-ABE had emerged as an essential cryptography solution for addressing the difficulty of safe information
transfer. A feature group defines the participant's private keys in a CP-ABE, and an encrypted message is coupled
including an authentication process. The Data Owner (DO) seems to have the authority to establish the
authentication protocol throughout the world of features. Only when a participant's feature collections group fits
the access permissions over an encrypted message, only then they may decode a particular encryption algorithm.
Using a CP-ABE subsystem in a virtualized environment might result in certain unresolved issues. All individuals'
private keys should be provided by something like a completely trustworthy key authority (KA). It creates a safety

ISSN: 2581-5059 www.mjbas.com


195
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

risk described as the key escrow dilemma. Understanding a participant's private keys allows the KA to decode all
of the participant's encrypted information, which is completely against the participant's desire. An additional
consideration is the complexity of the attribute value.

2. Review of Existing Literature

This section gives an elaborative literature review of secure and privacy-based data-sharing approaches in cloud
computing for healthcare as well as ordinary applications [14-20]. ABE is simply a stripped-down IBE framework
just with one property. Inside an ABE scheme, when the transmitter encodes their information with only a set of
characteristics that provides a quantity d; that ciphertext is only decrypted if indeed the receiver does have at
minimum d of provided features. Using these ideas, an ABE system featuring good data management which
accommodates monotonic integrative frameworks like AND, OR, and some other thresholds gates was introduced.
Investigators presented an improved approach that includes compatibility for quasi-grid interfaces, such as NOT
gates. ABE is divided into two categories KP-ABE and CPABE. The access structure of KP-ABE is utilized to
encode the private keys, while the characteristics are utilized to characterize the encrypted message. In contrast,
CP-ABE encrypts the secret message using the authentication protocol, and the cryptographic keys are made
depending on a set of attributes.

Investigators established a safe cloud services strategy for healthcare data stored in the cloud by dividing clients
into diverse areas employing Chase and Chow's cross ABE technique; nevertheless, such a method is indeed an
outlier that is not widely used in cloud-based solutions. They proposed a system access control process for
information that was leased. Every piece of information is encrypted with such secret cryptography in this
approach, and each client is allocated private keys; nevertheless, the overall cost of actions such as database
creation including user guarantee are proportional toward the number of users, rendering the schemes unscalable.
Furthermore, certain privacy protection problems in information leasing, spanning from information security to
information usefulness. This study provides an excellent overview of the privacy and security of information in
leased development servers.

The study proposed a good hierarchy system for access control based on Hierarchy IBE and CP-ABE. The design
of that kind of technique is tiered, with a core master and numerous domain masters to produce credentials for
clients; nonetheless, the is challenging owing to the enormous amount of credentials necessary for every object. A
number of cryptography techniques are provided in cloud applications. As an example, consider the traditionally
researched basic cryptographic algorithm. The ABE technique is being further improved and changed to KP-ABE
and CP-ABE.

Fuzzy IBE was created by academics as a foundational effort in attribute-based cryptography. Following this,
researchers initially introduced attribute-based cryptography. Both the private user key as well as the encrypted
message are connected with such a collection of features inside this ABE system. The client may decode the
encrypted message if and only unless a certain amount of characteristics associated with ciphertext and the user's
private key coincide. Unlike standard cryptographic keys, including such Identity-Based Encryption, ABE is
developed for encryption, wherein an encrypted message is also not automatically encoded for a small number of

ISSN: 2581-5059 www.mjbas.com


196
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

users but can be encoded for several customers. The threshold implications of Sahai and Waters' ABE scheme are
still not powerful enough to be utilized for constructing larger generic access control schemes. Regulations are
established and enforced within the cryptography mechanism itself within ABE. There are different sorts of
contemporary ABE schemes: KP-ABE systems and CP-ABE systems.

A KP-ABE technique was devised by experts. It allows for broader authentication and authorization. This is the
updated method of an already described broad framework of ABE. When investigating the KP-ABE system,
characteristics were connected to ciphertext as well as accessibility restrictions related to the user's private keys.
An authentication mechanism connected with the participant's private key must be met by the characteristics with
ciphertext used to decode the encrypted message. The KP-ABE system uses a public key cryptography mechanism
designed once per transmission. For instance, suppose the feature is specified as A, B, C, and D. The encrypted
message is calculated using the feature set A, B. An access policy (A C D) is included in the participant's private
key. In the previous scenario, the customer will be unable to decode the encrypted message but would have been
capable of decoding an encrypted message with characteristics A, C, and D.

The suggestion of that other upgraded type of ABE known as CP-ABE was suggested by the researchers.
Characteristic values are connected only with the encrypted message in the CP-ABE technique, while features
were connected to the participant's private keys; only some keys whose corresponding features fulfil the policies
linked to the information may decode the encrypted message. CP-ABE operates in the direction opposite of
KP-ABE. The encrypter provides the thresholds access control policy of their remarkable features when
encryption text files. Following encryption of the text according to the specified access control mechanism, only
customers with characteristics contained inside the private key who fulfil the access control policy may decode the
encrypted message. For example, if the features are described as A, B, C, and D, and the first user receives a private
key to attributes A, B, and D, while user 2 receives a private key to attribute D. If an encrypted message is secured
using the rule (A|C) D, user 2 will be capable of decoding it whereas user 1 will be unable to. The encryption key
may be maintained secret regardless of whether the storing system is untrustworthy, which makes it more secure
from collusion attempts when using the CP-ABE approach. To implement security controls for encrypted
information, the CP-ABE technique seems to be more intuitive to use than the KP-ABE method.

About every extant CP-ABE system demands complete trusted authorization. To overcome the key escrow issue in
a multi-authority environment, researchers recommended a distributed KP-ABE technique. Given that they have
not collaborated against one another, every authority in this system participates there in key development
procedures in a dispersed manner. Due to the lack of centralized trusted authorities holding master sensitive
documents, every attributed authority there in the network must interact with one another to generate a participant's
private key. The speed deterioration is a major problem with this strategy. It causes O(N2) communications
complexity during both the setup and any rekeying phases. Every user must store O(N2) modification in addition
to the attribute keys.

Utilizing the traditional CP-ABE, researchers have developed an enhanced security information exchange system.
The problem of key escrow is solved by employing an escrow-free key issuance method in which the key

ISSN: 2581-5059 www.mjbas.com


197
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

distribution center as well as the information storage center collaborate to produce a private key for the users. The
protocol necessitates proactive computing by both sides. As a result, the computing expense of creating the
participant's cryptographic key rises. Based on the safety and security assessments, the suggested method is
successful at effectively controlling the information disseminated inside the information-sharing environment.

Investigators developed a unique cloud technology accessibility control approach featuring effective attributes and
client termination. The computational complexity in consumer cryptography is reduced by O(2N) to O(N) through
enhancing the existing CP-ABE technique, wherein N represents the number of features. The length of the
encrypted message is around 50% of the original size of plaintext. Nevertheless, the system's trustworthiness is not
adequately shown. To create or distribute user private keys, several extant CP-ABE methods need a strong
authentication authority including its master private key as inputs. As a result, the key escrow problem is
fundamental, with the authority having the capacity to decode all system users' encrypted messages.

To address the problem of variable participation administration, the researchers designed an arbitrary-state ABE.
This gives significant flexibility in attribute limitations, allowing people to dynamically enter, exit, as well as edit
their features. Every client may register into it and exit any ABE platform, as well as update their features and the
values associated with the features. Whenever registration, departure, or feature modification happens, nobody
else needs to change their secret key.

3. Characteristics of Cloud

Cloud resources are distinguished by five key qualities which illustrate their relationship from or distinction with
existing computational methodologies. The features are as follows: (i) infrastructural abstractions, (ii) resource
democratization, (iii) service-oriented design, (iv) flexibility or elasticity, and (v) consuming and distribution
utility model.

Table 1. Relationship between cloud properties and cloud computing technologies

S. No. Cloud characteristics Cloud mechanisms

 Automated scaling listener


1. On-demand Usage
 Pay-per-use monitor

2. Ubiquitous Access  Multi-device broker

 Virtual server

3. Multitenancy  Container

 Resource replication

4. Measured Usage  Pay-per-use monitor

5. Elasticity  Virtual server

ISSN: 2581-5059 www.mjbas.com


198
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

 Automated scaling listener

 State management database

 Resource replication

 Container

 Pay-per-use monitor

 Failover system

 State management database


6. Resiliency
 Resource replication

 Container

 Hypervisor

 Cloud usage monitor

 Automated scaling listener


7. Elasticity
 Resource replication

 Load balancer

 Resource management system

 Hypervisor

 Cloud usage monitor

 SLA monitor

8. Measured Usage  Pay-per-use monitor

 Audit monitor

 SLA management system

 Billing management system

 Hypervisor

 Resource replication

9. Resiliency  Failover system

 Resource cluster

 Resource management system

ISSN: 2581-5059 www.mjbas.com


199
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

Table 1 depicts the mapping of cloud properties to cloud computational tools. Fundamentally, using the cloud
services processes outlined above aids throughout the fulfillment of the interconnected cloud features.

a. Abstraction of infrastructure

As a consequence of delivering services, the processing, networking, and memory facilities that are required are
isolated from the software as well as resource allocation. From a standpoint like an application's or provider's
capacity to offer it, wherever and through what actual resources particular information is processed, transported,
and saved remains essentially transparent. Irrespective of such tenant type used - sharing or private - infrastructural
resources are typically aggregated to supply services. Such abstraction is often given by significant levels of
virtualization only at hardware and OS layers, or through substantially customized data files, and software
platforms, including interoperability at greater levels.

b. Decentralization of resources

The abstractions of infrastructures give the concept of capacity democratization—whether architecture, programs,
or data allows aggregated assets to ever be rendered available and reachable to anyone and everything permitted
when using these through recognized resources.

c. Service-oriented design

Because the separation of infrastructure from implementation as well as data results in well-defined and
loosely-coupled resource democratization, the idea of utilizing such elements in their entirety or portion, alone or
through interconnection, offers an operations architectural style in which assets can be obtained and utilized in a
systematic manner. The emphasis inside this paradigm seems to be on the delivery of services rather than network
infrastructure.

d. Flexibility/dynamism

The on-demand cloud device management framework, in conjunction with high levels of automated processes,
software as a service, and prevalent, dependable, and increased interconnection, enables massive development or
contraction of resource distribution to service definition as well as specifications via a cloud-based framework
which expands with the capabilities. Because those resources are shared, improved utilization and service
standards are possible.

e. Consumption and allocation utility model

The cloud's abstraction, democratized and flexible architecture, combined with strict control, management,
deployment, and identity, enables the dynamic allocation of resources depending on any combination of
controlling incoming criteria. Considering the transparency at the nano scale, network capacity may be exploited
to give a regulated utility pricing and use paradigm. This allows for increased economic efficacy as well as
scalability, in addition to controlled or predictable expenditures. In essence, the cloud-based data model's
fundamental parts are the Data Owner (DO), the Data User (DU), and Cloud Service Provider (CSP) (Figure 2).
The DO uses the cloud's infrastructure for outsourcing its information. Furthermore, the DO includes the clouds,

ISSN: 2581-5059 www.mjbas.com


200
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

which may offer other capabilities such as information computing, data search, or sharing of information. The user
is indeed the person or entity who uses data in the cloud and conducts actions that include retrieval of information,
information gathering, information calculation, and accessibility.

Furthermore, the CSP has massive memory and processing capacity, and the CSP is in charge of assigning services
to information users if they require them inside the cloud infrastructure. In certain circumstances, cloud users could
be responsible for delivering solutions like a CSP.

Figure 2. SPI Model Cloud Framework

4. Secure Cloud Communications

Whenever the preceding step of anonymized verification is successful, the customer 'Ci' may transfer the
information onto the cloud as well as, whenever necessary, retrieve it. Secret authentication enables users to access
the common spaces of an Internet or FTP website without having to submit a login password or a username.
Furthermore, the bidirectional cipher helps safeguard the data.

But instead, inside the authentication step, AES is used for both encryption and decoding. In this situation,
maintaining safe virtualized communications are established by taking into account the CSP's role as a third-party
auditor in certain circumstances. Among the third-party audits may have been the fraudulent client. As a result,
their security breach in the cloud environment would be ineffective. Table 2 depicts the cloud security
vulnerabilities [21-22].

ISSN: 2581-5059 www.mjbas.com


201
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

Table 2. Significant Security Issues in Cloud Infrastructure

S. No. Security Challenge Description

 Temporary and permanent unavailability may cause


service breakdown
1. Availability
 DoS Attacks, natural disasters, equipment failures

 Physical and logical control missing on the organization’s


internal and DBaaS Provider’s employees
2. Access Control Issues  An increase in development and analysis cost is incurred
when user management and granular access control is
implemented

 Need to avoid modification on configuration, access, and


3. Integrity Checks data files

 Require integrity and accuracy of information

 Configuration requirements –change continuously

 Important for avoiding failures, backup maintenance, the


4. Auditing and Monitoring
configuration of auto fail-over mechanisms

 Requires stark network and physical devices, expertise


and relevant resources

 Data recovery by malicious sources if not properly


5. Data Sanitization
discarded

 Unencrypted data in memory, disk, or in networks might


result in data breaches

6. Data Confidentiality  Co-located application data is vulnerable to software


bugs and errors in the clouds

 External organizations may also generate attacks

Data Replication and


 Replications between multiple servers may cause
7. Consistency
management as well as consistency issues
Management

 Data flowing over the internet may be prone to hazardous


8. Network Security
attacks

ISSN: 2581-5059 www.mjbas.com


202
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

Furthermore, in the suggested method of confidentiality internet communications, the information obtained from
the system is calculated, followed by the calculated information being handled in the service platform through an
operating platform enabling cryptographic algorithms.

START

Input
Username
& Password

Change the Password


to SHA 512 Hash Form

N
Check
Database

Home
System

FINISH

Figure 3. SHA framework

Meanwhile, the created password is sent to the CSP interfaces through Access authorization as well as information
uploading via third-party administration (TPA). A TPA is a service firm that offers many operations to insurance
companies under the conditions of a sales contract. The primary responsibility of the TPA is to secure the data. It
generates hash on encoded packets acquired from the cloud platform, concatenates these, as well as generates
fingerprints on them. This then examines these fingerprints to determine if the information stored in the cloud was
already altered. Furthermore, the TPA information is sent to the cloud client. SHAs (Figure 3) is a type of
encryption algorithm that was employed to secure data. It uses a hashing algorithm that is a binary arithmetic

ISSN: 2581-5059 www.mjbas.com


203
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

operation, segmentation addition, and integrity techniques to convert information. The hashing returns a resolved
sequence without a resemblance to the input. Such algorithms were procedures that imply that they make it
immensely difficult to revert back to the original information when it has been turned into hash functions. The
researchers maintained this cloud-based model's cryptography as well as the unprocessed confidentiality
procedures. Utilizing TPA, the pairing-based Signature Model provides guaranteed reliable audit. Furthermore,
the model employed the batching verification technique to eliminate server communication overhead and increase
the model's expenditure.

The security protocols that offer dynamic data analysis were supplied by standard technology. Furthermore, the
concept highlights the difficulty of offering a concurrent communal audit model as well as flexible information
processing for remote data analysis. The research establishes the requirement for a secured, unregistered
transmission medium. Furthermore, none of the cryptography methods is specified in the framework. In contrast, a
quasi-method provides consumer privacy inside the cloud context. The concept produced a customer security
administrator which reduces the leakage factor of customers' confidential communications.

Moreover, reliability modeling was reliant upon robustness modeling that includes the use of an Information
dispersal algorithm (IDA). The approaches, nevertheless, were insufficient for client linkability in identifying
fraudulent usernames and passwords.

5. Cloud Security Techniques

A. Delay-Optimized File Retrieval utilizing LT-Based Cloud Storage

Because of its quick recovery, the Luby Transform (LT) algorithm remains one of the most common source
algorithms in archival systems. Sequential phase recovery for fragmentation was shown throughout the study to be
beneficial in reducing file-retrieval latency. First, create delayed modeling for several multi-phase retrieving
strategies that are suitable for the investigated system. The study focused on overall file-retrieval delay, which is
characterized by the timeframe it takes between a gateway to receive the LT-coded document requests and the
overall period that takes for its gateway to send over the final LT-coded packet. The time it takes to get a document
provides a significant predictor for customer experience.

Researchers designed a delay-optimal file-retrieval issue wherein the retrieving latency is minimized through
intelligently timing packet retrieve requests. As a result, researchers intend significantly minimize file-retrieval
latency by intelligently arranging LT-coded packet requests. The goal of the suggested multi-stage query
technique effectively reduced the mean file-retrieval latency given a particular amount of steps. One difficulty
concerning such an approach is that it solves the difficulty of optimum document retrieving latency in a
decentralized cloud hosting system. Researchers developed the efficient two-stage request technique with the
particular decoder likelihood employing that model. Both experimental plus quantitative studies showed how the
optimum technique may significantly minimize median latency.

The research provided memory facility administrators with one method in designing an efficient memory
retrieving technique for LT-based decentralized internet storing systems. Researchers developed an efficient

ISSN: 2581-5059 www.mjbas.com


204
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

two-stage request technique assuming a particular decoder likelihood employing that model. Both experimental
and analytical studies showed how this optimum technique may significantly minimize median latency.

B. Enhancing Synchronization Effectiveness for Mobile Cloud Storage Services

Portable providers of cloud-based storage have been seeing a tremendous increase. Analyze, investigate, then
solve a synchronization inefficiencies issue of current wireless cloud-based storage systems in this study.
Although when progressive sync is performed, the findings showed how traditional commercial sync solutions are
incapable of making maximum utilization bandwidth, therefore, creating a substantial amount of superfluous sync
traffic in some cases.

According to the results, Quick Sync, a solution featuring 3 distinct ways to increase sync effectiveness for
portable cloud data storage services, is offered, and the structure is based on two existing sync solutions.
Researchers also assessed its potential to improve network consumption effectiveness. Furthermore, utilizing
actual applications, they improved sync performance altogether. Whenever designers evaluate the performance of
the first Sea file and DropBox customers to those created when the multiple service architectures are enhanced
with Quick Sync, designers can see that the former outperforms the latter. To overcome the inefficiencies and
difficulties, Quick Sync, a technology featuring three distinctive strategies, is offered. Quick Sync will let us sync
with DropBox and Sea File. The comprehensive testing showed how Quick Sync may efficiently cut sync
efficiency and reduces considerable traffic latency for typical sync applications.

C. Public Auditing Using Dynamic Hash Tables for Secured Data Storage

Dynamic hashing tables, the novel multiple data models hosted once at an independent auditor, serve to preserve
information metadata for dynamic audits. Unlike previous efforts, the developed scheme relocates permitted
content from the CSP to the TPA, considerably reducing computational load and network latency. To facilitate
data confidentiality by integrating any homomorphic encryption authentication scheme that relies on the public
keys also with TPA's randomized mask, and even to accomplish batch audits by using the aggregated BLS
signatory approach. The suggested system enabled safe audits for cloud computing efficiently as well as surpasses
earlier techniques in computing complexities, processing costs, and overall connectivity latency. Furthermore, for
data protection, it incorporates a randomized mask given mostly by TPA into the certificate generation system to
blind the data contents.

It also uses the accumulated BLS signature methodology from interpolation maps to conduct different auditing
tasks all at the same time, the basic concept is to group all the credentials by multiple individuals on various
information frames into a short one then authenticate it just once to decrease the resource consumption in the
verification system. As a result, it might be a current thing to build a more solution that helps that includes diverse
audit procedures for distinctive types of cloud environments.

D. Outsourced Attribute-Based Encryption with Keyword Search Functionality

Attribute-based cryptographic protocols were utilized to create a good access management system, that presents a
potentially viable solution to cloud safety problems. Outsourced ABE with the fine-grained access-control system

ISSN: 2581-5059 www.mjbas.com


205
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

may significantly lower computational costs for customers who need to retrieve secured files in the cloud by
offloading computation to the cloud providers.

Even as the number of protected data located in the cloud grows, effective query execution would be hampered. To
address the aforementioned issue, a modern cryptography primitive known as an attribute-based cryptosystem
including outsourced key-issuing as well as outsourced decoding, which could also perform keywords search
options, was developed.

The time-consuming coupling procedure may be transferred to a cloud-based service provider whereas clients
could do minor activities. As a result, the cost decreases for both the client as well as trustworthy authoritative
perspectives. The suggested approach includes a keyword-based capability, that might considerably increase the
effectiveness of communication while also protecting customers' privacy and safety.

E. Storage Service Through Multiple Cloud Providers in Clouds

Numerous cloud service providers offer digital storage solutions via computer servers located across the globe.
Regarding resource consumption and reservations, several information systems offer varying become latency as
well as pricing. Furthermore, to lower transaction expenditures and service delay, researchers suggested three
advanced techniques: (1) information re-allocation depending on coefficients; (2) information transmission
depending on multicast; as well as (3) congestion management depending on query redirect.

The user data center makes the queries to either a memory cluster that stores the necessary information based on
the activities of a company's users. DAR seeks to establish a plan for a consumer who distributes every piece of
information to a handful of chosen cloud data centers, assigns requested service ratios to such cloud data centers,
then calculates reservations in order to maintain the SLO and reduce the user's transaction expense.

The above effort is intended to reduce customer payment costs while also maintaining SLOs by utilizing globally
dispersed data centers that belong to different CSPs with varying service unit rates. This reducing costs issue is
solved by utilizing mathematical programming within the initial model. Because the issue is NP-hard, researchers
proposed the DAR concept as a pragmatic approach that combines a data allocation algorithm across memory
network infrastructure and an effective resource reservation method to reduce the expense of every memory
network infrastructure.

F. SLO-Guaranteed Cloud Storage Service

A multi-cloud Inexpensive and SLO-guaranteed Storage Service that establishes information distribution, as well
as network control, plans while minimizing transaction cost and guaranteeing SLO. ES3 includes a synchronized
information allotment but also recourses reservation technique that assigns every piece of information to a cloud
data center but also calculates this same channel access quantity on cloud data centers besides utilizing those price
strategies, as well as an encryption method helps in gathering alteration technique that reduces data Get/Put rate
variance for each data center to maximize the reservation benefit. Payments Reduction Objectives solve the
challenge of finding the ideal information distribution as well as capacity allocation plans for reducing cost and
SLO assurance utilizing mathematical programming.

ISSN: 2581-5059 www.mjbas.com


206
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

Researchers offered a multi-cloud Economical and SLO-guaranteed Cloud Storage Service for a cloud broker
which delivers SLO assurance as well as reduced costs regardless of how the Get rate varies. ES3 is superior to
earlier techniques because it completely leverages multiple price strategies and takes query rate variation into
account when lowering transaction costs. To ensure the SLO while minimizing transaction costs, ES3 includes a
data distribution and reserving technique as well as a GA-based data allocation adjustments methodology.

G. Key-Aggregate Cryptosystem

This demonstrates where to safely, effectively, as well as dynamically exchange information via online storage
with one another. Researchers provided a novel public-key public key cryptosystem which generate continual
encrypted messages, allowing for the effective assignment of decoding privileges for any collection of encrypted
text. The uniqueness entails that any group of private keys may be combined as well as made as small as a single
key while retaining the overall strength of each of the keys getting gathered.

This small aggregated key may be easily given to someone else or kept on a smart card with restricted safe data
storage. In the general framework, researchers offered rigorous vulnerability scanning of various systems. They
additionally discussed how these strategies may be used in various ways. The procedure is based on
collusion-resistant broadcasting cryptography technique. Despite their approach allowing for continuous private
keys, each key can just decipher cipher messages linked with certain indices. As a result, they must develop a new
Extract algorithm as well as a related Decryption method.

6. Conclusions

Cloud services remain undeniably among the most appealing technological fields of the present, owing to
cost-effectiveness and versatility. This paper suggests that considering the spike in activities as well as attention,
there seem to be serious, ongoing worries regarding cloud applications that are hindering progress or may
ultimately jeopardize cloud computing's objective like a revolutionary IT acquisition paradigm.

Considering the acknowledged financial and technological cloud computing services, numerous prospective cloud
consumers are still waiting to adopt the cloud, and even those significant organizations who are cloud customers
are mostly storing less important information in the cloud. This study found out that, lack of control leads to
vulnerability in cloud technology that runs counter to the initial agreement of cloud applications, stating that cloud
application is irrelevant. Moreover, this paper recommends that accountability is required for financial purposes as
well as to alleviate concerns about the possibility of cybersecurity incidents.

References

[1] Arunarani, A., Manjula, D., Sugumaran, V. (2019). Task scheduling techniques in cloud computing: A
literature survey. Future Gener. Comput. Syst., 91: 407–415.

[2] Basu, S., Bardhan, A., Gupta, K., Saha, P., Pal, M., Bose, M., Basu, K., Chaudhury, S., Sarkar, P. (2018). Cloud
computing security challenges & solutions-A survey. In Proceedings of the 2018 IEEE 8th Annual Computing and
Communication Workshop and Conference (CCWC), Las Vegas, NV, USA, 8–10 January 2018, Pages 347–356.

ISSN: 2581-5059 www.mjbas.com


207
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

[3] Chandramouli, R., Iorga, M., Chokhani, S. (2014). Cryptographic key management issues and challenges in
cloud services. In Secure Cloud Computing, Springer: New York, NY, USA, Pages 1–30.

[4] Cook, A., Robinson, M., Ferrag, M.A., Maglaras, L.A., He, Y., Jones, K., Janicke, H. (2018). Internet of cloud:
Security and privacy issues. In Cloud Computing for Optimization: Foundations, Applications, and Challenges,
Springer: New York, NY, USA, Pages 271–301.

[5] Lamba, A. (2014). Cyber Attack Prevention Using VAPT Tools (Vulnerability Assessment & Penetration
Testing). Cikitusi Journal for Multidisciplinary Research, 1(2).

[6] Kim, W. (2019). Cloud computing: Today and tomorrow. J. Object Technol., 8: 65–72.

[7] Li, R., Xiao, Y., Zhang, C., Song, T., Hu, C. (2018). Cryptographic algorithms for privacy-preserving online
applications. Math. Found. Comput., 1: 311.

[8] Liu, D. (2014). Securing outsourced databases in the cloud. In Security, Privacy and Trust in Cloud Systems,
Springer: New York, NY, USA, Pages 259–282.

[9] Anil Lamba (2014). Uses of different cyber security service to prevent attack on smart home infrastructure.
International Journal for Technological Research in Engineering, 1(11): 5809-5813.

[10] Pearson, S., Benameur, A. (2010). Privacy, security and trust issues arising from cloud computing. In
Proceedings of the 2010 IEEE Second International Conference on Cloud Computing Technology and Science,
Indianapolis, IN, USA, 30 November–3 December 2010, Pages 693–702.

[11] Anil Lamba (2017). A comprehensive survey on security in cloud computing. International Journal for
Technological Research in Engineering, International Conference on Emerging Technologies in Engineering,
Biomedical, Medical and Science (ETEBMS - 2016), Pages 31-34.

[12] Sookhak, M., Gani, A., Talebian, H., Akhunzada, A., Khan, S.U., Buyya, R., Zomaya, A.Y. (2015). Remote
data auditing in cloud computing environments: A survey, taxonomy, and open issues. ACM Comput. Surv., 47:
1–34.

[13] Anil Lamba (2017). Deriving intelligent data analytics using anomaly detection framework for iot network
and smart environments. International Journal for Technological Research in Engineering, 4(6): 5682-5686.

[14] Subashini, S., Kavitha, V. (2011). A survey on security issues in service delivery models of cloud computing.
J. Netw. Comput. Appl., 34: 1–11.

[15] Subramanian, N., Jeyaraj, A. (2018). Recent security challenges in cloud computing. Comput. Electr. Eng.,
71: 28–42.

[16] Tan, Z., Nagar, U.T., He, X., Nanda, P., Liu, R.P., Wang, S., Hu, J. (2014). Enhancing big data security with
collaborative intrusion detection. IEEE Cloud Comput., 1: 27–33.

[17] Anil Lamba (2017). Analyzing and fixing cyber security threats for supply chain management. International
Journal for Technological Research in Engineering, 4(5): 5678-5681.

ISSN: 2581-5059 www.mjbas.com


208
Mediterranean Journal of Basic and Applied Sciences (MJBAS)
Volume 4, Issue 4, Pages 194-209, October-December 2020

[18] Wang, C., Ren, K., Yu, S., Urs, K.M.R. (2012). Achieving usable and privacy-assured similarity search over
outsourced cloud data. In Proceedings of the 2012 Proceedings IEEE Infocom, Orlando, FL, USA, Pages 451–459.

[19] Wu, H., Zhao, B. (2016). Overview of current techniques in remote data auditing. Appl. Math. Nonlinear Sci.,
1: 140–153.

[20] Lamba, Anil and Pal, Pushparaj and Singh, Satinderjeet and Singh, Balvinder and Muni, Sivakumar Sai Rela,
(2018). Quantum Computing Technology (QCT) - A Data Security Threat. JETIR, 5(4).

[21] Yang, K., Jia, X. (2012). Data storage auditing service in cloud computing: Challenges, methods and
opportunities. World Wide Web, 15: 409–428.

[22] Zhan, Z.H., Liu, X.F., Gong, Y.J., Zhang, J., Chung, H.S.H., Li, Y. (2015). Cloud computing resource
scheduling and a survey of its evolutionary approaches. ACM Comput. Surv., 47: 1–33.

ISSN: 2581-5059 www.mjbas.com


209

You might also like