You are on page 1of 25

Serverless Computing

Challenges and Road Ahead

Manan Sharma, Deepak Saini, Utpal Chaudhary, Riteek Raj Modi, Vani Gupta, Monika
Introduction
Cloud Computing
Cloud Computing is the on-demand delivery of
computing resources such as computing power and
storage usually through the public network. It relies
on sharing of resources among its users and usually
uses the ‘pay-as-you-go’ model, which means now
users have to pay only for OPEX (operational
expense) rather than CAPEX (capital expense). It
surely helped in reducing capital expenses but also
lead to unexpected operating expenses for users.
AWS
Global Leader

AWS was one of the first companies which lead to the Commercialization of
Cloud Computing. AWS was launched in July 2002 with the aim to innovate
and build entrepreneurial applications on the cloud. By 2010 other vendors like
Microsoft Azure, and Google Cloud Platform came into the picture. After June
2015, the term serverless began to gain popularity after the release of the AWS
service called AWS Lambda. Now at present, we have many cloud providers
other than AWS like Microsoft Azure, Google Cloud, Rackspace, IBM, Oracle,
etc.
Architectures
Virtual Machine
• For many years, in cloud computing, the architecture entire machine down to the hardware
layers was virtualized with the help of a hypervisor. In this architecture there was a pool of
virtual machines each carrying a copy of the operating system was created on the top of the
machine. In short virtual machines creates a copy of hardware or computer system that
makes it possible for a single machine to host several virtual machines that act as if they
are separate piece of hardware.
• These virtual machines did not interact with other virtual machines on the same host
machine and were self-contained units and contained copies of everything from the
operating system to libraries and databases.
• However, with time, it turned out it was not the most effective architecture.
Containers
Containers are a lighter-weight, more agile way of handling virtualization — since they don't use a
hypervisor, you can enjoy faster re-source provisioning and speedier availability of new applications .
So instead of virtualizing the hardware containers virtualize the operating system so each individual
container contains only the application and its libraries and dependencies.
Containers deliver a much higher level of abstraction than virtual machines and also provide resources
faster than Virtual Machines.
Although containers are far more efficient and faster than virtual machines, their growth is inhibited due
to basic infrastructural elements called servers. This gives rise to new computing architecture known as
Serverless computing.
SERVERLESS COMPUTING
Serverless Computing

• Although its name is Serverless it does not mean that servers are not involved here, here what it does is that users can
develop their application independently without thinking about servers as cloud providers manage servers on behalf of their
customers.
• Serverless computing is a programming model and cloud architecture where the application is decomposed into ‘triggers’
(events) and ‘actions’ (functions) where small code snippets are executed in the cloud without any control over the re-
sources.
• Here we are billed not for resources allocated but for the execution time.
• The application consumes the resources only at the time of execution and once the execution time is over, it releases the
resources. The price model includes only the amount of time in which the resources were in use and the application
developer need not pay for resources until they are executed, thus it is referred to as ‘serverless’.
• Since we are being charged only for the execution time and the whole process revolves around functions it is also known as
FAAS (Function as a Service). FaaS is a simple and straightforward approach just writing functions in programs to reuse
the code in the upcoming program.
Challenges
• Lack of Control: You don't own or one provider for your serverless
have any control over the server architecture but wish to use their
infrastructure when using serverless capabilities with other cloud services.
computing. This means that problems
with your server's hardware, software, • Potential effects on performance: A
or other components could have a function is terminated after a period of
significant negative influence on your inactivity. This results in a temporary
business. slowdown in the time it takes the code
to execute when it is called again, a
• Possible compatibility issues: You condition known as a cold start that may
could desire to employ cloud services have an impact on your business
from many suppliers, which could operations.
present compatibility problems.
Although this is theoretically
conceivable, you can run into
compatibility problems if you utilize
Cold Start
PROBLEM
• The capability of serverless to scale to zero during periods of inactivity is a key
selling factor. The resources of a function are spun down if it is not actively being
used, which restores platform capacity and lowers the cost to the user of reserving
those components. From a financial standpoint, this is perfect because users will only
be charged for the time and resources their code actually uses.

• The drawback of this is that there is a known delay the next time it needs to execute
when the resources totally spin down. Running the function requires reallocating the
resources, which takes time. You end up with a single set of performance
characteristics for recently utilized "hot" functions and another profile for “cold”
functions that must be patiently created by the platform before being executed.
Serverless Security Risks & Challenges:
• An unsafe configuration :Providers of cloud services provide a variety of options and
features. Unattended incorrect settings or configurations might pose serious security
risks. It may serve as a point of entry for assaults against serverless infrastructures.
• Permissions for overly powerful functions:Each autonomous function within the
serverless ecosystem has its own services and responsibilities for a specific purpose.
Users shouldn't have access to more than they need, therefore make sure of that. The
functions' rights and permissions should be set up correctly. A situation where
functions become overprivileged may result, posing a possible security risk.
• Injection of event data:Application with injection issues is one of the most frequent
security threats. These can be triggered by events in cloud storage, NoSQL database,
code modifications, and another source besides untrusted inputs in application calls.
Careful assessment is required of various input kinds that may not come from
unreliable event sources. The possible assault surface is expanded by this diverse
collection of event sources.
• Insufficient function recording and monitoring:The early warning indicators of an
attack may go unnoticed because serverless solutions may not offer sufficient security
features for recording and monitoring applications.
• Reliance on unreliable third parties:Serverless applications deal with the integration
of back-end cloud services, database services, and other requirements from third
parties. If they have weaknesses, this can open the door to exploitation.
Other Challenges
• Security: As perimeters evaporate, serverless architecture calls for new security
paradigms and best practices. It is exceedingly challenging to reliably implement
security standards throughout the entire application because each container or
serverless workload provider uses its own security frameworks.
• Observability: Using outdated techniques to monitor and troubleshoot contemporary
applications is challenging. In addition to the fact that outdated measures are no longer
useful, it is challenging to set up serverless applications for monitoring agents. In any
scenario, distributed asynchronous tracing must be implemented in addition to
conventional logs.
• Cost: The costs of serverless computing may be both direct and indirect. For instance,
significant reliance on API calls might increase expenses and lead to difficult-to-fix
performance bottlenecks.
Future of Serverless
computing
• fThe benefits of serverless computing and the use maintenance costs, and delivers great security.
of AWS Lambda for serverless applications have The advantages of serverless computing include
been discussed in this paper. the fact that the cloud service provider handles
Future iterations of the application may explore maintenance, server ownership, and resource
the use of alternative NoSQL databases in place allocation instead of the developer. As pricing is
of DynamoDB. Integration of AWS Lambda determined by how much time is spent using the
across platforms with other well-known suppliers program or resource, maintenance costs are low.
of serverless technology, such as Google Cloud
and Microsoft Azure Functions, etc. Serverless • A difficulty with serverless cloud computing is
computing is here to stay for a very long time that it cannot be utilized for processes that take a
and will change the way we design, test, deploy, long time to complete because the cost is
and manage our apps with the aid of AWS and determined by how long the code takes to
other comparable cloud services and their execute. Given that vendors are in charge of the
reducing costs for the resources. entire backend, there are certain security issues
as well.
• The amount of time needed to host a project is
minimal when using cloud computing, but
security and maintenance costs cannot be
considered. However, employing serverless cloud
computing reduces execution time, and
Conclusion
• Serverless computing is a growing market and it continues to evolve as cloud providers
continue to come up with new solutions for its drawbacks and challenges. Serverless
computing allows developers to focus on developing applications and business logic while
taking care of various resources.
• Sadly, there has been a lack of interest from the research community and academics
regarding this. It seems that the task to solve various wide variety of technically
challenging and intellectually deep problems in this space, ranging from infrastructure
issues such as optimizations to the cold start problem to the design of a composable
programming model has been left to Big Cloud providers or future generations. We
strongly feel if more academicians get into it, it will not only benefit cloud industry but
also humanity as with proper and efficient usage of servers we can lower the carbon
footprint to a significant level

You might also like