You are on page 1of 7

Deploying Microservices Using Serverless

Architecture
Dorival Querino Following
May 1 · 4 min read

S erverless architecture (sometimes called FaaS — Function as a Service) is an


execution model where the cloud provider is responsible for running pieces of
code, dynamically allocating resources, and the customer pays only for the execution
time.

The code is typically executed into ephemeral containers and can be triggered by several
types of events, including HTTP requests, queues, database events, monitoring alerts,
file uploads, cron jobs, etc. In this model, the client doesn’t need to care about servers
because it’s abstracted by the provider’s architecture.

Let’s start by analyzing the architecture defined in Figure-1, where we can see a
conventional deployment of microservices.
Figure 1: A Basic Microservices Architecture

Web can be a SPA (Single Page Application) running in a Web Server, (Tomcat for
example).

API Gateway could be Spring Boot application running Zuul, a proxy application that
handles requests and does the dynamic routing of microservices applications.

Microservices (MS in the figure), Spring boot running RestControllers, Services, and
Repositories

DataBase (DB), MySQL databases, it can be a DB server with one or more instances per
machine, where each service has its DB instance.

Now let’s see how we can deploy our microservices using a serverless architecture.

We can start by turning our code into Serverless Functions (FaaS) and deploy them right
in the cloud.

Figure 2: Functions (FaaS): You code right in the cloud


But it appears something is missing here. How would functions be accessible from
outside of the cloud, over the Internet?

Figure 3: Serverless functions themselves are not accessible from the internet

To publish our functions over the Internet, the natural candidate to help us is the API
Gateway. It is the front door to access most of the services in the cloud. It is a serverless
cloud resource used to expose service to the Internet. It offers the most common features
you need to achieve that, such as caching, security, the throttle of requests, cors access,
and others.
Figure 4: API Gateway is the most common way to expose cloud resources to the Internet

Meticulous developers and architects can look at the implementation of microservices


using serverless functions with certain criticism. As serverless function are only pieces of
code to be executed in the cloud, they are too small to be considered as microservices,
they look more like nano services, — a microservice antipattern where services become
too much granular.

Figure 5: Serverless functions are small pieces of code, wouldn´t they be considered as nano services?

A single function can be too small to execute some complex tasks of a microservice. So,
most of the time, we will use a set of functions working together, and, also API Gateway
as aggregators, so we can model our serverless microservices the right way.
Figure 6: You can aggregate several functions to achieve more complex tasks

And finally, to complete our architecture, our functions are going to store our data, part
in a NoSQL and part in a SQL serverless database and, also, retrieve data from file
storage. All of them are serverless resources.

Figure 7: Serverless functions can read from le storage and record data in a DB, Serverless DB or NoSQL DB

Now, we can implement our architecture in a cloud provider that offers the resources we
need to achieve that, the AWS, as we can see in figure 8.
Figure 8: Resources to implement serverless on AWS (Lambda, DynamoDB, API Gateway, Aurora, S3)

The Lambda Functions to execute the compute tasks, some acting as aggregators
and others executing specific tasks.

S3 is used as object storage, where we can save pictures or receive text data to be
processed, such as CSV file to be imported to the databases.

We have the AWS Aurora Serverless, which is a relational database highly scalable,
that you pay only for the request you execute. It’s the best fit to work together with
the highly scalable Lambda functions.

And we have the DynamoDB, a high scalable serverless NoSQL database, where we
can access data with latency as low as two digits milliseconds.

As you can see, it’s achievable to deploy your microservices as serverless resources. As
you may have known, serverless, like any other technology, will not fit all the situations
nor to be the best solution to all the problems. But it offers an incredible advantage on
the time need to put a project into production, it reduces costs with infrastructure,
eliminates the concerns with scalability and availability. On the other hand, it is highly
distributed models, so you have all the concerns related to distributed systems, besides
some limitations regarded to timeout, memory, storage among others.
. . .

If this post was helpful, please click the clap 👏 button below a few times to show your
support for the author

Serverless Microservices AWS AWS Lambda

About Help Legal

Get the Medium app

You might also like