You are on page 1of 8

Projects to Learn AWS

Project 1: Host a Static Portfolio Website

Services Used: Route53, Cloudfront, S3

Difficulty: Easy

Cost: $10

Related Content: YouTube Walkthrough

A blog or portfolio website is a great way to showcase your skills and interests. Why not set
one up from scratch and learn AWS while you’re at it?

In this project, you’ll need to register a domain using Amazon Route 53. Afterwards, you’ll
have a Route53 hosted zone with DNS records. You’ll need to upload your website’s assets
(index.html, js files, css files, images, etc) to S3 and link your S3 bucket to your domain’s

© Copyright Be a Better Dev 2023

1
DNS records. Finally, you’ll configure a CloudFront distribution that will make your website
globally available and optimized for users all around the world.

Note: This project costs approximately $10 to register a domain. This is a yearly fee should you
choose to keep your domain for more than 12 months.

Project 2: Event Driven Facial Analysis with Rekognition

Services Used: S3, Lambda, Rekognition, IAM, DynamoDB

Difficulty: Easy/Medium

Cost: Free

Related Content: AWS Learning Accelerator Course

Facial Detection and Facial Analysis are two hot topics these days with utility in many
applications. Law enforcement uses this technology to identify potential criminals, and
public agencies use them to validate user submitted photographs.

In this project, you build an event application driven application that analyzes user
submitted self-portraits uploaded to an S3 bucket. You leverage AWS Lambda as your

2
compute tier, and Amazon Rekognition to perform the facial analysis process. You will
finally evaluate the results and save them to a DynamoDB database.

If you’re interested in a hands-on walkthrough of this project (and much more!) check out
my course - The AWS Learning Accelerator and use the coupon code BEABETTERDEV_10OFF
for a 10% discount off the price! Check it out here.

Project 3: Two Tier Backend with Caching

Services Used: API Gateway, Fargate, Lambda, RDS, DynamoDB, IAM

Difficulty: Medium/Hard

Cost: Free (without Caching)

Related Content: HTTP API with API Gateway & Lambda,

If you’re a backend or fullstack developer looking to gain experience with APIs, databases,
and caching, this is the perfect project for you.

3
This project idea consists of building Create, Read, Update, and Delete APIs that integrate
with a scalable database, and optionally, a caching layer to improve performance. You’ll use
the API Gateway service to set up a resource, and routes that map to your compute tier.
For compute, you can use many options, but I suggest using either AWS Lambda or AWS
Fargate. AWS Lambda is a very popular serverless compute service that integrates easily
with API Gateway. AWS Fargate uses AWS managed infrastructure to host your compute,
but you won’t need to worry about maintaining it. Typically, folks use Docker containers
with Fargate to define their environments.

If you go with Lambda, you’ll need to create a Lambda Function and implement the
necessary logic for your CRUD apis. If you go with Fargate, you’ll need to set up a cluster
and similarly implement the logic.

For the database layer you also have many options. I suggest going with either Amazon RDS
for a relational database or Amazon DynamoDB if you prefer NoSQL. If you go with RDS,
you’ll need to create a database (I suggest trying Aurora Serverless) and the necessary
tables. You may also need to configure your Database’s security groups to allow access
from your home machine. Going with DynamoDB is much more straightforward. You just
need to create a table and start adding your items. You’ll be able to view and interact with
your DynamoDB tables directly in the AWS console.

If you want to improve performance and get some experience with caching, you can go one
step further and add a ElastiCache Redis cluster. You can integrate it in your Lambda or
Fargate application during the read flow by first checking if an item exists in the cache
before retrieving it from your RDS or DynamoDB table. Make sure to populate the record
into your cache if you don’t find it!

4
Project 4: Data Processing Workflow With StepFunctions

Services Used: Step Functions, S3, Lambda, DocumentDB, SNS, IAM

Difficulty: Medium/Hard

Cost: Free

Related Video: Step Functions With Lambda & DynamoDB

Applications that need to perform a sequence of events in a distributed but reliable way
are very common in cloud based development.

5
In this project, you leverage an orchestration service called Amazon Step Functions to build
a data processing pipeline. You’ll create a Step Function state machine that defines the
sequence of steps in your workflow. The workflow will iterate over all records in a CSV file
stored within S3 using the Step Function Distributed Map Task. For each record, you’ll
process the contents and save the data into a database, in this case DocumentDB (aka
MongoDB). Optionally, you can use any other database you prefer. Once the batch of data
is processed, your final step will be to broadcast the completion to an SNS topic. This way,
clients that are interested in the results can retrieve them.

6
Project 5: CI / CD Pipeline with CDK

Services Used: CDK, CloudFormation, CodeCommit, CodeBuild, CodeDeploy,


CodePipeline

Difficulty: Medium

Cost: Free

Related Video: YouTube Walkthrough Using CDK Pipelines

Developing your application in the console is one thing, but what about automating the
provisioning and deployment of your components by linking it to your Git repository? This
is the perfect project for devops or developers trying to get more experience building a
Continuous Integration / Continuous Deployment (CI/CD) pipeline. Further, you’ll get to use
the Cloud Development Kit (CDK) to define your infrastructure through code.

In this project, you’ll use the CDK Pipelines construct available using any supported
language. This CDK Construct makes it easy to provision, integrate, and deploy your
application. The CDK pipeline leverages Amazon CodeCommit for source control storage,

7
CodeBuild for building your application and running unit tests, and CodeDeploy to deploy
the changes to your infrastructure. Finally, your CodePipeline will orchestrate the entire
deployment process while also providing great monitoring/visibility into the process.

You can use this same process to set up multiple stages (e.g. dev and prod) to ensure you
isolate your testing and production environment.

You might also like