You are on page 1of 3

www.skipq.

org
Course Curriculum
Prerequisites: Basic knowledge of Python programming.
Course Project: An engineer graduating from this course will use infrastructure-as-code (IaC) constructs
to build and operate a production-grade web application running across multiple AWS Regions. The
DevOps engineer will write a RESTful Python application on AWS, use CI/CD to automate multiple
deployment stages (prod vs beta), write unit/integration tests to continuously deploy the application
without human intervention, store logs in CloudWatch, and automate metrics to roll back a deployment
in case of service degradation.
IaC built in this course will rely on industry-standard Cloud Development Kit to deploy our infrastructure
pieces, Github to version control the software, periodic and event-based Lambda functions to take care
of business logic. Code Pipelines will be used to incrementally and safely deploy the code in waves. Each
deployment stage will run different kinds of tests to ensure the safe deployment of code. Our application
will use Cloudwatch to store business-critical metrics along with application logs. Apart from test cases,
we’ll also create Cloudwatch alarms to send a notification if anything goes wrong. We will also build
Docker-based API testing clients that will be hosted in Elastic Compute Cloud (EC2) instances and will test
the public endpoints built in this project. Other technologies used to build this application will include
Lambda, CloudWatch, DynamoDB, API Gateway, SNS, SQS, Cloud9, and S3.

Sprint-wise Instruction Plan


Sprint Learning Objectives

Sprint 1 Project: Use AWS CDK to build a canary in a Lambda function. This canary runs in one AWS
Region and measures availability and latency when accessing a predefined, public web
application. Push the code to versioning control repo. Manage README files in markdown on
GitHub.
Concepts:
● Introduction to the DevOps Engineer Role and Infrastructure-as-Code (IaC)
● Introduction to AWS: Regions/AZs/Edge Services, Foundational services (EC2, S3,
CloudFront), Microservice architecture
● Learn AWS Services: IAM, Lambda
● Learn Tools: Shell and Scripting, Vim, GitHub
● Start writing code on AWS

Sprint 2 Project: Extend the canary Lambda function into a web crawler -- crawl a custom list (json file)
of websites from an S3 bucket, also including webpages that should be crawled on these
websites.
Run the crawler periodically on a 5 min cadence and write <availability, latency> metrics for
each website and each run to CloudWatch using CloudWatch’s API. Create a CloudWatch
Dashboard to monitor website health, and set up alarms when availability or latency falls below
prescribed thresholds. Every alarm is also published to an email address using Simple Email
Service (SES). Manage README files and runbooks in markdown on GitHub.
www.skipq.org

Concepts:
● Introduction to the Art of Monitoring Web Applications
● Learn AWS Services: CloudWatch, SNS/SQS
● Extend IaC to automate CDK deployments across multiple AWS Regions
● Introduce scalability in web application

Sprint 3 Project: Create multi-stage pipeline having Beta/Gamma and Prod stage using CDK. Deploy the
project code in 1 Region. Each stage must have bakeTimes, code-review, and test blockers.
Write unit/integration tests for the web crawler. Emit CloudWatch metrics and alarms for the
operational health of the web crawler, including memory and time-to-process each crawler run.
Automate rollback to the last build if metrics are in alarm. Manage README files and runbooks
in markdown on GitHub.
Concepts:
● Introduction to CI/CD
● Learn AWS services: CodePipeline for build and test, CodeDeploy for CD
● Integrate AWS CodePipeline with GitHub
● Learn automated testing using PyTest running
● Build a release process by writing merge-blocking automated tests for the canary on
CodePipeline
● Build operational CloudWatch metrics for web crawler
● Write rollback automation allowing rollback to last build
● Setup beta and prod environments in CodePipeline and deploy using CodeDeploy

Sprint 4 Project: Build a public CRUD API Gateway endpoint for the web crawler to
create/read/update/delete the target list containing the list of websites/webpages to crawl.
First, move the json file from S3 to a database (DynamoDB). Then implement CRUD REST
commands on DynamoDB entries. Extend tests in each stage to cover the CRUD operations and
DynamoDB read/write time. Write API documentation and commit to GitHub. Manage
README files and runbooks in markdown on GitHub.
Concepts:
● Learn AWS Services: API Gateway, DynamoDB
● Write a RESTful API Gateway interface for web crawler CRUD operations
● Write a Python Function to implement the business logic of CRUD into DynamoDB
● Extend tests and prod/beta CI/CD pipelines in CodeDeploy / CodePipeline
● Use CI/CD to automate multiple deployment stages (prod vs beta)

Sprint 5 Project: Use docker-compose to build API test clients using pyresttest and Syntribos. Publish
built images to Elastic Container Registry (ECR). Continuously run functional and security API
tests as cron-job on local machines. These tests will exercise the web crawler’s CRUD endpoint
built in Sprint 4.
Concepts:
● Learn AWS services: ECR
● Learn docker: DockerFile, Image, Containers. Docker commands [build, start/stop
container, deleting container].
● Learn container registries: Working with images (Pull/Push), auto-updates to pull new
images once published
● Learn API functional testing framework: pyresttest
● Learn API security testing framework: Syntribos
www.skipq.org

Sprint 6 Project: Deploy API test clients from Sprint 5 on an EC2 instance. Build and push API test
dockers through CodePipeline. Push API test results into CloudWatch. Setup alarming and
notification on API test metrics. Extend tests in each stage. Manage README files and runbooks
in markdown on GitHub.
Concepts:
● Learn AWS Services: EC2
● Extend tests and prod/beta CI/CD pipelines in CodeDeploy / CodePipeline
● Use CI/CD to automate multiple deployment stages (prod vs beta)

You might also like