You are on page 1of 42

TRAINING REPORT

OF

During the Internship Period from 7 June 2021 to 30 June 2022


At

FICO

IN

DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING

ON

DEVSECOPS

SUBMITTED IN PARTIAL FULFILLMENT OF THE DEGREE

OF

BE(CSE)
DECLARATION

I hereby declare that the dissertation entitled Training Report submitted for the B.E

Degree is my original work, and the dissertation has not formed the basis for the award

of any degree, associateship, fellowship or any other similar titles.

Place: Panipat. Shivam Soin


Date: 06/10/2022 1910990510
CERTIFICATE
This is to certify that the project report entitled Training Report being submitted by

SHIVAM SOIN 1910990510

in partial fulfillment for the award of the Degree of Bachelor of Engineering in Computer Science and

Engineering to the Chitkara University, Punjab is a record of bonafide work prepared by the candidate(s) above as

part of their internship during the period from 28 February 2022 to 30 June 2023.
ACKNOWLEDGEMENT
It is my privilege to thank all the people who contributed to make this internship period
a great and unforgettable experience of my life.

First, I would like to thank all my teachers at Chitkara University for helping me out at
every step and Chitkara University, Punjab for providing me the opportunity of gaining
practical industry knowledge through internship training at FICO.

I take this opportunity to express my sincere gratitude and deep regards to my Manager,
Kiran Thomas for his exemplary guidance, monitoring and constant encouragement
throughout the course of this internship.

I also thank all my team members for sharing the knowledge and experienceof their
respective fields, which assisted me in the successful completion of the project work.

Finally, I thank every member of the FICO family in technical, non-technicalfields and
support staff who have helped me.

This internship program provided to me opportunities to work in different settings and


exposed me to interaction with different sets of people. I was able to utilize and
enhance different skills in me like decision-making,communication, teamwork, and
coordination.
ABSTRACT
A student gets theoretical knowledge from classroom and gets practical knowledge
from industrial training. When these two aspects of theoretical knowledge and
practical experience are together then a student is fully equipped to secure his best.

In conducting the project study in an industry, students are exposed and have
knowledge of real situation in the work field and gains experience fromthem. The
objective of the Industrial Training is to provide an opportunity to experience the
practical aspect of the technology in an organization. It provides a chance to get the
feel of the organization and its function.

FICO (Fair Isaac Corporation), originally Fair, Isaac and Company, is a data
analytics company based in San Jose, California focused on credit scoring services.
Bill Fair and Earl Isaac founded it in 1956. Its FICO score,a measure of consumer
credit risk, has become a fixture of consumer lending in the United States.

In an agile software development environment, the working model and the


operations need to be super flexible to the ever-changing needs of the company.
DevOps – A combination of tools and practices aid in software development along
with IT operations and go hand-in-hand. This cross- functional working mode aims
in minimizing the duration of the system's development life cycle and provides
continuous deployment and delivery.

Incorporating CI/CD into your organization’s development process reduces the


number of non-critical defects in your backlog. These small defects are detected
prior to production and fixed before being released to end-users. The benefits of
solving non-critical issues ahead of time are many.

In DevOps, automation means eliminating the need for human engineers tointervene
manually to facilitate DevOps practices.

This report gives an overview of what I have learnt and worked upon during my
industrial training. It gives an insight into my project work on CI/CD and
Automation.
TABLE OF CONTENT

S. No. Topic Page No.

1. Details of the Organization 9 - 16

2. Training Process 17 - 26

3. Understanding Epics 27-29

4. Tasks Accomplished 30-37

5. Conclusion 38

6. References 39
LIST OF FIGURES

S. No. Figure description Page No.

1.1 Fico Customers 16

2.1 Intern Induction Program 17

2.2 First 30 Days 18

2.3 First 60 Days 19

2.4 First 90 Days 21

3.1 Docker File 32

3.2 Jenkins File 32-34

3.3 Jenkins Pipeline Build 35

3.4 Image in Ecr Repository 35

4.1 Gremlin Dashboard 36

4.2 Monitor Attacks 339

5.1 File Structure 40

5.2 Jenkins Build for ror.024 41


CHAPTER 1
Details Of the Organization

FICO (Fair Isaac Corporation), originally Fair, Isaac and Company, is a data
analytics company based in San Jose, California focused on credit scoring services.
Bill Fair and Earl Isaac founded it in 1956. Its FICO score, a measure of consumer
credit risk, has become a fixture of consumer lending in the United States.

1.2 DESCRIPTION

FICO is a leading analytics software company, helping businesses in 90+


countries make better decisions that drive higher levels of growth, profitability and
customer satisfaction. The company’s ground-breaking use of Big Data and
mathematical algorithms to predict consumer behaviour has transformed entire
industries.

FICO provides analytics software and tools used across multiple industries to
manage risk, fight fraud, build more profitable customer relationships, optimize
operations and meet strict government regulations. Many of our products reach
industry-wide adoption — such as the FICO Score, the standard measure of
consumer credit risk in the United States. FICO solutions leverage open-source
standards and cloud computing to maximizeflexibility, speed deployment and
reduce costs. The company also helps millions of people manage their personal
credit health.
Founded in 1956, FICO introduced analytic solutions such as credit scoringthat have
made credit more widely available, not just in the
United States butaround the world. We have pioneered the development and
application of these technologies to help businesses improve the precision,
consistency, and agility of their complex, high–volume decisions.

FICO has offices throughout the world serving industries including financial services, health care,
insurance, automotive, public sector, retail, pharmaceuticals, telecommunications, travel and
hospitality, media and entertainment, high tech and utilities.

FICO clients include more than half of the top 100 banks in the world, more than 600 personal and
commercial line insurers in North America and Europe including the top 10 US personal lines
insurers, 400+ retailers and general merchandisers, including one-third of the top 100 U.S. retailers,
95 of the 100 largest financial institutions in the U.S., and all the 100 largest U.S. credit card issuers
and more.

1.3 HISTORY AS FAIR, ISSAC AND COMPANY


FICO was founded in 1956 as Fair, Isaac and Company by engineer William Fair
and mathematician Earl Isaac – on the principle that data, used intelligently can improve
business decisions. FICO builds its first credit scoring system for American Investments in 1958,
FICO pitched its system to fifty American lenders.

In 1972, ASAP™, the first automated application-processing system, debuts at Wells


Fargo. December 1976, the 500,000th document was entered through a CRT terminal.

FICO went public in 1986 and is traded on the New York Stock Exchange. The company debuted
its first general-purpose FICO score in 1989. FICO scores are based on credit reports and "base"
FICO scores range from 300 to 850, while industry-specific scores range from 250 to 900. Lenders
use the scores to gauge a potential borrower's creditworthiness.

In 1991, FICO credit bureau risk scores made available at all three major US credit reporting
agencies – BEACONsm at Equifax, EMPIRICA® at Trans Union, and the Experian/FICO model
at Experian. Fannie Mae and Freddie Mac first began using FICO scores to help determine which
American consumers qualified for mortgages bought and sold by the companies in 1995.

In 2000, FICO expands availability of NextGen FICO scores through an agreement with
TransUnion LLC. FICO partners with Call Credit, UK in 2001, to be first to deliver consumer
credit scores directly over the web to UK lenders.
In 2003, Fair, Issac and company was renamed as Fair Issac Corporation.

1.4 HISTORY AS FAIR ISSAC CORPORATION


FICO announced 10 millionth score sold to US consumers in 2005 through myFICO.com and its
partners. Launched Fraud alert network forum for fraud professionals in 2008.

In 2009, Fair Issac Corporation changes brand name and stock symbol to FICO.

1.5 HISTORY AS FICO

In 2010, FICO receives 100th patent for analytics and decision management innovations. FICO
launched ScoreInfo.org to help US consumers understand new risk-based pricing rules.

In 2014, FICO launched FICO decision management suite, a complete platform for
decision management in the cloud.

1.6 TOP PRODUCTS


1.6.1 FICO® Xpress Optimization: It is comprised of four components:
FICO® Xpress Insight enables businesses to rapidly deploy optimization models as powerful
applications. It allows business and other users to work with models in easy-to-understand terms.
FICO® Xpress Executor provides standalone support for optimization execution services, allowing
business to deploy and execute optimization models quickly and easily.
FICO® Xpress Solver provides the widest breadth of industry leading optimization algorithms and
technologies to solve linear, mixed integer and non-linear problems.
FICO® Xpress Workbench is an Integrated Development Environment (IDE) for developing
optimization models, services and complete solutions. FICO® Xpress Workbench is used with and
supports all the other FICO Xpress Optimization components.
1.6.2 FICO® Decision Management: The FICO Decision Management Suite, along with the
FICO Analytic Cloud, provides a comprehensive environment that makes it quick and easy to glean
insights from data, and develop analytic models and decision services that operationalize those
insights. It seamlessly integrates analytics, decisioning, optimization, data visualization and
exploration, rapid application development and other FICO Decision Management Platform
capabilities to provide a complete and agile decisioning solution.

1.6.3 FICO® Decision Central™: FICO® Decision Central™ eases compliance management and
supports profitable business decisions through end-to-end model governance and decision support.
FICO® Decision Central™ monitors and manages the evolution of every component that goes into
making a decision. This gives users visibility and control over the entire decision strategy,
including predictive models, optimization models, strategy trees, rule flows and more.

1.6.3 FICO® Score: FICO® Scores help lenders make consistent, unbiased risk decisions,
and support compliance with national, local and global regulatory requirements, such as Basel
II. FICO works to ensure the scores are understood and accepted by regulators worldwide.
The FICO® Score is used by Fannie Mae, Freddie Mac and the FHA in the mortgage
secondary market; and Standard & Poor's and Fitch IBCA ratings agencies that enable
securitization of industry loan pools into bond securities. A FICO® Score comes with reason codes
that indicate why the score was not higher. These support regulatory compliance and
communication with consumers so they can improve their credit standing over time.

1.6.4 FICO® Score Open Access: With FICO® Score Open Access, FICO extends the license on
FICO® Scores you already purchase for account risk management decisions, enabling you to
display those scores to your end customers. Using those scores, which are already resident in your
customer database, you create your customer experience through your paper statements, online
banking and/or mobile channel. To support your channel displays, FICO provides a digital asset
package for the key branding elements, including the FICO® Score logo lockup and FICO® Score
Meter. You incorporate FICO-written educational content, "Understanding FICO® Scores" and
"Frequently Asked Questions about

FICO® Scores", into your website for customer self-service. Leveraging dedicated FICO
Implementation Consultants and a Planning Guide, you’ll quickly be on your way to opening the
door to better customer relationships.

1.6.5 FICO® Payment Integrity Platform: Healthcare claims processing is not a


static environment. Policies change, manipulation strategies evolve, and patterns emerge.
Adaptive predictive analytics scour data for indications of anomalous behavior across members,
providers, claims, and facilities. By applying machine-learning techniques, adaptive predictive
analytics find previously unknown indications of payment discrepancy. Discrepancies that often
evade detection by rules systems alone. The foundation of the Payment Integrity Platform is FICO
Insurance Fraud Manager. This proven technology coordinates the analytics, link analysis, business
rules, and case management capabilities that help even the largest healthcare payers deliver value
quickly. In fact, Insurance Fraud Manager can analyze up to one million claims per hour while
prioritizing where to focus your limited review cycles for maximum return.

1.6.6 FICO® TONBELLER® Siron® AML: By recording individual and group behavior over
time, Siron® AML creates dynamic profiles and statistics based on customer, account and
transaction data. The transaction monitoring system immediately detects changes in customer
behavior and deviations from peer group behavior. Also, the system continuously classifies
customers according to the risks they pose. This dynamic profiling is the basis for the automated
application of appropriate customer due diligence according to the risk-based approach
recommended by international regulations.

1.6.7 FICO® TONBELLER® Siron® KYC: Siron® KYC supports enterprises through
the critical onboarding process by identifying who is their customer and what is the
customer’s level of money-laundering or terrorists financing risk. Easily adaptable Know Your
Customer questionnaires are customized with necessary statutory requirements and industry
standards to automatically determine the risk rating of potential customers.

1.6.8 FICO® Debt Management Solutions: The debt collection landscape keeps evolving, but
one basic fact remains: organizations collecting debt need better strategies to connect with
consumers and to collect more successfully when they do. FICO® Debt Manager produces
impressive collection revenue increases by automating and streamlining the entire collection
process. Collectors are empowered with precise segmentation of delinquent accounts and full
visibility into consumer data to strategically guide interactions. Think about having the ability to
use focused segmentation, advanced analytics and behavior modeling to gain a deeper
understanding of the consumer. At every stage of the credit lifecycle, appropriate treatments are
applied to restore payment and maintain compliance. Debt Manager drives regulatory compliance
by enabling organizations to implement policies, data-based decisions, structured methodologies,
and documented actions.

1.6.9 FICO® Falcon® Fraud Manager: At the core of the FICO® Falcon® Platform, provides
transaction event monitoring and decisioning technology that allows institutions to implement
fraud protection with an end-to-end, holistic approach. Its comprehensive core functionality that
can easily be extended with add-on and custom capabilities, as well as integrated with back-end
payment monitoring and front-end authentication solutions.
1.6.10 FICO® Origination Manager: FICO Origination Manager supports connected decisions
across the entire lending lifecycle for a more holistic view of the customer relationship that can
then be factored into origination decisions. Origination Manager is composed of four core modules
that can operate together or independently to best fit your origination needs.
• The Application Processing Module includes preconfigured workflow processing steps, queues
and data capture that reduce the implementation cycle time.
• The Decision Module, leveraging FICO® Blaze Advisor® decision rules management system,
includes an intuitive user interface that enables business users to author rules and deploy business
strategies quickly, without the need for IT coding.
• The Data Acquisition Module provides access to external data sources such as consumer or
small business credit reporting agencies and alternative data providers. It also includes credit report
characteristics, which are the key components of FICO® Scores.
• The Analytic Module access to a FICO-hosted service that scores each applicant using the latest
versions of FICO® Pooled Models.
Small Business Scoring Service Version 7.0: The latest version offers new capabilities that provide
credit grantors with more choice and risk measures for identifying additional profit opportunities.
Capabilities include support for higher loan amounts, a bankruptcy score and additional models.

1.6.11 FICO® TRIAD® Customer Manager: It unifies and automates FICO’s cutting-edge work
in predictive analytics & decision strategies to drive profits. TRIAD clients benefit from precise
and consistent customer treatments in credit cards, current or demand deposit accounts, mortgages
and installment loans.

1.6.12 FICO® Strategy Director for Deposit Management: Strategy Director can
be implemented quickly to produce results with its predefined capabilities. It provides transparency
into the entire decision model, allowing you to adjust continuously to achieve sustained
performance overtime. Strategy Director connects directly to existing systems and allows business
users to easily add or update decision variables through its web-based system without the need of
IT support.

1.8 LOCATIONS
• California
• Connecticut
• Colorado
• Delaware
• Florida
• Michigan
• Minnesota
• Montana
• New York
• Texas
• Virginia
• Montreal
• Toronto
• Brazil
• Chile
• Birmingham
• Africa

1.9 CUSTOMERS

FICO clients include more than half of the top 100 banks in the world, more than 600 personal and
commercial line insurers in North America and Europe including the top 10 US personal
lines insurers, 400+ retailers and general merchandisers, including one-third of the top 100
U.S. retailers, 95 of the 100 largest financial institutions in the U.S., and all the 100 largest U.S.
credit card issuers and more.

Below are some of customers of FICO: -

CHASE BMW
DELL Citi Bank
Santander Walmart
Capital Services The co-operative bank
Volk bank Riyad Bank
TBC bank Garanti
EnterCard BDO
Barclays Ecobank
African Bank ICICI Bank
American Airlines LLOYDS Banking Group
Istanbul Germany
London Italy
Spain Sweden
Poland Lithuania
Australia China
India Japan
Korea Malaysia
Philippines Thailand

Figure 1.1 Fico Customers


CHAPTER 2
Training Process

DESCRIPTION
The training program was a very well-structured process that included various pathways
(a collection of text documentation and online courses) which were suitable for everyone
to learn the basics of the applications that they will be using soon.
The pathway consisted of resources collected from various platforms like O’REILLY,
Udemy for Business and FICO Learning (a platform for the Employees).

2.2 Intern Induction Program


This was an introductory program that was devised to help the interns Onboard the
organization, it consists of several courses that tells us about the company. This program
was for the initial week of joining. The contents also included a few small team building
activities and a few educational courses as post work.
During this first week we had an opportunity to have a meeting with some of the senior
employees of the organization and had a chance to interact with them which gave us
inspiration and motivation to learn something different and work hard.

Figure 2.1 Intern Induction


2.3 First 30 Days at FICO Program
This is a program devised to help us with the first 4 weeks of onboarding, it is a completely
structured pathway that consists of numerous courses that help us gain understanding of
what the various departments of the company are meant for and helps us understand all
the policies that are to be followed by each employee.
A few courses that are covered under this pathway are:
 FICO HR Orientation
 Security and Data Privacy Compliance Training
 Preventing Workplace Harassment
 Global Technology Services On-boarding
o This program contains the complete information about the GTS department
and how the workflow is.
 Developing for the Cloud: Requirements & Standards
 Privacy by Design and Default
 GTS Service Management
 Product, Technology, and Services Organization On-boarding Program

Figure 2.2 First 30 Days


2.4 First 60 Days at FICO Program

This program is focused on learning about DevOps and CI/CD. The


courses contain a lot of information on various topics such as:

 A practical introduction to DevOps


 The DevOps Essentials – The Handbook
 The Phoenix Project (E-Book)
 DevOps and CI/CD for Beginner’s
 Introduction to Continuous Integration & Continuous Delivery
 DevOps: CI/CD with Git GitLab, Jenkins, Docker, and Django
 Project in DevOps: Jenkins CI/CD for Kubernetes Deployment.
Figure 2.2 First 60 Days
2.5 First 90 Days at FICO Program

Like the 30 and the 60 days program, the 90 days program also focusses on providing
us a better understanding of the working process of the company along with a huge
attention on the Scaled Agile Framework (SAFe). This pathway also consisted of
several courses and documentations that helped us to grow. They are:
 Privacy by Design and Default
 Empowering Talent: Work like a Genius
o It consists of a series of videos that help us learn the tools and techniques
to improve the efficiency and productivity
 Empowering Talent: Building a culture of inclusion to Drive Business
Success
 Fico SAFe Essentials
 Introduction to SAFe 5.0
 PI Planning
 Pre-PI Planning Rolling Wave Readiness
 PI Objectives
 Program Increment
 Inspect and Adapt
 Innovation and Planning Iteration
 Agile Release Train
 Lean-Agile Mindset and SAFe Principles
Figure 2.3 First 90 Days
2.6 AWS DevOps Engineer Professional
This is a complete AWS DevOps Engineer Certification Course offered by UDEMY that
focusses on many major aspects of the use of AWS Services and their use in DevOps.
This pathway consists of a complete professional course for AWS DevOps, a course for
exam readiness and a link to multiple practice exams that help for the preparation of the
same. A few services that were included in this course include:
 SDLC Automation
o Code Commit
o Code Build
o Code Deploy
o Code Pipeline
o Code Star (Overview)
o Jenkins (Overview)
 Configuration Management and Infrastructure as a Code
o Cloud Formation
o Elastic Beanstalk
o AWS Lambda
o API Gateway
o ECS
o ECR
 Monitoring and Logging
o CloudTrail
o CloudWatch
 Policies and Standards Automation
o SSM
o Config
o Secrets Manager
 Incident and Event Response, Fraud Tolerance
o Auto Scaling Groups (ASG)
o DynamoDB
Figure 2.4 AWS DevOps Engineer Professional
CHAPTER 3 The Milestones

Problem Statement
Develop, manage build and deploy a Java based Web Application on AWS Cloud Infrastructure with fully
automated continuous integration and continuous delivery pipeline..

Key Points
1. This assignment is distributed into 5 stages with final stage of demonstration. Each stage has its own goal
with requirements.
2. Some references are provided in each stage which can be useful to understand the task. All these
references are available over internet and more information can be explored as they deemed fit.
3. Each stage has been described with expected time of completion. If any stage is completed ahead of time,
next stage work can be started right away.
4. This is an individual Project assignment.

Prerequisites
Access
 FICO Bitbucket server gitserver.fairisaac.com:8443 https://jive.fico.com/docs/DOC-56030
 Access to FICO AWS Training AWS Account
 Go to https://ficoitservices.service-now.com/internaless?id=sc_home
 Select Order Something -> Global technology Services -> Amazon Web Services -> Request
Access to AWS Console, API or CLI
 Account Name: AWS Training
 Role: Admin
 Notes/Comments: Need access for DMS Internship training assignment. Administrative access is
needed as create/manager IAM roles for assignment.

Open Source Software


 Java Development IDE(Eclipse or IntelliJ)
 Git CLI and Source Tree
 Maven Command line tool
 AWS CLI
 MySQL Workbench
 Tomcat

Stages
Stage I
1.1.1 Goal
Develop a sample Rest API fronted Web Application Project which accepts user inputs and data gets persisted in
database
1.1.2 Functional Requirements
 Develop a Web UI which ask user to submit a form with following fields and persist in MySQL database.
o Name
o Date of Birth
o Aadhaar Number
o Address
o Occupation
 Develop a Web UI which retrieves the user details based on Aadhaar Number and DOB.
 Server should throw error in case of duplicate inputs of Aadhaar number.
 Server should throw error in case mismatch of Aadhaar Number and Date of Birth.
1.1.3 Technical Requirements
 A standalone Spring application that implements all the required features (logic parts) of this web
application in Java, should expose set of REST APIs that can be consumed by Frontend.
 Any UI technology can be used for UI Development.
 Source code should be built via maven tool.
 Write unit tests for your spring application
 MySQL database should be used for data persistence.
 Spring Data JPA (or Hibernate) for database integration.
 Web Application should be deployable on Tomcat server.

1.1.4 Time Duration


 4 Weeks
Stage II
1.1.5 Goal
Manage source code in git and use build tool Jenkins to build source code
1.1.6 Functional Requirements and Technical Requirements
 Create Git repository to save and manage source code.
 Git Repository should be created on FICO Git Server https://gitserver.fairisaac.com:8443/
 Setup Jenkins Server in Local.
 Setup Jenkins Job to build source code from Git repository and generate deployable war file.
1.1.7 Time Duration
 1 week

Stage III
1.1.8 Goal
Setup AWS Cloud Infrastructure and Deploy Web Application on Cloud
1.1.9 Functional Requirements and Technical Requirements
 Create an Amazon VPC with one public and one private subnet. Public subnet should be attached to
Internet Gateway and private subnet should be attached to NAT Gateway.(Create tickets with GTS to get
the VPC created due to recent process change).
 Setup an Amazon EC2 Instance in Private subnet.
 Setup an Amazon Classic Load Balancer in public subnet. It should be accessible from your network only
i.e., accessible only from your Internet/Wi-Fi network.
 Attach EC2 Instance on Classic Load Balancer.
 Setup tomcat and MySQL server on EC2 Instance.
 Create an S3 Bucket and upload deployable Web Application(war).
 Create an Amazon SSM Document which allows download a file from S3 and place on EC2 instance.
 Execute SSM document via AWS Command Line or AWS Console to deploy Web Application on
Tomcat server.
 Access Web Application via Load Balancer Public DNS.

1.1.10 Time Duration


 2 weeks

Stage IV
1.1.11 Goal
Deploy AWS Cloud Infrastructure components using single click automation

1.1.12 Functional Requirements and Technical Requirements


 Develop AWS CloudFormation Template in Json or yaml format to setup Entire AWS Infrastructure setup
in III.

1.1.13 Time Duration


 1 week

Stage V
1.1.14 Goal
Setup Fully Automated Continuous Delivery Pipeline

1.1.15 Technical and Functional Requirements


 One Preconfigured Jenkins Server will be provided.
 Setup Jenkins Job described in Section II.a.iii on above Jenkins server. This will be called as Continuous
Integration Job.
 Setup Jenkins Job which can executed SSM Document developed in Section III.a.vii via AWS CLI. This
will be called as Continuous Deployment Job.
 Setup Jenkins Pipeline which can execute Continuous Integration Job and Continuous deployment Job in
sequential manner.
 Configure Jenkins Pipeline to listen git commit even on specific branch of git repository where source
code of web application is checked in.
 You can also write integration test cases to be executed as a stage in the pipeline (optional).

1.1.16 Time Duration


 1 week

Stage VI
1.1.17 Goal
Demonstrate fully automated web application deployment on AWS Cloud Infrastructure
1.1.18 Requirements
 Show source code repository in git server and Jenkins Pipeline configurations.
 Deploy AWS Cloud Infrastructure with single click CloudFormation deployment.
 Make is dummy check-in to git repository which should trigger the Jenkins pipeline and that eventually
deploy the Web Application on Tomcat server configured on AWS EC2 Instance.
 Web Application should be accessible via Public DNS of Classic load balancer deployed as part of Cloud
formation stack.

1.1.19 Time Duration


1 Week.

Stage VII
1.1.20 Goal
Demonstrate fully automated web application deployment on AWS Cloud Infrastructure employing docker and
image registry.
1.1.21 Requirements
 Create docker image of your web application.
 Push the image to ECR/Artifactory.
 Deploy the docker image to a Fargate cluster.
 Web Application should be accessible via Public DNS of application load balancer.

1.1.22 Time Duration

2 Weeks.

1.1.23 The Milestones

3.1 Milestone #1 Container & Pipeline

 Jenkins-Pipeline that builds a docker container and pushes it to adocker


repository (Artifactory or AWS ECR).
 Entry-Point for the Docker-Container needs to be a "program" in any
desired programming or scripting language.
 BONUS (if you have time left)
o Jenkins-Pipeline as code
o Unit Tests for the program
o Integration Test

3.2 Milestone #2 Infrastructure as Code & automated deployment

 Jenkins-Pipeline(as code) that uses Infrastructure as Code(CDK,


CloudFormation, Terraform) to provision all infrastructure required to run the
Docker Container as an AWS Fargate Task.
 Automated deployment after commit to source control
 BONUS (if you have time left)
o Expanded integration tests that ensure the infrastructure and changed
code in the program executes and behaves as expected

3.3 Milestone #3 Integration into H2P and Promotion

 Get the code of an existing control, integrated it into this structure and into
your pipeline
 Test this delivered initial control in the DevX AWS infrastructure
 BONUS (if you have time left)
o Integrate your pipeline with H2P
1.1.24 Group Name – CheckmarX Debuggers

Brief Introduction about Jenkins-Pipeline, Jenkins file:


Jenkins Pipeline is a suite of plugins which supports implementing andintegrating
continuous delivery pipelines into Jenkins.
A continuous delivery (CD) pipeline is an automated expression of your process for
getting software from version control right through to your users and customers.
Every change to your software (committed in source control)goes through a complex
process on its way to being released. This process involves building the software in
a reliable and repeatable manner, as well as progressing the built software (called a
"build") through multiple stages of testing and deployment. Pipeline provides an
extensible set of tools for modelling simple to complex delivery pipelines “as code”.
The definition of a Jenkins Pipeline is written into a text file (called a Jenkinsfile)
which in turn can be committed to a project’s source control repository. This is the
foundation of "Pipeline-as-code"; treating the CD pipeline a part of the application
to be versioned and reviewed like any other code.

Creating a Jenkins file and committing it to source control provides several


immediate benefits:
 Automatically creates a Pipeline build process for all branches and pull
requests.
 Code review/iteration on the Pipeline (along with the remaining sourcecode).
 Audit trail for the Pipeline.
 Single source of truth for the Pipeline, which can be viewed and editedby
multiple members of the project.
1.1.1 Checkmarx Scanning

Checkmarx is the global leader in software security solutions for modern enterprise
software development. Checkmarx delivers the industry’s most comprehensive Software
Security Platform that unifies with DevOps and provides static and interactive application
security testing, software composition analysis and developer AppSec awareness and
training programs to reduce and remediate risk from software vulnerabilities. Checkmarx
is trusted by more than 40 percent of the Fortune 100 and half of the Fortune 50,
including leading organizations such as SAP, Samsung and Salesforce.com.

1.1.2 Protect workloads running on:

Amazon EKS
Prevent unauthorized images from running in your EKS cluster, enforce container
immutability, network segmentation and segregation of duties.

1.1.3 SecureApplications running on AWS Fargate containers


Embed Aqua Micro Enforcer into your containers to ensure that workloads running
on AWS Fargate are only performing their intended function, detect vulnerable or
compromised containers.

1.1.4 Extend security from Amazon ECR to Amazon ECS


Manage image vulnerabilities, ensure only trusted images can be deployed,automatically
whitelist legitimate container behaviour, and detect and block suspicious activities.

1.1.5 Application Of Checkmarx in Our Project:


We will be downloading scanner cli and then it will be scanning the repository
and then it will let us know if there is any vulnerability in the docker image with
colour signals.

TASK:
1. Open Java App -> In this we will be doing it on EC2 and configuring Docker,
Jenkins inside my ubuntu machine running on AWS EC2 and then Pushing the image
that is built through the Jenkins-Pipeline to AWS ECR.

2. First Maven Project -> Done locally on MAC machine with Jenkins, maven,
git. Build my maven project with the test cases and output the reportgenerated. The
jar application will be published to the maven directory after the build is successful.

3. Simple-java-maven-app -> we will build a simple java app in

which we willbe using fico bitbucket server. For this go to your Bitbucket account
and clickon manage account.

In the Account go to personal access token andcreate


a token.
Now go to Jenkins and mange plugins. Then install
bitbucket server integration. once installed go to
manage Jenkins and then Configuration and here you
will find bitbucket then add the token here.
DOCKER FILE:

1.1.6 JENKINS FILE:

1. Firstly, we must specify which tools we will be using.


2. In the next stage we will checkout our remote repository.
3. We will install the required dependencies i.e. npm and cdk for our
project.
4. We will synthesize our project which will generate the CloudFormation
template.
5. Using cdk deploy to create our stack and deploy are application.
6. Once we are done, we will destroy are stack.
1.1.7 JENKINS DASHBOARD:

Create a new job with pipeline Source Code from SCM.


In our task, whenever we push anything to our remote repository, the build should be automatically
triggered, for that we will add the webhook in the bitbucket repo.
once done, if we make any change to the repo, the build should automatically trigger.

Figure 3.3 Jenkins Pipeline Build

ECR REPOSITORY:
CHAPTER 4 UNDERSTANDING GREMLIN

Turn failure into resilience. Gremlin provides you with the framework to simulate real outages
safely, securely, and simply with an ever-growing library of attacks. Using Chaos Engineering to
improve system resilience, Gremlin’s “Failure as a Service” makes it easy to find weaknesses in
your system before they cause problems for your customers.

It offers several categories of attacks to inject faults into our system:

Category Impact

Resource Starve your application of critical resources

State Change the state of the environment your application is running within

Network Simulate the inherently unreliable behaviour of the network

Request Impact individual requests as they hit the wire

Gremlin is a simple, safe, and secure way to use Chaos Engineering to improve system resilience.
The Gremlin Platform provides a range of attackswhich you can run against your infrastructure.
This includes ResourceGremlins, Network Gremlins and State Gremlins. It is also possible to
schedule regular attacks, create attack templates, and view attack reports. Itprovides a library of
possible failure modes to test. You can impact system resources, delay or drop network traffic to
your dependencies, shut down your hosts, and much more! Each attack, or "gremlin" tests your
resilience ina different way.

Resource Gremlins
Resource gremlins are a great starting point -- simple to run and understand.They reveal how your
service degrades when starved of CPU, memory, IO, or disk.

State Gremlins
State gremlins introduce chaos into your infrastructure so that you can observe how
well your service handles it or fails.

Network Gremlins
Network gremlins allow you to see the impact of lost or delayed traffic to your
application. Test how your service behaves when you are unable to reach one of your
dependencies, internal or external. Limit the impact to only the traffic you want to
test by specifying ports, hostnames, and IP addresses.

Stages
Stages are sorted by descending order of importance (the Running Stage holds the
highest importance)

Stage Description

Running Attack running on the host

Halt Attack told to halt

RollbackStarted Code to rollback has started

RollbackTriggered Daemon started a rollback of client

InterruptTriggered Daemon issued an interrupt to the client

HaltDistributed Distributed to the host but not yet halted


Stage Description

Initializing Attack is creating the desired impact

Distributed Distributed to the host but not yet running

Pending Created but not yet distributed

Failed Client reported unexpected failure

HaltFailed Halt on client did not complete

InitializationFailed Creating the impact failed

LostCommunication Client never reported finishing/receiving execution

ClientAborted Something on the client/daemon side stopped the


Gremlin and it was aborted without user intervention

UserHalted User issued a halt, and that is now complete

Successful Completed running on the Host

TargetNotFound Attack not scoped to any current targets


1.1.8 Figure 4.1 Gremlin Dashboard

Figure 4.2 Monitor Attacks


CHAPTER 5 TASKS – E2E Controls

Like the milestones I am working on the end-to-end controls. The controlI am currently
working on is ror.020ScaleUpDown. It’s a Release and Operational Readiness
Control.

The Release and Operational Readiness chapter i.e., the ROR Chapter’s objective is
to identify from a Release and operational perspective the relevant
controls/measurements for each Delivery Pipeline Phase, so the quality of the
services/binaries produced is ensured. Main goal is to have “Everything as Code”
and repeatable.

The Purpose of this control is to test the ability of any running application to scale
up or scale down during any load or any attack. No service interruption should be
seen by the end user.
We have used Gremlin to introduce network latency and measure the
corresponding impact on the application.

Control Rules: Gremlin can perform a 'health' validation. We may need to pass some
test data, but this should see no appreciable difference to the customer service (HTTP
response code and response time) to pass the test.Valid response codes with higher-
than-expected response times could be classed as an 'Amber' warning for the control.

File Structure:
File Structure:

 Jenkins build for ror020


CONCLUSION

During this period, I have collaborated efficiently with my team at


many times like whenworking on the milestones or working on the
controls which helped me explore a lot about cloud services like
AWS Services, lambda functions and CDK. I learnt different
automation tools like Terraform, Docker, Jenkins, etc. Also, I
worked with security tools like SonarQube and Checkmarx.

But except the technical skills this helped me work efficiently in a


team. There is no doubt that there is always a scope of
improvement, however good or efficient the program may be, the
important thing is that the system developed should be flexible to
accommodate any future enhancements. In these 6 months I have
learned not only technical details but also the corporate policies
and formalities. It helped me to understand the responsibilities of
several designated persons, as well as mine as an intern. In this
phase I was asked to concentrate on my learning.

I am having an amazing experience, leveraging my skills under the


guidanceof my manager with the resolute team of DevSecOps.

I will continue to put my efforts so that I can learn and grow and
be a great asset for the company.

SPOT AWARDS.
1 Fixing Blackduck Security Control.

2 Consistent controls delivery in the E2E team.

REFERENCES

 FICO Learning
 Jive
 Udemy For Business
 Stack Overflow
 O’REILLY
 Medium Articles
 AWS documentation
 Blogposts

You might also like