You are on page 1of 45

CONTINUOUS INTEGRATION AND CONTINUOUS

DELIVERY USING DevOps


(Skill Oriented Course III)
LAB MANUAL

Exercise 1
Software engineering and Agile software development. Get an understanding of the stages
in software development lifecycle, the process models, values and principles of agility and
the need for agile software development. This will enable you to work in projects following
an agile approach to software development.
Software engineering and Agile software development
What is Agile?*8
Agile is the ability to create and respond to change. It is a way of dealing with, and ultimately
succeeding in, an uncertain and turbulent environment.
What is Agile Software Development?
Agile software development is more than frameworks such as Scrum, Extreme Programming,
or Feature-Driven Development (FDD).
Agile software development is more than practices such as pair programming, test-driven
development, stand-ups, planning sessions, and sprints.
Agile software development is an umbrella term for a set of frameworks and practices based on
the values and principles expressed in the Manifesto for Agile Software Development and the
12 Principles behind it. When you approach software development in a particular manner, it’s
generally good to live by these values and principles and use them to help figure out the right
things to do given your particular context.
One thing that separates Agile from other approaches to software development is the focus on
the people doing the work and how they work together. Solutions evolve through collaboration
between self-organizing cross-functional teams utilizing the appropriate practices for their
context.
What are Agile Methodologies?
Agile methodologies are the conventions that a team chooses to follow in a way that follows
Agile values and principles.
“Wait,” you’re probably saying, “I thought Scrum and XP were Agile methodologies.” Alistair
applied the term framework to those concepts. They certainly were born from a single team’s

1
methodology, but they became frameworks when they were generalized to be used by other
teams. Those frameworks help inform where a team starts with their methodology, but they
shouldn’t be the team’s methodology. The team will always need to adapt its use of a framework
to fit properly in its context.
What about Agile Project Management or Agile Business Analysis?
Agile Software Development became more popular, people involved with software
development activities but who didn’t personally develop software looked for some way to
figure out how these Agile ideas applied in their line of work.
The Agile Manifesto and the 12 Principles were written by a group of software developers (and
a tester) to address issues that software developers faced. When you think of Agile as a mindset,
that mindset can be applied to other activities.
What are the Key Agile Concepts?
User Stories: In consultation with the customer or product owner, the team divides up the work
to be done into functional increments called “user stories.” Each user story is expected to yield
a contribution to the value of the overall product.
Daily Meeting: Each day at the same time, the team meets so as to bring everyone up to date
on the information that is vital for coordination: each team members briefly describes any
“completed” contributions and any obstacles that stand in their way.
Personas: When the project calls for it – for instance when user experience is a major factor in
project outcomes – the team crafts detailed, synthetic biographies of fictitious users of the future
product: these are called “personas.”
Team: A “team” in the Agile sense is a small group of people, assigned to the same project or
effort, nearly all of them on a full-time basis. A small minority of team members may be part-
time contributors, or may have competing responsibilities.
Incremental Development: Nearly all Agile teams favor an incremental development strategy;
in an Agile context, this means that each successive version of the product is usable, and each
build upon the previous version by adding user-visible functionality.
Iterative Development: Agile projects are iterative insofar as they intentionally allow for
“repeating” software development activities, and for potentially “revisiting” the same work
products.
Milestone Retrospective: Once a project has been underway for some time, or at the end of
the project, all of the team’s permanent members (not just the developers) invests from one to
three days in a detailed analysis of the project’s significant events.

2
Exercise 2
Development & Testing with Agile: Extreme Programming. Get a working knowledge of
using extreme automation through XP programming practices of test first development,
refactoring and automating test case writing.
Development & Testing with Agile: Extreme Programming
What is Extreme Programming (XP)?
Extreme Programming (XP) is an agile software development framework that aims to produce
higher quality software, and higher quality of life for the development team.
XP is the most specific of the agile frameworks regarding appropriate engineering practices for
software development.
The general characteristics where XP is appropriate are
* Dynamically changing software requirements
* Risks caused by fixed time projects using new technology
* Small, co-located extended development team
* The technology you are using allows for automated unit and functional tests
AGILE TESTING
Agile Testing is quite different from traditional testing methods of sequential processes.
Requirements and documentations are not necessary for test processes in agile. Testers are
allowed to join the developers and users in their initial plan meetings. During the meeting tester
himself notes down the requirements and then matches the developed codes to those
requirements
EXTREME PROGRAMMING
Extreme programming (XP) is a well-known agile practice. XP tends to manage project tasks,
project manuals and documentation to reduce cost and support changing requirements. XP is
based on an iterative and incremental approach normally executed in small cycles. XP brings
whole team together thereby increasing productivity and creating a cohesive working
environment. Idea behind team clustering is to get enough feedbacks, so that team can highlight
their weak points and tune them accordingly. This is why XP is normally said to be a people-
oriented process rather than process oriented.
XP has its various dialects. Here focus of interest would be
1) Pair programming
2) Ping Pong Programming
Pair Programming:

3
Pair programming is a concept used within XP, in pair programming, two people work in
collaboration on the same algorithm, code or designing task. The two involved participants sit
at one working computer side by side. One of the people is responsible for writing the code or
designing the algorithm while the other person sits beside him and keep reviewing the coding.
The first person who is coder is generally known as “Programmer” while the other one who
reviews the code is known as “navigator” or “observer”. This practice improves the software
development process.
1)Code reviews
In Code Reviews, Reviewers sit down by taking a dedicated time out to review someone’s
code, every one of these reviewers have some comments related to the code but not all of them
will work on that code on daily bases, everyone seems to be engaged and involved in discussion
at that time but once review meeting is done not all of them will actually be a part of coding
process, which eliminate the effective feedback loop.
2) How code reviews are related to pair programming Code reviews are done to analyse
the quality of code whereas pair programming is designed to produce a reviewed code.
In pair programming, Driver and Navigator are continuously writing and reviewing the code
thus increasing feedback loop effectiveness, which eliminates the code review overhead in later
stages. Introducing pair programming and its variants in agile software development and testing
increases the code qualities which ultimately decrease the chances of code review meetings in
later testing phase.
Ping Pong Programming
Both members of the pair write code. One of them writes a test code and the other writes the
actual code (product code). For example, A writes a test code for product code which is written
by B.A aims to produce a failing test code whereas B aims to pass that test code.
Tested code Ping pong approach is designed to produce a tested code. There is a difference
between tested and reviewed code. A reviewed code is such a code which is clean and is
according to reviewed code standards. Tested code is such a code which actually passes all
necessary test cases. Any code which is tested code does not necessarily assure that the code is
clean and reviewed.

Exercise 3
DevOps adoption in projects. It is important to comprehend the need to automate the
software development lifecycle stages through DevOps. Gain an understanding of the

4
capabilities required to implement DevOps, continuous integration and continuous
delivery practices.
DevOps adoption in projects
What is DevOps?
DevOps is not a tool, technology or framework. Instead, it is a set of practices that help bridge
the gap between development and operations teams in an enterprise. By bridging the gap,
DevOps eliminates communication barriers and makes collaboration easier.

DevOps cannot be efficient without an agile setup. There is no point optimizing development
and accelerating build processes if the new code will not go to the users until the next ‘big’
release.
What is Agile DevOps?
To bring two technologies together, we must understand both of them individually, which in
turn, will help us understand how well they can gel with each other.
Both DevOps and Agile are the modern software development practices that are designed to
produce a part of a product, launch, or release, the approaches they follow are different. Let us
try and compare the approaches that the two practices follow:
How Can DevOps and Agile Work Together to Help Your Business Grow? - Appinventiv

5
7 Steps to Successful DevOps Adoption

Although the idea of DevOps is not new—it has been around for more than ten years—many
firms have not yet put it into practice. And some organizations still have trouble using DevOps
to get the results they want. Here are the steps that will help in the successful adoption of
DevOps.
Adopt a DevOps mindset
Let’s implement DevOps. The process doesn’t just start by saying that. Everyone in your
organization must be willing to change the way things are currently done and have a complete
sense of what DevOps is and the specific business demands it may address.
Organizations frequently mix up automation and DevOps. Even while automation helps speed
up manual operations, cooperation and communication are the key objectives of DevOps.
Automating your operations won’t bring about the desired business benefits unless everyone
involved in the software development, delivery, testing, and operating processes adopts
excellent communication and collaborative practices.
The best way to implement DevOps effectively is to make sure that everyone involved in the
delivery cycle is more flexible and has an innovative mentality.
Everyone participating in the process should be aware of their duties and responsibilities and
trained to cooperate for DevOps to become the organization’s culture. For DevOps to succeed,
the organization’s leadership must have confidence in it and must assist in fostering a DevOps
culture.
Recognize your infrastructure requirements
There is no “one size fits all” DevOps solution, despite what those who offer DevOps solutions
will tell you. You can’t merely hire a self-described “DevOps engineer” or toss in an online
tool and expect success.

6
Each organization’s DevOps journey will be distinct and based on its own business, culture,
and infrastructure. The crucial next step is to have a deeper grasp of your application’s
requirements. It enables you to make DevOps adoption business-driven and match
infrastructure architecture with your organizational goals.
Evaluate your project delivery cycle and testing environments to find areas for improvement
and possible bottlenecks.
Your DevOps adoption won’t be successful without integrating Continuous Integration and
Continuous Delivery (CI/CD) pipelines into your workflow. Why? Because Continuous
Delivery enables your development teams to deploy changes in production, and Continuous
Integration helps them develop a product in small phases and identify and rectify faults
instantly.
Create a DevOps strategy
Program managers must establish a shared objective to bring teams together in a collaborative
setting. It instills a sense of responsibility and obligation in each team member. DevOps relies
heavily on best practices that promote innovative approaches to software development,
architecture, and testing while enhancing teamwork.
Your strategy should be focused on two objectives: helping the team as a whole do its work to
the best of its ability and facilitating the continuous deployment of processes that are ready for
production.
Choose the right DevOps tools
There isn’t a single tool that can handle all of the demands and key purposes of DevOps. The
best course of action is to select a collection of tools that are ideal for the organization’s software
delivery environment, applications, and teams.
The appropriate tools help organizations establish a solid DevOps framework, accomplish a
continuous process from development to delivery, aid in resource and cost optimization, support
seamless process execution, and ultimately fulfill organizational goals.

7
Organizations must take the following considerations into account when selecting the
appropriate DevOps tools:
The tools ought to be capable of enterprise-level automation. Without adding more effort, it
will assist in scaling business workflows and continuously improving the operations.
Integrating the entire delivery ecosystem is required in DevOps. Consequently, the tools you
select should have integration capabilities.
Increase test automation and align QA with development
DevOps requires appropriate automated testing in order to achieve faster delivery. Not all
testing types need to be automated. For instance, manual testing should still be done for
investigative, security, and usability testing. Functional testing may only be partially automated,
depending on the amount of writing effort required.
Development and testing are done simultaneously to prevent bugs after a release. The
recommended approach is to run automated tests 1-2 times per day while the program is still
being developed. If any issues are discovered, developers can concentrate on stabilizing the
software before deploying the latest build.
Application containerization
Application containerization is a rapidly developing technology that is altering how cloud-
based application instances are tested and run by developers. Your programs become
lightweight and simple to execute when you containerize them.
As software is used, its reliability is increased by container packaging. Additionally, the
software is independent of the broader infrastructure, thanks to its container components. This
improves its ability to operate independently in any context. Furthermore, containerizing
enables DevOps teams to quickly manage the application and make any adjustments required
for a specific microservice.
Focus on iterative adoption
Avoid attempting to launch a comprehensive DevOps in the enterprise while just getting started.
Choose a pilot application, put together a cross-functional DevOps team made up of developers,
testers, and operations personnel, assess your value stream to discover bottlenecks and
restrictions, and develop a preliminary deployment pipeline that takes a few of your processes
constraints into account.
Measure your success and growth, then repeat the process. Before starting to expand to
additional projects, you must go through a few iterations to gain trust in the framework and the
pilot.
Generally, since doing so would have the greatest commercial impact, you should start by
addressing your largest value-stream restrictions. Some of these restrictions will be simple to
overcome, while others will require a lot of time.

8
Exercise 4
Implementation of CICD with Java and open-source stack. Configure the web application
and Version control using Git using Git commands and version control operations.
Implementation of CICD with Java and open-source stack
What Is CICD?
Continuous integration (CI) and continuous delivery (CD) deliver software to a production
environment with speed, safety, and reliability.
Continuous Integration
Continuous Integration (CI) is a development practice that requires developers to integrate code
into a shared repository several times a day. Each check-in is then verified by an automated
build, allowing teams to detect problems early. By integrating regularly, you can detect errors
quickly, and locate them more easily.
Tools for CI
Jenkins—a free, open-source, Java-based tool that gives you a lot of flexibility.
Azure Pipelines—a Microsoft product free for up to five users and open-source projects.
Cloud Build—the managed service offering from Google Cloud Platform.
Travis CI—a popular tool for GitHub open-source projects that offers a hosted or self-hosted
solution.
GitLab CI—a free tool from GitLab that can also integrate with other tools via the API.
Circle CI—a tool that’s popular for GitHub projects and has a hosted and self-hosted solution.
You can start for free.
CodeShip—a self-hosted-only solution. You can start with the free version, but it’s a paid tool.
Continuous Delivery
Continuous Delivery is the ability to get changes of all types—including new features,
configuration changes, bug fixes and experiments—into production, or into the hands of users,
safely and quickly in a sustainable way. We achieve all this by ensuring our code is always in
a deployable state, even in the face of teams of thousands of developers making changes on a
daily basis.
Tools for CD
A few of the tools for CD are also tools for CI. That’s why I’ll repeat a few tools here from the
CI tools list. But there’s also a few new ones.
Jenkins—can also be used for CD with its pipeline as code, Ansible, or Terraform plugins.

9
Azure Pipelines—has a release definition section that you can integrate with a build stage from
CI.
Spinnaker—gaining popularity, and it’s the tool that Netflix uses to do releases in a CD way.
GitLab CI—lets you configure deployment and release pipelines with GitLab.
GoCD—the Thought Works offering that applies the principles I’ve discussed in this post.
A typical CI/CD workflow
1) An engineer codes application changes using Visual Studio.
2) When the code is ready for integration, it’s pushed to a Git repository.
3) CI automatically triggers the execution of test cases that will confirm that the code is
available for release.
4) In CICD Pipelines, the release pipeline triggers automatically to deploy the artifacts produced
in the CI stage.
5) An artifact is released into the Web App—let’s say to a development environment.
6) Application Insights collects information from the site to provide feedback to the team.
7) The team uses the information available after a release to know the status and impact of the
latest version.
8) Any new feature or bug fix is added and prioritized into the backlog.
Understanding the GIT Workflow
GIT is the most widely used open-source VCS (version control system) that allows you to track
changes made to files. Companies and programmers usually use GIT to collaborate on
developing software and applications.
A GIT project consists of three major sections: the working directory, the staging area, and the
git directory.
The working directory is where you add, delete, and edit the files. Then, the changes are staged
(indexed) in the staging area. After you commit your changes, the snapshot of the changes will
be saved into the git directory.
Everyone can use GIT as it is available for Linux, Windows, Mac, and Solaris. The software
may have a steep learning curve, but there are lots of GIT tutorials ready to help you.
Basic GIT Commands
Here are some basic GIT commands you need to know:
git init will create a new local GIT repository. The following Git command will create a
repository in the current directory:
➢ git init

10
Alternatively, you can create a repository within a new directory by specifying the project
name:
➢ git init [project name]
git clone is used to copy a repository. If the repository lies on a remote server, use:
➢ git clone username@host:/path/to/repository
Conversely, run the following basic command to copy a local repository:
➢ git clone /path/to/repository
git add is used to add files to the staging area. For example, the basic Git following command
will index the temp.txt file:
➢ git add <temp.txt>
git commit will create a snapshot of the changes and save it to the git directory.
➢ git commit –m “Message to go with the commit here”
git config can be used to set user-specific configuration values like email, username, file format,
and so on. To illustrate, the command for setting up an email will look like this:
➢ git config --global user.email youremail@example.com
The –global flag tells GIT that you’re going to use that email for all local repositories. If you
want to use different emails for different repositories, use the command below:
➢ git config --local user.email youremail@example.com
git status displays the list of changed files together with the files that are yet to be staged or
committed.
➢ git status
git push is used to send local commits to the master branch of the remote repository. Here’s
the basic code structure:
➢ git push origin <master>
git checkout creates branches and helps you to navigate between them. For example, the
following basic command creates a new branch and automatically switches you to it:
➢ git checkout -b <branch-name>
To switch from one branch to another, simply use:
➢ git checkout <branch-name>
git remote lets you view all remote repositories. The following command will list all
connections along with their URLs:
➢ git remote –v
To connect the local repository to a remote server, use the command below:
➢ git remote add origin <host-or-remoteURL>
Meanwhile, the following command will delete a connection to a specified remote repository:

11
➢ git remote rm <name-of-the-repository>
git branch will list, create, or delete branches. For instance, if you want to list all the branches
present in the repository, the command should look like this:
➢ git branch
If you want to delete a branch, use:
➢ git branch –d <branch-name>
git pull merges all the changes present in the remote repository to the local working directory.
➢ git pull
git merge is used to merge a branch into the active one.
➢ git merge <branch-name>
git diff lists down conflicts. In order to view conflicts against the base file, use
➢ git diff --base <file-name>
The following basic command is used to view the conflicts between branches before merging
them:
➢ git diff <source-branch> <target-branch>
To list down all the present conflicts, use:
➢ git diff
git tag marks specific commits. Developers usually use it to mark release points like v1.0 and
v2.0.
➢ git tag <insert-commitID-here>
git log is used to see the repository’s history by listing certain commit’s details. Running the
command will get you an output that looks like this:
commit 15f4b6c44b3c8344caasdac9e4be13246e21sadw
Author: Alex Hunter <alexh@gmail.com>
Date: Mon Oct 1 12:56:29 2016 -0600
git reset command will reset the index and the working directory to the last git commit’s state.
➢ git reset --hard HEAD
git rm can be used to remove files from the index and the working directory.
➢ git rm filename.txt
git stash command will temporarily save the changes that are not ready to be committed. That
way, you can go back to that project later on.
➢ git stash
git show is a command used to view information about any git object.
➢ git show

12
git fetch allows users to fetch all objects from the remote repository that don’t currently reside
in the local working directory.
➢ git fetch origin
git ls-tree allows you to view a tree object along with the name, the mode of each item, and the
blob’s SHA-1 value. Let’s say you want to see the HEAD, use:
➢ git ls-tree HEAD
git cat-file is used to view the type and the size information of a repository object. Use the -p
option along with the object’s SHA-1 value to view the information of a specific object, for
example:
➢ git cat-file –p d670460b4b4aece5915caf5c68d12f560a9fe3e4
git grep lets users search through committed trees, working directory, and staging area for
specific phrases and words. To search for www.hostinger.com in all files, use:
➢ git grep "www.hostinger.com"
gitk shows the graphical interface for a local repository. Simply run:
➢ gitk
git instaweb allows you to browse your local repository in the git-web interface. For instance:
➢ git instaweb –httpd=webrick
git gc will clean unnecessary files and optimize the local repository.
➢ git gc
git archive lets users create a zip or a tar file containing the constituents of a single repository
tree. For instance:
➢ git archive --format=tar master
git prune deletes objects that don’t have any incoming pointers.
➢ git prune
git fsck performs an integrity check of the git file system and identifies any corrupted objects.
➢ git fsck
git rebase is used to apply certain changes from one branch to another. For instance:
➢ git rebase master

13
Exercise 5
Implementation of CICD with Java and open-source stack. Configure a static code
analyser which will perform static analysis of the web application code and identify the
coding practices that are not appropriate. Configure the profiles and dashboard of the
static code analysis tool.
CI/CD pipelines with static code analysis
A static code analysis tool inspects your codebase through the development cycle, and it's able
to identify bugs, vulnerabilities, and compliance issues without actually running the program.
What is static code analysis?
Static code analysis is a practice that allows your team to automatically detect potential bugs,
security issues, and, more generally, defects in a software's codebase. Thus, we can view static
analysis as an additional automated code review process. Let's examine this analogy more in
detail. The tasks involved in static code analysis can be divided as such:
1. Detecting errors in programs
2. Recommendations on code formatting with a formatter
3. Metrics computation, which gives you back a rating on how well your code is.
Static Code Analysis tools:
Many commercial and free static code analysers support a vast plethora of programming
languages. One of the most famous is SonarQube.
SonarQube is open-source software for continuous inspection of code quality. It performs
automatic reviews with static analysis on more than 20 programming languages. It can spot
duplicated code, compute code coverage, code complexity, and finds bugs and security
vulnerabilities. In addition, it can record metrics history and provides evolution graphs via a
dedicated web interface.
SonarQube Benefits
So Why SonarQube
So why not just existing and proven tools and configure them in the CI server ourselves? Well
for SonarQube there are a lot of benefits:
▪ CI tools do not have a plugin which would make all of these tool’s work easily together
▪ CI tools do not have plugins to provide nice drill-down features that SonarQube has
▪ CI Plugins does not talk about overall compliance value
▪ CI plugins do not provide managerial perspective
▪ There is no CI plugin for Design or Architectural issues
▪ CI plugins do not provide a dashboard for overall project quality

14
Features of SonarQube are:
▪ Doesn’t just show you what’s wrong, but also offers quality and management tools to
actively helps you correct issues
▪ Focuses on more than just bugs and complexity and offers more features to help the
programmers write code, such as coding rules, test coverage, de-duplications, API
documentation, and code complexity all within a dashboard
▪ Gives a moment-in-time snapshot of your code quality today, as well as trends of past
and potentially future quality indicators. Also provides metrics to help you make the
right decisions
Getting Started
Installation of SonarQube: Installing SonarQube in ubuntu
Perform a system update and install unzip
➢ sudo apt update
➢ sudo apt install unzip -y
Install Openjdk11
➢ sudo apt install openjdk-11-jdk -y
Install and Configure Postgres
➢ sudo sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt/ `lsb_release -cs`-pgdg
main" >> /etc/apt/sources.list.d/pgdg.list' wget -q
https://www.postgresql.org/media/keys/ACCC4CF8.asc -O - | sudo apt-key add –
➢ sudo apt-get -y install postgresql postgresql-contrib
Enable and Start Postgresql
➢ sudo systemctl enable postgresql
➢ sudo systemctl start postgresql
Change the passwd for postgres user
➢ sudo passwd postgres
Switch to postgres user and create a user called sonar
➢ su - postgres
createuser sonar
psql
Set a password for the newly created user for SonarQube database and create a database for
Postgresql database
ALTER USER sonar WITH ENCRYPTED password 'P@ssword';

15
CREATE DATABASE sonar OWNER sonar;

Exit the psql shell and switch back to the user by running exit comand
➢ \q
exit
Download SonarQube
➢ wget https://binaries.sonarsource.com/Distribution/sonarqube/sonarqube-
8.9.1.44547.zip
Unzip the SonarQube using following command
➢ sudo unzip sonarqube-8.9.1.44547.zip -d /opt •
Rename the directory
➢ sudo mv /opt/sonarqube-8.9.1.44547 /opt/sonarqube •
Create a non sudo linux user
➢ sudo adduser sonarq •
Assign permissions to sonarqube directory
➢ sudo chown -R sonarq: sonarq /opt/sonarqube/ •
Sonarqube uses the elastic search service so increase vm max map
➢ sudo sysctl -w vm.max_map_count=262144 •
Open the Sonarqube properties file
➢ sudo nano /opt/sonarqube/conf/sonar.properties
and change the following properties
➢ sonar. jdbc. username=sonar
➢ sonar.jdbc.password=P@ssword
➢ sonar.jdbc.url=jdbc: postgresql://localhost/sonar
➢ sonar.web. javaAdditionalOpts=-server •
Configure Sonarqube as service
➢ sudo nano /etc/systemd/system/sonar.service •
Add the following content to sonar. service
➢ [Unit]
Description=SonarQube service
After=syslog.target network. target
[Service]
Type=forking

16
ExecStart=/opt/sonarqube/bin/linux-x86-64/sonar.sh start ExecStop=/opt/sonarqube/bin/linux-
x86-64/sonar.sh stop
User=sonarq
Group=sonarq
Restart=always
[Install]
WantedBy=multi-user. target

Now enable and start sonarqube


➢ sudo systemctl enable sonar
➢ sudo systemctl start sonar
➢ sudo systemctl status sonar
Now access the sonarqube with the ip address of the server http://<ipaddress>:9000. Login into
sonarqube with default credentials username: admin and password: admi
Integrate SonarQube scan into the pipeline
Once our project is created and configured, we can automatically trigger a code analysis, adding
a step in our CD pipeline.
you can start an analysis by running this command.
➢ sonar-scanner
-Dsonar.projectKey=[PROJECT_KEY]
-Dsonar.sources=.
-Dsonar.host.url=[LOAD_BALANCER_URL]
-Dsonar.login=[LOGIN_KEY]
-Dsonar.qualitygate. wait=true

17
Exercise 6
Implementation of CICD with Java and open-source stack. Write a build script to build
the application using a build automation tool like Maven. Create a folder structure that
will run the build script and invoke the various software development build stages. This
script should invoke the static analysis tool and unit test cases and deploy the application
to a web application server like Tomcat.
Build/Compile/test Java Using Maven
Building Java Code
▪ Compiling Each and every .java file
▪ archive (zip) all the generated .class files into war/jar/ear
To do this build activity, there are many builds tools
▪ ANT
▪ Maven
▪ Gradle
In this series we restrict ourselves to Maven
Maven Installation on ubuntu
Commands to install maven:
➢ sudo apt-get update
➢ sudo apt-get install maven -y
➢ mvn -v
Maven
▪ Is a Project Management Tool.
▪ Can be used for build, dependency Management, Releases, Documentation, Test
Executions.
▪ Maven prefers conventions over configurations.
▪ Maven also works for Java Based Languages groovy, Scala.
▪ Maven uses a file called as pom.xml to define dependencies, project information’s,
plugins.
Maven folder conventions
▪ Code: /src/main/java/
▪ Location of pom: /pom.xml
▪ Test: /src/test/java/
▪ Target Folder: /target

18
pom.xml
POM (Project Object Model): is a xml which defines Project info, Dependencies, Plugins,
Profiles.
Goals
▪ compile: compile the code
▪ test: compile the code + test the code
▪ package: test the code + package the application
▪ install: pushes the pom file and jar/war to ~/.m2
▪ deploy: pushes the pom file and jar/war to Remote/Central Repo
▪ clean: remove the target folder
Executing goals
▪ single goal execution
➢ mvn compile
➢ mvn test
▪ multi goals
➢ mvn clean package
Maven Repository Architecture

Using Maven with Jenkins


Install maven and ensure Jenkins User has access to maven
In the build section of Free style project
▪ Execute Shell: directly execute the maven command
▪ Invoke Top level Maven Targets: Specify goals from here

19
Integrating Maven tests with Jenkins
▪ To configure publishing of junit test results to jenkins
▪ ensure your maven goal has test execution
▪ Navigate to post-build Actions and Select as shown below

▪ Select xml files in surefire-reports folder in target

Archiving the Artefact (Displaying the package built)

20
▪ In post build actions, select the section ‘archive the artifact’ and give the artifacts
location (for eg: target/*.jar)
Post Build Actions:
Activities that are performed after build is completed
Most commonly post builds are
▪ Show Test Results
▪ Show the Package
▪ Call other Jenkins Project to start building
▪ Send Emails to the team
Jenkins Plugins
Plugin is extra functionality into Jenkins
Plugins can be installed into Jenkins. This installation can be from Online, Plugins get
downloaded from internet and Offline, Upload plugin to Jenkins, Plugin has two popular
formats (hpi, jpi)
Day Builds

Day builds basic intention is


▪ to give feedback of code quality of the commit(s) done by one (or more) developers for
shorter period of time during the day.
▪ Have to finish quickly
Night Builds
Night Builds basic intention is
▪ to give feedback of the product quality for the collective work done by developers
during past day
▪ Execute extensive tests (unit, System, Performance)
▪ Time is no bar

21
Jenkins Build Triggers
Will help in triggering the Jenkins Build based on
Schedule:
▪ Periodic: eg. Every 1 hour, Every 2 hours etc
▪ On Schedule: e.g., Every weekday at 3 AM
Git Commits: Any new commits to Git Repo. Two ways of doing this
▪ POll SCM: Jenkins will poll Git
▪ Git/Web Hooks: Git will inform Jenkins whenever new commits happen
After Other Jobs are Built

Build Periodically
▪ Here you configure the schedules by using syntax which is much like cron jobs.
MINUTE HOUR DOM MONTH DOW
H/15 * * * *
0 4,15 * * *

22
Exercise 7
Implementation of CICD with Java and open-source stack. Configure the Jenkins tool with the
required paths, path variables, users and pipeline views.
Aws free tier link
With the link provided below, we can signup to our account:
https://portal.aws.amazon.com/billing/signup?refid=74413e6a-8eef-4b37-9d39-
d088b67a90c5&redirect_url=https%3A%2F%2Faws.amazon.com%2Fregistrationconfirmatio
n#/start/email
Jenkins installation on AWS
https://www.jenkins.io/doc/tutorials/tutorial-for-installing-jenkins-on-AWS/
Tutorial link how to install jenkins on AWS.
Following are the commands to install Jenkins:
➢ sudo yum update –y
➢ sudo wget -O /etc/yum.repos.d/jenkins.repo \
https://pkg.jenkins.io/redhat-stable/jenkins.repo
➢ sudo rpm --import https://pkg.jenkins.io/redhat-stable/jenkins.io.key
➢ sudo yum upgrade
Java installation
Following are the commands to install java and run Jenkins:
➢ sudo amazon-linux-extras install java-openjdk11 -y
➢ sudo yum install jenkins -y
➢ sudo systemctl start jenkins
➢ sudo systemctl status Jenkins
http://65.1.106.16:8080/
GitHub signup:
https://github.com/
https://github.com/tiruch192/Test-12.git
Following are the commands to implement in git bash:
➢ git config --global user.name ""
create a new repository on the command line
➢ echo "# Test-12" >> README.md
➢ git init

23
➢ git add README.md
➢ git commit -m "first commit"
➢ git branch -M main
➢ git remote add origin https://github.com/tiruch192/Test-12.git
➢ git push -u origin main
push an existing repository from the command line
➢ git remote add origin https://github.com/tiruch192/Test-12.git
➢ git branch -M main
➢ git push -u origin main
SSH Connection
▪ ssh -i "test12.pem" ec2-user@ec2-65-1-106-16.ap-south-1.compute.amazonaws.com

24
Exercise 8
Configure the Jenkins pipeline to call the build script jobs and configure to run it
whenever there is a change made to an application in the version control system. Make a
change to the background colour of the landing page of the web application and check if
the configured pipeline runs.
Jenkins Pipeline
Multibranch Pipeline with a Webhook on Jenkins
Sometimes it may be necessary to create a pipeline on Jenkins for each Git branch. In this case
it could be difficult to create independent pipeline for each branch. Besides that, what if we
create or delete a branch in future? So, someone has to take care of the pipelines on Jenkins
whenever there is change in branches. that's where Multibranch pipeline comes into picture.
Through Multibranch pipeline we can create a pipeline for each branch in a repository and it
also create or remove when there is a change in the branches. Let’s see how it works!!

Pre-Requisites:
1. Expected a Jenkins server is already up and running
2. A Git repository with more than one branch
In my case I am using a repository called https://github.com/ravdy/testnodejs-app.git
Create a Multibranch Pipeline
1. Login to Jenkins GUI
2. Click on “New Item” → Specify a job name → Select “Multibranch Pipeline option”

3. Give Display Name and Description

25
4. Under Branch Sources → Add source → Chose Git → and provide GitHub URL and
Credentials (Credentials are optional if it is public repo)

Git Repo Information


5. Under Build Configuration → chose Jenkins file path. Most of the case it will be under Repo
root directory.

Jenkins file path


6. Apply and Save the job Now Jenkins automatically scans the repository and create a job for
each branch wherever it finds a Jenkins file and initiate first build.

Multibranch Pipeline
Using Webhook
If you wish to automate the build process in the multibranch pipeline we can use Webhook.
This feature is not enabled until we install “Multibranch Scan Webhook Trigger”. This enables
an option “scan by webhook” under “Scan Multibranch Pipeline Triggers”. Here we should
give a token. I am giving it as “mytoken”. by this time your job looks something like below.

26
Jenkins Multibranch pipeline with webhook
Now to enable auto build process we should provide Jenkins URL with token in the GitHub. in
this case like should be http://65.0.130.108:8080/multibranchwebhook-
trigger/invoke?token=mytoke
for this log into GitHub → settings → Webhooks → Add webhook

Provide Payload URL as “http://65.0.130.108:8080/multibranch-


webhooktrigger/invoke?token=mytoke” and Content type as “application/json” and click on
Add webhook

27
Once this is done you can see a new webhook and its time to do some changes in the
repository to test the webhook connection.

In this demonstration, I am going to create a new branch called “stage” and push the changes
onto remote repo.

28
New branch creation on Git and pushed changes onto GitHub
Now you could see a build has been triggered automatically on Jenkins and it scan and create
a job for the new branch as well.

Hope this has given fair idea about how Jenkins multibranch pipelines work with webhook.

29
Exercise 9
Create a pipeline view of the Jenkins pipeline used in Exercise 8. Configure it with user
defined messages.
Step 1: Create a New Freestyle Project
1. Click the New Item link on the left-hand side of the Jenkins dashboard.

2. Enter the new project's name in the Enter an item name field and select the Freestyle project type.
Click OK to continue.

30
2. Under the General tab, add a project description in the Description field.

31
Step 2: Add a Build Step

1. Scroll down to the Build section.

2. Open the Add build step drop-down menu and select Execute Windows batch command.
3. Enter the commands you want to execute in the Command field. For this tutorial, we are using a
simple set of commands that display the current version of Java and Jenkins working directory:
java -version
dir
4.Click the Save button to save changes to the project.

32
Step 3: Build the Project
1. Click the Build Now link on the left-hand side of the new project page.

2. Click the link to the latest project build in the Build History section

33
3. Click the Console Output link on the left-hand side to display the output for the
commands you entered.

3. The console output indicates that Jenkins is successfully executing the commands, displaying
the current version of Java and Jenkins working directory.

34
Exercise 10
In the configured Jenkins pipeline created in Exercise 8 and 9, implement quality gates
for static analysis of code.
SonarQube is an excellent tool for measuring code quality, using static analysis to find code smells,
bugs, vulnerabilities, and poor test coverage. Rather than manually analysing the reports, why not
automate the process by integrating SonarQube with your Jenkins continuous integration pipeline?
This way, you can configure a quality gate based on your own requirements, ensuring bad code
always fails the build.
Quality gates
In SonarQube a quality gate is a set of conditions that must be met in order for a project to be marked
as passed.
SonarQube and Jenkins
Running a SonarQube scan from a build on your local workstation is fine, but a robust solution needs
to include SonarQube as part of the continuous integration process. If you add SonarQube analysis
into a Jenkins pipeline, you can ensure that if the quality gate fails then the pipeline won’t continue to
further stages such as publish or release. After all, nobody wants to release crappy code into
production.
To do this, we can use the SonarQube Scanner plugin for Jenkins. It includes two features that we’re
going to make use of today:
SonarQube server configuration – the plugin lets you set your SonarQube server location and
credentials. This information is then used in a SonarQube analysis pipeline stage to send code analysis
reports to that SonarQube server.
SonarQube Quality Gate webhook – when a code analysis report is submitted to SonarQube,
unfortunately it doesn’t respond synchronously with the result of whether the report passed the quality
gate or not. To do this, a webhook call must be configured in SonarQube to call back into Jenkins to
allow our pipeline to continue (or fail). The SonarQube Scanner Jenkins plugin makes this webhook
available.

35
Here’s a full breakdown of the interaction between Jenkins and SonarQube:

1. a Jenkins pipeline is started


2. the SonarQube scanner is run against a code project, and the analysis report is
sent to SonarQube server
3. SonarQube finishes analysis and checking the project meets the configured
Quality Gate
4. SonarQube sends a pass or failure result back to the Jenkins webhook exposed
by the plugin
5. the Jenkins pipeline will continue if the analysis result is a pass or optionally
otherwise fail

36
Exercise 11
In the configured Jenkins pipeline created in Exercise 8 and 9, implement quality gates
for static unit testing.
Exercise-11

Jenkins - Unit Testing

Example of a Junit Test in Jenkins

The following example will consider

A simple HelloWorldTest class based on Junit.

Ant as the build tool within Jenkins to build the class accordingly.

Step 1 − Go to the Jenkins dashboard and Click on the existing HelloWorld project and choose the
Configure option

37
Step 2 − Browse to the section to Add a Build step and choose the option to Invoke
Ant.

Step 3 − Click on the Advanced button.

38
Step 4 − In the build file section, enter the location of the build.xml file.

Step 5 − Next click the option to Add post-build option and choose the option of “Publish
Junit test result report”

39
Step 6 − In the Test reports XML’s, enter the location as shown below. Ensure that
Reports is a folder which is created in the HelloWorld project workspace. The “*.xml”
basically tells Jenkins to pick up the result xml files which are produced by the
running of the Junit test cases. These xml files which then be converted into reports
which can be viewed later.
Once done, click the Save option at the end.

Step 7 − Once saved, you can click on the Build Now option.
Once the build is completed, a status of the build will show if the build was successful
or not. In the Build output information, you will now notice an additional section
called Test Result. In our case, we entered a negative Test case so that the result
would fail just as an example.

40
One can go to the Console output to see further information. But what’s more interesting is
that if you click on Test Result, you will now see a drill down of the Test results

41
Exercise 12
In the configured Jenkins pipeline created in Exercise 8 and 9, implement quality gates
for code coverage.

Implementing quality gates for code coverage


Jenkins Job setup
we use Jenkins as our CI and sonarqube as a code inspection tool. Therefore, we wanted to
force Jenkins’s build job to fail if the code doesn’t meet the specified Quality Gates.
Sample quality gate metrics setup in sonarqube.
Configure Jenkins job to fail the build when not meeting Quality Gates.
Prerequisites — Install Jenkins plugin “sonar-quality-gates-plugin” if not already present.
Email-ext plugin for Jenkins to be able to send emails. We also need Jenkins job which is
already configured with sonar analysis and build for it passing successfully.
For example, here is the snapshot of the Jenkins build job that is currently passing Jenkins
build job before Quality Gates setup.

Let’s setup Quality Gates metrics in Sonar. We are going to create a quality gate only for the

metric called Code Coverage for demo purpose. But there are more metrics available that you

should be selecting while creating quality gates.

ogin to sonar > got Quality Gates as shown in the screen below.

42
Click on create > Add Condition > Choose metrics (In this example, we selected Code
Coverage) > select operator along with warning and error threshold.

Select the project to add Quality Gates. We have selected a sample miqp project. In your case,
project name would be different so please change it accordingly.

43
Now go to the Jenkins job and configure the Quality Gate validation.

Click on the job and go to Post-build Actions and provide the project details you have

associated with Quality Gate created in the earlier steps.

We have configured the project key of miqp sample project along with Job status when sonar

analysis fails.

Verify if our build fails after the quality check was enabled.

44
You can verify the same in the sonarqube server.

In the above screenshot, you can verify that the project miqp is reporting WARNING as the
code failed to meet the code coverage metric (i.e we set in one of the earlier steps).

45

You might also like