Professional Documents
Culture Documents
EXPERIMENT NO. 1
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Experiment No. : 1
Aim: To understand DevOps: Principles, Practices, and DevOps
Engineer Role and Responsibilities
Theory:
What is DevOps?
DevOps stands for development and operations. It’s a practice that
aims at merging development, quality assurance, and operations
(deployment and integration) into a single, continuous set of
processes. This methodology is a natural extension of Agile and
continuous delivery approaches.
By adopting DevOps companies gain three core advantages that cover technical,
business, and cultural aspects of development.
Higher speed and quality of product releases. DevOps speeds up product
release by introducing continuous delivery, encouraging faster feedback, and
allowing developers to fix bugs in the system in the early stages. Practicing DevOps,
the team can focus on the quality of the product and automate a number of
processes.
Culture
DevOps is initially the culture and mindset forging strong collaborative bonds
between software development and infrastructure operations teams. This culture is
built upon the following pillars.
Constant collaboration and communication. These have been the building blocks
of DevOps since its dawn. Your team should work cohesively with the understanding
of the needs and expectations of all members.
Gradual changes. The implementation of gradual rollouts allows delivery teams to
release a product to users while having an opportunity to make updates and roll back
if something goes wrong.
Shared end-to-end responsibility. When every member of a team moves towards
one goal and is equally responsible for a project from beginning to end, they work
cohesively and look for ways of facilitating other members’ tasks
Early problem-solving. DevOps requires that tasks be performed as early in the
project lifecycle as possible. So, in case of any issues, they will be addressed more
quickly.
DevOps model and practices
DevOps requires a delivery cycle that comprises planning, development, testing,
deployment, release, and monitoring with active cooperation between different
members of a team.
Agile planning
In contrast to traditional approaches of project management, Agile planning
organizes work in short iterations (e.g. sprints) to increase the number of releases.
This means that the team has only high-level objectives outlined, while making
detailed planning for two iterations in advance. This allows for flexibility and pivots
once the ideas are tested on an early product increment. Check our Agile
infographics to learn more about different methods applied.
Continuous development
The concept of continuous “everything” embraces continuous or iterative software
development, meaning that all the development work is divided into small portions
for better and faster production. Engineers commit code in small chunks multiple
times a day for it to be easily tested.
Continuous automated testing
A quality assurance team sets committed code testing using automation tools like
Selenium, Ranorex, UFT, etc. If bugs and vulnerabilities are revealed, they are sent
back to the engineering team. This stage also entails version control to detect
integration problems in advance. A Version Control System (VCS) allows developers
to record changes in the files and share them with other members of the team,
regardless of their location.
Continuous deployment
At this stage, the code is deployed to run in production on a public server. Code
must be deployed in a way that doesn’t affect already functioning features and can
be available for a large number of users. Frequent deployment allows for a “fail fast”
approach, meaning that the new features are tested and verified early. There are
various automated tools that help engineers deploy a product increment. The most
popular are Chef, Puppet, Azure Resource Manager, and Google Cloud Deployment
Manager.
Continuous monitoring
The final stage of the DevOps lifecycle is oriented to the assessment of the whole
cycle. The goal of monitoring is detecting the problematic areas of a process and
analyzing the feedback from the team and users to report existing inaccuracies and
improve the product’s functioning.
DevOps tools
The main reason to implement DevOps is to improve the delivery pipeline and
integration process by automating these activities. As a result, the product gets a
shorter time-to-market. To achieve this automated release pipeline, the team must
acquire specific tools instead of building them from scratch.
Currently, existing DevOps tools cover almost all stages of continuous delivery,
starting from continuous integration environments and ending with containerization
and deployment. While today some of the processes are still automated with custom
scripts, mostly DevOps engineers use various products. Let’s have a look at the
most popular ones.
Server configuration tools are used to manage and configure servers in DevOps.
Puppet is one of the most widely used systems in this category. Chef is a tool for
infrastructure as code management that runs both on cloud and hardware servers.
One more popular solution is Ansible that automates configuration management,
cloud provisioning, and application deployment.
CI/CD stages also require task-specific tools for automation — such as Jenkins that
comes with lots of additional plugins to tweak continuous delivery workflow or GitLab
CI, a free and open-source CI/CD instrument presented by GitLab.
For more solutions, check our corresponding article where we compare the major CI
tools on today’s market.
Some other DevOps experts partly disagree with this statement. They also believe
that a team is a key to effectiveness. But in this interpretation, a team – including
developers, a quality assurance leader, a code release manager, and an automation
architect – work under the supervision of a DevOps engineer.
So, the title of a DevOps Engineer is an arguable one. Nonetheless, DevOps
engineers are still in demand on the IT labor market. Some consider this person to
be either a system administrator who knows how to code or a developer with a
system administrator’s skills.
DevOps engineer responsibilities
In a way, both definitions are fair. The main function of a DevOps engineer is to
introduce the continuous delivery and continuous integration workflow, which
requires the understanding of the mentioned tools and the knowledge of several
programming languages.
Depending on the organization, job descriptions differ. Smaller businesses look for
engineers with broader skillsets and responsibilities. The basic and widely-
accepted responsibilities of a DevOps engineer are:
EXPERIMENT NO. 2
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Theory:
Version control allows you to keep track of your work and helps you to
easily explore the changes you have made, be it data, coding scripts,
notes, etc. You are probably already doing some type of version
control, if you save multiple files, such as
Dissertation_script_25thFeb.R, Dissertation_script_26thFeb.R, etc.
This approach will leave you with tens or hundreds of similar files,
making it rather cumbersome to directly compare different versions,
and is not easy to share among collaborators. With version control
software such as Git, version control is much smoother and easier to
implement. Using an online platform like Github to store your files
means that you have an online back up of your work, which is beneficial
for both you and your collaborators.
Git uses the command line to perform more advanced actions and we
encourage you to look through the extra resources we have added at
the end of the tutorial later, to get more comfortable with
Git. But until then, here we offer a gentle introduction to syncing
RStudio and Github, so you can start using version control in minutes.
Step1:Install git
To install the Git Account
GO to the git page once you reach the page click sign up
After clicking on create account you will able to see your account
EXPERIMENT NO. 3
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Linus Torvalds, the developer of the Linux kernel, created Git in 2005 to
help control the Linux kernel's development.
This change history lives on your local machine and lets you revert to a
previous version of your project with ease in case something goes
wrong.
Git makes collaboration easy. Everyone on the team can keep a full
backup of the repositories they're working on on their local machine.
Then, thanks to an external server like BitBucket, GitHub or GitLab, they
can safely store the repository in a single place.
This way, different members of the team can copy it locally and
everyone has a clear overview of all changes made by the whole
team.
Conclusion: performed git operation using git cheat.
Ramrao Adik Institute of Technology
DEPARTMENT OF INFORMATION TECHNOLOGY
ACADEMIC YEAR: 2021-2022
EXPERIMENT NO. 4
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Next we will install Jenkins. Issue the following four commands in sequence
to install Jenkins on Ubuntu:
By default Jenkins will run on port 8080. To start Jenkins type in the
IP of your VPS and the port number 8080. It would look something
like this in your browser – 120.0.0.1:8080.
You will be asked to enter the administrator password. You can find
the password in
the /var/lib/jenkins/secrets/initialAdminPassword file. You can use
the cat command to display the password:
cat /var/lib/jenkins/secrets/initialAdminPassword
Conclusion: Successfully installed and configure Jenkins in ubuntu.
Ramrao Adik Institute of Technology
DEPARTMENT OF INFORMATION TECHNOLOGY
ACADEMIC YEAR: 2021-2022
EXPERIMENT NO. 5
EXPERIMENT TITLE To Build the pipeline of jobs using Maven / Gradle / Ant
in Jenkins, create a pipeline script to Test and deploy
an application over the tomcat server.
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Aim: To build the pipeline of jobs using Maven / Gradle / Ant in Jenkins, create a
pipeline script to test and deploy an application over the tomcat server.
LO3: To understand the importance of Jenkins to Build and deploy Software Applications
on server environment
Theory:
Tomcat:
The following pieces of software are required to follow this example of a Jenkins
deployment of a WAR file to Tomcat:
For a successful Jenkins Tomcat deploy of a WAR file, you must add a new user
to Tomcat with manager-script rights. You can do this with an edit of the
tomcat-users.xml file, which can be found in Tomcat's conf directory.
After you edit the tomcat-users.xml file, it's a good idea to bounce the Tomcat server
to confirm the changes have taken effect.
Out of the box, there are no built-in features that perform a Jenkins WAR file
deployment to Tomcat. That means a Jenkins Tomcat deploy plugin must be installed
in the CI tool to make a deployment happen.
The most popular Jenkins Tomcat deployment plugin is named Deploy to container,
which can be installed through the Plugin Manager tab under the "Manage Jenkins"
section of the tool.
With the Jenkins Tomcat deployment plugin installed, it's time to create a new
Jenkins build job that can build an application and deploy a packaged WAR file to
Tomcat.
Step 3A: Create a Jenkins freestyle project
This freestyle Jenkins job will build a WAR and deploy it to Tomcat.
The Jenkins build job will be configured with the following properties:
JDK: java8
Git Repository URL: https://github.com/cameronmcnz/rock-paper-scissors.git Git
branch specifier: */patch-1
Maven Goals: clean install
After a build, the final step of a Jenkins pipeline deploy to Tomcat is to use the Deploy
to container plugin in a post-build action.
Three of the four settings used by the Deploy WAR/EAR to a container plugin can
be typed in directly:
WAR/EAR files: **/*.war
Context path: rps
Containers: Tomcat 8.x
Tomcat URL: http://localhost:8081
To configure the credentials, you must click the Add button next to the empty entry
field and create a new Jenkins credentials object:
The username and password need to match what was entered into the tomcatusers.xml
file in an earlier step:
Username: Sam
Password: *******
Now that you have specified all of the configurations, the Jenkins build job can be
saved and run.
When the build job finishes, the Jenkins Tomcat deploy of a WAR file will have
also completed, and a file named rps.war will be visible in the webapps directory
of Tomcat.
A WAR file deployed to Tomcat through Jenkins.
With the WAR file deployed, test the application by running Tomcat and pointing
your browser to the following URL:
http://localhost:8081/rps/#
To recap, here is a summary of the steps required to perform a Jenkins Tomcat WAR
file deployment:
Conclusion: We have successfully built the pipeline of jobs using Maven / Gradle
/ Ant in Jenkins and also created a pipeline script to test and deploy an application
over the tomcat server.
Ramrao Adik Institute of Technology
DEPARTMENT OF INFORMATION TECHNOLOGY
ACADEMIC YEAR: 2021-2022
EXPERIMENT NO. 6
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
3. Select New Node and enter the name of the node in the Node Name field.
4. Select Permanent Agent and click the OK button. Initially, you will get only one
option, "Permanent Agent." Once you have one or more slaves you will get the
"Copy Existing Node" option.
5. After Clicking OK, following configuration page will appear for machine Test,
enter the required information.
6. Node is Created
8. In the same directory, where you have downloaded the agent you have to run the
command.
15.We can verify the history of executed build under the Build History by clicking
the build number.
16.Click on build number and select Console Output. Here you can see that the
pipeline ran on a slave machine.
Conclusion: Successfully understood Jenkins Master-Slave Architecture, created
a Slave Node and created a Pipeline Running on the Slave Machine.
Ramrao Adik Institute of Technology
DEPARTMENT OF INFORMATION TECHNOLOGY
ACADEMIC YEAR: 2021-2022
EXPERIMENT NO. 7
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
For this tutorial, we will use Eclipse (Juno) IDE for Java Developers to set up
Selenium WebDriver Project. Additionally, we need add m2eclipse plugin to
Eclipse to facilitate the build process and create pom.xml file. Let’s add
m2eclipse plugin to Eclipse with following steps:
Step 1) In Eclipse IDE, select Help | Install New Software from Eclipse Main Menu.
Step 2) On the Install dialog, Enter the
URL http://download.eclipse.org/technology/m2e/releases/. Select Work
with and m2e plugin as shown in the following screenshot:
Step 3) Click on Next button and finish installation.
Step 7). Select pom.xml from Project Explorer, pom.xml file will Open in Editor
section
Step 8) Add the Selenium, Maven, TestNG, Junit dependencies to pom.xml in the
<project> node:
Step 9) To add TestNG library in Eclipse install it from Help/Eclipse Marketplace
and after installation Restart IDE.
Step 10) Create a New TestNG Class. Enter Package name as “example” and “NewTest”
in the Name: textbox and click on the Finish button as shown in the
following screenshot:
Step 11) Eclipse will create the NewTest class as shown in the following screenshot:
Step 2) Select the Maven project button as shown in the following screenshot:
Using the Build a Maven Project option, Jenkins supports building and testing Maven
projects.
Step 5) On the WebdriverTest project page, click on the Build Now link.
Console Output:
Conclusion: Successfully Setup and Run Selenium Tests in Jenkins Using Maven.
Ramrao Adik Institute of Technology
DEPARTMENT OF INFORMATION TECHNOLOGY
ACADEMIC YEAR: 2021-2022
EXPERIMENT NO. 8
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
EXPERIMENT NO. 8
Practical name: Docker architecture and container Life cycle, install Docker and execute docker
commands to manage images and interact with containers.
Aim: To understand Docker architecture and container Life cycle, install Docker and execute docker
commands to manage images and interact with containers.
Lab Outcome: LO1: To understand the fundamentals of DevOps engineering and be fully
proficient with DevOps terminologies, concepts, benefits, and deployment options to meet your
business requirements.
LO5: To understand the concept containerization and Analyze the containerization of OS image and
deployment of application over Docker.
Theory:
What is Docker?
Developers can create containers without Docker, but the platform makes it easier, simpler, and
safer to build, deploy and manage containers. Docker is essentially a toolkit that enables
developers to build, deploy, run, update, and stop containers using simple commands and work-
saving automation through a single API.
Docker also refers to Docker, Inc. (link resides outside IBM), the company that sells the
commercial version of Docker, and to the Docker open-source project (link resides outside
IBM), to which Docker, Inc. and many other organizations and individuals contribute.
Most notably, in 2008, LinuXContainers (LXC) was implemented in the Linux kernel, fully
enabling virtualization for a single instance of Linux. While LXC is still used today, newer
technologies using the Linux kernel are available. Ubuntu, a modern, open-source Linux
operating system, also provides this capability.
1
Docker enhanced the native Linux containerization capabilities with technologies that enable:
● Improved and seamless portability: While LXC containers often reference
machinespecific configurations, Docker containers run without modification across any
desktop, data center and cloud environment.
● Even lighter weight and more granular updates: With LXC, multiple processes can be
combined within a single container. With Docker containers, only one process can run
in each container. This makes it possible to build an application that can continue
running while one of its parts is taken down for an update or repair.
● Automated container creation: Docker can automatically build a container based on
application source code.
● Container versioning: Docker can track versions of a container image, roll back to
previous versions, and trace who built a version and how. It can even upload only the
deltas between an existing version and a new one.
● Container reuse: Existing containers can be used as base images essentially like
templates for building new containers.
● Shared container libraries: Developers can access an open-source registry containing
thousands of user-contributed containers.
2
Step 2 :
$ sudo apt-get install \
apt-transport-https \
ca-certificates \
curl \ gnupg
\ lsb-
release
3
Step 3: Add Docker’s official GPG key from the official website and perform the next commands
in the list.
4
● echo \
"deb [arch=$(dpkg --print-architecture) signed-by=/usr/share/keyrings/docker-
archive-keyring.gpg] https://download.docker.com/linux/ubuntu \
$(lsb_release -cs) stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null
5
6
Docker Installation is complete on your system.
● $sude su
● docker image ls
7
Step 2:
● Create a git repository docker-java
● git clone https://github.com/Nidhhiii/docker.java.git
● Cd docker-java
● ls
Step 3: nano Dockerfile
Type the following code and save it by Control + X and click Y to save.
8
Step 4: nano HelloWorld.java
Type the following code and save it by Control + X and click Y to save.
Conclusion:
Hence, we understood docker architecture and container Life cycle, installed Docker and executed
docker commands to manage images and interact with containers.
9
Ramrao Adik Institute of Technology
DEPARTMENT OF INFORMATION TECHNOLOGY
ACADEMIC YEAR: 2021-2022
EXPERIMENT NO. 9
CLASS TE - IT
SEMESTER V
15/09/2021, 22/09/2021
GIVEN DATE
CORRECTION DATE
REMARK
TIMELY PRESENTATION UNDERSTANDING TOTAL
SUBMISSION MARKS
4 4 7 15
4. Now , go to the directory where out html , CSS and Docker file is
Now , we can make our container run by command
EXPERIMENT NO. 10
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Step 2 : To connect the host and master, then check the pinging and download
ansible on master machine.
By above command we can see that both master and host are able to connect
with each other with help of ansible.
EXPERIMENT NO. 11
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Thoery : Playbooks use YAML format, so there is not much syntax needed, but
indentation must be respected. Ansible playbooks tend to be more of a
configuration language than a programming language. A playbook is a
collection of plays. Through a playbook, you can designate specific roles to
some of the hosts and other roles to other hosts. By doing so, you can
orchestrate multiple servers in very diverse scenarios, all in one playbook.
Each ansible playbook works with an inventory file. The inventory file contains a
list of servers divided into groups for better control for details like IP address
and SSH port for each host. Ansible Playbook to install LAMP stack with
necessary packages and tools.
Ensure that you have copied the workspace folder which has all the .yml files, php
files and users.sql file present in it.
Setup with Ansible after checking all clients pings with ansible.
Create workspace folder named “ansible”, cd over it and by following command
add the remote ansible git repository to the folder and move to codes folder.
After cd to codes folder make index.html file and write the content in it, then open
the lampstack_1.yml
To Deploy the app with ansible hit with command ansible-playbook
lampstack_1.yml
Now to run the playbook Before that goto browser and type IP address of
ansible_slave
Conclusion : Thus, successfully implemented Provisioning Lamp stack on
Ubuntu using Ansible playbook on top of AWS Instance.
Ramrao Adik Institute of Technology
DEPARTMENT OF INFORMATION TECHNOLOGY
ACADEMIC YEAR: 2021-2022
EXPERIMENT NO. 12
CLASS TE - IT
SEMESTER V
CORRECTION DATE
REMARK
4 4 7 15
Batch : B/B2.
Aim : DEPLOY A WEBSITE CODE ON THE NODE BY PROVISIONING MYSQL
SERVER and DATABASE USING ANSIBLE PLAYBOOK
Theory : Playbooks use YAML format, so there is not much syntax needed, but
indentation must be respected. Ansible playbooks tend to be more of a
configuration language than a programming language. A playbook is a
collection of plays. Through a playbook, you can designate specific roles to
some of the hosts and other roles to other hosts. By doing so, you can
orchestrate multiple servers in very diverse scenarios, all in one playbook.
Each ansible playbook works with an inventory file. The inventory file contains
a list of servers divided into groups for better control for details like IP address
and SSH port for each host.
Ansible Playbook to install with necessary packages and tools. Ensure that you
have copied the workspace folder which has all the .yml files, php files and
users.sql file present in it.
To Run and Deploy the Playbook
$ansible-playbook mysqlmodule.yml
Conclusion : Thus, Successfully DEPLOY A WEBSITE CODE ON THE NODE BY
PROVISIONING MYSQL SERVER and DATABASE USING ANSIBLE PLAYBOOK
Name: Ayush Premjith
Roll No. :19IT2034
Batch:B2
DEVOPS LAB
ASSIGNMENT NO. 1
Case study on DevOps Implementation in real world
To all the companies that think building your own infrastructure and tools
from scratch is the best approach because no one can do it as good as
you – one of the main reasons Adobe is so successful today is because
they realized the scalability advantages of cloud early on, and let Amazon
handle the heavy lifting of building the best datacentres.
Adobe has been building out a managed service that includes continuous
integration/continuous development capabilities on the Adobe cloud
service. Now Adobe is extending the DevOps processes enabled by its
CI/CD platform to make them more customizable in addition to
expanding the scope of the application development tools it provides to
include support for single page application (SPA) JavaScript frameworks.
Name: Ayush Premjith
Roll No. :19IT2034
Batch:B2
It’s hard to say to what degree Adobe will be able to drive adoption of
DevOps processes across the Adobe Cloud platform. What might be even
more interesting to see is how many of the organizations that rely on
Adobe Cloud to develop applications will even realize they made the
transition to DevOps.
Name :- Ayush Premjith
Roll No :- 19IT2034
Batch :- B2.
Assignment No 2
Batch :- B(B2).
Name :- Ayush Premjith
Roll No :- 19IT2034
Batch :- B2.
Bitbucket :-
Step 5) Then after cloning it will ask for the Password of your
BitBucket Accout and you have to provide it.
Name :- Ayush Premjith
Roll No :- 19IT2034
Batch :- B2.
Step 6) We will now add some code to our Repository while going in
Source Tab and providing the code with File name
Step 7) We can add the Code and commit the changes via Code
Editor too and we will have the following steps below mentioned
Name :- Ayush Premjith
Roll No :- 19IT2034
Batch :- B2.
Name :- Ayush Premjith
Roll No :- 19IT2034
Batch :- B2.