You are on page 1of 71

DESIGN AND DEVELOPMENT OF SURVIVLANCE OF SPY

DRONE

PROJECT REPORT
SUBMITTED IN THE PARTIAL FULFILLMENT OF THE
REQUIREMENTS FOR THE
AWARD OF THE DEGREE OF

“BACHELOR OF TECHNOLOGY”
IN
MECHANICAL ENGINEERING (MECHATRONICS)

Submitted by
K.SAI CHARAN [20261A1419]
M.SRINIVAS [20261A1427]
K.SUMANTH NAYAK [21265A1402]

Under the esteemed guidance of


MS M. PRATYUSHA
Asst. Professor
DEPARTMENT OF MECHANICAL ENGINEERING

MAHATMA GANDHI INSTITUTE OF TECHNOLOGY


Accredited by National Board of Accreditation, New Delhi,
(Affiliated to Jawaharlal Nehru Technological University,
Hyderabad) Gandipet, Hyderabad – 500 075 (T.S)
www.mgit.ac.in

i
MAHATMA GANDHI INSTITUTE OF TECHNOLOGY
Accredited by National Board of Accreditation, New Delhi,
(Affiliated to Jawaharlal Nehru Technological University, Hyderabad)
Gandipet, Hyderabad – 500 075 (T.S)
www.mgit.ac.in

CERTIFICATE

Submitted by

K.SAI CHARAN [20261A1419]


M.SRINIVAS [20261A1427]
K.SUMANTH NAYAK [21265A1402]

This is to certify that the project report entitled “DESIGN, DEVELOPMENT


AND FABRICATION OF SURVIVLANCE SPY DRONE” In partial fulfillment
for the award of the Degree of Bachelor of Technology in Mechanical Engineering
is a record of bonafide work carried out by him under my guidance and supervision
during the academic year 2023-24. The results embodied in this project report have
not been submitted to any other University or Institute for the award of any Degree
or Diploma.

Internal Guide Prof. Dr. K. Sudhakar Reddy


Ms M. Pratyusha Head of the
Department Assistant Professor
Department of Mechanical Engineering

Internal Examiner External Examiner

ii
ACKNOWLEDGEMENT

The completion of the project brings with it a sense of satisfaction, but never
complete without thanking those people who made it possible and whose constant
support has crowned our group efforts with success.
With deep sense of gratitude, we acknowledge the esteemed guidance, help and
active cooperation rendered by our guide MS M. PRATYUSHA, Assistant
professor of Mechanical Engineering. His inspiring guidance has sustained the effort
that has led to successful completion of the project.
We are also thankful to Prof. K. SUDHAKAR REDDY, Head of the department of
Mechanical Engineering for his valuable guidance and acceptance of our project.
Our sincere thanks to Prof. G. CHANDRA MOHAN REDDY, Principal and
Mahatma Gandhi institute of technology for his encouragement and support all
throughout our project work without which it would not have been so thriving.
Finally, yet importantly, I would like to express heartfelt thanks to the entire faculty
of the department of Mechanical Engineering for providing me with the inputs
requisite for undertaking such an uphill task.

K.SAI CHARAN [20261A1419]


M.SRINIVAS [20261A1427]
K.SUMANTH NAYAK [21265A1402]

iii
DECLARATION
We, declare that the Industry oriented Mini project entitled “DESIGN AND
DEVELOPMENT OF SPY DRONE” is original and unpublished project work
carried out by us under the supervision MS M. PRATYUSHA, Assistant Professor,
Department of Mechanical Engineering, MGIT, Hyderabad.
It has not been submitted in part or full to any university or institute for the
award of any other degree. The sources we have sought information from or
consulted have been duly acknowledged that any omission or errors are
entirely inadvertent.

Date:

K.SAI CHARAN [20261A1419]

M. SRINIVAS
[20261A1427]

K.SUMANTH NAYAK [21265A1402]

iv
ABSTRACT

This project introduces an innovative fusion of a robust four-wheeled obstacle-


surmounting vehicle and a sophisticated "spy drone," amalgamated through
artificial intelligence (AI) on a Raspberry Pi for amplified control and autonomy.
The solution is meticulously engineered to navigate complex terrains, offering
substantial potential across search and rescue operations, border security,
agricultural monitoring, and similar applications.

The four-wheeled vehicle, fortified with advanced obstacle avoidance


technology, acts as a versatile mobile platform for deploying and retrieving the
surveillance-enabled spy drone. Equipped with surveillance capabilities such as
high-resolution cameras and intelligent sensors, the drone operates covertly
and effectively gathers critical information. The integration of AI modules on the
Raspberry Pi enables real-time image processing and autonomous decision-
making, empowering the spy drone to analyze live video feeds, identify objects,
and execute informed actions, augmenting operational efficiency.

This abstract encapsulates a strategic response to the burgeoning demand for


comprehensive surveillance and reconnaissance systems, leveraging the
capabilities of a spy drone. The integration of ground and aerial platforms,
fortified with AI on a Raspberry Pi, offers unprecedented adaptability, versatility,
and cost-effectiveness across diverse applications. The primary objective is to
elevate situational awareness and operational efficacy, ushering in a new era of
surveillance technology.

This project represents a significant leap in surveillance methodologies,


harnessing spy drone technology to redefine the landscape of surveillance and
reconnaissance operations. While recognizing the sensitive nature of spy drone
technology, its incorporation aims to ensure enhanced security, safety, and
efficiency across various domains.

v
CONTENTS

CHAPTER CONTENTS Pg.No

Certificates ii
Acknowledgement iii
Declaration iv
Abstract v
1 Introduction
1.1 Introduction & motivation
1.2 problems 2
1.3 Objective 3
1.4 Project plan 4

2 History & Literature review


2.1 History
2.1.1 What is drone
2.1.2 Early attempts
2.1.2.1 Breguet-Richet Gyroplane(1907)
2.1.2.2 Oehmichen No.2(1920)
2.1.2.3 De Bothezat helicopter (1922)
2.1.3 Recent developments
2.1.4 Agricultural drone
2.1.5 Surveillance spy drone
2.2 Literature Review
2.2.1 Classification
2.2.2 Durham K.Giles, Rayan C. Billing’s
2.2.3 Dr Archibald Low drone
vi
2.2.4 Abraham Karem drone
2.2.5 Nishant drone

3 Methodology & Working


3.1 Methodology
3.1.1 Parts used
3.2 Working
3.2.1 Flight dynamics
3.2.2 How quadcopters fly
3.2.3 Drag
3.2.4 Thrust
3.2.5 How drones lifts
3.2.6 How do quadcopters hover?
3.2.7 Gaining & losing altitude
3.2.8 Roll of drone
3.2.9 Pitch
3.2.10 Yaw of drone
3.2.11 Controls

4 Design and calculations


4.1 Design
4.1.1 Parts of frame
4.1.1.1 Arms of frame
4.1.1.2 Base of frame
4.1.1.3 Frame assembly
4.1.2 Propellers
4.2 Calculations
4.2.1 Motor Specifications
4.2.2 Specifications of Propeller
4.2.3 Specifications of frame
vii
4.2.4 Maximum Thrust For 1 Motor (Kv)

4.2.5 Dead Weight & Payload (in grams)

4.2.6 Drag force

4.2.7 Lift force

5 Conclusions
5.1 Advantage & Disadvantages x

Conclusion
References

viii
LIST OF FIGURES

Figure No. Title Pg. No


2.1 Drone 1
2.2 Breguet-Richet Gyroplane 2
2.2.2 Durham K.Giles, Rayan c.Billing’s drone
2.3 Oehmichen No.2 3
2.4 Drone for photography 4
2.5 Delivery drone by amazon 5
2.6 An Agricultural drone 5
2.6.1 SPY Drone
2.7 Fixed wing drone 9
2.7.1 Durham K. Giles, Ryan C. Billing’s Drone
2.8 Dr. Archibald Low's drone 10
2.8.1 Abraham Karem
2.9 Nishant drone 12
3.1 Wiring diagram 16
3.1.1 Process flow diagram
3.1.2 Pixhawk pin discription
3.2 Flight dynamics of drone 17
3.3 Flying setup of quadcopter 18
3.4 Quard X mode 18
3.5 Drag of quadcopter 19
3.6 Thrust force of quadcopter 19
3.7 Lift of drone 20
3.8 Hovering of quadcopter 21
3.9 Getting & losing height by drone 23
3.10 Left roll of quadcopter 25
3.11 Right roll of quadcopter 25
3.12 Backward pitch of quadcopter 26

ix
3.13 Yaw of quadcopter 27
3.14 Remote control system 30
4.1 Arms of frame 31
4.2 Upper base of frame 32
4.3 33
Lower base of frame
4.4 34
Assembled frame
4.5 35
4.6 Propeller

x
LIST OF TABLE

Table 2.1 Literature review

Table 3.1 List of parts used

Table 4.1 Specifications of motor

xi
CHAPTER-1

INTRODUCTION

1.1 Introduction & Motivation

Spy drones, also known as unmanned aerial vehicles (UAVs) or remotely piloted
aircraft (RPA), are unmanned aircraft systems used for surveillance and reconnaissance
purposes. These drones are equipped with various sensors, cameras, and communication
systems to gather intelligence and monitor activities in both military and civilian contexts.

 Types of Spy Drones

There are different types of spy drones, ranging from small handheld devices to
large, long-endurance aircraft. Examples include the Predator, Reaper, Global Hawk, and
MQ-9 Reaper, among others.

 Applications

Spy drones have numerous applications, primarily in military and intelligence


operations. They are used for aerial surveillance, target acquisition, intelligence gathering,
and reconnaissance missions. In addition, they can support law enforcement, border
security, disaster management, and environmental monitoring.

 Features and Capabilities

Spy drones are equipped with advanced technologies such as high-resolution cameras,
infrared sensors, radar systems, and communication devices. They can operate
autonomously or be remotely controlled, providing real-time data and images to operators
or command centers.

 Benefits

Spy drones offer several advantages. They provide a cost-effective and safer alternative to
manned aircraft for missions in hazardous or remote areas. They can be deployed quickly
and operated over extended periods, enhancing situational awareness and operational
effectiveness.
1.

12
 Challenges and Concerns

There are also challenges and concerns associated with spy drones. These include
privacy issues, potential misuse, airspace regulations, vulnerability to hacking or jamming,
and the risk of collateral damage. Addressing these concerns is crucial for responsible and
ethical use of such technology.

 Future Developments

The field of spy drones continues to evolve rapidly. Advancements in artificial


intelligence, miniaturization of components, and improved battery life are driving the
development of more capable and autonomous drones. Additionally, efforts are being made
to enhance cybersecurity and establish regulations for their safe and responsible use.

The motivation to this project, problems to the recent project and project plan is as follows

Motivation

In recent years, the use of unmanned aerial vehicle (UAV) has been a major focus of active
research, since they extend our capability in a variety of areas. Applications such as
surveillance, medical evacuation and search-and-rescue missions, agriculture are some such
areas where use of UAVs is explored. Personal Drones have been all the rage for the past
few years, as toys, and primarily as new devices for capturing amazing aerial photography.
As the technology has matured and become more mainstream, a number of practical and
very interesting uses of Drone technology have emerged. In the past few months we have
seen some amazing developments in the flying drone industry. These drone can be used in
agriculture field, in defence field, in transportation field (delivery) and many other fields.
such as crop spray and surveillance in agriculture and defence feilds. Using drones we have
following advantages

1. Increased Efficiency

Drones are equipped with sensors that can collect data at a much faster pace than traditional
methods. This enables oil and gas companies to survey large areas quickly and allows them
to detect faults and geographical features at a much faster pace. Drones can also be used to
monitor pipelines, check for leaks, and conduct surveys of oil spill zones. All of this can be
done in a short amount of time and at a reduced cost because drones do not need as much
manpower as traditional methods require.

2. Improve Safety

Another significant advantage offered by drone technology is safety. Traditional methods


of oil and gas exploration often involve on-site staff who are at risk of physical harm and

13
exposure to hazardous substances. By using drones to conduct surveys and inspections, oil
and gas companies can avoid putting their staff in danger.

3. Access to Remote Areas

Some oil and gas exploration takes place in remote and inaccessible areas. By using drones,
oil and gas companies can survey these areas without putting their staff in danger. Drones
can also fly over rough terrain, making them ideal for inspections of pipelines and other
infrastructure placed in unmanageable terrain.

4. Cost-Effectiveness

Drone technology is more cost-effective than traditional methods of oil and gas exploration,
primarily because drones do not require the same level of human resources and time
investment that traditional methods involve. By using drones, oil and gas companies can
reduce their expenses while still collecting valuable data that can inform their operations.

5. Enhanced Data Collection

Drone technology is transforming data collection and analysis in the oil and gas industry,
allowing greater accuracy and speed. Drones are equipped with sensors that can measure
heat, pressure, and humidity levels, which helps detect fault zones and provides real-time
data about natural gas and oil reserves.

6. Improved Response to Accidents

In the event of an accident in the oil and gas industry, time is of the essence. Drones can
help oil and gas companies respond more quickly to accidents and provide more accurate
information about the extent of the damage. This can help prevent significant losses and
provide first responders with information to respond to the accident more efficiently.

Drones are transforming oil and gas exploration and providing numerous benefits for the
companies that use them. The technology allows for increased efficiency, improved safety,
access to remote areas, cost-effectiveness, enhanced data collection, and improved response
to accidents. As advancements in drone technology and sensors continue, the future of oil
and gas exploration likely will see these kinds of technological tools in wider use, evolving
beyond their current applications. It is clear that oil and gas companies that embrace drone
technology will have a significant advantage.

1.2 Problems
Recent researches on drone have led them to use in many applications as it is also used in
various fields now a days drone are more stable in flight, and are more advanced than ever
but as these are also having some drawbacks a spy drone can only work in militant

14
operations as well as it has to follow international laws, communication failure also one of
the major concerns, a spy drone can only spectate it can’t work more than that.

1.3 Objective

The basic objective of the project is to reduce the cost of the drone making as well as to
increase the payload capacity of the drone such that it could carry medicines as well as food
during military operations during extrme weather conditions and also helps in spectating
the enemies at borders. Also one of the main objective is to reduce the weight and size of
the drone and to increase the thrust and lift of the drone such that it is invincible to
enimies.

The final Drone design had to meet the following specifications:

 The Drone must be capable of carrying payload up to 1.5kg.

 It must have long range.

 The Drone must be stable and balanced.

 The Drone must be cheap and components must be easily available.

 The drone must be capable of spectating as well as to carry payload.

1.4 Project plan

The project plan was divided into five major parts.

 Project Description and Plan of Work


 System Model
 Components Purchasing
 Implementation / Hardware / Software
 Project Demonstrations

CHAPTER 2

HISTORY & LITERATURE

15
REVIEW

2.1 History

2.1.1 What is drone?

An unmanned aerial vehicle (UAV), commonly known as a drone, is an aircraft without a


human pilot onboard. UAVs are a component of an unmanned aircraft system (UAS);
which include a UAV, a ground-based controller, and a system of communications between
them. The flight of UAVs may operate with various degrees of autonomy: either under
remote control by a human operator or autonomously by onboard computers. Compared to
manned aircraft, UAVs were originally used for missions too "dull, dirty or dangerous" for
humans. While they originated mostly in military applications, their use is rapidly
expanding to commercial, scientific, recreational, agricultural, and other applications, such
as policing, peacekeeping, and surveillance, product deliveries, photography, and drone
racing also used in many militant operations.

Fig. 2.1 drone

16
A UAV is defined as a "powered, aerial vehicle that does not carry a human operator, uses
aerodynamic forces to provide vehicle lift, can fly autonomously or be piloted remotely,
can be expendable or recoverable, and can carry a lethal or nonlethal payload" .Therefore,
missiles are not considered UAVs because the vehicle itself is a weapon that is not reused,
though it is also unmanned and in some cases remotely guided. [1]

2.1.2 Early attempts

The idea of drone quadcopter has been experimented, some of the experiments are as
follows

2.1.2.1 Breguet-Richet Gyroplane (1907)


A four-rotor helicopter designed by Louis Breguet. This was the first rotary wing aircraft to
lift itself off the ground, although only in tethered flight at an altitude of a few feet. In 1908
it was reported as having flown 'several times', although details are sparse.[2]

Fig. 2.2: Breguet-Richet Gyroplane

2.1.2.2 Oehmichen No.2 (1920)

Etienne Oehmichen experimented with rotorcraft designs in the 1920s. Among the six
designs he tried, his helicopter No.2 had four rotors and eight propellers, all driven by a
single engine. The Oehmichen No.2 used a steel-tube frame, with two-bladed rotors at the
ends of the four arms. The angle of these blades could be varied by warping. Five of the
propellers, spinning in the horizontal plane, stabilized the machine laterally. The aircraft
exhibited a considerable 6 degree of stability and increase in control-accuracy for its time,

17
and made over a thousand test flights during the middle 1920s. By 1923 it was able to
remain airborne for several minutes at a time, and on April 14, 1924 it established the first-
ever FAI distance record for helicopters of 360 m (390 yd). It demonstrated the ability to
complete a circular course and later, it completed the first 1 kilometre (0.62 mi) closed-
circuit flight by a rotorcraft.[2]

18
Fig. 2.3: Oehmichen No.2

2.1.2.3 de Bothezat helicopter (1922)

Dr. George de Bothezat and Ivan Jerome developed this aircraft, with six-bladed rotors at the
end of an X-shaped structure. Two small propellers with variable pitch were used for thrust and
yaw control. The vehicle used collective pitch control. Built by the US Air Service, it made its
first flight in October 1922. About 100 flights were made by the end of 1923.[2]

2.1.3 Recent developments

In the last few decades, small-scale unmanned aerial vehicles have been used for many
applications. The need for aircraft with greater maneuverability and hovering ability has led to a
rise in quadcopter research. The four-rotor design allows quadcopters to be relatively simple in
design yet highly reliable and maneuverable. Research is continuing to increase the abilities of
quadcopters by making advances in multi-craft communication, environment exploration and
maneuverability. If these developing qualities can be combined, quadcopters would be capable
of advanced autonomous missions that are currently not possible with other vehicles

Some current programs include

 The Bell Boeing Quad TiltRotor concept takes the fixed quadcopter concept further by
combining it with the tilt rotor concept for a proposed C-130 sized military transport.
 Flying prototype of the Parrot AR.Drone : Parrot AR.Drone 2.0 take-off, Nevada, 2012.
 Parrot AR.Drone is a small radio controlled quadcopter with cameras attached to it built
by Parrot SA, designed to be controllable by smartphones or tablet devices
 Nixie is a small camera-equipped drone that can be worn as a wrist band.[3]
 Amazon has announced a service, which will deliver your orders right to your door, and
3D Robotics, a commercial drone maker, has received $36 million in funding. The
future of drones flying around everywhere is coming closer and closer to us.[4]

19
Fig. 2.4: Drone for photography

Fig. 2.5: Delivery drone by amazon

2.1.4 Agricultural drone

As drones entered use in agriculture, the Federal Aviation Administration (FAA) encouraged
farmers to use this new technology to monitor their fields. However, with the unexpected boom

20
of agricultural drones, the FAA quickly retracted such encouragement, pending new rules and
regulations. With incidents such as drones crashing into crop dusters, it was vital for the FAA
and the AFBF (American Farm Bureau Federation) to agree on regulations that would allow the
beneficial use of such drones in a safe and efficient manner. Although the American Farm
Bureau Federation would like small adjustments to some of the restrictions that have been
implemented, they are happy that the agricultural industry can actually use this new machinery
without the worry of facing any legal issues.

Fig. 2.6: An Agricultural drone

2.1.5 SPY drone (surveillance)

Spy drones, also known as unmanned aerial vehicles (UAVs) or remotely piloted aircraft
(RPA), are unmanned aircraft systems used for surveillance and reconnaissance purposes.
These drones are equipped with various sensors, cameras, and communication systems to gather
intelligence and monitor acti vities in both military and civilian contexts.

Fig. 2.6.1: A SPY Drone

21
2.2 Literature Review

A “Drone” is basically an Unmanned Aerial Vehicle (UAV) – an aircraft without a human pilot
aboard. In this article, we explore the different types of drones out there in the market some of
which are just concepts, while most others are already in action.

2.2.1 classification

“Drones” can be classified on a different basis say based on usage ‘like Drones for
Photography, Drones for aerial Mapping, and Drones for Surveillance etc. However, the best
classif ication of ‘Drones’ can be made on the basis of aerial platforms. Based on the type of
aerial platform used, there are 4 major types of drones.

 Multi Rotor Drones


 Fixed Wing Drones
 Single Rotor Helicopter
 Fixed Wing Hybrid VTOL

Fi
g. 2.7: Fixed wing drone

2.2.2 Durham K. Giles, Ryan C. Billing’s Drone

22
In 2015 Durham K. Giles, Ryan C. Billing Deployment and Performance of a UAV for Crop
Spraying The UAS used in this project was a commercially-produced UAV with the associated
ground control station. The aircraft was a petroleum-powered helicopter (Model RMAX,
Yamaha Motor U.S. Co. USA, Cypress, CA USA) originally developed and deployed for
spraying agrochemicals onto rice in Asia (Figure 1). The physical characteristics of the aircraft
where: Vehicle mass = 100 kg; Rotor diameter = 3.1 m; vehicle length = 3.6 m and vehicle
height = 1.1 m. The aircraft power plant was a two-stroke, 250 cm3 displacement, liquid
cooled, 13.6 kW engine. Control of the aircraft was through a radio linked, 60 mW, dual
joystick handheld transmitter operating in the 72 MHz band. The model used in this project hd
no provisions for autonomous operation; operation was by direct operator manipulation of the
flight control surfaces and the engine throttle. Operation of the aircraft was limited to a 400 m
line-of-sight range.[5]

Fig. 2.7.1: Durham K. Giles, Ryan C. Billing’s Drone

2.2.3 Dr Archibald Low drone

 Dr. Archibald Low was an influential figure in the development of early drone
technology. He is known for his work in the field of radio-controlled aircraft, which laid
the foundation for modern drone systems.

 Low's contributions in the early 20th century played a crucial role in advancing the
concept of unmanned aerial vehicles. He developed a radio-controlled aircraft called the

23
"Aerial Target," which was successfully demonstrated in 1917. This early drone was
powered by an internal combustion engine and controlled through radio signals.

 The Aerial Target drone showcased the potential for unmanned aircraft in military
applications, particularly for target practice and aerial reconnaissance. It marked a
significant milestone in the history of drone technology, as it proved the feasibility and
effectiveness of remote-controlled flight.

 Dr. Archibald Low's pioneering work set the stage for further advancements in drone
technology. Following his research, the concept of unmanned aircraft continued to
evolve and expand, leading to the development of more sophisticated and versatile
drones in subsequent decades.

 Today, modern drones have evolved far beyond what Dr. Low initially envisioned. They
are used not only in military contexts but also in various civilian applications, such as
aerial photography, surveillance, delivery services, and scientific research.

 Dr. Archibald Low's contributions to the field of drone technology laid the groundwork
for the remarkable progress seen in unmanned aerial vehicles today. His work remains
significant in the history of aviation, and his early inventions continue to inspire and
influence the advancements being made in the drone industry.

Fig.2.8: Dr. Archibald Low's

2.2.4 Abraham Karem drone

Abraham Karem is a renowned engineer and inventor who has made significant contributions to
the development of drone technology. He is often referred to as the "father of the modern
drone" due to his pioneering work in designing and building unmanned aircraft systems.

Karem's most notable creation is the General Atomics MQ-1 Predator, which revolutionized
military drone operations. He founded his company, Leading Systems Inc., in the 1980s and
24
collaborated with the U.S. Air Force to develop the Predator drone. This unmanned aerial
vehicle (UAV) was designed for long-endurance surveillance and reconnaissance missions.

The Predator drone incorporated advanced features such as autonomous flight, real-time
surveillance capabilities, and the ability to carry and deploy munitions. It played a crucial role
in military operations, particularly in the surveillance and targeting of adversaries.

Karem's expertise extended beyond the Predator. He also contributed to the development of
other notable drone platforms, including the follow-on MQ-9 Reaper, an upgraded and more
capable version of the Predator.

Abraham Karem's innovations in drone technology have had a profound impact not only in the
military domain but also in other sectors. Drones based on his designs have been employed in
law enforcement, border surveillance, disaster response, and environmental monitoring.

Karem's contributions to the drone industry have shaped its growth and paved the way for the
widespread use of unmanned aerial vehicles today. His expertise and vision have driven
advancements in flight control systems, sensor integration, and autonomous capabilities,
making drones more efficient, versatile, and reliable for various applications.

Fig. 2.8.1: Abraham Karem

2.2.5 Nishant drone

The Nishant drone is an Unmanned Aerial Vehicle (UAV) developed by the Defense Research
and Development Organization (DRDO) of India. It is a remotely piloted aircraft system
primarily designed for aerial reconnaissance and surveillance missions.

25
Here are some key features and aspects of the Nishant drone:

1. Design and Specifications: The Nishant drone is a fixed-wing aircraft with a high-wing
configuration. It has a length of approximately 4.6 meters and a wingspan of around 6.7 meters.
The drone is powered by a two-stroke, liquid-cooled, heavy fuel engine.

2. Payload and Sensors: The Nishant drone is equipped with a variety of sensors and payloads
to support its surveillance mission. These include electro-optical (EO) and infrared (IR) sensors
for gathering visual and thermal imagery, as well as an automatic identification system (AIS)
for maritime surveillance.

3. Communication and Control: The Nishant drone operates on a radio control link for
communication between the ground control station and the aircraft. It can be remotely piloted
by a ground-based operator using manual control inputs.

4. Endurance and Range: The Nishant drone is designed for short-range surveillance missions,
with an operational range of about 200 kilometers. It has a flight endurance of approximately
four to five hours, depending on payload configuration and mission requirements.

5. Launch and Recovery: The Nishant drone utilizes a launch system that includes a hydraulic
launcher for takeoff. It typically lands using a parachute and is recovered with the help of a net
or airbag system.

6. Applications: The Nishant drone is primarily used by the Indian Armed Forces for
intelligence, surveillance, and reconnaissance (ISR) operations. It aids in border surveillance,
target acquisition, battle damage assessment, and tactical support.

The Nishant drone represents India's indigenously developed unmanned aircraft capability. It
provides valuable aerial surveillance and reconnaissance capabilities to enhance situational
awareness and operational effectiveness for the Indian Armed Forces.

Fig. 2.9: Nishant Drone

26
27
CHAPTER 3

METHODOLOGY AND WORKING

3.1 Methodology

There are many design which can be used to make the drone, and there are many ways to
control it such as by mobile, by remote, with gsm, with remote, with Bluetooth wit GPS, etc.
but we have selected to make a multi rotor drone, and further we have made a quadcoptor
which is capable of caring payload also it can spectate( spy on enemies) , the methodology of
this drone is not so complicated.

There are four motor attached on the frame , the frame is made up of plastic which have holes
in it to reduce the weight of the frame each arm of frame is at 90 degree to the other arm. At the
end of these arm there are the motors each arm of the frame consist of one motor. The motor
used in this project is brushless motor and operated with dc current the motor is 2200 KV and is
operated on 11- 24 volts.(4’s,3’s lipo batteries).

The motor is attached with the propellers there are two sets of propellers, each set of propeller
contains one clock wise and one anticlockwise propeller. We have made the quardcoptor in X
mode as it is shown in figure below, so the setup of motors will be as the figure one motor will
rotate clock wise and other will rotate anti clockwise, and just like motors the clockwise motor
will have the clockwise propeller and anticlockwise motor will contain the anticlockwise
propeller to generate upward thrust The direction of each motor will be opposite to its adjacent
motor and such as propeller will also have different direction which will cancel the effect of
each other The control of this drone is made easy by using the special remote control system
instead of using mobile and laptop because they use programing.

As the motor of these drones are used for caring weight and have large thrust and rpm value so
we are using the speed controller circuit for motors named as ESC (electronic speed controller)
Which receive the signals and control the speed. As for stable flight, a flight controller is
needed which gives signal to ESC to control motors, there are many flight controller but we
using Pixhawk flight controller. And the sensors used is gyroscope. The receiver of remote is
connected with flight controller and the value of remote control frequency is collaborated with
flight controller, so the remote can control the thrust values of motors and movement such as
pitch, yaw etc.

28
Fig. 3.1 Wiring diagram

29
Fig. 3.1.1 process flow diagram

3.1.1 Parts used

The parts used in this project is selected by taking care of cost and low weight

1. Quadcopter Frame - It includes arms to hold motors and a chassis to hold the flight
controller, battery and other stuff on board.
2. Motors - 4 motors are must for Quadcopter to lift it up. There are various types of
motors available in the market for Quadcopters. We will discuss various types of motors
in the details below.
3. ESC ( Electronic Speed Controller ) - Since the motors usually used in quadcopters
need 3 phase-type of supply, we cannot provide it direct supply. Hence we need ESCs
which convert the signals from Controller and send it to motors to control its speed.
4. Propellers - One of the most important parts of your drone is the propellers. These
spinning blades are the wings to your craft, the very part that creates the airflow that lifts
your machine into the air.
5. Flight Controller - Its function is to direct the RPM of each motor in response to input.
A command from the pilot for the Quad-copter to move forward is fed into the flight
controller, which determines how to manipulate the motors accordingly.
6. RC Transmitter & Receiver - Radio Transmitter is an electronic device that controls
the quadcopter manually. More about transmitter-receiver pair will be discussed later.
7. Battery - Your quadcopter battery is the power source that drives all the systems on
your drone and allows it to fly.
8. Miscellaneous - Various types of jumper cables, bullet connectors are needed.

30
PIXHAWK FLIGHT CONTROLLER

Fig. 3.1.2 : Pixhawk pin discription

31
3.3 Working

To understand working we have to take a look at flight Dynamics we adopted in drone

3.2.1. Flight dynamics

Quadcopters generally use two pairs of identical fixed pitched propellers; two clockwise (CW)
and two counterclockwise (CCW). These use independent variation of the speed of each rotor to
achieve control. By changing the speed of each rotor it is possible to specifically generate a
desired total thrust.

Fig 3.2 : Flight dynamics of drone

To locate for the Centre of thrust both laterally and longitudinally; and to create a desired total
torque, or turning force. Each rotor produces both a thrust and torque about its center of
rotation, as well as a drag force opposite to the vehicle's direction of flight. If all rotors are
spinning at the same angular velocity, with rotors one and three rotating clockwise and rotors
two and four counterclockwise, the net aerodynamic torque, and hence the angular acceleration
about the yaw axis, is exactly zero, which means there is no need for a tail rotor as on
conventional helicopters. The Rotors act as wings. They generate thrust by rotating at Fast
speeds, which pulls the air downwards and keeps the quad in the air

32
 The Thrust cancels out the acting weight and the quad hovers.

 A directional Thrust causes the quad to move in that direction.

 Or a decrease in Thrust overall causes the Drone to lose height

The setups for Flying is simple:

 Two adjacent motors spin in the opposite direction.

 Two opposite motors spin in the same direction

 A,C spin Clockwise (From our point of view)

 B,D Spin Counter-Clockwise

Why?

Physics says to be in stability the net forces acting on a body should be zero. So if all the rotors
were to spin in the same direction, it would result in a net Torque causing the complete Quad to
rotate.
Consider a motor rotating a ceiling fan of yours. The motor pushes the air in the downward
direction overcoming the resistance that air puts on the blades due to inertia. Inertia is the
property of any object to resist change in motion. Hence the blades, which are inherently
connected to the motor, push the air in the desired direction overcoming the inertia of air. This
pushing is what in mechanical terms,we call force. So the blades apply force on the air
molecules. Newton's third Law kicks in, which says, that any object applying a force on
33
another object, experiences a force of the same magnitude but in the opposite direction. As the
blades push the air, the air pushes the blades with the same magnitude but in the opposite
direction. Now, the blades are connected to the motor in such a way that the resulting force due
to the air produces a torque. A torque is nothing but a force which when applied to an object, it
tends to rotate about an axis. Hence the motor experiences a torque. That is, there is some
tendency for the motor to rotate. This tendency to rotate needs to be stopped by some force, or
else the motor will start spinning, which is undesirable. That is done by the hook by which the
motor is hanged to the ceiling. It provides the necessary opposite torque to stop the motor from
spinning. So far so good. As long as we are on land, we got the earth to provide the necessary
compensatory torque. But once we go up in the air, things are a bit different. In the air, we have
got to somehow give that opposite torque so that we get the desired results. Many flying designs
have tackled this problem in different ways. Quadcopter has tackled the problem, leveraging
the fact that there are multiple motors which on an average run at the same speed.

Having two motors turn anticlockwise and the other two clockwise, nullifies the overall effect
on the Quadcopter. A single motor by the torque applied by the air would want the body of the
quad to rotate in the direction opposite to that of its spin; the overall effect by the four motors
cancels out

34
3.2.2. How quadcopters fly?

To getting into the specifics how quadcopters fly, it is important to understand what drag And
thrust are

3.2.3. Drag

‘Drag’ is essentially a mechanical force that opposes the motion of any object through a fluid.
In this context, since we are talking about quadcopter passing through air, it is called
‘aerodynamic drag’ (as opposed to ‘hydrodynamic drag’ (for objects passing through water).

35
Aerodynamic drag on quadcopter is generated due to the difference in velocity between the
quadcopter and the air. This is only if the quadcopter/multirotor is in motion (going up, down,
forward, backward and taking turns) relative to the air. If the quadcopter is stationary, there is
‘Thrust’ is the force generated by the propellers of the multirotor, in order to work against one
of the forces that need to be overcome: the drag. Note that the thrust force is not the main force
responsible for getting the multirotor up in the air. Instead, it is the force that lets the multirotor
travel within the air, which is a fluid, overcoming its drag resistance.

3.2.4. Thrust

The upward thrust force generated by the propellers is usually measured in pounds or grams. To
keep your drone flying at a hover, the upward thrust needs to equal the weight of your drone.
The thrust to weight ratio TWR (thrust divided by weight), indicates how much thrust your
drone generates relative to its weight. A good rule of thumb is to design the TWR to be at least
a value of two. Typically, quadcopter propellers produce more thrust the faster they spin. They
are also influenced by the flight dynamics of your quadcopter.

Some propellers produce much more thrust when the drone is stationary, as opposed to when it
is flying. Other props perform much better at higher speeds. Torque is generated when the
propellers accelerate up or down. This force is responsible for
the ability of the drone to rotate on the yaw axis. Torque is an effect of Newton’s third law,

36
where every action has an equal and opposite reaction. As the propeller rotates, and pushes
through the air, the air pushes back and causes a counter rotation on the body of the drone. This
is why all of the propellers on a multirotor drone do NOT rotate in the same direction Drones
use rotors for propulsion and control. You can think of a rotor as a fan, because they work
pretty much the same. Spinning blades push air down. Of course, all forces come in pairs,
which means that as the rotor pushes down on the air, the air pushes up on the rotor. This is the
basic idea behind lift, which comes down to controlling the upward and downward force. The
faster the rotors spin, the greater the lift, and vice-versa.

3.2.5 How Drone Lifts

 To Rise above the ground, you need a net upward Force.

 The Motors generate Thrust that is greater than the Weight, making the quad rise
upwards.

 The lift is the force that acts against the weight of the craft, taking it up in the air. Like
we covered in the quadcopter blade rotation and lift post, The following are responsible
for the lift in a wing:

 Newton’s third law (every action has an equal and opposite reaction) – generates a lift in
a wing at the bottom, since the mass of air is pushed down and back (lift and drag).

 Bernoulli’s explanation is incomplete, but the pressure difference between the air at the
top and at the bottom of propeller due to the Coanda Effect generates a lift towards the
lower pressure at the top (read more in the quadcopter blade rotation post).

 The propellers on the multirotor generates a ‘lift’ force using similar principles (pushing
the air downwards and the difference in air pressure). In order for the multirotor to get
off the ground and be able to hover and fly around, this force must be greater than the
weight of the craft.

37
3.2.6. How do quadcopters hover?

In order for the quadcopter to hover in place, it is necessary to ensure that

The rotation speed must be sufficient enough for the quadcopter to generate a
‘lift’,counteracting its own weight, but not so much that the quadcopter keeps climbing in
altitude. The torque effect acting on the body of the quadcopter by each of the motors should be
cancelled out. Otherwise, expect the quadcopter to tend to want to yaw in a certain direction.

 Hovering in Air is simple.

 Motors generate Thrust.

 All the motors rotate at the same speed (or RPM).

 The Thrust should equal the weight of the System.

 The two forces cancel and our drone Hovers.

 To hover, the net thrust of the four rotors push the drone up and must be exactly equal to
the gravitational force pulling it down

38
3.2.7. Gaining and losing altitude

In order for the quadcopter to gain altitude, all four of the motors must increase the speed of
rotation simultaneously. Conversely, to descend down, all four of the motors must decrease
speed of its rotation simultaneously. This is what happens when you increase or decrease the
elevator control on your transmitter – the speed of the motors change simultaneously.

 The gain or losing height depends upon the thrust of all the motors.

 The thrust of all motors must be same, thrust of directly depends on motor speed

 If thrust is greater than its weight then it moves upward

 If thrust is smaller than its weight then motors moves downward

 If thrust is equal to the weight then the drone will hovers

39
3.2.8. Roll of drone

The ‘roll’ control tells the quadcopter to move side to side. In order to ‘roll’ to the right for
example, the speed of the motor at the left of the quadcopters must increase, relative to the
speed of the motors on the right. This ‘rolls down’ the right side of the quadcopter, resulting in
a side-ways swaying movement. Like pitch, this is achieved by either increasing the speed of
the left motors or decreasing the Speed of the right motors. Conversely, in order to ‘roll’ left,
the speed of the motors of the right of the quadcopter should increase relative to the speed of
the motors at the left.

As the red arrows in figure indicater higher thrust and blue arrows shows lower thrust side.

 To roll towards the Left (Our Left), the Thrust is increased on the Motors on the Right.

 Thrust on the motors on the Left also decreases.

40
 To Roll towards the Right (Our Right), the Thrust is increased on the Motors on the
Left.
 Thrust on the motors on the Right also decreases.

 To keep the Net Torque zero and allowing an Leftward Net Force

3.2.9. Pitch

The ‘pitch’ control tells the quadcopter whether to fly forward or backward. In order to pitch
forward for example, the speed of the motors at the rear of the quadcopter must increase,
relative to the speed of the motors on the front. This ‘pitches’ the nose (front) of the quadcopter
down, resulting in the forward movement. What is the difference between moving forward or
backward? None, because the drone is symmetrical. The same holds true for side-to-side
motion. Basically a quadcopter drone is like a car where every side is the front. This means that
explaining how to move forward also explains how to move back or to either side.

In order to fly forward, we need a forward component of thrust from the rotors. You could
increase the rotation rate of rotors 3 and 4 (the rear ones) and decrease the rate of rotors 1 and 2,
the total thrust force will remain equal to the weight, so the drone will stay at the same vertical
level.

Since one of the rear rotors is spinning counterclockwise and the other clockwise, the increased
41
rotation of those rotors will still produce zero angular momentum. The same holds true for the
front rotors, and so the drone does not rotate.

However, the greater force in the back of the drone means it will tilt forward. Now a slight
increase in thrust for all rotors will produce a net thrust force that has a component to balance
the weight along with a forward motion component. This is achieved by either increasing the
speed of the rear motors or decreasing the speed of the front motors. Conversely, in order to
‘pitch’ backwards, the speed of the motors at the front of the quadcopter must increase relative
to the speed of the motors at the back.

To Pitch Forwards (Towards us)

The Power to the motors rear motors is increased.

To Pitch Backward (away from us)

The Power to the motors rear motors is increased.

We also decrease the power to the two front motors to keep the angular momentum conserved.

Fig. 3.12Backward pitch of quadcopter

42
3.2.10. Yaw of drone

The ‘yaw’ or ‘rudder’ is a rotation movement of the quadcopter. In this case, the rotation speed
of diametrically opposing pairs of motors are increased or decreased, varying the torque in the
direction of rotation of that pair (remember that diametrically opposing motors in a quadcopter
rotate in the same direction), causing the quadcopter to rotate in the direction of the increased
torque.

 To Yaw Clockwise.

 We increase the Thrust on the Anti-Clockwise moving Motors.

 Decrease the Thrust on Clockwise Rotating Motors.

 There is a resulting Anti-Clockwise Torque.

 The Quad rotates Clockwise to conserve the Angular Moment

43
3.2.11. Controls

Roll

Done by pushing the right stick to the left or right. Literally rolls the quadcopter, which
maneuvers the quadcopter left or right.

Pitch

Done by pushing the right stick forwards or backwards. Tilts the quadcopter, which maneuvers
the quadcopter forwards or backwards.

Fig3.14: Remote control system


44
YAW

Done by pushing the left stick to the left or to the right. Rotates the quadcopter left or right.
Points the front of the copter different directions and helps with changing directions while
flying.

THROTTLE

Engaged by pushing the left stick forwards. Disengaged by pulling the left stick
backwards. This adjusts the altitude, or height, of the quadcopter.

AILERON
Same as the right stick. However, it relates directly to controlling roll (left and right
movement).

HOVERING
Staying in the same position while airborne. Done by controlling the throttle.

ROLL
Roll moves your quadcopter left or right. It’s done by pushing the right stick on your
transmitter to the left or to the right.

YAW
This is done by pushing the left stick to left or to the right.

45
CHAPTER 4

DESIGN AND CALCULATIONS

4.1. Design

4.1.1. Parts of frame

4.1.1.1. Arms Of frame

Fig. 4.1: Arms of frame

46
4.1.1.2. Base of frame

Fig. 4.2 & Fig 4.3 Upper & Lower base of Frame

47
4.1.1.3. Frame Assembly

Fig. 4.4: Assembled Frame


48
4.1.2. Propellers

Fig. 4.4: Propeller

49
4.2. Calculations:

4.2.1. Motor Specifications:

KV Values: 22000KV

Voltage: DC 11~24V

Current: 4-10A

Trust: 1200 Grams

Power: 342W

Efficiency: 80%

Rotation Speed: 24420 (Approx with lipo 11.v battery) RPM

Table : 4.1 Specifications of motor

4.2.2 Specifications of Propeller

 Diameter: 10in

 Pitch: 4.5in

50
 Propeller diameter: 25.4cm

 Centre bore diameter: 6mm front and 9mm reverse side.

 Working Area: 0.0290

 Centre seat TH: 6mm

 Weight: about 10g/pair

 1 x CW Propeller & 1 x CCW Propeller

4.2.3. Specifications of frame:

 Model Flame Wheel 450 (F450)

 Frame Weight 282g

 Diagonal Wheelbase 450mm

 Take off Weight 800g ~ 1600g

4.2.1. Maximum Thrust For One Motor (2200 Kv):

Maximum Thrust For One Motor = 1200 grams

For 11.5 volt & (10*4.5 inch propeller)

For Four Motors = 1200 * 4

= 4600 grams

= 4.6kg

4.2.2. Dead Weight & Payload ( in grams)

1. Dead Weight

 Propeller = 20 grams

 Frame =280 grams

 Motors = For One Motor 50 grams

 For Four Motors 200 grams

 Tank = 25 grams

 Pump = 50 grams
51
 Battery = 150 grams

 Circuits = 30 grams

 Nozzle & Pipe = 50 grams

 Total Weight = 20+280+200+25+50+150+30+50 = 805 grams

2. Maximum Payload

Weight Carrying Capacity for Drone

According to thrust to weight ratio the Thrust should be twice of weight for stable flight so
Thrust to weight ratio = 4.6/2 = 2.3 kg 2300 grams.

Maximum Payload during stable flight = Thrust to weight ratio – Total weight

= 2300 -850

= 1450grams

=1.055 kg

4.2.3. Drag Force

Take a look at the drag equation:

Fd = ½ ρ CdA V²

Where:

Fd= drag force in Newtons


ρ = the density of air in kg/m³
Cd = the drag coefficient
A = the cross section of our quad in m³ in the direction of movement

CALCULATIONS
ρ = Density

Air density =1.223 kg/m^3

Cd= Drag Coefficient is constant for 10*4.5 inch propeller

Cd = 0.0475

A = Area of Drone propellers


52
Area OF Propeller = 0.0290 m^2

V = Assume Average Velocity of drone at Maximum Thrust

Velocity V = 8 m/se

Fd = 1/2 [ρCdAV^2]

= 1/2 * 1.223 * 0.0475 * 0.0290 * 8^2

= 1.858 N

4.2.4. Lift force

LF = 1/2 [ CLρAV^2 ]

CL = lift coefficient = 2L/A ρV

ρ = Density

Air density =1.223 kg/m^3

CL = Lift Coefficient it varies by distance covered by drones

CL = 21.146 A = Area of Drone Area OF Propeller = 0.0290 m^2

V = Assume Average Velocity of drone at Maximum

Thrust Velocity V = 8 m/sec

The distance covered by drone during lift, Assume distance L = 3 m

CL =2L/AρV

CL = 2∗3/0.029∗1.223∗8

CL = 21.146

L.F = ½ ρ CLAV^2

= 1/2 * 1.223 * 21.146 * 0.0290 * 8^2

L.F = 23.99 N

53
CHAPTER -5

5. SOFTWARE COMPONENTS USED IN PROJECT INVOLVING FACE AND


OBJECT RECOGNITION:

5.1 OpenCV (Open Source Computer Vision Library): OpenCV is a library of programming
functions mainly for real-time computer vision. Originally developed by Intel, it was later
supported by Willow Garage, then Itseez. The library is cross-platform and licensed as free and
open-source software under Apache License

OpenCV's application areas include:

 2D and 3D feature toolkits


 Egomotion estimation
 Facial recognition system
 Gesture recognition
 Human–computer interaction (HCI)
 Mobile robotics
 Motion understanding
 Object detection
 Segmentation and recognition
 Stereopsis stereo vision: depth perception from 2 cameras
 Structure from motion (SFM)
 Motion video tracking
 Augmented reality

54
Fig 5.1

5.2 DEEP LEARNING FRAMEWORKS :


Deep learning frameworks provide tools and libraries for building and training neural networks,
essential for tasks like face and object recognition. Here are some notable deep learning
frameworks commonly used in computer vision applications:
1. TensorFlow:
 Features: Developed by Google Brain, TensorFlow offers a comprehensive
ecosystem for building and deploying machine learning models.
 Advantages: Known for its flexibility, scalability, and extensive community
support. TensorFlow provides high-level APIs (such as Keras) for easier model
building and deployment.
 Applications: Widely used in research and industry for various deep learning
tasks, including image recognition, natural language processing, and
reinforcement learning.
2. PyTorch:
 Features: Developed by Facebook's AI Research lab, PyTorch is known for its
dynamic computation graph, enabling more intuitive model building and
debugging.
 Advantages: Offers a more Pythonic and easy-to-understand interface, making
it popular among researchers and developers. It allows for flexible
experimentation with neural network architectures.
 Applications: Widely used in academia and research for its simplicity and
flexibility, especially in fields like computer vision and natural language
processing.

55
A face detection system has become very popular these days, as it can be very secure compared
to fingerprint and typed passwords. You may have seen the face unlock feature in your
smartphone which makes things very easy. Face detection is also used in many places such as
airports, railway stations, and roads for surveillance.
Here, we will build a Face Recognition system using the OpenCV Library on Raspberry Pi, as it
is portable to work as a surveillance system. This system is tested by me and will surely work
without any issue.
Like every other Face Recognition system, it includes two python scripts, of which one is a
training program that will analyze the set of photos of a particular person and create a
dataset(YML File) from it.
The second program here is the Recognizer program, which detects a face and then uses this
YML file to recognize the face to mention the person's name. The programs here, are optimized
for Raspberry Pi (Linux).
5.2.1 What is OpenCV and How to Use it for Face Recognition?
OpenCV is an open-source library for computer vision, machine learning, and image
processing. Now it plays a major role in real-time operation which is very important in today’s
systems.
By using this library, anyone can process images and videos to identify objects, faces, and even
handwriting. When integrated with various libraries, such as NumPy & python, which is
capable of processing the OpenCV array structure for analysis.
It Identifies, image patterns and their features, which will be used in vector space to perform
mathematical operations.
Working Process
Before we start, it's important to grasp that Face Detection and Face Recognition are two
different things. In Face Detection, only the face of an individual will be detected by the
software.
In Face Recognition, the software won't only detect the face but will recognize the person.
Now, it should be clear that we'd like to perform Face Detection before performing Face
Recognition.
A video feed from a webcam is nothing but a long sequence of images being updated one after
the other and each of those images is simply a set of pixels of various values put together in its
respective position.
There are plenty of algorithms behind detecting a face from these pixels and further recognize
the person in it and trying to explain them is beyond the scope of this tutorial, but since we are
56
using the OpenCV library, which is incredibly simple to perform, face Recognition can be
understood without getting deeper into the concepts.
So now, let's install the packages required for face detection.
Note - Please install OpenCV before proceeding with further steps.
Setup Procedure
Power your Pi with an adapter and connect it to a display monitor via HDMI cable you can also
do the process with headless mode.
Install dlib: Dlib is a toolkit for real-world Machine Learning and data analysis applications.
To Install dlib, just enter the following command within the terminal.

Fig 5.2.1
 This should install dlib and when successful, you'll get a screen like this on entering the
command again. Here, I have the setup already ready for face recognition, so all the
packages are already pre-installed.
 You can do the same as me and check by typing the command again if it is installed
properly.
Install pillow: Pillow also called PIL, stands for Python Imaging Library which is known to
open, manipulate and save images in an exceedingly different format, to install PIL use the
following command
Once installed, you'll get a successful message as shown below.

57
Fig 5.2.2
Install face_recognition: The face_recognition library for python is taken into account to be
the only library to acknowledge and manipulate faces. we'll be using this library to coach and
recognize faces. To install this library follow the command.

 Once installed you will get a success message as shown below.

Fig 5.2.3
The library is heavy and the general public will face memory exceeding problems, hence I've
got the “—no –cache-dir” code to install the library without saving the cache files.

58
5.3 Train the system (How to feed the data)
Let’s take a glance at the Face_Trainer.py program. The target of the program is to open all
the pictures within the Face_Images directory and seek faces.
Once the face is detected, it crops the face and converts it to grayscale, then to a NumPy array.
Then we finally use the face_recognition library that we installed earlier to coach and put it
aside as a file called face-trainner.yml. the information during this file can later be accustomed
to recognize the faces.
The whole Trainer program is given at the tip, here I will explain the foremost important lines.

Fig:5.3.1
 Next, we've got to use the haarcascade_frontalface_default.xml classifier to detect faces
in images. To confirm, you have got placed this XML, go in your project folder else
you'll face errors.
 Then we use the recognizer variable to form a Local Binary Pattern Histogram (LBPH)
Face Recognizer.
face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')
recognizer = cv2.createLBPHFaceRecognizer()
 Then we've to go into the Face_Images Directory to access the pictures inside it. This
directory should be placed inside your current working directory (CWD).
 The subsequent line is employed to go into the folder which is placed within the CWD.

59
Fig: 5.3.2
Face_Images = os.path.join(os.getcwd(), "Face_Images") #Tell the program where we have
saved the face images
 We then use for loops to get into each sub-directory of the directory Face_Images and
open any files that end with jpeg, jpg, or png. The trail of every image is stored during a
variable called path and also the folder name (which is going to be the person’s name)
within which the pictures are placed & stored in a variable called person_name.

Fig :5.3.3

 If the name of the person has changed, we increment a variable called Face_ID,
this can help us in having a special Face_ID for a person which we are going to later use
to spot the name of the person.

60
 As we all know, it's plenty easier for OpenCV to figure with grayscale images than with
colored images, since the BGR values will be ignored. So to cut back the values within
the image, we convert it to grayscale and also resize the image to 550 in order that all
images stay uniform.
 Ensure the face is within the middle, else the face is cropped out. Finally, convert these
images to NumPy array to get a mathematical value for the photographs, so use the
cascade classifier to detect the faces within the images and store the data to a variable
called faces.
 Once the face has been detected, we'll crop that area and consider it as our Region of
Interest (ROI). The ROI region is used to train the face recognizer. We've got to append
every ROI face within a variable called x_train.
 Then we offer these ROI values together with the Face ID value to the recognizer which
can provide us the training data. The info obtained is saved after you compile this
program you may find that the face-trainner.yml file gets updated anytime
 So ensure to compile this program whenever you create any changes to the photos
within the Face_Images directory.
 When compiled you may get the Face ID, pathname, person name, and NumPy array
printed like shown below for debugging purposes.

5.3.4
Face detection results
Now, that we've our trained data ready, we are able to use it to acknowledge faces. Within the
Face Recognizer program, we'll get a live video feed from a USB webcam then convert it to a
picture.
 Then we've got to use our face detection technique to detect faces in those photos then
compare them with all the Face ID that we've created earlier.

61
 If we discover a match, we will then box the face and write the name of the one that has
been recognized. The entire program is again given at the tip, the reason for the identical
is as follows.
 The program shares plenty of similarities with the trainer program, so import the
identical modules that we used earlier and also use the classifier since we want to
perform face detection again.
import cv2 #For Image processing
import numpy as np #For converting Images to Numerical array
import os #To handle directories
from PIL import Image #Pillow lib for handling images
face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')
recognizer = cv2.createLBPHFaceRecognizer()
 Next within the variable labels, you've got to jot down the name of the persons who
were mentioned within the folder. Confirm you follow the identical order. In my case,
it's my name “Apurva” and “Paul”.
labels = ["Apurva", "Paul"]
 We then should load the face-trainner.yml file into our program since we are going to
just use the information from that file to acknowledge faces.
recognizer.load("face-trainner.yml")
 The video feed is obtained from the USB webcam. If you have got over one camera
connected replace 0 with 1 to access the secondary camera.
cap = cv2.VideoCapture(0) #Get vidoe feed from the Camera
 Next, we break the video into frames (Images) and convert it into grayscale and so
detect the faces within the image.
 Once the faces are detected we've got to crop that area rather like we did earlier and
reserve it separately as roi_gray.
ret, img = cap.read() # Break video into frames
gray = cv2.cvtColor(img, cv2.COLOR_BGR2GRAY) #convert Video frame to Greyscale
faces = face_cascade.detectMultiScale(gray, scaleFactor=1.5, minNeighbors=5) #Recog. faces
for (x, y, w, h) in faces:
roi_gray = gray[y:y+h, x:x+w] #Convert Face to greyscale

id_, conf = recognizer.predict(roi_gray) #recognize the Face


 The variable conf tells us how confident the software is recognizing the face. If the
detection level is bigger than 80, we get the name of the person using the ID number
62
using the below line of code.
 Then draw a box around the face of the person and write the name of the person on top
of the box.
if conf>=80:
font = cv2.FONT_HERSHEY_SIMPLEX #Font style for the name
name = labels[id_] #Get the name from the List using ID number
cv2.putText(img, name, (x,y), font, 1, (0,0,255), 2)
cv2.rectangle(img,(x,y),(x+w,y+h),(0,255,0),2)
 Finally, we have to display the video feed that we just analyzed and then break the feed
when a wait key (here q) is pressed.
cv2.imshow('Preview',img) #Display the Video
if cv2.waitKey(20) & 0xFF == ord('q'):
break
 Make sure the Pi is connected to a monitor through HDMI when this program is
executed. Run the program and you will find a window popping up with name preview
and your video feed in it.

 If a face is recognized in the video feed you will find a box around it and if your
program could recognize the face it will also display the name of the person.
.
5.4 OBJECT DETECTION AND RECOGNITION ON LIVE VIDEO FEED FROM PI
CAMERA.

import os
import cv2
import numpy as np
from picamera.array import PiRGBArray
from picamera import PiCamera
import tensorflow as tf
import argparse
import sys

IM_WIDTH = 1280
IM_HEIGHT = 720
63
camera_type = 'picamera'
parser = argparse.ArgumentParser()
parser.add_argument('--usbcam', help='Use a USB webcam instead of picamera',
action='store_true')
args = parser.parse_args()
if args.usbcam:
camera_type = 'usb'

sys.path.append('..')

from utils import label_map_util


from utils import visualization_utils as vis_util

MODEL_NAME = 'ssdlite_mobilenet_v2_coco_2018_05_09'

CWD_PATH = os.getcwd()

PATH_TO_CKPT = os.path.join(CWD_PATH, MODEL_NAME, 'frozen_inference_graph.pb')

PATH_TO_LABELS = os.path.join(CWD_PATH, 'data', 'mscoco_label_map.pbtxt')

NUM_CLASSES = 90

label_map = label_map_util.load_labelmap(PATH_TO_LABELS)
categories = label_map_util.convert_label_map_to_categories(label_map,
max_num_classes=NUM_CLASSES,
use_display_name=True)
category_index = label_map_util.create_category_index(categories)
64
detection_graph = tf.Graph()
with detection_graph.as_default():
od_graph_def = tf.GraphDef()
with tf.gfile.GFile(PATH_TO_CKPT, 'rb') as fid:
serialized_graph = fid.read()
od_graph_def.ParseFromString(serialized_graph)
tf.import_graph_def(od_graph_def, name='')

sess = tf.Session(graph=detection_graph)

image_tensor = detection_graph.get_tensor_by_name('image_tensor:0')

detection_boxes = detection_graph.get_tensor_by_name('detection_boxes:0')

detection_scores = detection_graph.get_tensor_by_name('detection_scores:0')
detection_classes = detection_graph.get_tensor_by_name('detection_classes:0')

num_detections = detection_graph.get_tensor_by_name('num_detections:0')

frame_rate_calc = 1
freq = cv2.getTickFrequency()
font = cv2.FONT_HERSHEY_SIMPLEX

if camera_type == 'picamera':
# Initialize Picamera and grab reference to the raw capture
camera = PiCamera()
camera.resolution = (IM_WIDTH, IM_HEIGHT)
camera.framerate = 10
rawCapture = PiRGBArray(camera, size=(IM_WIDTH, IM_HEIGHT))
rawCapture.truncate(0)

65
for frame1 in camera.capture_continuous(rawCapture, format="bgr", use_video_port=True):

t1 = cv2.getTickCount()

frame = np.copy(frame1.array)
frame.setflags(write=1)
frame_rgb = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
frame_expanded = np.expand_dims(frame_rgb, axis=0)

(boxes, scores, classes, num) = sess.run(


[detection_boxes, detection_scores, detection_classes, num_detections],
feed_dict={image_tensor: frame_expanded})

vis_util.visualize_boxes_and_labels_on_image_array(
frame,
np.squeeze(boxes),
np.squeeze(classes).astype(np.int32),
np.squeeze(scores),
category_index,
use_normalized_coordinates=True,
line_thickness=8,
min_score_thresh=0.40)

cv2.putText(frame, "FPS: {0:.2f}".format(frame_rate_calc), (30, 50), font, 1, (255, 255, 0),


2, cv2.LINE_AA)

cv2.imshow('Object detector', frame)

t2 = cv2.getTickCount()
time1 = (t2 - t1) / freq
frame_rate_calc = 1 / time1

if cv2.waitKey(1) == ord('q'):
66
break

rawCapture.truncate(0)

camera.close()

camera.release()

cv2.destroyAllWindows()

CHAPTER 6

CONCLUSIONS

6.1 ADVANTAGES AND DISADVANTAGES

67
Fig. 6.1 Advantages and disadvantages

68
Conclusion:

Some of the points which are achieved in this drone and the results from drone are as follows:

1.Drone are crucial in military operations because they will no longer have to walk around the
borders they can spectate from base and they can also supply required payload like medicines,
food etc.

2.The drone is 20 % more faster than the man and cover the large area in less time space.

3.This helps in securing military people life’s during surgical strike operations.

4.It reduces the human effort as well as it decrease the risk to human life’s especially in
military.

5.The main achievement is to reduce the cost of drone the cost on our drone is Rs. 30000/-
which is much less than drone in the market. The cost has been reduced by using cost effective
parts.

6.The battery timing has also been increased by using the larger battery the battery timing of
drone is 25 min with caring pay load, but using larger battery increases the weight and reduce
the payload.

7.Remote control has been used to control which is easy to use or setup, than to use any laptop
or mobile application

69
REFERENCES

 https://en.wikipedia.org/wiki/Unmanned_aerial_vehicle

 https://en.wikipedia.org/wiki/Quadcopter

 https://en.wikipedia.org/wiki/Nixie_(drone)

 https://www.amazon.com/Amazon-Prime-Air/b?ie=UTF8&node=8037720011

 Deployment and Performance of a UAV for Crop Spraying Durham K. Giles*, Ryan C.
Billing Department of Biological & Agricultural Engineering, University of California,
https://www.aidic.it/cet/15/44/052.pdf

 Timothy Paul D Picar University of Mindanao, College of Engineering Education,


 https://www.academia.edu/38769537/
FINAL_NOV_20_Drone_Thesis_Body_Corrected

 Freyr drone: pesticide/fertilizers spraying drone - agricultural approach , by spoorthi s. ;


b. Shadaksharappa ; suraj s. ; v.k. manasa tucker, compton j.
http://anveshanaindia.com/wp-content/uploads/conference/paper/freyr-drone- pesticidef
ertilizers-spraying-drone-an-agricultural-approach-ms-spoorthi-s.pdf

 "Unmanned Systems of the United States Military" by David J. Bryant (2015): Provides
an overview of various unmanned systems used by the U.S. military, including spy
drones.

 "Remote Warfare: The Evolution of Unmanned Aerial Systems" by Randy Vance


(2019): Explores the evolution of unmanned aerial systems, including spy drones, and
their impact on modern warfare.

 "Drones: The Complete Manual" by Aaron Southwick (2017): Covers various aspects of
drone technology, including surveillance drones and their applications.

 "The Future of Drone Use: Opportunities and Threats from Ethical and Legal
Perspectives" edited by Bart Custers, Simone van der Hof, and Marco Spruit (2016):
Discusses ethical and legal considerations surrounding drone use, including surveillance
and espionage drones.

70
71

You might also like