0% found this document useful (0 votes)
322 views38 pages

Design Report

The document outlines the design report for Team Artemis, participating in the SAE India AeroTHON 2024 competition, focusing on the development of a multirotor uncrewed aircraft system (UAS). The primary objective is to create a high-performance drone capable of completing various missions, including surveying and payload delivery, while adhering to weight restrictions. The report details the team's design process, including research, objectives, and compliance certifications from faculty advisors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
322 views38 pages

Design Report

The document outlines the design report for Team Artemis, participating in the SAE India AeroTHON 2024 competition, focusing on the development of a multirotor uncrewed aircraft system (UAS). The primary objective is to create a high-performance drone capable of completing various missions, including surveying and payload delivery, while adhering to weight restrictions. The report details the team's design process, including research, objectives, and compliance certifications from faculty advisors.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

Birla Institute of Technology & Science

Pilani, KK Birla Goa Campus

dEsign report
Team No: 082

Regn No: AT2024-082


APPENDIX A

STATEMENT OF COMPLIANCE
Certification of Qualification

Team Name: /in'fetn1£


Bl rs. B/oni KK f],it/(7... Comj,M
¥ VO,b}p'(j
University/Institute: t?&a.-

Faculty Advisor: 1/ar!Jh~v :/6sh, • . . . .


Faculty Advisor's Email: @c9(9(l.,- b,fi-p,lrm, •()(,.Jr,,

Statement of Compliance

As Faculty Advisor, I certify that the registered team members are enrolled in collegiate
courses. This team has designed the UAS for the SAE AEROTHON 2024 contest,
without direct assistance from professional engineers, RIC model experts or pilots, or
related professionals.

Signature of Faculty Advisor Date t '

Team Captain Information:

Team Captain's Name: Orn Ja1slJcfl


Team Captain's E-mail:
Jza 2 2Q4 B'3 ti{j(XJ,.. bi~ •pi/an,~ ac . ;r,_,

Team Captain's Phone:


8962B14Jlt g

Note:
A copy of this statement needs to be included in your Design Report as page 2

SAEINDIA AEROTHON - UNCREWED AIRCRAFT SYTEM (UAS) DESIGN, BUILD AND FLY CONTEST 2024 PAGE 142
TEAM ARTEMIS

0 TABLE OF CONTENTS

1 Executive Summary 3

2 Management Summary 3
2.1 Team Organisation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

3 Conceptual Design 4
3.1 Mission Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

4 Mechanical Design Requirements 5


4.1 Drone Configuration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
4.2 Choice of Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
4.3 Weight Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
4.4 UAS Sizing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
4.5 Centre of Gravity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.6 Structural Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
4.7 Computational Fluid Dynamics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4.8 Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13

5 Manual Operations 14
5.1 Choice of Electronics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
5.2 Choice of Onboard Computer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
5.3 Choice of Flight Controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
5.4 Camera . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
5.5 Choice of Communication Modules . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
5.6 RTK-GPS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
5.7 Mission Planner . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5.8 Power Module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
5.9 Battery . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
5.10 ELRS TX/RX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
5.11 Analog and digital VTX . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
5.12 Telemetry module . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19

6 Autonomous Operations 20
6.1 System Design for Communication of Survey Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
6.1.1 Format Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
6.1.2 Transmission Mechanisms for Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
6.1.3 Security of Data Storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
6.1.4 Performance Estimations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
6.2 Methodology for Autonomous Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
6.2.1 Procedure of Autonomous Flight Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.2.2 Procedure for Object Identification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

TABLE OF CONTENTS 1
TEAM ARTEMIS

6.2.3 Procedure for Payload Drop . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23


6.3 Exploration Algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.3.1 Grid size estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.3.2 Boustrophedon Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.3.3 Spiral Search . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.3.4 Simulation Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
6.4 Computer Vision . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
6.4.1 Haar Cascade Classifier . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
6.4.2 Tiny-YOLO v4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
6.4.3 Faster R-CNN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
6.4.4 Counting of Objects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
6.5 Drone Control System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
6.5.1 Frameworks for Drone Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
6.5.2 Features of the Control System . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26

7 Build 27

8 Optimized Final Design 28

9 Innovation 29

10 Detailed Exploded View 30

11 Appendix 31
11.1 PID Tuning using Mission Planner . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
11.2 Internet Speed Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
11.3 Grid Size Estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
11.4 Bill of Materials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
11.5 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
11.5.1 Mechanical Operations- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
11.5.2 Manual Operations- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
11.5.3 Autonomous Operations- . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35

TABLE OF CONTENTS 2
TEAM ARTEMIS

1 EXECUTIVE SUMMARY
This report documents the design, fabrication and testing of Team Artemis’s drone for the SAE India AeroTHON competition.
The objective of the competition is to design, build and fly a multirotor uncrewed aircraft system that can survey, classify
objects and deliver payloads. The UAS must be below 2kg in weight and must successfully carry out 4 missions.
Project Objectives
The team’s primary objective is to create a high-performance drone capable of competing in various challenging missions at
Aerothon 2024. These missions test the drone’s capabilities in areas such as navigation, payload delivery, endurance, and
autonomous operations. By participating, the team aims to showcase its technical, innovative thinking, and collaborative
skills.
Development Process
The project commenced with an extensive research phase, during which the team analyzed previous competition entries
and emerging trends in drone technology. This informed our design decisions and allowed us to set realistic performance
targets. The design phase involved creating detailed CAD models, performing simulations, and conducting feasibility
studies to ensure our drone met the required specifications. The manufacturing phase saw these designs come to life
using advanced fabrication techniques, ensuring precision and reliability.
Innovation and Testing
A significant focus of the project has been on innovation. The team tried to incorporate its self made flight controller
using the STM32 micro controller, and also tried installing toroidal propellers instead of the usual bi-blade propellers.
The team innovatively manufactured the carbon fiber plates in-house using CNC machines, which allowed us to achieve
customization and develop our expertise in cutting-edge manufacturing techniques of composites. The drone has
undergone multiple iterations of testing, including flight simulations and real-world flight trials. These tests have been
crucial in identifying and addressing potential issues, ensuring the drone performs reliably under competitive conditions.
Outcomes and Benefits
The ”Aerothon 2024” project has provided the team with invaluable hands-on experience, bridging the gap between
theoretical knowledge and practical application. It has fostered a spirit of innovation and teamwork, preparing us for future
careers in engineering and technology.

2 MANAGEMENT SUMMARY
The 2024 Team Artemis comprises of 6 freshmen and 4 sophomores, 1 faculty advisor and external advisors. The team
receives funding from student-run innovation labs and clubs.
2.1 Team Organisation
The team is divided into subsystems as per Figure 1 for efficient workflow. The team captain act as a central link across
subsystems and serve as a bridge between the team and administration. The autonomous team is responsible for
integrating all computer vision, path planning, object detection and image processing algorithms. The manual team is
responsible for handling all electronic components. The mechanical team is responsible for the structural CAD design,
and overall manufacturing of the drone frame.

2 MANAGEMENT SUMMARY 3
TEAM ARTEMIS

Figure 1: Team Organisation

3 CONCEPTUAL DESIGN
The design of the drone has been divided into 3 subsystems - (i) Mechanical (ii) Manual (iii) Autonomous. Each of the
subsystem’s roles have been covered in the following subsections. The mission briefs, along with the sensitivity analyses
has been covered and performed to optimise parameters for maximum scoring.
3.1 Mission Requirements
Mission 1: It is divided into 2 parts, testing the ability of the UAS to survey the field and capture hot spots. In the process,
it must identify, classify and count the number of objects based on their shapes using object detection algorithms. This
tests the ability of the UAS to fly with manual control and assesses the algorithm’s ability to analyze images and log it
accordingly to its timestamp.
Mission 2: It tests the autonomous abilities of the UAS by capturing hot spots using computer vision and image processing.
This mission assesses the ability of the UAS to fly without any human or manual intervention, therefore evaluating its
computer vision algorithms.
Mission 3: It tests the agility of the UAS in an attempt to navigate an obstacle course with a payload, without damaging
either the payload or the UAS. This mission assesses the overall stability of the UAS with a suspended payload and
overall response to inputs from the pilot.
Mission 4: It tests the agility, computer vision, object detection and image processing algorithms in order to allow the UAS
to capture hot spots, deliver the payload at the required target and return to base. This mission is also divided into 2 parts,
wherein the UAS first has to survey the entire field and then deliver the payload at the target after running its algorithms.

The designing phase began with an in-depth understanding of mission requirements, scoring, and competition rules and
regulations. Accordingly, the dimensions, battery pack, flight controller and subsystems were finalised based on overall
feasibility and weight limitations.

3 CONCEPTUAL DESIGN 4
TEAM ARTEMIS

4 MECHANICAL DESIGN REQUIREMENTS


A quadcopter configuration was chosen due to the weight constraint of 2kg. A hex configuration would have added
the weight of 2 additional arms and motors along with additional mounts. A tricopter is lighter but is less stable than a
quadcopter.
4.1 Drone Configuration
The most conventional configurations include tricopter, quadcopter, hexacopter and octacopter. A weighted analysis, as
per Table 1, was performed to choose the most suited configuration.

Parameter Tricopter Quadcopter Hexacopter Octacopter


Stability 0 1 2 3
Weight 3 2 1 0
Maneuverability 2 3 1 0
Efficiency 3 2 1 0
Total: 8 8 5 3

Table 1: Configuration Matrix

While the tricopter and quadcopter scored equally in the weighted analysis, a quadcopter configuration was chosen
due to its superior stability and ease of maneuverability, especially for autonomous missions to avoid any interference
with object recognition or image processing algorithms due to instability of the drone. Quadcopter configurations are also
easier for the pilot to maneuver in manual missions.
4.2 Choice of Materials
• Plates: The initial prototype consisted of laser cut plywood plates. Ultimately they were replaced with 1 mm carbon
fibre sheets which were CNC machined to achieve high precision. Carbon fibre was chosen due to its superior
strength and low weight.

• Arms: The 4 arms of the drone were also made out of square (20 mm × 18 mm) carbon fibre rods(roll wrapped)
due to their increased strength and to avoid twisting and constrain them more effectively. The circular arm is only
20g lighter overall and the square C/S provides more strength. This has been depicted as per Figure 2 (a).

• Mounts and Hub: The motor mounts, landing gear and central hub were 3D printed using Polylactic Acid (PLA)
and a gyroidal infill. PLA, along with the infill pattern, was chosen due to its strength. The team will be testing
Polycarbonate (PC) filament and this will be viewed as a potential alternative to PLA. The central hub involved 4
slots to slide the carbon fibre rods into, and the rods were secured by screwing into them. To prevent the holes in
the rod from expanding, the surrounding region of the holes were reinforced with araldite.

• Propellers: The team experimented with ”toroidal propellers” which proved to be more efficient and more noise
friendly. The testing for these is still in progress. Carbon fibre PLA was used to 3D print these propellers due to
the delicate nature of the propeller and the need for high surface finish. Ultimately, conventional propellers were
selected due to durability concerns, as the toroidal ones failed to generate sufficient thrust in practical evaluations.

4 MECHANICAL DESIGN REQUIREMENTS 5


TEAM ARTEMIS

(a) Square vs Circular C/S (b) Weight Bifurcation

Figure 2: Carbon Fibre Strength and Drone Weight Bifurcation

4.3 Weight Distribution


The table of weights is shown in Table 2. These weights are depicted as per Figure 2 (b).

Component Number Weight per component(g) Total weight(g)


Motors 4 90 360
ESC 4 26 104
Dropping mechanism 1 25 25
Propellers 4 10 40
Battery 1 400 400
Cube Orange 1 75 75
GPS 1 40 40
Raspberry Pi 1 50 50
Camera 1 100 100
O3 air unit 1 40 40
RX+Telemetry 1 20 20
Dongle 1 30 30
Payload 1 200 200
Frame 1 316 316
Total 1800

Table 2: Weights Table

4 MECHANICAL DESIGN REQUIREMENTS 6


TEAM ARTEMIS

4.4 UAS Sizing


• Wheel base and Propeller Clearance The wheelbase of the frame was chosen to be 420mm to allow for the
smallest frame size that could be achieved without compromising on the maximum thrust that could be achieved
with bigger propellers. This configuration allows for the smallest frame size to minimise weight, while not allowing
the propellers to touch or interfere with any other components. In order to maintain a safe propeller clearance, the
frame must allow sufficient space for propeller movement. This is shown in Figure 3.

Figure 3: Propeller Clearance

• Central Attachment (Hub) The hub is the central part of the UAS where the arms connect. It serves as the
structural core, distributing loads and stresses evenly across the drone. The hub of the drone was 3D printed using
PLA plastic and 25% infill to provide sufficient strength. It also ensured that the arms are correctly aligned with
respect to the plates. This is shown in Figure 4.

Figure 4: Hub placement and dimensions

4 MECHANICAL DESIGN REQUIREMENTS 7


TEAM ARTEMIS

• Landing Gear The landing gear consists of the structure that supports the UAS when it is on ground, during takeoff
and landing. Properly sized landing gear protects the UAS’s components and ensures that the camera does not
touch the ground. The landing gear height must also ensure that there is enough space to mount the payload. The
team designed 4 landing gears, one on each end of the arm to ensure a stable landing and take-off. This is shown
in Figure 5.

Figure 5: Landing gear dimensions

• Rotor Arm The rotor arms are the extension from the hub to which the motors and propellers are attached. They
determine the overall span of the UAS and play a critical role in structural integrity and aerodynamics. The rotor
arm can either have a square or a circular cross section. The team chose to go with square cross-section as per
reasons pointed out in Figure 2 (a).

• Dropping Mechanism The dropping mechanism, Figure 6 was implemented in the simplest way to achieve the
lowest weight possible, while ensuring that the payload has no scope of falling unintentionally. The mechanism
works in the following way:
(i) The payload is strapped with 2 rubber bands and the rubber bands are hooked onto a custom made 3D printed
hook.
(ii) The hook is attached to a servo motor with the help of a push rod passed through a loop on the hook. Simply
put, the push rod effectively connects the servo and the hook.
(iii) In order to release the payload, the servo motor is turned by which unhinges the push rod from the hook. This
would set the rubber band loose and in turn, releasing the payload.

Figure 6: Payload Dropping

4 MECHANICAL DESIGN REQUIREMENTS 8


TEAM ARTEMIS

4.5 Centre of Gravity Analysis


The centre of gravity, (CG) affects flight performance and maneuverability. The CG was located using the Inspect feature
of Fusion 360 after assigning appropriate material properties. This provided the CG coordinates (X, Y, Z) relative to the
defined origin of the assembly. To ensure that the calculated CG was optimal, its location was compared to the drone’s
geometric center. The placement of heavy components such as battery and flight controller were adjusted to fine-tune
the CG location and a symmetric arrangement of components was aimed. This is shown in Figure 7.

Figure 7: Three view of CG

4.6 Structural Analysis


In this section, the structural and model analysis carried out by the team on the drone frame using SolidWorks solver is
presented. The analyses performed include total deformation, equivalent Von Mises stress and model analysis.
1. Total Deformation- A static structural analysis is carried out to determine this by considering non-linear deformation
based on the material and loads applied. An upward force of 20N is applied on all four motor positions to simulate thrust.
Along with this, the self weight of the drone, 20N is also applied downwards. Fixed constraints were applied at the points
where the frame would be attached to other structures. This results in a maximum deformation of only 1.246e-02 mm,
considering the forces were small and the material used is carbon fibre, which has a high tensile and yield strength.
2. Equivalent Von Mises Stress This is used to determine the distribution of stress within the drone frame under the
applied loads. It helps in identifying the critical stress points that could lead to material failure.

4 MECHANICAL DESIGN REQUIREMENTS 9


TEAM ARTEMIS

(a) Total Deformation (b) Equivalent Von Mises Stress

Figure 8: (a) Total Deformation and (b) Equivalent Von Mises Stress

The maximum stress value observed is 4.905e+06 Pa which is well below the yield strength of the carbon fibre material
used, indicating that the material will not fail under the given loads.
3. Modal Analysis Modal analysis is performed to determine the natural frequencies and mode shapes of the drone frame.
This analysis is essential for understanding the vibrational characteristics and ensuring that the operational frequencies do
not coincide with the natural frequencies, which could lead to resonance and hence damage to the frame. The analysis
was carried out by solving the eigenvalue problem for the drone frame. The natural frequencies obtained are tabulated in
Figure 9.

Figure 9: Modal Analysis

4 MECHANICAL DESIGN REQUIREMENTS 10


TEAM ARTEMIS

RPM of the rotor = 8000 RPM


ω = 2π N /60
Hence, ω = 840 r ad /s
f = 133.331/s
Therefore, it can be concluded that the operating frequency is much lower than the lowest natural frequency of the drone
frame, ensuring that resonance would not be a concern during typical flight conditions. These analyses validate the design
of the drone frame, ensuring reliability and performance during operations.

4.7 Computational Fluid Dynamics


Computational Fluid Dynamics (CFD) simulations are essential to analyse the aerodynamic performance of the drone. In
this section, the team detailed the CFD simulation conducted using SimScale to investigate parameters such as pressure
distribution, velocity field and flow visualisation around the drone. By employing 1/4th symmetry, we have optimised the
computational resources and reduced run time while maintaining accuracy as well. An incompressible fluid flow simulation
was created as the maximum Mach number remains lesser than 0.3 (≈ 125 m/s air speed at standard pressure).

1. Model Preparation For simplicity, 1/4th symmetry was used. This approach leverages the geometric symmetry of
the drone, allowing us to simulate only one-fourth the drone’s volume and extrapolate the results to the entire model.
The cylinders modelled around the propeller represent the rotational zone. The volume represented by the blue cuboid
represents the external flow volume.
2. Mesh Generation An unstructured mesh was generated around the drone. Mesh refinement was also employed near
the rotor blades and body to capture detailed flow characteristics.

(a) Model Preparation (b) Mesh Generation

4 MECHANICAL DESIGN REQUIREMENTS 11


TEAM ARTEMIS

3. Boundary conditions
a. Drone Surface: No-slip wall condition is used for the drone faces.
b. Symmetry Planes: Two planes of symmetry have been assigned as per the configuration.
c. Atmosphere: The flow direction at the faces open to atmosphere is not predetermined, hence the ’Pressure inlet-outlet
velocity’ condition was selected for the velocity (U) setup. The guage pressure is set to ’Total Pressure’ with a value of 0
Pa, corresponding to atmospheric pressure. This allows the solver to automatically determine the flow direction.
d. Rotating Zone: A cylinder around the propeller was modelled to simulate the rotating zone and a radial velocity of
840 r ad /s was imparted to the zone.
4. Simulation Parameters
Turbulence Model: The k − ϵ turbulence model is used for its robustness and efficiency in predicting turbulent flow
characteristics. Flow Conditions: Simulations were carried out under standard atmospheric conditions using the Newtonian
viscosity model. Air density was considered as 1.196k g /m3 .

(c) Pressure Distribution (d) Velocity Magnitude

(e) Velocity (Z) (f) Turbulent Kinematic Viscosity

Figure 10: CFD Results

4 MECHANICAL DESIGN REQUIREMENTS 12


TEAM ARTEMIS

4.8 Testing
There were 3 major design choices that required testing - (i) Toroidal propellers vs Conventional 2-blade propellers, (ii) PC
vs PLA filament for 3D printed parts and (iii) Plywood vs Carbon Fibre plates for the main frame.
(i) Toroidal propellers vs Conventional propellers: The Table 3 depicts a ranking between the two options. Currently,
the team has decided to move forward with conventional propellers due to inadequate data on toroidal propellers. Toroidal
propellers also had the added advantage of being more efficient compared to conventional propellers.

(a) First Iteration (b) Second Iteration (c) Third Iteration

Figure 11: Testing with Toroidal Propellers

Figure 12: Chosen Propeller: 10X4.5

Parameter Toroidal Conventional Propeller


Efficiency 1 0
Noise 1 0
Weight 1 1
Thrust Produced 0 1
Total: 3 2

Table 3: Propeller Matrix

(ii) PC vs PLA filament: PC filament is known to be the strongest filament for 3D printing. The motor mount was 3D
printed out of both materials while keeping all other parameters constant. Physical testing proved that PLA was stronger
in this respect, however this could be attributed to printing flaws. Further testing could not be carried out due to other
external factors, but will be carried out in the near future. Currently, the team has decided to move forward with PLA for
the same reason.
(iii) Carbon Fibre vs Plywood plates: Carbon fiber sheets offer a superior strength-to-weight ratio, 10 times that
of plywood. With tensile strengths of 600-1500 MPa and yield strengths of 500-1400 MPa, carbon fiber outperforms
plywood’s 40-80 MPa tensile and 20-40 MPa yield strengths. Carbon fiber is lighter (1.5 g/cm³ vs. 0.6-0.7 g/cm³) and
more durable, resists moisture, UV, and temperature damage better than plywood. Despite its higher cost, carbon fiber
significantly enhances drone performance and longevity, making it the optimal choice for UAS applications.

4 MECHANICAL DESIGN REQUIREMENTS 13


TEAM ARTEMIS

Figure 13: Plywood and Carbon fiber plates side by side

5 MANUAL OPERATIONS
The manual subsystem dealt with the entire electronic configuration of the drone like the choice of flight controller, and its
integration with auxiliary parts like the camera, Raspberry Pi and the 4G-dongle, collecting telemetry data, geofencing
parameters and overall fine tuning of the PID values. The team also aimed at developing a custom flight controller utilising
the STM32 Blue Pill. However, this is still under development due to the complicated nature of this project, therefore the
Cube Orange serves as a substitute in the mean time.
5.1 Choice of Electronics
(i) Brushless Motors A motor was selected with focus on efficiency and thermal management. After meticulous analysis
using eCalc data, the BrotherHobby Avenger 2816 1050KV motor, Figure 14 (a), was chosen. This motor was selected
to prioritize extended flight times and minimize motor heat buildup, which was crucial for reliable drone operation. Heat
generation was also minimized to protect 3D-printed mounts from potential deformation, ensuring the structural integrity
of the entire system.
(ii) ESCs (Electronic Speed Controllers) It is an electronic circuit used to change the speed of an electric motor precisely.
The HobbyWing XRotor 40A ESC, Figure 14 (b), was chosen to optimise weight and space requirements on the frame. It
was also preferred for its higher headroom to avoid overheating out in any event of current surges.

(a) Motor (b) ESC (c) ESC

Figure 14: (a) Motor, (b) Electronic Speed Controller and (c) Raspberry Pi

5.2 Choice of Onboard Computer


The options that were available to the team were (i) Raspberry Pi 5 (ii) Nvidia Jetson. The Raspberry Pi 5 was chosen
over the Nvidia Jetson for the following reasons:
1) Lower weight of the module
2) Sufficient computing power available through Raspberry Pi 5

5 MANUAL OPERATIONS 14
TEAM ARTEMIS

5.3 Choice of Flight Controller


The options that were available to the team were (i) Pixhawk 2.4.8 (ii) Cube Orange (iii) Teensy 4.0 (iv) STM32 BluePill.
The Cube Orange was chosen due to the availability of active PID tuning, ease of integration with peripherals and
presence of an overall better processor. The team is developing a custom flight controller using STM32 Bluepill and
Teensy 4.0 microcontroller boards, having created several prototypes. However, these prototypes have not yet achieved
the necessary reliability for drone deployment. Meanwhile, the Cube Orange, Figure 15, remains the best option due to its
dependability and advanced capabilities.

1. Processor: i. 32 bit STM32H753VI (32 bit ARM Cortex M7, 400 Mhz, RAM 1 MB). 2. Sensors: i. 3 redundant IMUs, 2
MS5611 Barometer, and a compass. 3. Safety features: i. Integrated backup system for in-flight recovery and manual
override; standalone power supply.
4. Interfaces: i. IO Ports: 14 PWM servo outputs, 5x UART (serial ports), 2x CAN, RSSI.

(a) Cube Orange (b) STM32 Flight Controller

Figure 15: (a) Cube Orange and (b) STM32 BluePill FC

The ranking is shown in Table 4

Parameter Pixhawk Cube Orange Teensy 4.0 STM32 BluePill


Ease of integration 1 1 0 0
Stability 3 4 2 1
Active PID tuning 1 0 0 1
Reliability 1 1 0 0
Total: 6 6 2 2

Table 4: Flight Controller Matrix

5.4 Camera
The Siyi A8 mini camera (Figure 16(a)) was chosen for its compact design, lightweight construction, and respectable
image quality. The camera offered HDMI, Ethernet (RTSP), and CVBS (analog) outputs. Since Ethernet and analog do
not work together, HDMI will be used to stream high-quality video to the Raspberry Pi, while CVBS will be used for the
VTX to ensure low latency for real-time manual operation. This setup supported both manual and autonomous operations.

5 MANUAL OPERATIONS 15
TEAM ARTEMIS

(a) Camera (b) 4G SIM Dongle

Figure 16: (a) Camera and (b) 4G SIM Dongle

5.5 Choice of Communication Modules


The options available were (i) 4G SIM Dongle (ii) LoRa module (iii) RasPi 4G Hat. The 4G SIM dongle (Figure 16 (b)) was
easier to integrate (relatively) with the Cube Orange than the 4G Hat, and the LoRa modules offered high latency which
would affect transfer and communication times. A ranking as per Table 5 was performed.

Parameter Dongle LoRa module RasPi 4G Hat WiFi Router


Range 2 3 2 0
Latency 2 0 1 3
Ease of integration 3 0 1 2
Data Transfer Rate 3 1 2 1
Total: 10 4 6 6

Table 5: Communication Module Matrix

5.6 RTK-GPS
In order to achieve centimeter-level accuracy for the drone, the team adopted a dual Here 1 RTK GPS setup. This
was chosen over alternatives like the M8N and M9N due to their metre level accuracy. The dual module design further
increased reliability, mitigating the risk of mission critical GPS failures.

(a) Base Station (b) GPS

Figure 17: (a) Base Station and (b) Here GPS

5 MANUAL OPERATIONS 16
TEAM ARTEMIS

5.7 Mission Planner


Mission Planner, a robust ground control station software for UAVs, offers advanced flight planning, real-time monitoring,
data analysis, and customization. It facilitates wireless connection to ArduPilot through Telemetry Modules.
Heads-up Display and Flight Data Screen- The heads-up display (HUD) of the mission planner ground station provides
detailed views of flight parameters, Heading direction, Bank angle, Telemetry connection link quality and more. FENCES
in Mission Planner- Mission Planner has a FENCES option as shown in Figure 18 for creating circular or polygonal
fences instead of traditional mission planning, which is crucial for Geo Fencing. It also displays all parameters and sensor
controls of the connected flight controller, allowing for tuning and activation/deactivation of onboard sensors. This level of
control helps optimize the drone’s performance for each competition leg

(a) HUD (b) Geofences (c) RTK

Figure 18: (a) HUD, (b) Geofences and (c) Real-time Kinematics

RTK GPS and Fixed Baseline- Mission Planner facilitated the integration and utilization of RTK GPS (Real Time
Kinematics) by providing precise configuration options, real-time data monitoring, and correction stream management as
per Figure 18. The “Fixed Baseline” feature of Mission Planner utilized Real Time Kinematics GPS to improve the position
of point estimate from normal GPS meter range, down to centimeter range (estimated at about 1.5 – 2 cm, during testing).

5.8 Power Module


The PM-02 power module was chosen for the Cube Orange to monitor voltage and current from the LiPo cells, essential
for safety features like RTL (Return to Launch) in case of low power or critical battery discharge. It employs a hall-effect
current sensor for accuracy, lower power consumption, and temperature immunity. It can sustain 100A continuously and
handle a maximum burst current of 800A at 85°C for 1 second, within observed currents during UAV testing.

(a) Power Module (b) Battery

Figure 19: (a) Power Module and (b) Lipo Battery

5 MANUAL OPERATIONS 17
TEAM ARTEMIS

5.9 Battery
The choice of battery for the competition is a single 4-cell (4S) 4500 mAh (35C/50C) Lithium Polymer battery. eCalc
analysis estimated a flight time of about 10.5 minutes, which was deemed sufficient for each leg of the competition, while
maintaining a high Thrust-to-Weight ratio (2.7). The team might modify the LiPo battery to a higher capacity LiPo battery
for the drone in further iterations of the testing.

5.10 ELRS TX/RX


The RadioMaster Boxer MAX Radio Transmitter (ELRS 2.4GHz M2-BLACK) (Figure 20) was switched from a Flysky
transmitter by the team because of the advantages of the ExpressLRS (ELRS) system. Longer range, better signal
penetration, lower latency, and higher update rates were offered by ELRS, which enhanced control for manual missions.
Advanced monitoring capabilities, providing real-time feedback, were also provided by ELRS, and its open-source nature
allowed for customization and community support.

(a) ELRS Transmitter (b) ELRS Receiver

Figure 20: (a) Transmitter and (b) ELRS Reciever

5.11 Analog and digital VTX


In an assessment for the optimal choice between analog and digital video transmitters (VTX), the team conducted a
thorough evaluation based on various criteria. A ranking as per Table 6 was performed. Specific criteria prioritized, such
as video quality, latency, compatibility, and price, favored analog transmission. This informed choice aligns with the
project’s requirements and budget constraints, ensuring optimal performance and cost-effectiveness.

Parameter Analog VTX Digital VTX


Video Quality 0 1
Latency 1 0
Range 0 1
Compatibility 1 0
Ease of Use 1 0
Flexibility 1 0
Total: 4 2

Table 6: VTX Matrix

5 MANUAL OPERATIONS 18
TEAM ARTEMIS

(a) Telemetry (b) Video Transmitter

Figure 21: (a) Telemetry and (b) Video Transmitter

5.12 Telemetry module


A telemetry module was included to enable real-time data transmission and enhance overall control and monitoring
capabilities. Vital flight parameters, system status, and environmental data can be tracked, ensuring safe and efficient
operations. Additionally, data logging for post-flight analysis was facilitated, allowing performance optimization and issue
troubleshooting.

The data from eCalc is shown in Figure 22 a, b

(a) eCalc inputs

(b) Data

Figure 22: eCalc

5 MANUAL OPERATIONS 19
TEAM ARTEMIS

6 AUTONOMOUS OPERATIONS
The autonomous subsystem dealt with developing the computer vision, object detection, image recognition and path
planning algorithms. The team dealt with achieving controlled flight without human interference.
6.1 System Design for Communication of Survey Data
Survey data was stored on the Companion Computer before being securely transmitted to the cloud. It could then be
retrieved by a user with sufficient credentials. The data would be uploaded during flight using 4G, with data items being
assigned different priorities to enable sending time critical information first, to make efficient use of limited bandwidth.

Figure 23: System Design for Capturing Survey Data

(a) Communication between a Static IP and a 4G Connected (b) A cloud server is required to establish the initial connection as Drone and GCS do not
Drone is allowed by CG-NAT know each other’s IPs due to CG-NAT.

Figure 24: A cloud server was needed due to CG-NAT and lack of static IP during competition.

6.1.1 Format Selection


A. Images: As Training Data for CV models had not yet been released, it was unclear if Training on Lossless images
whilst inferencing on Lossy images significantly reduced accuracy. Hence, Lossless formats (PNG) were chosen. Lossy
formats (JPEG) may be considered after release of Training Data and further performance testing of the final models.

6 AUTONOMOUS OPERATIONS 20
TEAM ARTEMIS

B. Videos: The Video streaming Codec would have a significant impact on the compute performance, since Video
data processing happened for the entire duration of flight. Choices were narrowed down to two popular Codec formats.
The H265 Codec was preferred over H264 as it provided smaller file sizes. Since the amount of compute headroom is
unknown, further testing is required to evaluate whether the higher processing cost of H265 is worth the reduction in
file size. The Matroska format was chosen as a container format for its versatility. Corrupted Matroska Video files have
usually been easier to recover during unexpected recording stops, so critical survey data could potentially be recovered.

6.1.2 Transmission Mechanisms for Data


A. Real Time Video Feed: Real time video feed would be obtained on the base station by two mechanisms. The Analog
VTX, and Streaming of Video Data via Cloud Server. Video streaming via the cloud would be accomplished using Open
Source Streaming software, using variable bitrate (VBR) streaming to conserve network bandwidth wherever possible.
B. Survey Capture Data: A Preferential Queuing system for data transmission was set up on the Companion Computer
to ensure time critical data would be sent over first, whilst less important data would be sent when network bandwidth was
free. This would be accomplished through a message proxy node (Figure 25) which would receive all outbound data.
Data that is not considered important will not be transmitted during flight and may be retrieved manually via SCP/SFTP.

Figure 25: Message Proxy node decides what to transmit based on the size and priority of Data in the Queue. The priority
level of each data type must be specified before flight.

6.1.3 Security of Data Storage


A ranking of the various authentication options for security of Survey Data was performed as per Table 7.

Parameter Traditional authentication VPN Key/Token based authentication


Ease of setup 2 1 0
Security 0 2 1
Scalability 0 1 2
Total: 2 3 3

Table 7: Authentication Matrix

Ultimately, a VPN was chosen over another method of authentication as it was simpler to set up for a single device. The
VPN also allowed a static IP to be assigned to the Drone within the Virtual Private network which had Quality of Life
benefits when writing software. The choice of VPN was primarily based on ease of setup since only basic functionality was
required. Wireguard was chosen for its integration with the Linux Kernel and availability of Tutorials and guides for setup.

6 AUTONOMOUS OPERATIONS 21
TEAM ARTEMIS

6.1.4 Performance Estimations


Size estimations for Video and Images
A. Video Streams An exact size cannot be obtained for video data encoded in H264/H265. Video samples of a field,
encoded using H264 at 720p@30fps are 10 Megabytes in size for 10 Seconds of footage. Assuming similar compression
quality holds at lower frame rates, a bitrate of 1 Mbps would be sufficient for live streaming.
B. Raw Image Files For uncompressed image data using 3 channel 8-bit color, the size of a 4096 × 2160 resolution
image is given by 4096 × 2160 × 3 which would put the file size at about 26 Megabytes. A conservative estimate
compression ratio of 75 percent was assumed since the image was expected to have a mostly homogeneous background
with at most one hotspot. With a typical 4 Mbps connection, such an image would be uploaded in under 10 seconds.
Time to upload is given by the formula:
S ×8
T (seconds)= C
Where, T is time to upload ; S is size in Megabytes ; C is connection speed in Megabits per second

6.2 Methodology for Autonomous Operations


The Autonomous Operations will be carried out using Robot Operating System 2 (ROS2) running on a Raspberry Pi 5.
The Raspberry Pi will command the drone having a Pixhawk flight controller using MAVROS. The images will be captured
using the Siyi A8 Mini camera and published as a topic in ROS2. Both Deep Learning based and Non Deep Learning
based methods have been tested for Computer Vision and their results are shown in section 6.4. Various exploration
algorithms have been tested for efficient coverage of the field and their simulations are shown in 6.3. ROS2 has been
preferred for carrying out autonomous operations because it provides easy and low overhead abstractions for sending
data to other parts of the ROS2 network, which is critical as the drone has to make multiple decisions at any given moment
to take the next steps during autonomous flight. The simulations have been carried out using ROS2 and Gazebo Garden.
Mission Planner will be used as the Ground Control Station for the operation.

Figure 26: Methodology for Autonomous Operation

6 AUTONOMOUS OPERATIONS 22
TEAM ARTEMIS

6.2.1 Procedure of Autonomous Flight Operation


1. The drone will take off from the specified point and follow an exploration algorithm flying at the altitude of 15m.
2. The camera will continuously capture images of the field and pass it to the Computer Vision node in ROS2 which will
determine if the captured picture has a hotspot/payload drop marker/object detection zone.
3. In case a hotspot is detected, the drone will move to the top of the hotspot and capture a picture at 0 degrees with
timestamp. After capturing the image, the drone will go back to following the exploration algorithm.

6.2.2 Procedure for Object Identification


4. In case the object detection zone is detected, the drone will move to the top of the zone and capture multiple pictures to
count the objects and the median of the count from different images will be taken to avoid any error. The drone will move
back to following the exploration algorithm after counting the objects

6.2.3 Procedure for Payload Drop


4. In case the payload drop marker is detected, the drone will descend to the height of 5m and move accurately to the top
of the marker and drop the payload. The drone will go back to following the exploration algorithm after dropping the payload.

If all hotspots have been detected along with the counting of objects/dropping of payload, the drone will return to base.

6.3 Exploration Algorithms


Various exploration algorithms were tested and their simulations are shown below. The best approach will be decided after
testing on the drone before Phase-2 due to several real world factors which might remain unaccounted for in simulations.

6.3.1 Grid size estimation


An estimate of the approximate area of capture was obtained mathematically in 11.3. The dimension of the capture area
was found to be 25.62 × 13.51 m. This data has been used in Simulations and to obtain estimates of number of Search
Grids. Buffer Length is a reduction in capture dimensions to account for Hotspots at the edge of an image.
Number of Grids is given by the formula:
L W
(25.62−n) × (13.51−n)
Where, n is the buffer area ; L = Length of the field = 200 m ; W = Width of the field = 100 m
With a 4 meter buffer, the grid size is 11 × 10
With a 2 meter buffer, the grid size is 9 × 9

6.3.2 Boustrophedon Search


The map was subdivided into a grid and drone was moved to a corner of the grid. The longest edge of the map was
picked and the drone was flown along that while searching for hotspots. If the drone was at the opposite end, it could
move up or down to the next strip of grids. This process was repeated until the area has been surveyed.

6.3.3 Spiral Search


The map was subdivided into a grid and drone was moved to a corner of the grid. The longest edge of the map was
picked and the drone was flown along that edge while searching for hotspots. If the drone was at the opposite end, it
rotates 90 degrees and moves along the edges in a spiral manner.

6 AUTONOMOUS OPERATIONS 23
TEAM ARTEMIS

(a) Boustrophedon Search (b) Spiral Search

Figure 27: Search Algorithms

6.3.4 Simulation Data

Buffer Length (m) Grid size (Boustro) Traversal time (Boustro) No. of Grids (Spiral) Traversal time (Spiral)
0 8x8 252s 76 356s
2 9x9 357.5s 97 406s
4 11x10 448.3s 135 604.9s

Table 8: Simulations in Gazebo show that Boustrophedon outperforms Spiral search in terms of Traversal Time.
Performance of CV Models will ultimately dictate the search algorithm used during Phase-2 as continuous image capture
is easier in spiral search.

6.4 Computer Vision


Various Models were tested which will be used in identification of hotspots and the payload marker and their results are
shown below. These models will be trained on the datasets of hotspots and the payload marker once the datasets are
released. The results shown were for models trained on dataset collected from public online sources.

6.4.1 Haar Cascade Classifier


Haar Cascade was considered for its higher FPS over Tiny-YOLOv4 and Faster R-CNN but it failed during complex scenes
which had variations in lighting. Given the uncertain lighting and weather conditions, it was not preferred.

6.4.2 Tiny-YOLO v4
YOLO v4 and Tiny-YOLO v4 models were considered for the identification of targets, as they were faster and more
accurate considering the computational power provided by the Raspberry Pi 5. Tiny-YOLO v4 was finalized due to much
faster response and acceptable accuracy. It offered higher FPS (26.8 FPS), compared to YOLO v4 (1.82 FPS).

6.4.3 Faster R-CNN


The team also tested the Faster R-CNN model for target detection. It offered 1.5 FPS on Raspberry Pi 5 and had a similar
response time as compared to its YOLO counterparts but with considerably lower accuracy.

6.4.4 Counting of Objects


For counting of objects thresholding algorithms were considered. A threshold is applied on the images where pixels below
a certain threshold are set to zero. The boundary points of objects and zero pixels were used to classify the object into
different shapes based on the number of [Link] median of the count from multiple images would be taken to avoid
any error.

6 AUTONOMOUS OPERATIONS 24
TEAM ARTEMIS

(a) Haar Cascade Classifier (b) Tiny-YOLO v4 (c) Faster R-CNN

Figure 28: Computer Vision Models. Tiny-YOLOv4 and Faster R-CNN will be tested on the drone before Phase-II and
ultimately the model with higher accuracy and FPS will be chosen.

Figure 29: Counting of objects

6.5 Drone Control System


The Drone Control system was the autonomous subsystem on the Companion Computer of the drone in-charge of
performing mission objectives autonomously. This control system was designed with the Safety, Performance and
Extensibility as primary goals.

6.5.1 Frameworks for Drone Control

Parameter MAVROS+ROS2 AP DDS+ROS2 Dronekit MAVSDK


Ease of use 1 2 3 0
Message passing capabilities 2 3 0 1
Hardware and compatibility testing 2 1 0 3
Total: 5 6 3 4

Table 9: Frameworks Matrix. These frameworks use MavLink which is a lightweight protocol used to share Flight Data
among Drones.

Various frameworks such as Dronekit, MAVROS, AP DDS, and MAVSDK were considered. Table 9 depicts a ranking of
these frameworks. Although AP DDS with ROS2 ranked first, it was still heavily under development and lacked critical
functionality at the time of writing. Thus, MAVROS with ROS2 was chosen for initial implementations.

6 AUTONOMOUS OPERATIONS 25
TEAM ARTEMIS

6.5.2 Features of the Control System


ROS Nodes acted as a single point of management of a particular subsystem. To balance performance and safety, only
the systems that had the potential to cause physical harm to the drone when they failed were built with some capability of
Safety Checking the current state of the drone when necessary. To protect against unexpected halting of nodes, a Node
termed the Interrupter Node whose sole purpose was to check the operational status of other nodes was employed. If
nodes encountered an error, they were quickly reverted to a state with sufficient capabilities to ensure Safety.

(a) The Interrupter checks the operational status of other nodes by sending “challenges” to (b) ROS Nodes are independent processes, which
the other Node. Improper response results in termination enables parallel computation and reduces processing
times

Figure 30: Features of the Control System

Figure 31: An example of a safety check involving the Navigation System

6 AUTONOMOUS OPERATIONS 26
TEAM ARTEMIS

7 BUILD
Manufacturing of Drone Body
The manufacturing of the drone frame involves transitioning from an initial prototype to a final model, optimising for
parameters such as weight, strength and production efficiency.
Initial Prototype Manufacturing:
1. Materials:
a. Plates: 3mm Plywood
b. Arms: Carbon Fibre Rods (20mm outer side length, 18mm inner side length)
c. Landing Gear, Motor Mounts and Central Hub: 3D printed PLA

2. Manufacturing Techniques:
a. Laser Cutting: The plywood sheets were laser cut using a CNC laser cutter. This technique allowed for precise cutting
of intricate shapes and designs necessary for the drone frame. It also allowed for mass reproducibility of the sheets.
b. Fused Deposition Modelling: Major mounts for the drone were 3D printed using PLA plastic as it provides adequate
strength and is also lightweight.

3. Strength and Limitations: Plywood, while easy to cut and assemble, has lower tensile strength and rigidity compared
to more advanced composites such as carbon fibre. It also results in a heavier and less durable frame. The plywood
frame may experience warping or splitting under stress.

Figure 32: Laser Cutting and 3D Printing of initial prototype

Current Prototype Manufacturing:


1. Materials:
a. Plates: 1mm Carbon Fibre
b. Arms: Carbon Fibre Rods (20mm outer side length, 18mm inner side length)
c. Landing Gear: 3D printed PLA
d. Motor Mounts: 3D printed PLA

2. Manufacturing Techniques:
a. CNC Machining: The carbon fibre sheets were milled using a CNC Vertical Milling Machine (VMC 5 Axis). This
offered high precision and repeatability, essential for producing consistent and complex parts. The mounts were not
redesigned as they provided satisfactory results upon impact testing,

7 BUILD 27
TEAM ARTEMIS

3. Strength and Limitations: Carbon fibre plates significantly enhanced the frame’s strength-to-weight ratio. It provided
high tensile strength, lower weight and greater resistance to environmental factors, providing a more robust and durable
frame compared to plywood. Hence, the transition to a fully carbon fibre frame with 3D printed components represents a
significant improvement in the drone’s structural integrity and overall performance. Though this also resulted in increased
overall cost as carbon fibre sheets are more expensive than plywood sheets.

Figure 33: CNC Machined current prototype

8 OPTIMIZED FINAL DESIGN


The final optimised design of our drone included several key improvements for enhanced performance, reduced weight and
better functionality. These optimizations involved material upgrades, structural redesigns and efficient component
integration.

• Material Upgrade: The original 3mm plywood sheets were replaced with 1mm carbon fibre plates.

• Central Attachment Redesign: The central hub was redesigned to include internal holes for better wire manage-
ment and weight reduction. The new design allows for cleaner, organized routing of wires and reduced material
volume, resulting in a lighter hub without compromising strength.

• Landing Gear Optimisation: The landing gear was redesigned to better handle impact forces and scaled down to
provide just enough space clearance for the payload, reducing unnecessary weight. This overall lowered the CG as
well resulting in improved stability.

Figure 34: Optimized final designs

8 OPTIMIZED FINAL DESIGN 28


TEAM ARTEMIS

9 INNOVATION
Our drone design incorporates several innovative features that enhance the flight performance, functionality and user
experience. They include advanced propulsion systems, custom-built electronics, state-of-the-art software integration and
modern connectivity solutions.

• Toroidal Propellers: Unlike conventional propellers, toroidal propellers improve aerodynamic efficiency and reduce
due to their unique ring-shaped design. They enhance the drone’s stability and flight performance. The team custom
printed such propellers but due to lack of trustable data, more testing and redesigning is required.

• Custom Flight Controller with STM32: The team developed a custom flight controller using the STM32 microcon-
troller. It allows for greater flexibility and optimization tailored specifically for our drone.

• Integration of ROS2: ROS2 enables seamless integration of various sensors, actuators and other hardware
components, facilitating advanced functionalities such as autonomous navigation and real-time data processing
along with enhancing scalability and ease of development.

• 4G Enabled Connectivity: To ensure long-range communication, our drone will be equipped with 4G connectivity.
It allows the drone to be controlled and monitored remotely over cellular networks, providing greater operational
flexibility and range compared to traditional radio-frequency-based control systems.

(a) Toroidal (b) STM-32 based Flight Controller (c) STM-32 micro-controller

Figure 35: Innovation

9 INNOVATION 29
TEAM ARTEMIS

10 DETAILED EXPLODED VIEW

Figure 37: Parts List


Figure 36: Exploded View

10 DETAILED EXPLODED VIEW 30


1 2 3 4 5 6 7 8

A A

42
0.
00
Centre of Gravity

336.00
B B

C C
490.00

336.00
348.00

D D
247.74

247.74
E E

348.00 Dept. Technical reference Created by Approved by


Scale- 1:3.4 Birla Institute of Technology & Science
Document Type Document status

All dimensions in mm Three-View Drawing


490.00 Title DWG No.

F Team Artemis F

Rev. Date of issue Sheet


04/06/2024 1/1
1 2 3 4 5 6 7 8
TEAM ARTEMIS

11 APPENDIX
11.1 PID Tuning using Mission Planner
By adjusting the Proportional (P), Integral (I), and Derivative (D) gains using Mission Planner enhances the drone’s flight
performance by ensuring stable and precise control.

Figure 38: PID Tuning

11 APPENDIX 31
TEAM ARTEMIS

11.2 Internet Speed Test


Upload Speed: 4 Mbps
Ping Latency: 100 − 400 milliseconds

Figure 39: Cloudflare Internet Speed Test

11 APPENDIX 32
TEAM ARTEMIS

11.3 Grid Size Estimation


The camera used in the drone, Siyi A8 Mini has an 81◦ Horizontal FOV which was used in the simulations to estimate grid
sizes. The resolution of the image captured by the camera is 4096 x 2160. A field size of 200m × 100m was assumed.
A. Rough size of capture area
To find the width of capture area at 15 meter height, the following formula was used:
width = 2 × tan( f ov
2 ) × height

(a) Geometric Representation of Width of Capture (b) Height of capture is obtained by scaling the width

Figure 40: By reducing the length of a search grid by 4 meters (width of largest region of interest), hotspots that are at the
edge of an image or in two areas can be accounted for.

At 15m,
width = 2 × tan(40.5◦ ) × 15 = 25.62 m
4096 pixels : 25.62 m
∴ 2160 pixels : 2160/4096 × 25.62 = 13.51 m

11 APPENDIX 33
TEAM ARTEMIS

11.4 Bill of Materials

11 APPENDIX 34
TEAM ARTEMIS

11.5 References
11.5.1 Mechanical Operations-
• Mathworks : [Link]

• Solidworks : [Link]

• Ansys : [Link]

• Autodesk : [Link]

11.5.2 Manual Operations-

• Ardupilot : [Link]

• TBS unify vtx : [Link]

• Rapid-fire vrx : [Link]

• GetFPV : [Link]

• Siyi A8 Mini Manual : [Link] file/A8%20mini/A8%20mini%20User%20Manual%[Link]

11.5.3 Autonomous Operations-

• ROS2 Documentation : [Link]

• MAVROS : [Link]

• Gazebo Simulator : [Link]

• WireGuard : [Link]

• Siyi A8 Mini Manual : [Link] file/A8%20mini/A8%20mini%20User%20Manual%[Link]

• Matroska Video : [Link]

• CG-NAT : [Link] NAT

11 APPENDIX 35

You might also like