You are on page 1of 52
19.521 - Group Project Interim Report Semi-Autonomous Unmanned Aerial Vehicle Philipp Karagiannakis Colin McGuire Niall

19.521 - Group Project Interim Report Semi-Autonomous Unmanned Aerial Vehicle

Philipp Karagiannakis Colin McGuire Niall McKenzie Neil Sinclair

December 19, 2008

Mentor: Professor R. Stewart

Abstract

The SUAVE project aims to create a Semi-Autonomous Unmanned Aerial Vehicle for use in observation and surveillance applications. A quadcopter helicopter based on a commercial open-source project called Mikrokopter will be used as foundation. Further systems will be added to increase the functionality of the quadcopter including an object detection system, GPS-powered navigation and a bi-directional wireless communication system using WiFi and GPRS. A camera will be integrated into the system to provide an example of a potential surveillance application.

Currently the project is on schedule and all of the required hardware has been ob- tained. The system design has been finalised and a work plan created to implement this in the coming semester.

ii

Contents

1 Introduction

1

1.1 Aims and Objectives

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

1

1.2 Project Organisation

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

2

1.3 Report Structure

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

2

2 Applications

3

2.1 Microwave Cavity Resonator .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

3

2.2 Air Pollution Sensor (Inner-city) .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

3

2.3 Remote Volcano Sensing

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

3

2.4 Camera Surveillance

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

4

3 Background

5

3.1 How Quadcopter Works

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

5

3.2 Proportional Integral Derivative Control

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

7

3.3 Communication System

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

9

3.3.1 WiFi .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

9

3.3.2 GPRS

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

10

3.4 Object Detection

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

10

3.4.1 Sensing

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

10

3.4.2 Imaging

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

11

4 Design Methodology

 

13

4.1 System Overview

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

13

4.2 Navigation and Autonomous Control System .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

13

4.2.1 Requirements 1 and 3 Send and Receive commands and flight data

to the communication system

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

16

4.2.2 Requirement 2

 

Receive and Pre-Process Data

.

.

.

.

.

.

.

.

.

.

.

16

4.2.3 Requirement 4 - Process Available Data and Decide on Movements

Required to Control the MikroKopter .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

17

4.2.4 Requirement 5 Disengage Manual Control and Control quadcopter

in Autonomous Mode

. Monitor Current Situation and Prepare for Au- . . .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

tonomous Control

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

4.2.5 Requirement 6

.

.

4.3 Communications

4.4 Object Detection

18

20

20

21

iii

5

Equipment

22

5.1 Flight Control .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

22

5.2 Navigation System Equipment

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

24

5.2.1 Central Processing Unit

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

24

5.2.2 GPS Module

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

24

5.2.3 Magnometer

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

24

5.3 Communications System

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

25

5.3.1 Freescale Coldfire Microcontroller

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

25

5.3.2 Fargo Maestro - 100 GPRS Modems

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

26

5.3.3 S103 WiFi Module

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

27

5.4 Surveillance Module and Compression .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

28

5.5 Object Detection

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

29

6 Progress to date

 

31

6.1 Construction and Test Flights

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

31

6.2 Selection of Navigation Equipment and Testing

.

.

.

.

.

.

.

.

.

.

.

.

.

.

32

6.2.1

Component Testing .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

32

6.3 Communications

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

33

6.4 Object Detection

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

35

7 Future Work and Goals

 

37

7.1 Navigation System

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

37

7.2 Communication System

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

37

7.3 Camera Surveillance

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

38

7.4 Object Detection

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

38

8 Conclusion

39

A Risk Assessment

 

43

B Gantt Chart

47

List of Figures

 

1 quadcopter with mounted camera[2]

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

4

2 quadcopter showing direction of each propeller[3]

.

.

.

.

.

.

.

.

.

.

.

.

.

5

3 Thrust comparison of 10” and 12” propellers .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

8

4 PID controller[8]

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

8

5 Illustration of using infrared or ultrasound to detect the presence of objects[11] 11

iv

6

RoboRealm example, showing the original image (left) and the edge de-

tected image (right)[12] .

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

11

7 RoboRealm example, showing the side-filled image (left) and the eroded

 

image (right)[12]

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

12

8 RoboRealm example, showing the final image overlaid with the original[12]

12

9 System Overview

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

13

10 Basic Navigation System Connections

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

14

11 Navigation Processor Data Inputs and Outputs

.

.

.

.

.

.

.

.

.

.

.

.

.

.

15

12 The three levels of control: go to command, object avoidance and basic

 

flight stabilisation

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

13 Control loops used to achieve autonomous flight. (a)

. yaw: controls the

.

.

.

.

.

.

.

.

.

 

18

direction of vehicle (b)

pitch: controls the horizontal movement. (c)

 

throttle: controls the height of the vehicle

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

19

14 Transmission Structure of SUAVE communications

.

.

.

.

.

.

.

.

.

.

.

.

20

15 Reception Structure of SUAVE communications

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

20

16 Freescale MCF528x features[28]

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

25

17 Freescale Coldfire Microcontroller and Development board

.

.

.

.

.

.

.

.

26

18 Fargo Maestro- 100 GPRS modems[29]

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

26

19 S103 Wifi module[31]

 

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

27

20 Polar response of a Parallax Ping Sensor, tested with a 12” square piece

 

of white card[33]

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

30

21 MikroKopter kit before construction

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

31

22 Fully constructed MikroKopter

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

32

23 Magnometer test rig schematic

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

.

33

24 Circuit diagram of a Sharp Distance Measuring Sensor in use

 

.

.

.

.

.

35

25 Screen captures of the output pin of the Sharp Distance Sensor

 

36

v

1

Introduction

Autonomous vehicles create substantial and varied engineering challenges. With ever in- creasing processing power and the reduction in size, weight and cost of electronic devices, this challenge can now be extended into the sky.

Designing and constructing an autonomous aerial vehicle is a substantial challenge and requires knowledge of several areas, both mechanical and electrical. Expertise in digital communications, control engineering, embedded system design and integration, power economy and data processing are all required. This multidisciplinary aspect and the novelty of flight appealed to all of the group members. A quad rotor helicopter provides a stable and agile platform for surveillance. It avoids some of the complexities involved with maintaining stable flight compared to a standard helicopter making it an excellent choice for an initial autonomous aerial vehicle project.

1.1 Aims and Objectives

The aim of the project is to design and construct a Semi-autonomous Unmanned Aerial Vehicle (SUAVE) for observation and surveillance.

The platform is based on a quad-rotor design which offers superior stability, ease of control and payload capacity compared to a standard helicopter. A quad-rotor design is also extremely agile and is free to move in any direction, making it ideal for surveillance and observation.

A navigation system will incorporate basic object avoidance to allow safe autonomous flight between predetermined points and wireless communications will facilitate the trans- fer of control commands and sensor data between the vehicle and a base station.

The following objectives have been identified for this project:

Construct a quad rotor helicopter and achieve manually controlled stable flight of the vehicle using radio frequency (RF) control.

Create a control system capable cable of autonomous hover.

Create a system to facilitate wireless bi-directional data communication and control.

Integrate a basic object detection system with an improved control system to avoid collisions.

1

Incorporate a navigation system to allow autonomous flight and navigation.

Integrate a surveillance device and pre-process the data for transmission over the communications channel.

1.2 Project Organisation

The project is divided into four manageable sections. Each group member is responsible for one section yet substantial overlapping and crossovers exist. These sections were assigned according to each members interests. A Gantt chart (Appendix B) was created containing the parallel work streams to provide a schedule for the group to follow. To keep all group members and the project mentor informed of progress, formal weekly meeting are scheduled and an online record of the groups progress in the form of a blog, is kept at http://project.rawxing.org

1.3 Report Structure

The aim of this report is to inform the reader of the current progress as of 18th December 2008, provide an overview of the design to be implemented and highlight the future work required to meet the project objectives.

Section 2 details the possible applications of an autonomous quadrotor helicopter and the chosen application for the SUAVE project.

Section 3 provides background information to aid the reader in understanding how a quadrotor helicopter operates as well as providing an overview of the technologies which will be referred to in this report.

Section 4 gives an overview of the current system design which will be implemented. It is divided into the following sections: Navigation and Autonomous Control, Commu- nications, Object Detection System and Camera and Video Compression. Each section will discuss the architecture of the individual system, the connections to other systems and the equipment selected.

Section 6 highlights the work accomplished to date in all sections and Section 7 dis- cusses future work and goals and concludes the report.

Appendix A contains the specific risk assessment for this project as of 18th December

2008.

2

2

Applications

A number of options were considered as a possible final application for the quadcopter.

Below is a list of some of the applications investigated and the reasons why they were eliminated.

2.1 Microwave Cavity Resonator

An idea put forward by Dr Ian Glover a Strathclyde lecturer involved testing the reflec- tivity of air contained in the boundary layer. The boundary layer is the first few hundred meters above the earths surface. The reflectivity of air affects the transmission of radio waves. The proposed method of measurement included building a microwave cavity res- onator. The resonator chamber would allow air to pass through it and by testing the capacitance of the air the reflectivity could be determined. The rotating propellers posed a problem because they would force air pass the cavity thus affecting the measurement.

A different method was to actively monitor the temperature and humidity of the sur-

rounding air and perform a calculation using the two variables. The problem with this is obtaining an instantaneous and accurate measurement of the humidity. Once again the rotating blades would affect the measurements.

2.2 Air Pollution Sensor (Inner-city)

It was thought that the vehicle could be used to navigate a pre-determined flight path

and measure several air pollutant levels en route. The resulting measurements could then be used as an indication of the air pollution in a particular area of a city. However, after some correspondence with the Technical Officer at the Land Environmental Services of the Glasgow City Council the idea was abandoned. The main reasons came down to a problem with the measurement sensors: they were either both large and bulky or a passive measurement was needed. The passive measurement would once again be affected by the moving vehicle and is usually taken over a longer period.

2.3 Remote Volcano Sensing

Andrew McGonigle, a Scottish scientist working for the University of Sheffield, has de- veloped a novel means for determining whether or not a volcano is due to erupt[1]. It involves sending a Radio Controlled (RC) helicopter into the crater of a volcano and measuring the level of various gases. This information can then be processed once the helicopter returns. We are hoping to develop an autonomous version of this which was

3

capable of transmitting the data back for real time processing. Once again the sensors were the main reason this idea was abandoned. They were found to be too expensive and there was nowhere to test the vehicle i.e. a local active volcano.

2.4 Camera Surveillance

Possibly the most obvious use for an aerial platform would be as a mount for a camera so that it may be used as a mobile surveillance device, shown in Figure 1. A decision was made to build the quadcopter for use as a platform for a camera which would be connected via a standard serial connection such as RS232. The standardised connection could then be used to attach different devices and essentially enable the quadcopter to be used for a multitude of applications.

enable the quadcopter to be used for a multitude of applications. Figure 1: quadcopter with mounted

Figure 1: quadcopter with mounted camera[2]

4

3

Background

3.1 How Quadcopter Works

A quad-rotor helicopter avoids much of the complexity needed to fly a standard helicopter by eliminating the need for variable pitch rotor blades and a thrust stabilising tail rotor. Instead it uses four identical propellers which are evenly spaced; they each produce an equal amount of lift and torque with each pair of opposing propellers being spun in the same direction. This gives us a pair of clockwise and anti-clockwise propellers. This is shown in Figure 2 below. By varying the thrust produced by the propellers, movement of the helicopter in any direction is possible. One of the clockwise rotating arms is defined as being the front of the vehicle and all movement is in relation to it.

front of the vehicle and all movement is in relation to it. Figure 2: quadcopter showing

Figure 2: quadcopter showing direction of each propeller[3]

As all the propellers are spinning at the same angular velocity, a yaw stabilising tail rotor like the one found on a standard helicopter is not needed. Yaw is achieved by in- creasing the torque produced by a pair of rotating motors – either the clockwise pair or the anti-clockwise pair and decreasing the torque produced by the other pair. Similarly the pitch of the rotors need not be altered. Instead, movements in the pitch (forward/back-

5

ward) and roll (left/right) axes can be achieved separately, simply by altering the thrust produced by a pair of motors and all this can be done without affecting the yaw of the vehicle. The front and rear propellers control the pitch of the vehicle while in flight and the right and left propellers control the roll of the vehicle.

The Flight Control (FC) is the main controlling board on the quadcopter and all the necessary sensors and microprocessors used to achieve and maintain stable flight are located on this board. The gyroscopes (gyro, also known as rotational speed sensors) are considered to be the most important of the onboard sensors. The programmed microcon- troller uses the measurements from them to compensate for external influences such as wind. The X, Y and Z axis are all assigned their own gyro and they measure the degrees per second the vehicle turns in a certain plane.

The rate of angular change is measured using three accelerometers; once again each axis is assigned its own accelerometer. It is possible to fly the vehicle without these accelerometers however it would not be possible for the quadcopter to return to a level state by itself. By including these sensors it is possible for the pilot to release the remote control (RC) sticks and the vehicle will stabilise itself. The final sensor is a barometer and it allows the vehicle to maintain a specific flying height.

Due to the nature of the project a strict power to weight ratio budget was designed. From initial estimations, maximum payload for the vehicle would be about 400 grams. The motors used in RC vehicles vary greatly so it was desired to find a light motor with low power consumption. The main requirement was that it provided 1000 rpm V 1 . A Brushless motor fulfils the specifications. Brushless motors are more complicated to op- erate because they need to change the DC supply voltage of the battery into a phased AC (usually three phase) with a controlled power output so that the speed of the motors can be accurately controlled. In total there are four Brushless Controllers (BL-Ctrl) one for each motor. There are numerous BL-Ctrls on the market. To achieve a very stable level of flight it is desired to change the throttle value very rapidly (<0.5m s 1 ); this would be best done using an Inter-Integrated Circuit (I 2 C) bus interface.

Control of the vehicle is achieved through use of a RF transmitter and receiver pair. A transmitter with a minimum of four channels is required to control the quadcopter in the various planes. It must use Pulse Position Modulation (PPM). The receiver demodulates the instructions given by the pilot and converts them into electrical signals. A receiver with a summing signal is required for the MikroKopter (MK). This signal contains all of

6

the channels sent by the transmitter and is made available to the FC for processing.

The design of the frame is only limited by one factor: it must be as light as possible. The frame designed took on a basic cross shape similar to many open source projects[4]. Bearing in mind that the frame must be lightweight some of the materials considered include: aluminium, carbon fibre or polystyrene; it is possible to have a combination of these. The most cost effective material for this project is aluminium because it is readily available and cheap.

The batteries needed to power all the onboard electronics have to have a low weight to power ratio so that maximum flight time may be achieved. The different types of battery considered suitable for use in RC vehicles are: Lithium polymer (Lipo), Nickel-cadmium (NiCd) and Nickel metal hydride battery (NiMh). The Lipo types are the most advanced and offer the most power. They are volatile and must be handled with care but they can have a capacity of over 10,000 mAh. NiMh batteries are a slightly older technology but they are still capable of capacities around 5000 mAh. However if a poor quality of battery is purchased then bad cells can develop quickly when the capacity is increased. NiMh batteries are cheaper than Lipo batteries and are also less volatile. NiCd batteries are the cheapest overall and have similar characteristics to that of NiMh but are unable to achieve an equal capacity. A decent flying time is desired to test the software and hardware which will be added and due to the powerful motors and large propellers the best weight to power ratio with a high capacity can be achieved using the Lipo batteries.

The propellers play a major role on the take-off weight and the speed that the vehicle will fly at. The diameter of the propellers as well as their pitch must be considered. Due to the slow spinning motors a low pitch is desired. After using a thrust calculator it was determined that a good combination for the propellers would be: 12” propellers with 4.5” pitch. A snapshot of the thrust calculator at 4000 rpm for 10” and 12” propellers is shown below in Figure 3. The affect of altering the blade diameter by 2” doubles the static thrust developed.

3.2 Proportional Integral Derivative Control

Proportional Integral Derivate (PID) control is a common method used to control pro- cesses and robotic systems [5]. It allows automatic control of an output, depending on the current output and various inputs. It consists of a combination of three individual controllers combined together; Proportional, Integral and Derivative as shown in Figure 4.

7

Figure 3: Thrust comparison of 10” and 12” propellers Figure 4: PID controller[8] As shown

Figure 3: Thrust comparison of 10” and 12” propellers

Figure 3: Thrust comparison of 10” and 12” propellers Figure 4: PID controller[8] As shown in

Figure 4: PID controller[8]

As shown in Figure 4, the error signal is an input minus the current output. This error signal is processed by the PID controller and the outputs of each element are combined to create the output to control the process.

8

Proportional control can be thought of as a gain. The error signal is multiplied by the gain; K p . It influences the amount the input has to change to cause a variation of 0-100% in the output[6].

Integral control uses feedback (most commonly positive) to reset the controller[7]. This is achieved by integrating the error signal and multiplying it by the gain, K i . This has the effect of controlling the output so it stabilises at a constant steady-state.

Differential control differentiates the error signal and multiplies it by the gain, K d . This influences the rate of change of the output signal.

Combining these three elements creates an effective controller which can be imple- mented in software.

3.3 Communication System

3.3.1

WiFi

The WiFi version that will be implemented is 802.11b and it has the following characteristics[9]:

Release Date - October 1999

Frequency band - 2.4 GHz

Typical Throughput - 4.5 Mbps

Bit Rate - 11 Mbps

Range (Indoor) - 38 m

Relevant appli-

cations are short range communications that do not require extremely large data rates. This version of WiFi is very commonly used as a wireless LAN technology.

These aspects describe how ideally suited it is to our application.

The main issue with 802.11b is that there are other devices that operate on the 2.4 GHz frequency band. These devices include microwaves, Bluetooth, baby monitors and cordless phones. When these devices operate in the same region as 802.11b they cause interference to occur in the WiFi signal.

9

3.3.2

GPRS

Whilst GSM is excellent for communications networks that involve the transmission and

reception of voice calls, it is not designed for the use of sending or receiving data. This

is due to the data rate of GSM not being high enough for data transmission.

GPRS (General Packet Radio Service) improves the performance of GSM to allow it to send and receive data. GPRS is one of the communications technologies that are classed as part of 2.5 G, these are designed to move the performance of 2G networks closer to the performance of 3G networks. The bit rate of GPRS for a pair of Class 10 modems is 16- 24 kbps for uploading and 32- 48 kbps for downloading[10].

Due to GPRS being part of the standard cellular network, there is currently wide coverage, especially in urban areas. This means that if a remote control device is being controlled over the cellular network then it can be controlled from anywhere else with network coverage, no matter how far away.

3.4 Object Detection

Robots have no spatial awareness (being non-human), and as such need to be constantly updated with its surroundings. When the quadcopter is flying from Point A to Point B, it is important that it does not collide with any obstacles on route. There are two potential paths for Object Detection: Sensing and Imaging.

3.4.1 Sensing

The Sensing approach uses either Ultrasound or Infrared waves to detect the presence of objects. A short pulse is emitted from a transmitter and then a detector will receive the reflected signal. Either the time delay or the magnitude of the signal will then be used to calculate the distance to any objects within range of the sensor. This approach, as illustrated in Figure 5, is the same as is used with touchless hand driers found in public toilets. There are three types of outputs that can be taken from a distance sensor: Analogue Range, Pulse Width Modulation (PWM) Range and Distance switch.

Analogue Range is a variable voltage output (between ground and the supply volt-

age) that is dependent on the distance to the object. This can then be quantised using

a Analogue-to-Digital Converter (ADC) and with the aid of a look-up table, a distance realised. A PWM range output works in a similar manner, where by the pulse width is

10

Figure 5: Illustration of using infrared or ultrasound to detect the presence of objects[11] varied

Figure 5: Illustration of using infrared or ultrasound to detect the presence of objects[11]

varied dependent on distance. The Distance Switch is the most basic of the three, and

when an object is within range of the sensor, the output is toggled (either High to Low

or vice-versa).

3.4.2

Imaging

The use of a camera allows the quadcopter to ”see” the surrounding environment. Robo-

Realm has open source software[12] showing how a simple webcam can be used to detect

objects and calculate a safe route. Firstly edge detection is applied to an image as seen

in Figure 6.

edge detection is applied to an image as seen in Figure 6. Figure 6: RoboRealm example,

Figure 6: RoboRealm example, showing the original image (left) and the edge detected image (right)[12]

A technique called ”side-fill” is used whereby a vertical line is taken from the bottom

of the image and white is placed in each pixel, until an edge is detected, at which point

the rest of the line is filled black. The white area of the image is then eroded by an object

such that the area left is large enough for the vehicle to pass. These steps are shown in

Figure 7.

11

Figure 7: RoboRealm example, showing the side-filled image (left) and the eroded image (right)[12] The

Figure 7: RoboRealm example, showing the side-filled image (left) and the eroded image

(right)[12]

The image is smoothed and the white pixel with the greatest y value is found and it

is this pixel that the vehicle aims for. In this example, as seen in Figure 8, this is at the

x value of 193.

as seen in Figure 8, this is at the x value of 193. Figure 8: RoboRealm

Figure 8: RoboRealm example, showing the final image overlaid with the original[12]

12

4

Design Methodology

This section details the architecture of the overall system, and the design of and inter- connections between internal subsystems.

4.1 System Overview

A diagram of the complete system is shown in Figure 9. The navigation system controls

the actual flight of the quadcopter and passes flight telemetry (current position, heading, height, battery level, gyroscope and accelerometer values) to the communications system

to send back to the user. The communications system receives control commands to tell

the quadcopter where to go (latitude/longitude/height) from the user and passes them to the navigation system. It also encodes and transmits sensor data back to the user. The object detection system will detect objects in the quadcopters path and inform the navigation system which will take the appropriate action.

navigation system which will take the appropriate action. Figure 9: System Overview Communication between the systems

Figure 9: System Overview

Communication between the systems will be via an I 2 C bus. The navigation system will act as the master device and will ask the other two systems for data when required. This will be discussed further in Section 4.2.2.

The design of the navigation, communications and object detection systems are de- tailed in the sections below.

4.2 Navigation and Autonomous Control System

The navigation system is centred around a microcontroller hereby referred to as NAVPRO. This is the centre of the whole system, controlling the overall operation of the Unmanned Aerial Vehicle (UAV) and acting as master to the other communication and object de- tection systems (ODS). The MKFC will be used to maintain stable flight when manually

13

and autonomously controlled.

To meet the requirements specified in Section 1.1, a Global Positioning System (GPS) module and magnometer are required for autonomous flight. The GPS module allows the quadcopter to be aware of its current location (by providing a latitude and longitude) and the magnometer allows the quadcopter to be aware of its current orientation to allow it to know in which direction to start moving.

A diagram of the navigation system and internal and external connections is shown in Figure 10.

and internal and external connections is shown in Figure 10. Figure 10: Basic Navigation System Connections

Figure 10: Basic Navigation System Connections

The primary requirements of the navigation processor are:

R1 - Receive instructions from the communications system to determine mode of operation (manual flight or autonomous flight).

R2 - Receive and pre-process the following data:

Current position from the GPS module.

Current orientation from the magnometer.

Object alerts from the object detection system.

14

Current height, fuel (battery level), pitch, yaw and role accelerometer and integral data from the MKFC.

R3 - Send the current flight telemetry (as specified in R2) to the communication system.

R4 - Process available data as specified in R2 to decide on movements required to control the MikroKopter.

R5 - Disengage manual control and send control commands to the MikroKopter FC when in autonomous mode.

R6 - Monitor current system and prepare for autonomous control when in manual mode.

Each requirement also requires a specific hardware and software design, detailed be- low. The flow of data to and from NAVPRO is shown in Figure 11.

The flow of data to and from NAVPRO is shown in Figure 11. Figure 11: Navigation

Figure 11: Navigation Processor Data Inputs and Outputs

15

4.2.1

Requirements 1 and 3 Send and Receive commands and flight data to the communication system

Communication between Communications Processor (COMPRO) and Navigations Pro- cessor (NAVPRO) will be via the I 2 C bus. There is also an option to use the second asynchronous serial port on the NAVPRO if the loading on the I 2 C bus becomes too high, but the I 2 C bus is the preferred option.

4.2.2 Requirement 2 Receive and Pre-Process Data

Three separate communication methods will be used to allow the NAVPRO to obtain all data required for autonomous flight as shown in Figure 11. All the current data values will be stored in RAM for use to fulfil Requirement 4.

As the I 2 C bus will be used to allow four devices to communicate, timing will be very important. The COMPRO will act as the master device and will request data from each device in a predefined sequence at a regular interval. This will ensure the NAVPRO will have all the data required to make a decision and fly the quadcopter when required.

GPS-320FW GPS Module

The GPS-320FW module utilises an asynchronous serial connection to send and re- ceive data. Therefore one of the NAVPRO SCI modules will be used to obtain positional data and initialise the GPS module when needed. The current position will be updated at 1Hz which is the maximum rate supported by the GPS-320FW. The serial interface protocol is based on the National Marine Electronics Associations NMEA 0183 ASCII interface specification which consists of a string of ASCII characters which contain all relevant positional data. The NAVPRO will decode these messages and extract the cur- rent position from these.

CMPS-03 Magnometer

The CMPS-03 magnometer will be interfaced via the I 2 C bus due to the ease of use and the low refresh rate required.

Object Detection System

The ODS will communicate with NAVPRO via the I 2 C bus. The Object Detection

16

System (ODS) will send data with required actions (e.g. change in direction, stop) to the NAVPRO. The NAVPRO will send commands to request data from the object detection system and to manage the I 2 C bus. A relatively low data transfer rate, roughly 6 times per second, will be required to communicate with ODC.

MikroKopter Flight Controller

Communication with the MikroKopter FC will be using the SPI interface. This will allow the NAVPRO to send control commands to the MikroKopter flight controller and receive the current battery level, height, nick, roll and yaw accelerometer values and gy- roscope positions. These values are already sent by the MKFC for debugging using the MikroKopter tool; these same values will be sent to the COMPRO.

The MikroKopter flight controller software will be modified to output the flight data and receive control commands over the SPI interface. The control commands will mimic those sent via the RF transmitter i.e. provide a value for pitch, yaw, thrust and roll. The software will also contain a function to use either control commands from the PPM receiver or disable this input and use the NAVPRO.

Communications System

The MikroKopter will receive commands from the COMPRO over the I 2 C bus. These commands will be sent periodically.

4.2.3 Requirement 4 - Process Available Data and Decide on Movements Required to Control the MikroKopter

The NAVPRO will use all the available data to decide on the best course of action to control the MikroKopter. There are three levels of control required as shown in Figure 12. The highest level is the Go to Command. This command will be received by the NAVPRO from the COMPRO and will be the ultimate destination of the quadcopter. It is made up of three parts: GPS latitude, GPS longitude and height. This will be stored in a memory location and will be compared the current GPS latitude, longitude, height and orientation to decide on what action is required.

The NAVPRO must make sure there is no

threat to the quadcopter in its current situation to meet the go to command; if there is

it must act to stop to quadcopter.

The next level is the object avoidance.

17

Figure 12: The three levels of control: go to command, object avoidance and basic flight

Figure 12: The three levels of control: go to command, object avoidance and basic flight stabilisation

The lowest level is the basic flight stabilisation. This underpins the two layers above

and is used to carry out the go to command and satisfy the object avoidance. Initially

all autonomous movements of the quadcopter will be made by rotating and pitching the

vehicle. A set of control loops based on PID controllers (section 1.2) will be used to

control the pitch, yaw and height of the craft by providing pseudo control values to the

MKFC. The MKFC will stabilise the roll of the vehicle. A block diagram of the control

loops is shown in Figure 13. The ‘system’ block refers to the physical movement of the

aircraft. There may be some options to use the control loops on the MKFC to help

implement this requirement. This will be investigated further.

4.2.4 Requirement 5 Disengage Manual Control and Control quadcopter in

Autonomous Mode

The MKFC has two sets of input command variables for thrust, yaw, pitch and roll:

‘externalStick’ and stick. The ‘externalStick’ set of variables will be used for autonomous

commands and will be set to the values sent over the SPI channel from the NAVPRO.

There will be two methods of engaging autonomous control, through the manual RF

transmitter or via the advanced communications link (see Section 4.3). The software

on the MKFC will be modified to let one of the RF PPM channels toggle autonomous

mode and select whether to use primary or secondary control commands. The MKFC

will then communicate with the NAVPRO which mode it is in. An alternative will be

via the COMPRO. When the COMPRO sends a signal to engage autonomous mode the

NAVPRO will contact the MKFC to disengage manual control, hold position and wait

for further instructions.

18

Figure 13: Control loops used to achieve autonomous flight. (a) yaw: controls the direction of

Figure 13: Control loops used to achieve autonomous flight. (a) yaw: controls the direction of vehicle (b) pitch: controls the horizontal movement. (c) throttle: controls the height of the vehicle

19

4.2.5

Requirement 6 Monitor Current Situation and Prepare for Autonomous Control

Requirement 5 demands that autonomous mode must be able to be switched on at any time therefore the NAVPRO will carry out its calculations to maintain the current posi- tion at all times but let the MKFC regain control via RF.

4.3

Communications

The design of the communications section of the project is in the form shown in Figure 14 and 15.

of the project is in the form shown in Figure 14 and 15. Figure 14: Transmission

Figure 14: Transmission Structure of SUAVE communications

Figure 14: Transmission Structure of SUAVE communications Figure 15: Reception Structure of SUAVE communications The

Figure 15: Reception Structure of SUAVE communications

The methodology is such that the data taken by the camera is passed to the micro- controller using the serial output, this data is then stored within the microcontroller. A large amount of RAM is required to store the data from the camera due to the resolution and the number of frames created by the camera. Once the data has been compressed, the data is then output through another serial connection to the GPRS modem or WiFi module for wireless transmission.

When there is navigation data sent from the base station, the quadcopter has to receive this information and move appropriately. The data that is received will be a series of bytes that make up a packet; this packet will then be broken up to obtain the navigation data. The navigation data will then be transmitted down the I 2 C to another Coldfire board and processed to determine what movements the user requests from the quadcopter.

20

4.4

Object Detection

The question arises as to how often objects must be searched for and how often the results of the searches are returned to the Navigations module. Consideration must be taken as to the maximum speed and stopping distance of the quadcopter, and the range of the sensors. For example, if the quadcopter is flying at 5 m s 1 and the sensors have a range of 1 m, then searching must be performed at least 5 times a second in the direction of travel. Alternatively, continuous searching may be performed and the results are com- municated to the Navigation module at least 5 times a second. These calculations take no consideration of the stopping distance, so the refresh rate may in fact have to be higher.

If the decision is made to only communicate when objects are detected, the problem of a busy communications bus (assuming a shared bus – the Magnometer uses I 2 C and the GPS module uses an Asynchronous Serial link) may delay the transmission. This delay could prove disastrous for the vehicle. It is due to this problem that continuous searching will be used for this application and that the Navigation Module will periodically contact the Object Detection System asking for a status update.

As mentioned in Section 3.4, the Infrared and ultrasound sensors have the possibilities of variable outputs (be that voltage or pulse width), however, to find a microcontroller that has the capabilities of multiple ADCs and PWM captures would prove problematic. To overcome this, one pin could be used as an input, i.e. either a PWM capture pin or a serial connection for an ADC, and each sensor is then toggled using separate pins. This has the advantage of using a microcontroller with a lower specification, but now removes the ability to have continuous sensing in all directions. If, however, distance switching sensors are used, single pins can be used.

21

5

Equipment

5.1 Flight Control

As the flight controller is key to the ability to perform stable, autonomous flight, a sub- stantial amount of time was spent deciding whether to build a flight controller or buy an off-the-shelf model.

The advantage of building a flight controller is the flexibility to meet the exact spec- ifications of the project. The disadvantage is the substantial amount of time required to design and construct the flight controller which could be spent on creating advanced navigation functionality. For this reason, several off-the-shelf flight controllers were con- sidered using the following factors:

Ease of integration and modification number and types of input-output ports, re- programmable, open source.

Performance

Flight history and community opinions, stability etc.

Cost

Several contenders were considered:

Wolferl 3.1 UAVP - Universal Aerial Video Platform

The UAVP [13] is an international open source project to create a reasonably priced quadcopter. The flight controller is called Wolferl. Wolferl utilizes three gyros, a 3- axis linear accelerometer, a compass sensor and a barometric sensor. The addition of a compass sensor is an advantage over competing flight controllers but there are some disadvantages:

A populated and tested Universal Aerial Video Platform” V3.1x Flight Controller Board can be purchased for £265[14] which is expensive.

An 8 bit PIC16F876 PIC is the processor for the unit and is heavily utilized already which would make it difficult to add functionality to the flight controller.

These disadvantages made it unsuitable for the project.

22

Paparazzi

Paparazzi is open source hardware and software platform for autonomous aircraft [15]. It is designed for airplanes but there is evidence that it has been adapted for a quadcopter [16]. The paparazzi control board is called Tiny [17] and incorporates a GPS module and a powerful LPC2148 microcontroller. It is available preassembled with GPS module but no gyros, accelerometers and barometric sensors (IMU) for £160 [18]. To use the Tiny, an IMU will be required which is likely to raise the total price to an unacceptable level. The GPS integration is advantageous but there is little example software for quadcopters available which means the software would have to be written from scratch. The high price and the lack of a skeleton software made paparazzi unattractive.

MikroKopter

The MikroKopter flight controller (MKFC) [19] is an open source flight controller which is part of the MikroKopter project. The MikroKopter project includes frame designs, a flight controller, electronic speed controllers, a magnometer and a GPS based navigation board yet the latter two are not open source. The MKFC was selected for the SUAVE project for the following reasons:

The MKFC is an open source software project which allows the code to be changed and adapted to suit the needs of this project.

A high quality software application is available (Flight Tool) for debugging, testing and changing the flight parameters and settings.

A strong and enthusiastic community exists [20] which helped ensure the MKFC met our specifications.

The MKFC has several synchronous and asynchronous input-output ports which allow interfacing with external hardware.

The MKFC costs £170 [21] which is cheaper than other flight controllers especially considering the high performance evident from various videos and user comments on the internet [22].

23

5.2

Navigation System Equipment

5.2.1 Central Processing Unit

A central processing unit is required to perform the navigation calculations and control the

navigation peripherals as detailed in Section 4.2. The Freescale Coldfire MCF528CVF66 Microcontroller (MCU) was selected for the following reasons:

The MCU has a similar specification to the LPC2148 used in the Tiny flight con- troller which performs autonomous navigation (Section 5.1):

32 bit

66,80 MHZ clock speed

512kb ROM,

64kB RAM

The MCU also has a number of input-output ports which are required for interfacing with the other systems and peripherals, these are: 1xSPI (synchronous serial port) 3xSCI (asynchronous serial port) 1xI 2 C (Inter-Integrated Circuit) port.

A development board was available within the department

5.2.2 GPS Module

Several GPS modules were considered including those used in other autonomous vehicle projects such as the LEA-4P [16] but proved inhibitively expensive despite good perfor- mance. After researching alternatives the RF Solutions GPS-320FW was selected [23]. The GPS-320FW was selected due to its excellent specifications, integrated ceramic an- tenna and a synchronous TTL level interface despite the low price (£24) [24]. It features -155dBM tracking sensitivity, 5m CEP accuracy (positional error) and a 1Hz update rate which make it suitable for the SUAVE project.

5.2.3 Magnometer

The CMPS-03 magnetic compass [25] was selected for use as it was used successfully in

a robotics project last year[26]. It can be interfaced with using I 2 C or by capturing and

decoding a pulse width modulated signal, creating flexibility in the system. The module also preprocesses data and so outputs the current heading as a number of degrees from north (0.0 o to 359.9 o ) which simplifies the coding required on the navigation processor.

24

5.3

Communications System

The hardware platforms within the project were chosen with the knowledge that they would be connected together to work as a whole in the final implementation. Once con- nected the hardware will have the ability to take video (camera), process the information (Freescale Coldfire MCF5282CVF66) and transmit wirelessly (GSM modem of GPRS modem).

5.3.1 Freescale Coldfire Microcontroller

The choice of processing devices used for the project had to fulfil the requirements of both the communications and other aspects of the project. The physical requirements of the communications processing device were that it had significant process power, large size flash, an I 2 C, 2 SCI ports, TCP/IP stack and a real time operating system[27]. These are the requirements because receiving video from the camera and transmitting it via the communications hardware needs to be completed in real time with little delay. The choices for this were a Freescale Coldfire microcontroller or a Gumstix microcontroller as they both have high processing power.

The features of this microcontroller can be viewed in Figure 16.

features of this microcontroller can be viewed in Figure 16. Figure 16: Freescale MCF528x features[28] The

Figure 16: Freescale MCF528x features[28]

The Freescale microcontroller board that will be implemented into the quadcopter design is a MCF5282CVF66. This is shown in Figure 17.

25

Figure 17: Freescale Coldfire Microcontroller and Development board The decision for this was a Freescale

Figure 17: Freescale Coldfire Microcontroller and Development board

The decision for this was a Freescale Coldfire microcontroller. Due to the departments strong relationship with Freescale these were already being stored within the university for use within student projects. The MCF5282CVF66 has an excellent learning and prototyping environment in the form of the Freescale Codewarrior. The choice of micro- controller was also made quick and easy by the Freescale C & I Microselector tool.

5.3.2 Fargo Maestro - 100 GPRS Modems

The requirement from the GPRS modem is that it has the ability to change the RS232 signal from the microcontroller into a wireless transmission in the format required and vice versa. The data rates involved have to be high enough to be able to handle a constant transmission of video data, they should be able to handle RS232 data, and they should be as small and light as possible.

The GPRS modems that will be used are Fargo Maestro - 100 as these fit the purpose that is required for the project and have already been bought by the university. The modem that will be used are shown in Figure 18.

by the university. The modem that will be used are shown in Figure 18. Figure 18:

Figure 18: Fargo Maestro- 100 GPRS modems[29]

26

The Fargo Maestro-100 has the following features[30]:

Dualband GSM 900/1800 MHz.

Support Data, SMS, Voice and Fax.

Max Power Output: 2 W (900 MHZ), 1 W (1800 MHz).

GPRS Class B Class 10 (4Rx+1 or 3Rx+2Tx) at maximum speed.

SimToolKit Class 2.

AT command set.

And the following interfaces:

SIM Holder.

15-pin Sub D connector (serial and audio connection).

4-pin power supply connector.

External Antenna.

5.3.3 S103 WiFi Module

The WiFi module that will be used is a S103 WLAN Compact Serial Module as this is small, light and should automatically do all the conversions required to change the RS232 output from the microcontroller to the WiFi data transmission and vice versa. The module is shown in Figure 19.

to the WiFi data transmission and vice versa. The module is shown in Figure 19. Figure

Figure 19: S103 Wifi module[31]

27

The S103 WLAN Compact Serial Module has the following features[31]:

UDP version configurable baud rate (9600, 38400, 57600, 115200 bps).

TCP version configurable baud rate (9600, 38400, 57600 bps).

Full-duplex (serial data in and out).

Hardware flow control.

Firmware upgrade through RS232 port.

Easy set-up via console or web page.

LED status and activity indicators for easy installation, monitoring and diagnostics.

Link to WLAN Access Point (infrastructure mode).

5.4 Surveillance Module and Compression

Deciding to use a camera introduced an interesting set of problems. Firstly not only does the camera need to be light and robust but it must also have a suitable viewing range. The biggest challenge will be encoding the video data so that it may be transmitted back to a base station over a WiFi or GPRS link. The live video quality must be kept as high as possible and the latency with which the base station receives and decodes the data must be as low as possible.

There are numerous cheap cameras and web cams on the market. These are mostly of the CMOS (Complementary Metal Oxide Semiconductor) and CCD (Charged Couple Device) types. This design helps to keep the cost of these cameras low, however, the majority of these only have a very limited ability to capture video from a distance. A second problem is that the video data must be compressed so that it may be transmitted over a low rate data link.

A second approach is to purchase an integrated system which combines video cap- ture and compression. Such a device is the CAM-VGA 100 module on offer by Round Solutions[32]. It is an integrated serial camera board module that can be attached to a wireless host being used as a video camera or a JPEG compressed still camera. It pro- vides RS232 serial interface and a JPEG compression engine which acts as a low cost and low powered camera module for use with GPRS/GSM modules. The main application of such a device is on mobile phones or PDAs and thus the picture quality and viewing

28

range will limited.

If a readymade system, such as the one described above was purchased, the data rates would be restrained to those designed by the manufacturer. The solution which offers the most flexibility, would be to design a camera board module, with our own camera and a video codec. This would involve buying a CMOS camera board and implement- ing a video codec. There are a number of codecs which fit our purpose these include:

MPEG4, MJPEG, Strathclyde Compression Transform (SCT), H.263 or H.264. MPEG4, H.263 and H.264 are all more complex to implement than MJPEG but offer superior com- pression. Not enough is known about the SCT and some more investigation will need to be carried out before a final decision is made on which compression technique will be used.

5.5 Object Detection

When using the sensing approach, as detailed in Section 3.4, an imaginary protective barrier could be formed around the quadcopter, and if an object is detected within this barrier, evasive action is taken. However, this leads to the problem that if the vehicle is required to navigate a corridor that is narrower than the barrier surrounding the vehicle; the system would not allow the vehicle to pass and would force it to find a different route. In this case, using a method that allows the exact distance to be detected would be ad- vantageous. What needs to be considered is the complexity of the navigation system; does the system need to adapt its trajectory on-the-fly, or does it stop and re-evaluate its potential options? Initially it was decided that non-adaptive navigation would be an easier route to take and if time is sufficient, adaptive navigation will be implemented.

As mentioned in Section 3.4, ultrasound or infrared could be used to sense if an object was within range of the quadcopter. Ultrasound detecters appear to need a highly re- flective surface, and also have a poor polar response to flat surfaces, as shown in Figure 20.

Use of an imaging device is, by default, a far more complex option, however, it can return much more usable results and can allow the navigation module to be more efficient in its evasive actions. When an object is encountered and the vehicle starts to rotate to avoid the object, constant information can be passed to the navigation module to inform it if the object has been cleared. The use of imaging is also very limited; one camera requires a substantial amount of processing power (compared to Ultrasound or Infrared sensors). Object detection is also limited to one direction, which means the quadcopter must rotate if it is to see potentially viable routes. It is for these reasons that distance

29

Figure 20: Polar response of a Parallax Ping Sensor, tested with a 12” square piece

Figure 20: Polar response of a Parallax Ping Sensor, tested with a 12” square piece of white card[33]

sensing will be used in all directions.

During development, a Microchip PIC microcontroller will be used for the Object

Detection system as to minimise the complexity of having multiple systems running

on the one processor. In order for the Object Detection System to communicate with

the Navigation module, the PIC selected must have the capabilities of I 2 C or RS232.

The microcontroller must also have room for multiple inputs; for these reasons, the

Microchip PIC16F876 was chosen[34]. A PIC microcontroller has the advantage over

a conventional Microcontroller Development Board (like the Freescale Coldfire board

being used in the Communications and Navigations modules) as it does not have any

unnecessary components (such as DACs, switches) and has a very low power consumption.

30

6

Progress to date

This section details the work accomplished as of 17th December 2008. The work ac- complished falls into four sections; selection and construction of quadcopter, selection of navigation equipment and testing, selection of communications equipment and selection and testing of object detection equipment. Each of these sections is detailed below.

6.1 Construction and Test Flights

The complete MK set was purchased and a kit containing all the components shown in Figure 21 was received. Building the frame was the first step of construction. The MK came with preassembled control boards and brushless motor controllers. All the surface mounted components were already soldered on.

All the surface mounted components were already soldered on. Figure 21: MikroKopter kit before construction However,

Figure 21: MikroKopter kit before construction

However, several bulkier pieces such as electrolytic capacitors, switches, the baromet- ric sensor and the header connectors still required soldering. The wiring assembly used to carry the power and control signals was prepared. All the parts were then connected together and a successful dry test of the electronics was undertaken. All the separate components were then brought together and placed on the frame. The propellers were mounted and a pre-flight check was carried out. Using the MK Tool, the RC transmitter channels were configured to match the MK receiver and a successful maiden test flight was made[35]. Figure 22 below shows the MK after construction.

31

Figure 22: Fully constructed MikroKopter 6.2 Selection of Navigation Equipment and Testing Two core elements

Figure 22: Fully constructed MikroKopter

6.2 Selection of Navigation Equipment and Testing

Two core elements of the navigation and control system had to be selected: the standard flight controller which performs basic stabilisation of the quadcopter and the advanced navigation system which will add the functionality required to meet the objectives in Section 1.1.

As the flight controller is key to the ability to perform stable, autonomous flight, a substantial amount of time was spent deciding whether to build a flight controller or buy an off-the-shelf model.

To ensure the specifications of the GPS unit, magnometer and microcontroller are suitable to meet the project objectives several modules were considered as described in Section 5.2.

6.2.1 Component Testing

Currently only the CMPS-03 magnometer has been tested. It was tested by interfacing it via I 2 C with a Softec HCS12E128 starter kit. This starter kit was used due to the avail- ability of I 2 C on the microcontroller and the authors familiarity with the board and the opportunity to reuse I 2 C code from a previous project. A Freescale PBMCUSLK devel- opment board with an LCD module was used as an output device to display the compass heading. The MC9S12E128 drove the LCD module via the Serial Peripheral Interface

32

(SPI). Jump leads were constructed to connect the required pins on the MC9S12E128 to the CMPS03 and PBMCUSLK. A circuit diagram is shown in Figure 23.

and PBMCUSLK. A circuit diagram is shown in Figure 23. Figure 23: Magnometer test rig schematic

Figure 23: Magnometer test rig schematic

The magnometer functioned as expected and a 3 digit number representing degrees from North was displayed on the LCD. A possible problem emerged during testing as when the compass is tilted an error occurs to the heading. The further the magnometer is tilted from horizontal, the larger the error. This will have to be addressed, possibly with adjusting the value using the gyroscope values.

6.3

Communications