You are on page 1of 6

Report on Improving Positioning Algorithm During Landing

Date: 12th October 2023.

Summary: This report explores the use of MAVLink and visual feedback to

achieve precise drone landings over a designated marker. Initial challenges, like

an incorrect reference frame and command inconsistencies, were addressed.

Results: Videos from SN2 point of view


Report on Improving Positioning Algorithm During Landing

Objective:

The core objective is to align the drone directly above a designated marker, ensuring that the

marker remains centrally positioned within the drone's camera view before landing. MAVLink

serves as the primary communication protocol in achieving this task.

MAVLink Implementation:

Connection: A link is established with the drone via mavutil.mavlink_connection.

Telemetry: Real-time drone states, such as mode and altitude, are retrieved.

Sending Commands: The script uses MAVLink to send various commands to the drone:

● mission_item_send: Uploads waypoints for the drone to follow.

● command_long_send: Sends miscellaneous commands, like taking off, starting a

mission, or arming the drone.

● set_position_target_local_ned_message: Sends position or velocity adjustment

commands. (NED stands for North East Down)

These commands provide direct control over the drone's actions, guiding its movements

during the landing/position correction process.

Purpose: MAVLink ensures real-time drone control, leveraging visual feedback from the camera

to achieve precise landing or positioning.

Methodology:

Marker Detection:

The script utilizes OpenCV's ArUco module to detect markers in the camera's frame. The

aruco.detectMarkers() function identifies the marker's corners and its ID.


Computing Error:

The script calculates the center of the detected marker (cX, cY) and compares it with

the center of the camera's frame (centerX, centerY). The difference between the

marker's center and the frame's center provides an error metric which indicates how

off-center the marker is.

Angular Deviation:

The formula above calculates the angular deviation of the marker from the center of the

camera's field of view. It translates the pixel-based error into an angular error, guiding

how much the drone should adjust its position.

Position Correction:

The script uses the send_body_offset_ned_command function to correct the drone's

position based on the calculated angular error.The function is designed to work in two

modes:

● Velocity Mode: The drone adjusts its speed proportional to the error.

● Displacement Mode: The drone moves a fixed distance based on the

error, and then stops.

The mode is determined by a bitmask ( 111111000111 if velocity else 110111111000),

which specifies whether to interpret the input values as velocities or distances.

Continuous Correction:

The script continuously captures frames from the drone's camera, detects the marker,

computes the error, and sends correction commands. If the marker is detected,
corrections are made based on the marker's position. If the marker isn't detected, the

drone is commanded to stay in place (0 velocity or 0 distance move). The magnitude of

the correction is determined by the deviation of the marker from the center. Larger

deviations result in larger corrections.

Landing Decision:

Although not explicitly mentioned in the provided script, once the marker's center aligns

closely with the camera's center (i.e., the error is below a certain threshold), the drone

can initiate the landing process.

Comparing Current and Previous Implementations for Improving SN2 Positioning.

Figure 1: Body frame and Inertia frame.

The above figure shows the two coordinate systems, namely the body frame which means

coordinates that move together with the quadcopter, and inertial frames which mean the
reference point coordinates used for a quadcopter. The inertial frame or set point is fixed and

immovable so that it can be a reference balance for the quadcopter.

Previous Implementation:

Frame of Reference: Initially, the MAV_FRAME_LOCAL_NED reference frame was used, NED

local frame (x: North, y: East, z: Down) with origin fixed relative to earth.

In this case, the 𝑥𝑎𝑛𝑔 𝑎𝑛𝑑 𝑦𝑎𝑛𝑔 calculation, is sent to the send_body_offset_ned_command()

function, but the drone movement is made relative to NED of the earth (the reference frame). So

the orientation of the drone is ignored.

Observation

This causes the drone to yaw unnecessarily and gives inconsistent movements as soon as the

marker is detected and Xang and Yang is being calculated.

Modified Implementation.:

Adjusted Frame of Reference: Transitioned to using MAV_FRAME_BODY_OFFSET_NED, which is

fixed to the drone itself and adjusts orientation based on the drone's rotation. This shift resulted

in an enhancement in the drone's correction performance.

Observation.

After implementing these changes, we noted a significant improvement in correction

performance, which remained consistent. One behavior to be aware of is that SN2 tends to

overshoot at low altitudes (less than 5m) because the marker appears larger due to proximity.

Some other changes made.


Error Rectification: Corrected the directional errors observed during the keyboard controls.

Eliminating Error Threshold: After removing the error threshold for direction commands, we

saw improved correction performance. This is because, at very small error values, both Xang

and Yang approached values near zero.

Conclusion:

We have made significant improvements in achieving the desired drone positioning. By

adjusting the frame of reference and rectifying command errors, the drone's positioning

accuracy is way better. Moving forward, consistent monitoring and iterative testing will be critical

to refining performance further

You might also like