Professional Documents
Culture Documents
Summary: This report explores the use of MAVLink and visual feedback to
achieve precise drone landings over a designated marker. Initial challenges, like
Objective:
The core objective is to align the drone directly above a designated marker, ensuring that the
marker remains centrally positioned within the drone's camera view before landing. MAVLink
MAVLink Implementation:
Telemetry: Real-time drone states, such as mode and altitude, are retrieved.
Sending Commands: The script uses MAVLink to send various commands to the drone:
These commands provide direct control over the drone's actions, guiding its movements
Purpose: MAVLink ensures real-time drone control, leveraging visual feedback from the camera
Methodology:
Marker Detection:
The script utilizes OpenCV's ArUco module to detect markers in the camera's frame. The
The script calculates the center of the detected marker (cX, cY) and compares it with
the center of the camera's frame (centerX, centerY). The difference between the
marker's center and the frame's center provides an error metric which indicates how
Angular Deviation:
The formula above calculates the angular deviation of the marker from the center of the
camera's field of view. It translates the pixel-based error into an angular error, guiding
Position Correction:
position based on the calculated angular error.The function is designed to work in two
modes:
● Velocity Mode: The drone adjusts its speed proportional to the error.
Continuous Correction:
The script continuously captures frames from the drone's camera, detects the marker,
computes the error, and sends correction commands. If the marker is detected,
corrections are made based on the marker's position. If the marker isn't detected, the
the correction is determined by the deviation of the marker from the center. Larger
Landing Decision:
Although not explicitly mentioned in the provided script, once the marker's center aligns
closely with the camera's center (i.e., the error is below a certain threshold), the drone
The above figure shows the two coordinate systems, namely the body frame which means
coordinates that move together with the quadcopter, and inertial frames which mean the
reference point coordinates used for a quadcopter. The inertial frame or set point is fixed and
Previous Implementation:
Frame of Reference: Initially, the MAV_FRAME_LOCAL_NED reference frame was used, NED
local frame (x: North, y: East, z: Down) with origin fixed relative to earth.
In this case, the 𝑥𝑎𝑛𝑔 𝑎𝑛𝑑 𝑦𝑎𝑛𝑔 calculation, is sent to the send_body_offset_ned_command()
function, but the drone movement is made relative to NED of the earth (the reference frame). So
Observation
This causes the drone to yaw unnecessarily and gives inconsistent movements as soon as the
Modified Implementation.:
fixed to the drone itself and adjusts orientation based on the drone's rotation. This shift resulted
Observation.
performance, which remained consistent. One behavior to be aware of is that SN2 tends to
overshoot at low altitudes (less than 5m) because the marker appears larger due to proximity.
Eliminating Error Threshold: After removing the error threshold for direction commands, we
saw improved correction performance. This is because, at very small error values, both Xang
Conclusion:
adjusting the frame of reference and rectifying command errors, the drone's positioning
accuracy is way better. Moving forward, consistent monitoring and iterative testing will be critical