You are on page 1of 2

Visual-Inertial Navigation for a Camera-Equipped 25 g Nano-Quadrotor

Oliver Dunkley, Jakob Engel, J¨urgen Sturm and Daniel Cremers
Technical University Munich

Abstract— We present a 25 g nano-quadrotor equipped with
a micro PAL-camera and wireless video transmitter, with which
we demonstrate autonomous hovering and figure flying using a
visual-inertial SLAM system running on a ground-based laptop.
To our knowledge this is the lightest quadrotor capable of
visual-inertial navigation with off-board processing. Further we
show autonomous flight with external pose-estimation, using
both a motion capture system or an RGB-D camera. The
hardware is low-cost, robust, easily accessible and has freely
available detailed specifications. We release all code in the
form of an open-source ROS package to stimulate and facilitate
further research into using nano-quadrotors as visual-inertial
based autonomous platforms.

I. I NTRODUCTION
Research interest in autonomous micro-aerial vehicles
(MAVs) has grown rapidly in the recent years. On the
one hand, we have seen aggressive flight manoeuvres using
external tracking systems (e.g. Vicon), which are however
limited to a lab environment. On the other hand, there have
been significant advances regarding autonomous flight in
GPS-denied environments using cameras as main sensors
both with on-board [10], [8] and off-board processing [5].
A. Motivation & Related Work
Even though current mono-vision controlled MAVs are
already much smaller than approaches based on RGB-D
cameras, most models are still too large for indoor flight
in a tightly constrained environment (e.g., a small office)
or too dangerous to fly close to people. In particular the
weight of the platform plays a critical role, as it directly
correlates with danger the MAV poses: Even with a flight
weight of only 500 g, a crash can do significant damage to
both the MAV itself as well as its environment or in particular
human bystanders. Furthermore, size and weight influence
the transportability, the usability e.g. as a flying scout for
a ground-based robot, the potential to be deployed in large
swarms and – in particular if mass-produced – the per-unit
cost. While most research regarding MAVs focuses on – in
the context of this paper – relatively large platforms, there are
some exceptions: In [6], a visual SLAM system is used onboard a quadrotor weighing around 100 g. In [9], [1], nanoMAVs (36 g and 46 g) are stabilized using optical flow. Most
notably, in [2], a 20 g flapping-wing MAV is presented using
on-board stereo vision for obstacle avoidance and navigation.
The research leading to these results was supported by the BMBF within
the Software Campus (NanoVis) No. 01IS12057. We thank BitCraze AB
for their on-going cooperation with us.

Fig. 1: Crazyflie nano-quadrotor with attached camera. 25 g
total weight, 3.5 minutes of flight time and 9 cm across.
B. Contribution
With this paper, we present a 25 g nano-quadrocopter
based on the open-source, open-hardware Crazyflie by
Bitcraze1 equipped with analogue on-board camera and wireless video transmitter. Using this platform, we demonstrate
(1) autonomous flight using external pose estimation (Vicon
or Kinect), and (2) autonomous flight with only on-board
sensors, using the on-board camera for visual-inertial pose
estimation.
We released all code required to reproduce the presented
results as open-source ROS package, including the visualinertial SLAM system. In particular we provide a full
fledged, easy to set up ROS driver with intuitive GUI, which
exposes all the nano-quadrotor’s functionality to ROS. With
this we hope to encourage and ease further research into the
field of autonomous nano-quadrotors. More information can
be found in [4].
II. H ARDWARE P LATFORM
The platform consists of (1) the Crazyflie Quadrotor Kit,
a mass produced, open source & open hardware e 170
quadrotor, and (2) the camera system, consisting of a micro
PAL-camera (720×576, 25 fps, interlaced) with fisheye lens,
an analogue 2.4 Ghz video transmitter, a voltage regulator
and a 3D printed camera mount, amounting to e 80 total
(Fig. 1).
The Flie features a programmable Cortex M3 MPU, is
equipped with an IMU, barometer and magnetometer and
communicates over a 2.4GHz bi-directional link with a
1 http://www.bitcraze.se/

to facilitate usage if such a setup is not available. Scale-aware navigation of a lowcost quadrocopter with a monocular camera. Machine vision and applications. Straight lines have to be straight. Further. after which it needs a 20 minutes recharge via USB. In ICRA. [2] C.2 IV. de Wagter.9 m/s and typical RMSEs for hovering are around 6 cm.6 Time [minutes] 0. Lynen. it can be controlled using external positioning systems such as Vicon or. de Croon. M. and D. Remes. S. Pizzoli. Achtelik. Tanskanen. [7] L.5 minutes) and transmits a PAL video to the ground station where it is digitalized. we demonstrated drift-free hovering capability by using the on-board camera for visual-inertial SLAM – to the best of our knowledge demonstrating the smallest and lightest quadrotor with such capability. Dunkley. Achtelik.8 Z [m] 1. VI. 2012.2 Y [m] Z [m] 1. F LIGHT WITH O N -B OARD S ENSORS Fig. ETH. 2 shows the ground truth position over a 60 s hover flight using visual-inertial control. III. Faugeras. In contrast to e. Dantu. Chli.2 1 0. F. see [4]. specifically adapted for the Flie’s sensor characteristics. Barrows. we evaluated the Flie’s flight characteristics: The top speed reached with / without attached camera is 1.8 1 Using a motion capture system and the PID controller. In fact – without camera and as long as it doesn’t land upside down – it can typically directly take off again. . The flight time is around 7 minutes. Kneip. Zufferey. ground-station. Forster. is done by the provided ROS driver. M. Weiss.2 −0. Kneip. V. PhD thesis. 2014. Tijmons. (4) A visual-inertial SLAM system loosely based on [7]. Autonomous flight of a 20-gram flapping wing mav with a 4-gram onboard stereo vision system. 2: Autonomous hovering for 60 s using only on-board sensors. [10]. [4] O. corrupted ones (right pair). Devernay and O. and D. M. [8]. 2014.3 m/s / 1. 2012. G. changing parameters on the Flie during runtime. Only the motors and propellers need to be replaced occasionally. de-interlacing images and rectifying them using the FOV [3] model. and M. The bottom row shows images used by the visualinertial SLAM (left pair) and automatically dropped. Furthermore. which exposes the entire Crazyflie functionality to a full fledged intuitive GUI and the ROS network. Sturm. B. [5] J.2 0 0. [3] F. L. IROS. The voltage regulator filters out most of the motor induced noise. ROS S OFTWARE I NTEGRATION We provide all software as an open-source ROS package. amounting to a total delay of roughly 40 ms. The nano-quadrotor stays within 22 cm of its set goal position. 2014. either manually or using the barometer. Pixhawk: A micro aerial vehicle design for autonomous flight using onboard computer vision. Heng. In Vision-based Closed-Loop Control and Navigation of Micro Helicopters in GPS-denied Environments. including drops from several meters altitude or flying into a wall at high speed. Nagpal.2 0. The absolute scale of the world is set during initialization. G. and D. Floreano. [6] C. P. [8] L. In ICRA. By providing open and user-friendly. (3) A Crazyflie detector. Real-Time Scalable Structure from Motion: From Fundamental Geometric Vision to Collaborative Mapping. IMU measurements are tightly integrated into the visual SLAM pipeline at an early stage.2 0. M. F LIGHT WITH E XTERNAL P OSE E STIMATION 0 −0. allowing for realtime telemetry (delay of around 8 ms). we tested autonomous hovering and waypoint flying using the implemented visual-inertial SLAM system. J. S. 2001. The added camera system is powered by the Flie’s battery (reducing flight time to 3. [9] R. and R. K. Monocular vision for long-term micro aerial vehicle state estimation: A compendium. ROS based access to the Crazyflie along with some tools to get started. X [m] 0. Fraundorfer. 2013.Goal VO Vicon (5) A PID controller to control the Flie with gain presets for use together with SLAM or external tracking based pose estimates. a Kinect. Meier. as well as deinterlacing. Pollefeys. L.2 Y [m] 0. 13(1):14–24. C ONCLUSION We presented an open-source and open-hardware 25 g nano-quadrotor with wireless video capability. The estimated pose is then directly used as input to a PID controller. [10] S. Briod. as well as sending control commands. Lee. Engel.2 1 0. [5]. (2) A camera processing node for efficiently detecting and dropping corrupted frames. J. Holding a position under disturbances is even possible without further filtering or delay compensation. Autonomous Robots.2 0 −0. 2014.8 0 X [m] 0. . This. 30(5):803–831. but some frames are still corrupted and should be dropped (see Fig. Cremers. and G. effectively providing an easy to set up external tracking solution. 2). Scaramuzza.4 0. For details on the hardware and assembly. R EFERENCES [1] A. Optic-flow based control of a 46g quadrotor. Autonomous visual-inertial control of a nano-quadrotor. Siegwart. 2013. the mean RMSE to the goal position is 15 cm. we hope to encourage and ease further research into the field of autonomous nano-quadrotors. In ICRA. 2014. Autonomous mav guidance with a lightweight omnidirectional vision sensor. Moore. Robotics and Autonomous Systems (RAS). SVO: Fast semi-direct monocular visual odometry.g. that uses a commodity RGBD Camera combined with on-board attitude estimation to provide a 6 DOF pose estimate. Particularly noteworthy is the extraordinary robustness of the platform: it survives most crashes.2 0 −0. Master’s thesis. which includes the following: (1) The Crazyflie ROS driver. and R. Journal of Field Robotics. thus helping to compense for the comparatively poor image quality of the analogue camera. Fig. TUM. In addition to manual flight.