Professional Documents
Culture Documents
Control A Drone Using Hand Movement in ROS Based On Single Shot Detector Approach
Control A Drone Using Hand Movement in ROS Based On Single Shot Detector Approach
Abstract—These days, drones become one of the most popular Recently, due to advent in both software and harware of such a
robots and have lots of application in delivering, transportation, robotic device, there has been a widespread of applications for
construction, agriculture, data recording. Nevertheless, the in- drone in different fields, namely, delivering postal boxes. The
crement in using drones, the rate of world aviation accidents
increased too, and most of them were because of hardness pioneer of the foregoing application is Amazon for delivering
to keep the drone in wanted location, and also superficial postal boxes for nearby places which is still in progress.
acknowledgement about multi rotors structures. In this paper, Control of this type of robots is a challenging problem. by
a new method is introduced for controlling a drone rather than default people uses joystick and mobile to control them. but
controlling the drone by classical approach via radio control. In it is very interesting to control them high level. As it was
this new method, a deep learning algorithm and machine vision
applied to detect an object and use its position to order the first developed, and from a mechanical standpoint, the desired
drone. In the performed experimental study, which was done in movement of the drone is transfered by means of a Joystick
Human and Robot Interaction Laboratory, a human hand was which can be regarded as the first solution passing through
used as an object (to simplify the problem) which its movement the mind. However, by the advent of vision technologies an
is made equivalent to the command of a radio controller. In alternative has been proposed for controlling drone which
order to accurate the result of hand detection the so-called
Single Shot Detector is used which has been trained by using eases this task, i.e., control a drone using hand movement
the Ego Hand dataset. The latter leads to a precise result under and gesture. The latter can be regarded as an representative
different conditions like viewpoint variation, background clutter, example of HRI.
illumination and so on. Furthermore, a stereo camera vision Most of the paper propounded in the literature regarding
set up used instead of using Kinect or leap motion devices to this issue are a synergy of machine vision, neural network and
evaluate the depth of hand which enable to control the drone
in a three-dimensional coordinate system. Also, the proposed deep learning approaches. From the advent made in the recent
algorithm are implemented in the Robot Operation System(ROS) years in deep learning approach, more emphasizes has been
and Gazebo were applied to get these positions and simulate the focused on this approach. In this paper, a new methodology
drone. The drone, which is based on px4 as a flight controller- has been proposed for the foregoing problems which is based
in the Gazebo environment- receives its position from Mavros on the so-called Single Shot Detector method. In the previous
node which connected the drone to the ROS and could move
continuously along x, y, and z axes. According to the results, it work different method proposed. In [1], Yangguang Yu et al
can be inferred that the proposed method was much easier and uses color and simple image processing technique to detect
had more functionality instead of previous methods reported in hand and operator. In [2], Sarkar et al. uses a hand gesture
the literature. sensor called leap motion in order to recognize hand gesture.
Index Terms—drone, Single Shot Detector, ROS, Gazebo In [3], Nagi et al. combined the information of head and
face movement in order to commands to the drone. In this
I. I NTRODUCTION paper, faces are detected using Haar-like features. In [4], Lee
Nowadays, Human and Robot Interaction (HRI) has stimu- et al. focuses in Deep learning technique to find the gesture
lated the interest of many researchers in different filed, Engi- of hand. In the foregoing study, 3 deep neural network is used
neering science, social science and even psychology, which has to find the hand situation. In [5], Natarajan et al. introduces
been known as a multidisciplinary concepts. Robotic devices a open source library for human drone interaction based on
can be classified into different types and drones are the one hand gesture. in which a classic machine learning algorithm
which are known to have a challenging control problem. to end of detecting five hand gesture. In [6], Tsetserukou
978-1-7281-7296-5/20/$31.00 ⃝2020
c IEEE
Authorized licensed use limited to: ULAKBIM UASL - Sakarya Universitesi. Downloaded on February 02,2021 at 14:52:38 UTC from IEEE Xplore. Restrictions apply.
2020 28th Iranian Conference on Electrical Engineering (ICEE)
A. Train the Single Shot Detector Algorithm B. Stereo Vision for Calculating the Depth
In order to detect the hand movement, deep learning ap- In machine vision, calculating the depth of an image is a
proach is adopted. This technique is very accurate rather than challenging problem which requires a stereo vision camera.
other method such as using color and basic image processing As shown in Fig. 4, for the purpose of this paper, a stereo
technique which has been used in the previous work [1]. vision setup is built. in the below figure we shows the device
Authorized licensed use limited to: ULAKBIM UASL - Sakarya Universitesi. Downloaded on February 02,2021 at 14:52:38 UTC from IEEE Xplore. Restrictions apply.
2020 28th Iranian Conference on Electrical Engineering (ICEE)
Capture 30 image for both right Single camera calibration Dual Camera Calibration for
Create structure for stereo vision Apply WLS filter Extract depth image
and left camera ( for calibration ) for remove distortions remove epipolar lines
Authorized licensed use limited to: ULAKBIM UASL - Sakarya Universitesi. Downloaded on February 02,2021 at 14:52:38 UTC from IEEE Xplore. Restrictions apply.
2020 28th Iranian Conference on Electrical Engineering (ICEE)
Authorized licensed use limited to: ULAKBIM UASL - Sakarya Universitesi. Downloaded on February 02,2021 at 14:52:38 UTC from IEEE Xplore. Restrictions apply.
2020 28th Iranian Conference on Electrical Engineering (ICEE)
Authorized licensed use limited to: ULAKBIM UASL - Sakarya Universitesi. Downloaded on February 02,2021 at 14:52:38 UTC from IEEE Xplore. Restrictions apply.