You are on page 1of 5

2021 IEEE International Conference on Consumer Electronics and Computer Engineering (ICCECE 2021)

Design of Smart Car Control System for Gesture


2021 IEEE International Conference on Consumer Electronics and Computer Engineering (ICCECE) | 978-1-7281-8319-0/20/$31.00 ©2021 IEEE | DOI: 10.1109/ICCECE51280.2021.9342137

Recognition Based on Arduino


Zhexiang Zou, Qinyu Wu, Yuhang Zhang and Kaiyuan Wen
College of Industrial Automation, Beijing Institution of Technology, Zhuhai 519080, China.
Correspondence should be addressed to Kaiyuan Wen; 181112141@qq.com

Abstract- A smart car control system, based on Bluetooth reason why it is not the best choice to apply this solution to the
control for gesture recognition is designed, which uses the smart car is that it will cause the shortcomings such as
MPU6050 six-axis attitude sensor to collect attitude information cumbersome operation, limited functions, and high price [4-8].
as well as through multi-angle fusion and direction closed-loop
control to create the corresponding digital gestures for different The schedule of using data glove recognition is mainly to
instructions. After the digital recognition, the relevant recognize the change of the hand posture through the gesture
instructions are transmitted to the Arduino smart car via radio sensor. The core of its recognition system is the main control
frequency, which realizes the remote control of the intelligent chip and motion processing device of the data glove. The
vehicle through human gestures to complete the corresponding receiver is the smart car at the controlled end, so they both must
command actions. The experimental results show that this design carry on data communication through the wireless module. The
realizes the function of controlling the direction of the smart car data glove microprocessor responds quickly, and it can identify
through gesture changes. The movement trajectory of the smart changes in the angle and velocity of the hand. The analogue
car can better feedback the movement of the hand; at the same data is transmitted to the smart car, and the car controls the
time, it also shows that the system operates stably and has a motor through the car’s central control chip after receiving the
strong anti-interference ability. signal to control the movement of the car. At the same time, it
will be given useful feedback on the direction of the hand
Keywords- gesture recognition; intelligent vehicle; wireless
through the movement trajectory of the trolley that will make
transmission; posture sensor; Arduino
the recognition smoother and the system simpler.

I. In t r o d u c t io n The main components of the data glove are the main


control chip, the motion processing component, the wireless
With the increasing advancement of technology and the module and the power module. The structure is relatively
rapid development of the automotive industry, the design of simple and does not require steps such as image processing and
somatosensory remotely controls cars in its fancy. In gesture segmentation [11.12]. Currently, there are six-axis and
intelligence, it is necessary to conduct human-computer nine-axis attitude sensor on the market, and the appropriate
interaction [1-3] in a more humane, concise, and natural way. sensor will be selected according to needs. It can monitor the
Through human-computer interaction, related equipment can changes of the hand in real-time and convert the gesture
complete remote operation tasks on unmanned platforms in changes into the movement of the car by establishing the
unknown environments. The operator can remotely and movement coordinates. To sum up, the design chooses the
wirelessly control the smart car by wearing related equipment. gesture recognition schedule based on data gloves. The overall
Gesture recognition is a new starting point for the structure diagram of the gesture control smart car system is
transformation of human-computer interaction mode. The shown in Fig 1.
conversion of gestures is used as a signal change to control the
physical device so that the physical device helps people
complete the specified tasks. Human-computer interaction
equipment based on somatosensory devices has been applied to
the entertainment and medical field [4]. Compared with other
human-computer interaction technologies, using gestures to
communicate with intelligent devices is easier to understand
and closer to daily needs. Besides, it has the advantages of low
hardware costs and remote control.

II. SYSTEM SCHEDULE DESIGN


Gesture recognition scheme based on the computer vision
system is too complicated, and human-computer interaction is
not smooth enough for the control of the smart car. Each
instruction must be done separately, and the steps of each
identification are too cumbersome. This method requires the
use of camera and image processing functions. Furthermore, if Figure 1. Overall structure diagram of gesture control smart car system
it is a dynamic gesture, recognition, it is necessary to add
equipment such as motion-sensing components. Therefore, the

978-1 -7281 -8319-0/21/$31.00 ©2021 IEEE


695

Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
III. Sy s t e m h a r d w a r e d e s ig n

The whole system can be divided into two parts that the
hand control end and the smart car control end. Among them, VI
the software design is input into each corresponding module
based on the Arduino platform, and the hardware module is
installed on the smart car and the control glove, the two
communicate through the nRF24L01 wireless module. The
main components of the whole system are that smart car's
central control chip, motor drive module, power supply
module, infrared obstacle avoidance module, control glove
main control chip, motion processing component, wireless
module, etc. After the MPU6050 six-axis attitude sensor
collects the hand posture change data, it passes the data to the
sports glove microcontroller through filtering and calculation.
It should be noted that RF-NANO has a built-in nRF24L01 Figure 3. Movement direction coordinates
chip, so there is no need to install a wireless module on the
sports glove. The nRF24L01 on the control glove receives the The sports glove rotates around the Y-axis as shown in the
signal. It then transmits it to the microcontroller of the smart figure, that is, the hand is turned back and forth, and the pitch
car, the microcontroller of the smart car controls the motor to angle can be obtained; when the sports glove is rotated around
move the car according to the information from the control the X-axis, that is, the back of the hand is rotated left, and right,
glove. Among them, the infrared obstacle avoidance module the Roll angle can be obtained. It can be seen from this
can assist the operator in controlling the car remotely. When coordinate that when the control glove is tilted to the front
the distance between the car and the obstacle is less than the set right, it is listed in the 0-90°area on the coordinate system. By
safe distance, the blue warning light will light up to remind the that analogy, it can be concluded that the movement of the
operator to take timely measures to avoid the collision of the glove in other directions corresponding to different area corners
smart car. The hardware function module is shown in Fig 2. on the coordinate system.
Through the posture of the sports gloves, a three­
dimensional model can be established to calculate the angle of
the movement direction of the smart car. As shown in Figure 4,
according to the three-dimensional model of the sports glove,
when the direction of the sports glove's inclination is
represented by OB and degree is the steering angle of the
control smart car. When MPU6050 receives the motion state
information of the sports glove, it will get the corresponding
Roll angle and Pitch angle, and calculate ZFOE:

Z.FOE = arctan (1)


P itch
What is obtained above is the steering cosine angle, which
Figure 2. Hardware function module
needs to be multiplied by a factor of 57.3°.

IV. Ge s t u r e r e c o g n it io n im p l e m e n t a t io n m e t h o d

The MPU6050 six-axis attitude sensor of the data glove


will check the acceleration data of ACCEL_X, ACCEL_Y,
ACCEL_Z and the gyroscope data of GYRO_X, GYRO_Y,
GYRO_Z in real-time. The recognized data is transmitted to
the quaternion processing function, and the program displays
the MPU6050 angle states of Roll and Pitch in real-time [9-12].
The movement direction coordinates are shown in Fig 3.

Figure 4. Three-dimensional model based on sports gloves

To control the speed of the smart car, ZBOF is planarized,


as shown in Fig.5

696

Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
The working process of the hand control terminal: (1) The
system is initialized after the power is turned on, and whether
frequency modulation is detected; (2) if the frequency is
adjusted, the hand motion information is collected through the
MPU6050 motion processing module; (3) hand motion is
calculated through the internal chip of the sensor Angle; (4)
send the calculated angle to the host computer to detect
whether the frequency is modulated. If the frequency is not
modulated, the system will continue to collect the next
movement posture of the hand; if the frequency is modulated,
Figure 5. ZBOF plane the data will be resent until the transmission is successful.
The working process of the controlled end of the car: (1)
Z B ’OB is the angle of the sports glove relative to the
After the power is turned on, the system initializes and detects
initial state, so this inclination is used to control the speed of
whether channel information is received. If the channel
the smart car. From the foregoing, it is easy to know:
information is not received, the system continues to accept it
until the channel information is received and the corresponding
zB'OB = zOBF = arcsin ( — ) = arcsin ( 0E +EF ) configuration is performed; (2) The MPU6050 six-axis attitude
OB 0B
(2) sensor collects hand movement information in real-time and
transmits it to the central controller; (3) Detect whether
Knowing that EF=BD and OE=AB, we can get (3):
frequency modulation is detected, and then collect the
ZOBF = movement information of the smart car for direction closed-
arcsin(^sin2(Roll) + sin2(Pitch) )~arcsin(VRoll2 + Pitch2) loop; (4) If the control terminal fails to connect with the
controlling terminal, it can perform frequency modulation
(3) Processing until the communication is successful.
The radian angle obtained above also needs to be multiplied
by a factor of 57.3, but the range of this angle is 0° -90° ,
and Z B ’OB can be directly used as the control speed of the
smart car.

V. So f t w a r e d e s ig n

The Arduino-based gesture control smart car software level


designed in this article includes the control glove terminal and
the smart card terminal. The open-source platform, based on
Arduino carries on programming to each module, takes on the
function test and then carries on the assembly.

Figure 7. Software flow of car controlled terminal

VI. Ex p e r im e n t a l v e r if ic a t io n

Connect the car with the smart glove and use gestures to
indirectly control the car's forward and turning, the car's
trajectory can always be in a stable state after many
adjustments. The experimental car is shown in Fig 8.

Figure 6. Software flow of hand control terminal

697

Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
TABLE I. Ov e r a l l f u n c t io n a l t e s t r e s u l t s

P r o je c t R e s p o n s e t im e A ccu ra cy S t a b ilit y

Technical
0.02S 98.5% 98%
parameter

VII. Co n c l u s io n s

Regarding the question of whether the smart car can operate


stably by human movements, using the advantages of data
Figure 8. Smart car and control gloves gloves that can accurately recognize human gestures, a smart
car control system based on gesture recognition is designed,
By reading the MPU6050 in real-time to verify three-axis and experiments are carried out to test its performance. The
acceleration data and three-axis gyroscope data, the Roll angle experimental results show that the gesture-controlled smart car
and Pitch angle are calculated through the built-in filtering is designed based on ATMEGA328P microcontroller.
algorithm. Then the system transmits to the Processing Compared with the traditional smart car, this design adopts
program to display the 3D motion status of the MPU6050 in wireless frequency modulation technology. It uses the
real-time. As shown in Figure 9, the running results of roll MPU6050 six-axis attitude sensor for gesture recognition,
angle and pitch angle can be obtained: which realizes the function of the control glove to wirelessly
control the movement of the smart car and reduces a lot of
The attitude angle of the MPU6050 can be grasped in real­
tedious and complicated procedures. However, most of the
time through the serial monitor, and the 3D motion state of the experimental processes are in ideal environments, and the
MPU6050 can be seen intuitively through the Processing demo
technology is not mature enough to be used in virtual
interface. Figure 10 shows the motion attitude and the value of environments, so there is still a long way to go before
the attitude angle at a particular moment when the MPU6050 is
commercial applications. Also, the dynamic gesture recognition
rotated. It is proved that there is no problem with the MPU6050 algorithm is more complicated. In the future, we will further
module, and the attitude angle can be calculated typically.
improve and optimize the algorithm and apply this algorithm to
the actual environment.

Acknowledgments:
The support for this research under 2020 Guangdong
Province Scientific Research Platform (Grant
No.2020KTSCX188) and the National Natural Science
Foundation of China (Grant No. 2018A030313418) are
gratefully acknowledged. Without their financial support, this
work would not have been possible.

REFERENCES
[1] Yanan Chang, “Dynamic gesture recognition based on HMM,” MPhil.
thesis, South China University of Technology, Guangzhou, GD, China,
2012.
The use of gesture recognition technology, software and [2] Hang Zhao, “Motion control design of gesture recognition for mobile
program design realizes the control of the smart car. Under this robot,” M.S. thesis, University of Electronic Science and Technology,
control system, the smart car has good operability, Chengdu, SC, China, 2017.
controllability and stability that it can smoothly realize forward, [3] Chen Gao, “Research on the control of mobile robots based on dynamic
backward, left, right and other functions. The final test results and static gestures,” M.S. thesis, Beijing University of Chemical
Technology, Nanjing, JS, China, 2017.
are shown in Table 1.
[4] Gould S J J, Brumby D P, Cox A L, et al., “Methods for Human-
Computer Interaction Research,” in The, ACM Conference Extended.
ACM, 2015, pp. 2473-2474.
[5] Tianxing Wang, “Gesture-based remote control technology for
unmanned platforms,” M.S. thesis, Nanjing University, JS, China, 2017.
[6] V. Bonato, A. K. Sanches, Fernandes M M, et al., “A Real Time Gesture
Recognition System for Mobile Robots,” in Icinco 2004, Proceedings of
the First International Conference onInformatics in Control, Automation
and Robotics, Setubal, Portugal, August. DBLP, 2004, pp. 207-214.
[7] RM. Sanso, D. Thalmann, “A Hand Control and Automatic Grasping
System for Synthetic Actors,” Computer Graphics Forum, vol. 13, no. 3,
pp. 167-177, Aug. 1994.
Figure 10. Processing demo renderings [8] Robert Y. Wang, “Real-time hand-traking with a color glove,” in ACM
SIGGRAPH.ACM, 2009, pp. 1-8.

698

Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
[9] Poularakis S, Katsavounidis I, “Finger detection and hand posture
recognition based on depth information,” in IEEE International
Conference on Acoustics, Speech and Signal Processing. IEEE,2014, pp.
4329-4333.
[10] Yunde Zhang, “Research and application of gesture recognition
technology,” M.S. thesis, Anhui University, Hefei, AH, China, 2013.
[11] Yuanxin Zhu, Guangyou Xu, “Appearance-based dynamic isolated
gesture recognition,” Journal of Software, vol. 11, no.1, pp. 54-61,
Jan.2000
[12] Xueping Zhao, Jianming Wang, Haiting Liu, et al., “Research on
Recognition System of Indoor Robot Dynamic Gesture Command,”
Computer Engineering and Applications, vol. 47, no. 33, pp. 209-212,
Nov. 2011.

699

Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.

You might also like