Professional Documents
Culture Documents
Abstract- A smart car control system, based on Bluetooth reason why it is not the best choice to apply this solution to the
control for gesture recognition is designed, which uses the smart car is that it will cause the shortcomings such as
MPU6050 six-axis attitude sensor to collect attitude information cumbersome operation, limited functions, and high price [4-8].
as well as through multi-angle fusion and direction closed-loop
control to create the corresponding digital gestures for different The schedule of using data glove recognition is mainly to
instructions. After the digital recognition, the relevant recognize the change of the hand posture through the gesture
instructions are transmitted to the Arduino smart car via radio sensor. The core of its recognition system is the main control
frequency, which realizes the remote control of the intelligent chip and motion processing device of the data glove. The
vehicle through human gestures to complete the corresponding receiver is the smart car at the controlled end, so they both must
command actions. The experimental results show that this design carry on data communication through the wireless module. The
realizes the function of controlling the direction of the smart car data glove microprocessor responds quickly, and it can identify
through gesture changes. The movement trajectory of the smart changes in the angle and velocity of the hand. The analogue
car can better feedback the movement of the hand; at the same data is transmitted to the smart car, and the car controls the
time, it also shows that the system operates stably and has a motor through the car’s central control chip after receiving the
strong anti-interference ability. signal to control the movement of the car. At the same time, it
will be given useful feedback on the direction of the hand
Keywords- gesture recognition; intelligent vehicle; wireless
through the movement trajectory of the trolley that will make
transmission; posture sensor; Arduino
the recognition smoother and the system simpler.
Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
III. Sy s t e m h a r d w a r e d e s ig n
The whole system can be divided into two parts that the
hand control end and the smart car control end. Among them, VI
the software design is input into each corresponding module
based on the Arduino platform, and the hardware module is
installed on the smart car and the control glove, the two
communicate through the nRF24L01 wireless module. The
main components of the whole system are that smart car's
central control chip, motor drive module, power supply
module, infrared obstacle avoidance module, control glove
main control chip, motion processing component, wireless
module, etc. After the MPU6050 six-axis attitude sensor
collects the hand posture change data, it passes the data to the
sports glove microcontroller through filtering and calculation.
It should be noted that RF-NANO has a built-in nRF24L01 Figure 3. Movement direction coordinates
chip, so there is no need to install a wireless module on the
sports glove. The nRF24L01 on the control glove receives the The sports glove rotates around the Y-axis as shown in the
signal. It then transmits it to the microcontroller of the smart figure, that is, the hand is turned back and forth, and the pitch
car, the microcontroller of the smart car controls the motor to angle can be obtained; when the sports glove is rotated around
move the car according to the information from the control the X-axis, that is, the back of the hand is rotated left, and right,
glove. Among them, the infrared obstacle avoidance module the Roll angle can be obtained. It can be seen from this
can assist the operator in controlling the car remotely. When coordinate that when the control glove is tilted to the front
the distance between the car and the obstacle is less than the set right, it is listed in the 0-90°area on the coordinate system. By
safe distance, the blue warning light will light up to remind the that analogy, it can be concluded that the movement of the
operator to take timely measures to avoid the collision of the glove in other directions corresponding to different area corners
smart car. The hardware function module is shown in Fig 2. on the coordinate system.
Through the posture of the sports gloves, a three
dimensional model can be established to calculate the angle of
the movement direction of the smart car. As shown in Figure 4,
according to the three-dimensional model of the sports glove,
when the direction of the sports glove's inclination is
represented by OB and degree is the steering angle of the
control smart car. When MPU6050 receives the motion state
information of the sports glove, it will get the corresponding
Roll angle and Pitch angle, and calculate ZFOE:
IV. Ge s t u r e r e c o g n it io n im p l e m e n t a t io n m e t h o d
696
Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
The working process of the hand control terminal: (1) The
system is initialized after the power is turned on, and whether
frequency modulation is detected; (2) if the frequency is
adjusted, the hand motion information is collected through the
MPU6050 motion processing module; (3) hand motion is
calculated through the internal chip of the sensor Angle; (4)
send the calculated angle to the host computer to detect
whether the frequency is modulated. If the frequency is not
modulated, the system will continue to collect the next
movement posture of the hand; if the frequency is modulated,
Figure 5. ZBOF plane the data will be resent until the transmission is successful.
The working process of the controlled end of the car: (1)
Z B ’OB is the angle of the sports glove relative to the
After the power is turned on, the system initializes and detects
initial state, so this inclination is used to control the speed of
whether channel information is received. If the channel
the smart car. From the foregoing, it is easy to know:
information is not received, the system continues to accept it
until the channel information is received and the corresponding
zB'OB = zOBF = arcsin ( — ) = arcsin ( 0E +EF ) configuration is performed; (2) The MPU6050 six-axis attitude
OB 0B
(2) sensor collects hand movement information in real-time and
transmits it to the central controller; (3) Detect whether
Knowing that EF=BD and OE=AB, we can get (3):
frequency modulation is detected, and then collect the
ZOBF = movement information of the smart car for direction closed-
arcsin(^sin2(Roll) + sin2(Pitch) )~arcsin(VRoll2 + Pitch2) loop; (4) If the control terminal fails to connect with the
controlling terminal, it can perform frequency modulation
(3) Processing until the communication is successful.
The radian angle obtained above also needs to be multiplied
by a factor of 57.3, but the range of this angle is 0° -90° ,
and Z B ’OB can be directly used as the control speed of the
smart car.
V. So f t w a r e d e s ig n
VI. Ex p e r im e n t a l v e r if ic a t io n
Connect the car with the smart glove and use gestures to
indirectly control the car's forward and turning, the car's
trajectory can always be in a stable state after many
adjustments. The experimental car is shown in Fig 8.
697
Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
TABLE I. Ov e r a l l f u n c t io n a l t e s t r e s u l t s
P r o je c t R e s p o n s e t im e A ccu ra cy S t a b ilit y
Technical
0.02S 98.5% 98%
parameter
VII. Co n c l u s io n s
Acknowledgments:
The support for this research under 2020 Guangdong
Province Scientific Research Platform (Grant
No.2020KTSCX188) and the National Natural Science
Foundation of China (Grant No. 2018A030313418) are
gratefully acknowledged. Without their financial support, this
work would not have been possible.
REFERENCES
[1] Yanan Chang, “Dynamic gesture recognition based on HMM,” MPhil.
thesis, South China University of Technology, Guangzhou, GD, China,
2012.
The use of gesture recognition technology, software and [2] Hang Zhao, “Motion control design of gesture recognition for mobile
program design realizes the control of the smart car. Under this robot,” M.S. thesis, University of Electronic Science and Technology,
control system, the smart car has good operability, Chengdu, SC, China, 2017.
controllability and stability that it can smoothly realize forward, [3] Chen Gao, “Research on the control of mobile robots based on dynamic
backward, left, right and other functions. The final test results and static gestures,” M.S. thesis, Beijing University of Chemical
Technology, Nanjing, JS, China, 2017.
are shown in Table 1.
[4] Gould S J J, Brumby D P, Cox A L, et al., “Methods for Human-
Computer Interaction Research,” in The, ACM Conference Extended.
ACM, 2015, pp. 2473-2474.
[5] Tianxing Wang, “Gesture-based remote control technology for
unmanned platforms,” M.S. thesis, Nanjing University, JS, China, 2017.
[6] V. Bonato, A. K. Sanches, Fernandes M M, et al., “A Real Time Gesture
Recognition System for Mobile Robots,” in Icinco 2004, Proceedings of
the First International Conference onInformatics in Control, Automation
and Robotics, Setubal, Portugal, August. DBLP, 2004, pp. 207-214.
[7] RM. Sanso, D. Thalmann, “A Hand Control and Automatic Grasping
System for Synthetic Actors,” Computer Graphics Forum, vol. 13, no. 3,
pp. 167-177, Aug. 1994.
Figure 10. Processing demo renderings [8] Robert Y. Wang, “Real-time hand-traking with a color glove,” in ACM
SIGGRAPH.ACM, 2009, pp. 1-8.
698
Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.
[9] Poularakis S, Katsavounidis I, “Finger detection and hand posture
recognition based on depth information,” in IEEE International
Conference on Acoustics, Speech and Signal Processing. IEEE,2014, pp.
4329-4333.
[10] Yunde Zhang, “Research and application of gesture recognition
technology,” M.S. thesis, Anhui University, Hefei, AH, China, 2013.
[11] Yuanxin Zhu, Guangyou Xu, “Appearance-based dynamic isolated
gesture recognition,” Journal of Software, vol. 11, no.1, pp. 54-61,
Jan.2000
[12] Xueping Zhao, Jianming Wang, Haiting Liu, et al., “Research on
Recognition System of Indoor Robot Dynamic Gesture Command,”
Computer Engineering and Applications, vol. 47, no. 33, pp. 209-212,
Nov. 2011.
699
Authorized licensed use limited to: East Carolina University. Downloaded on June 19,2021 at 05:36:26 UTC from IEEE Xplore. Restrictions apply.