Professional Documents
Culture Documents
net/publication/323438029
CITATIONS READS
20 866
4 authors, including:
Some of the authors of this publication are also working on these related projects:
MDPI Sensors Special Issue on "State-of-the-Art Sensors Technology in Canada 2020" View project
All content following this page was uploaded by George Shaker on 21 March 2019.
Manuscript received December 13, 2017; revised February 5, 2018; accepted February 21, 2018. Date of publication February 27, 2018; date of current
version May 4, 2018.
Abstract— This article details the development of a gesture recognition technique using a mm-wave radar sensor for in-car
infotainment control. Gesture recognition is becoming a more prominent form of human–computer interaction and can be
used in the automotive industry to provide a safe and intuitive control interface that will limit driver distraction. We use
a 60 GHz mm-wave radar sensor to detect precise features of fine motion. Specific gesture features are extracted and
used to build a machine learning engine that can perform real-time gesture recognition. This article discusses the user
requirements and in-car environmental constraints that influenced design decisions. Accuracy results of the technique are
presented, and recommendations for further research and improvements are made.
Index Terms—Microwave/millimeter wave sensors, human-car interface, 60 GHz mm-wave radar, gesture sensing, random forest clas-
sifier, machine learning.
1949-307X C 2018 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
Number Set 1 Recorder 1 Recorder 2 Recorder 3 Recorder 4 Recorder 5 User 1 User 2 User 3 Average
1 Low wiggle 100% 100% 87% 90% 93% 93% 80% 83% 91%
2 Turn over 100% 100% 100% 100% 100% 93% 100% 100% 99%
3 High grab 93% 100% 90% 100% 87% 100% 90% 100% 95%
- Set 2 - - - - - - - - -
4 Swipe 100% 93% 87% 97% 93% 87% 87% 83% 91%
5 Large circle 100% 100% 100% 100% 97% 93% 93% 90% 97%
6 Small circle 100% 100% 93% 100% 100% 93% 93% 87% 96%
IV. CONCLUSION [5] C. Keskin, F. Kırac, Y. E. Kara, and L. Akarun, “Real time hand pose estimation using
depth sensors,” in Proc. IEEE Int. Conf. Comput. Vis. Workshops, 2011, pp. 1228–
We have presented a gesture detection system using mm-wave radar 1234. [Online]. Available: http://dx.doi.org/10.1109/ICCVW.2011.6130391
[6] P. Molchanov, S. Gupta, K. Kim, and K. Pulli, “Multi-sensor system for driver’s
for vehicular infotainment control. The gestures were designed to be
hand-gesture recognition,” in Proc. IEEE 11th Int. Conf. Workshops Automat.
distinguishable by the radar, be intuitive and memorable for the user, Face Gesture Recognit., Ljubljana, Slovenia, 2015, pp. 1–8. [Online]. Available:
and fit the constraints of the environment. Specific decisions were http://dx.doi.org/10.1109/FG.2015.7163132
[7] A. Riener, M. Rossbory, and M. Ferscha, “Natural DVI based on intuitive
made to maximize ease of use by drivers; further testing should be
hand gestures,” in Proc. INTERACT Workshop User Experience Cars, 2011,
done to validate these usability design decisions. Demonstrations of pp. 62–66.
the system in use were filmed and presented to showcase the use of [8] P. Molchanov, S. Gupta, K. Kim, and J. Kautz, “Hand gesture recognition
with 3D convolutional networks,” in Proc. IEEE Conf. Comput. Vis. Pattern
intuitive gestures [14], ease of use by both driver and passengers,
Recognit. Workshops, Boston, MA, USA, 2015, pp. 1–7. [Online]. Available:
and the use of large spatial zones for more robust recognition. The http://dx.doi.org/10.1109/CVPRW.2015.7301342
accuracy of the system was tested with multiple users; it was found [9] E. Ohn-Bar and M. M. Trivedi, “Hand gesture recognition in real time for automotive
interfaces: A multimodal vision based approach and evaluations,” IEEE Trans. Intell.
that involving more participants when recording a gesture set increased
Transportation Syst., vol. 15, no. 6, pp. 2368–2377, Dec. 2014. [Online]. Available:
accuracy and robustness. In the future, studies are needed to define http://dx.doi.org/10.1109/TITS.2014.2337331
and optimize the required user input for the system training stages. [10] Y. Jacob, F. Manitsaris, G. Lele, and L. Pradere, “Hand gesture recognition for driver
vehicle interaction,” in Proc. IEEE Comput. Soc. Workshop Observing Understand.
Hands Action 28th IEEE Conf. Comput. Vis. Pattern Recognit., Boston, MA, USA,
ACKNOWLEDGMENT 2015, pp. 41–44.
[11] U. Reissner, “Gestures and speech in cars,” Dept. Informat., Technische Univ.
This work was supported by the Natural Sciences and Engineering Research Council
Munchen, Munchen, Germany, 2007.
of Canada. The radar transceiver used in this work was provided by Google’s Advanced
[12] N. Gillian, “Gesture Recognition Toolkit, ver. 1.0,” Oct. 2017. [Online]. Available:
Technology and Projects group (ATAP) through the Project Soli Alpha DevKit Early
http://www.nickgillian.com/grt/
Access Grant.
[13] P. Molchanov, S. Gupta, K. Kim, and K. Pulli, “Short-range FMCW
monopulse radar for hand-gesture sensing,” in Proc. IEEE Radar Conf.,
REFERENCES Johannesburg, South Africa, 2015, pp. 1491–1496, Online]. Available:
http://dx.doi.org/10.1109/RADAR.2015.7131232
[1] C. Pickering, K. Burnham, and M. Richardson, “A research study of hand gesture [14] G. Shaker, “Gesture recognition using mm-waves,” Waterloo Artif. Intell. Inst., Nov
recognition technologies and applications for human vehicle interaction,” in Proc. 2017. [Online]. Available: https://goo.gl/iRqkJC
IET 3rd Inst. Eng. Technol. Conf. Automotive Electron., Warwick, U.K., 2007, [15] J. Lien, “Soli: Ubiquitous gesture sensing with mm-wave radar,” ACM
pp. 1–15. Trans. Graphics, vol. 35, no. 4, 2016, Art. no. 142, [Online]. Available:
[2] K. Young and M. Regan, “Driver distraction: A review of the literature,” Monash http://dx.doi.org/10.1145/2897824.2925953
Univ. Accident Res. Centre, Clayton, Vic, Australia, 2007, pp. 379–405. [16] S. Naik, H. R. Abhishek, K. N. Ashwal, and S. P. Balasubramanya, “A study
[3] J. P. Wachs, M. Kolsch, H. Stern, and Y. Edan, “Vision based hand gesture ap- on automotive human vehicle interaction using gesture recognition technol-
plications,” Commun. ACM, vol. 54, no. 2, pp. 60–71, 2011. [Online]. Available: ogy,” Int. J. Multidisciplinary Cryptol. Inf. Secur., vol. 1, no. 2, pp. 6–12,
http://dx.doi.org/10.1145/1897816.1897838 2012.
[4] P. Breuer, C. Eckes, and S. Müller, “Hand gesture recognition with a novel IR [17] M. Alpern and K. Minardo, “Developing a car gesture interface for use as
time-of-flight range camera—A pilot study,” in Proc. Int. Conf. Vis./Comput. Techn. a secondary task,” in Proc. ACM Extended Abstracts Hum. Factors Com-
Appl., 2007, pp. 247–260. [Online]. Available: http://dx.doi.org/10.1007/978-3-540- put. Syst., Fort Lauderdale, FL, USA, 2003, pp. 932–933. [Online]. Available:
71457-6_23 http://dx.doi.org/10.1145/765891.766078