You are on page 1of 5

2020 IEEE Region 10 Symposium (TENSYMP), 5-7 June 2020, Dhaka, Bangladesh

Integration of Home Assistance with a Gesture


Controlled Robotic Arm
Jahidul Islam∗ , Amit Ghosh† , Md Ifraham Iqbal‡ , Sadaf Meem § , Nazib Ahmad ¶
†‡§ Department of Computer Science & Engineering, United International University, Dhaka, Bangladesh,
∗ R&D Engineer, ¶ Director, ANTT Robotics Ltd.
∗ jahidulislamrahat97@gmail.com, † amitbd1508@gmail.com, ‡ ifraham.iqbal@gmail.com,
§ sadafmeem1996@gmail.com, ¶ nazibahmad10@gmail.com,

Abstract—Technology today has escalated to a level that arti-


ficially intelligent robots are now programmed and implemented
to replace human activities. The advent of these robots is made to
serve as humanitarian assistance. Nowadays robots are designed
to perform and imitate housekeeping and human maneuvers
in various kinds of work. These robots are pre-programmed
and can be operated remotely without any human intervention.
This paper shall discuss in detail about such a robot, with the
improved performance of its specially designed robotic arm. This Fig. 1: Arm in Action (Neu- Fig. 2: Arm in Action
robotic arm is tooled with leap motion to intensify and reflect the tral Shoulder Position) (Shoulder Rotate Left)
very touch of its user. Putting the traditional controllers aside,
the Leap Motion is a new technology that is being widely used
to make robotic arm control more realistic and easier. With this
device, we can track the human hand to minute movements. or children. Adding to that, it can be used to operate any
This technology uses the position of the human hand to calculate kind of electrical appliances in the household. Sensors such
angles of the joints and help maneuver the robotic arm with as smoke, temperature, light, sonic, microphone, cameras are
greater accuracy. This level of accuracy shall not only increase
all installed in this robot so that every precise measurement of
the acceptance of robots in housekeeping but also leverage its
use to fend for young children and the elderly. the environment or testing area can be tracked. These types of
Index Terms—Robotic Arm; Leap Motion; Gesture Control; input leverage the robot’s activities as humanitarian assistance.
Home IOT; Home Assistant; The hands of this robot incorporate Leap Motion(LP) Sensor
to maneuver the exact movements of its user just like a mirror
I. I NTRODUCTION reflection. The LP has two wide-angle cameras, and three
Gesture in mankind shows comfort and kindness. However, Infrared LEDs with 850nm of wavelength to detect two hands
in technology gesture is used to intensify Human-Computer and ten fingers [7]. LP sensor has greater accuracy because
interactions. Gesture controlled gadgets can be beneficial for of its large interaction space of around eight cubic feet and
a person who does not feel the urge to use voice commands a viewing range of about two feet above the camera. The
or manual commands. A device that can be used remotely and maximum resolution of the image captured is 200 frames per
is gesture-controlled, has a greater demand for the specially- second. For the usage of LP, a CPU and USB controllers
abled people- deaf, mute, or handicapped. It helps to regain are required. To find the placement of the object, defining
their mental health and confidence [1]. Robotics has made the mathematical terms and equations of the X, Y, and Z-
the advent of many different technologies like prosthetic axis are very necessary. The LP Sensor uses Infrared rays. In
arms [2], vocal box for laryngitis cancer, etc. This device our system, we used Python language for the Leap Motion
makes the specially-abled people feel more confident in their Application Programming Interface(API).
movements and makes them feel more self-sufficient. It also
automatically becomes more market efficient because of its II. L ITERATURE R EVIEW
user adaptability. Robots are also increasingly taking up the In 2007, a microcontroller-based robotic arm was devel-
works of humans in the industries [3]. In areas of research oped [8] based on the 8051-based microcontroller(mc). The
where human intervention is risky, these robots are sent to ON/OFF button shuts down or turns on the robotic arm. The
inspect and carry out the research instead [4]–[6]. START/STOP buttons allow the arm to start moving or return
Our target mission is to serve children, aged, and handi- to its neutral state. The LEFT-RIGHT/RIGHT-LEFT button
capped people with companionship to fend for them. Our enables the robotic arm to move from left to right and vice
robot has been made a standalone IOT based humanoid that versa. The 180/90 switch controls whether the arm will rotate
takes commands in the form of gestures, texts, and voice. Its 90° or 180°. However, the number of commands is very
features are designed as per the needs of an elderly person limited, thus it requires a lot of practice to control the robotic
arm as per the user needs. Furthermore, the 8051 mc cannot
978-1-7281-7366-5/20/$31.00 ©2020 IEEE 1
266
2020 IEEE Region 10 Symposium (TENSYMP), 5-7 June 2020, Dhaka, Bangladesh

be connected through the internet. An update to the above- III. M ETHODOLOGY


mentioned robot was the Android Operated Robotic Arm [9] A. Proposed Concepts
which was designed in 2014 by Ali et al. A AT89C51 mc
which is a variant of the 8051 mc was used to control the In 2019, a smart home system was designed [15] that
robotic arm. This robotic arm can be controlled with any enabled the user to have video surveillance of their homes
android mobile phone provided that is connected to the robotic through the internet. The system also had numerous sensors
arm with a USB link or blue tooth. This enables the arm that allowed the user to check the humidity and temperature
to be controlled with the help of a mobile application and in-house and also has vigilance if any intruders break into
is very user friendly. When buttons on this application are their homes. The advent of the robotic arm to be discussed
pressed it sends data to the IOIO board connected to the mobile below affirm how much easier it makes for a user to engage
application which in return sends data to the AT89C51 mc. themselves in the system to do numerous household works.
These data received leverages the control of the motors and The system discussed in the previous paper also uses a Pi.
actuators in the robotic arm. The arm is much easier to control Since the LP sensor is supported by the Pi, the two systems
as it can be controlled remotely with Bluetooth connection are connected as shown here 4 . The joints(actuators) on the
but, the robot still cannot be controlled through the internet. robotic arm is controlled by using the Arduino Mega 2560. As
Bluetooth connection has a maximum range of 100 meters 1 the robotic arm is gesture-controlled it will enable the user to
(if it belongs to Class 1 of Samsung Smartphones). However, do regular movements like lifting, dropping, pulling, and other
still, the number of commands is very limited thus mimicking delicate works. A gesture is detected by the LP Sensor and it
a human arm is not possible. In another research [10] a mobile sends commands to the Pi. Accordingly, the Raspberry sends
application was developed and connected to the robotic arm commands to the Arduino which manipulates the movement
using a Raspberry Pi(Pi). The GPIO pins of the PI controlled of the robotic arm.
the movements of the robotic arm after it took input command
B. Electrical System
from the mobile application. The arm can be controlled easily
from any place by the user granted that the PI is connected In this research, we used Arduino Mega 2560 as a mc along
to the internet. However, this system still doesn’t give the with a relay controller module, Camera, Control circuit, and
optimum accuracy required for the robotic arm to mimic applications and a Lithium Polymer as the battery. Figure 3
human behaviors successfully. shows the design of the connections of the prototype. The
Arduino Mega 2560 is a mc board based on the ATmega2560.
As seen above, robotic arms have been used widely in indus- It has 54 digital input/output(I/O) pins and 16 analog pins.
trial and research purposes before. But the concept of exactly The operating voltage for this mc is 5V. The input voltage is
mimicking a human reflection in the form of housekeeping between 7-12V, and the current rating is 200mA. The servo
and fending for has not been done to great measures. This is motors are controlled using 7 I/O pins. Commands are sent
where the concept of gesture-controlled robotic arms has great from the PI to the Arduino and the Arduino controls the motors
importance. For detecting gestures, we need to use vision- according to those commands.
based sensors. The 2 most common systems that are available
are the LEAP Motion Sensor 2 and the Xbox 360 Kinect
Controller 3 . The Kinect Controller has a decent accuracy of
about 1.5 cm [11] but the LP sensor has a depth accuracy of
up to 0.7 mm [12] which is why the LP Sensor is favored in
most systems. A feasibility study by Chan et al [13] concluded
that the LP has very high accuracy when detecting gesture
behavior. In 2015, A Real-Time Robot Control Using LP
[14]was introduced. The system used an OWI Robotic-arm
which was connected to a computer using a USB. Java API
was used to give commands to the robotic arm using the LP
sensor. This system worked very efficiently and the robotic Fig. 3: Electrical System
arm could be controlled with ease. However, for the LP Based
Robotic arm to serve as a home assistant and for its work in
C. Integration of the Leap Motion Controller
areas where humans cannot access some more features need
to be implemented. LP Controller is an external device facing upwards phys-
ically on any surface, it has two IR cameras and 3 Infrared
LEDs. The cameras detect the movements of the hand. The
LEDs detect the shape and structure of the hand concentrating
1 https://www.samsung.com/pk/support/mobile-devices/what-is-the- on the joints to detect the change in motion and the gesture.
maximum-range-of-a-bluetooth-connection/ The position of the hand is then used to calculate the joint
2 https://www.ultraleap.com/product/leap-motion-controller/
3 https://support.xbox.com/en-US/xbox-360/accessories/body-controller 4 http://blog.leapmotion.com/integrate-leap-motion-arduino-raspberry-pi/

978-1-7281-7366-5/20/$31.00 ©2020 IEEE 2


267
2020 IEEE Region 10 Symposium (TENSYMP), 5-7 June 2020, Dhaka, Bangladesh

angles that in turn help us to rotate the robotic arm joints by


the computer with great speed.

D. Materials Used
The structure of the hand is made out of Poly Lactic
Acid(PLA). Unlike petrochemical-based plastics, PLA is con-
sidered to be biodegradable and environment friendly. The Fig. 4: Arm in Action Fig. 5: Arm in Action
PLA is extremely affordable and its properties make it the (Neutral Wrist ) (Rotated Wrist)
easiest to 3D print with [16].

TABLE I: Length of segments of robotic arm 3) Wrist Joint- It is the third degree of freedom with a
Location Length(cm) full 180° UP and DOWN with a neutral position of
Base 11.5 20°, which is the 4th DoF of the robotic arm. Just like
Shoulder - Elbow ( Shoulder ) 23.5
Elbow - Wrist( Elbow ) 16 the movement of the wrist in a human hand, the wrist
Wrist - Wrist Rolling ( Wrist) 9.5 rolling, which happens to be the 5th DoF of the robotic
End effector 12
arm, has a rotation of 180° and a neutral position of 90°.
Figures 4 and 5 show how this joint is affected by the
E. Mechanical Structure Implementation motion of the actuator.
4) End effector- end effector mimics the fingers of the
The degree of freedom (DoF) of a robotic arm defines the
human hand. This robotic hand has 2 fingers with a
flexibility of the hands used in the robot. The capability and
neutral position of 90° and maximum and minimum
performance of a robotic arm will be more human-like if there
degrees of 120° and 30° respectively.
is more degree of freedom 5 . The robotic arm that we have
used in our system has 6 DoF [17] that replicates joints, the TABLE II: Specification of motors used in Robotic Hand
ball, and socket mechanism of a shoulder, wrists, and fingers to
Name Min(Angle) Max(Angle) Thumb Torque(kg/cm)
give a human-like capability to the robotic arms. The joints are Base Servo 0 180 90 20
driven by DC Gear motor inside and give near-to-millimeter Shoulder 1 90 180 120 20
accuracy using inverse kinematics [18]. Twelve (12) servo Shoulder 2 0 90 60 20
Elbow Servo 40 90 60 20
motors- the MG995 Metal Gear Servo 6 with 180 degrees Wrist Servo 0 60 20 15
rotation is used in this prototype for a balanced movement of Wrist Rolling 0 180 90 9
End Effector 30 120 90 9
links and joints. The motor operates at 12V DC, 3A current.
For Controlling the DC motors, a 5V 4-channel relay interface
board is used which is equipped with high-current relays that
operate within 30V DC, 10A. The MG995 operates between
4.8V and 7.2V and has a stall torque of 9.4 kg/cm. MG995 is
used to operate the hand’s links joints and head functionalities
in this robot. The mechanism is discussed based on Table I
and Table II in detail as follows:
1) Base Plate - Mimics the ball-socket mechanism of a Fig. 6: Joint Angles
human shoulder. It is the first DoF of the robotic arm. It
can rotate a full 180°. Leveraged by a servo motor and a
metal bearing, it acts as the ball of the shoulder. Figures F. Joint Trajectory
2and 1 shows how the movement of the base plate can Arm manipulator control for industrial applications has now
manipulate the robotic arm. The UP and DOWN 180° reached a good level of maturity. For the robotic arm to
movement, acts as the socket of a human shoulder. This perform its task with better stability and without any shaking
is the second DoF of the robotic arm, geared by two and wavering it needs to have a good calculation of its
servo motors. Its neutral position is 90°. The reverse trajectory. The joint trajectory works with quadruples: αi−1 ,
position is 120° and the forward position is 60°. ai−1 , d, θ, and, from the coordinate arrangement with each
2) Elbow Joint - The Elbow joint is the third DoF of the link and joint, the basic matrix has been calculated, in terms
robotic arm and has two movements, UP and DOWN of the base of the arm [19]. It is found as shown in equation
of a full 180°, and has a neutral position of 60°. The 1.
elbow joint will leverage the act of lifting, pulling, and
pushing tasks just like a human. C1 C5 S234 +S1 S3 −C1 S234 S5 +S1 C5 C1 C2 C1 A 
T60 = −S1 C5 C234 −C1 S5 S1 C234 C5 +C1 C5 S1 C23 S1 A
C234 C5 C234 S5 −S234 B
(1)
5 https://link.springer.com/content/pdf/bbm%3A978-3-319-19788- 0 0 0 1
3%2F1.pdf
6 https://www.autobotic.com.my/servo-motor-mg995-metal-gear-180-degree A = L2 S2 + L3 S23 + L4 C234 (2)
978-1-7281-7366-5/20/$31.00 ©2020 IEEE 3
268
2020 IEEE Region 10 Symposium (TENSYMP), 5-7 June 2020, Dhaka, Bangladesh

B = L1 + L2C2 + L3C23 − L4S23 (3)


4

A can be found using equation 2 and B by using 3. In


equation 1, the 3X3 matrix the last column represents the
position (X, Y, Z) of the end-effector concerning the base.
The inverse kinematics Model is one of the most important
measures in robotics [18]. Joints angles and positions are Fig. 7: System Flow of Robotic Arm
determined by minimizing the error values of the experimental
and the theoretical [20]. The angle of the joints in of our TABLE III: Human Arm and Robotic Arm interaction
robotic arm is given by θ1 , θ2 , θ3 , θ4 , and θ5 as shown in figure
Number Hand Action Robot Action
6. We calculated the rotation and translation of the joints using 1 Hand Movement Robot Movement
the matrix shown in equation 4. In equation 4, Px , Py , Pz is 2 Palm Half Grip Grab
3 Open Palm Place
the rotation and the translation of the end effector concerning 4 Make a fist Stop
the base of the arm. Now, the general expressions of the angles 5 Palm Rotation Rotation
are given by equations 5, 6, 7, 9, 10, 11, 8, 12.
nx Ox ax Px
ny Oy ay Py
 The data is then sent to the PI using the LP SDK. The PI
T = (4)
nz Oz az Pz decodes the data. Details on how the data are mapped are dis-
0 0 0 1
cussed in subsubsection III-G4. This data is then forwarded to
θ1 = A tan(Px , Py ) (5) the Arduino, which manipulates the servo motors accordingly.
S = C 1 a x + S1 a y (6) Here, data calculation is done using NumPy with python 3.7
as the programming language.
C234 = az (7)
θ2 = A tan(S2 , C2 ) (8)
θ234 = A tan(S234 , C234 ) (9)
2 2 2 2
(C1 Px +S1 P4 +l4 S234 ) +(Pz −l1 +l4 C234 ) −l2 −l3
C3 = 2l2 l3 (10)
−(C1 Px +S1 P4 +l4 S234 )S3 l3 +(C3 l3 +l2 )(Pz −l1 +l4 C234)
S2 = (l3 C3 +l2 )2 +(S3 2 l3 2 )
(11)
θ4 = θ234 − θ3 − θ2 (12)
G. Software Development
1) Gesture Recognition: The proposed system uses dy- Fig. 8: Data Mapping
namic hand gesture recognition using Support Vector Ma-
chine(SVM) [21]. The LP Sensor detects the 3D motion. After 4) Data Mapping: One of the most important tasks when
that extraction of feature vectors of the gestures occurs. Then using the LP Sensor is to ensure that the data is mapped prop-
early recognition of the gesture can be achieved by stretching erly. The LP uses the 3-dimensional coordinate system [22] in
the trial database. The test database is used to model the SVM. units of millimeters. The top center of the hardware itself is
2) Software Algorithm: the origin. The coordinate values found can be mapped onto
1) Connect LP using LP SDK the application-defined coordination system using the formula
2) Add hand position change listener 8 for all 3 coordinates. Using the values found from equation
3) When hand position changes 13, the PI will manipulate the arm. Figure 8 shows how the PI
• Get the position(x,y,z) of fingers and palm uses the mapped data to control the Robotic Arm. For example,
• Convert those positions according to data mapping if the value is between 1000 and 1180 then the base plate is
• Send data to connected Arduino using pyduino rotated until further instructions are forwarded.
library
Appr
4) Repeat xa = (xl − Ls ) Leap r
+ Apps (13)
3) Console Application: For the console applications we
used CircleGesture 7 , KeyTapGesture 8 , ScreenTapGesture 9 , IV. R ESULTS
SwipeGesture 10 . These gestures are read by the LP sensor. To deduce the performance of our system, we created a
7 Continuous
sequence of commands arbitrarily and first implemented them
gesture to create a circle
8 Tap gesture using finger
with a joystick in a way to ensure that all of the commands
9 Gesture similar to tapping a vertical screen were carried out 20 times. Then the same sequence was
10 Gesture similar to horizontal movement repeated using gesture control. The efficiency of our system
978-1-7281-7366-5/20/$31.00 ©2020 IEEE 4
269
2020 IEEE Region 10 Symposium (TENSYMP), 5-7 June 2020, Dhaka, Bangladesh

TABLE IV: Efficiency of the System


[3] R. Goel and P. Gupta, “Robotics and industry 4.0,” in A Roadmap
Hand Action Gesture Control Accuracy to Industry 4.0: Smart Production, Sharp Business and Sustainable
Robot Movement 100% Development, pp. 157–169, Springer, 2020.
Grab 80%
Place 96% [4] V. B. Semwal, “Data driven computational model for bipedal walking
Stop 100% and push recovery,” arXiv preprint arXiv:1710.06548, 2017.
Rotation 88% [5] V. B. Semwal, S. A. Katiyar, R. Chakraborty, and G. C. Nandi,
“Biologically-inspired push recovery capable bipedal locomotion mod-
eling through hybrid automata,” Robotics and Autonomous Systems,
in identifying the different commands sent to the robotic arm vol. 70, pp. 181–190, 2015.
is compared to when it was controlled by a joystick. Table [6] N. Gouthami, A. S. Reddy, M. Susmitha, M. T. Sree, K. Mamatha, and
N. S. Reddy, “Numerous sensors based environmental weather variables
III shows the different commands we have to control the monitoring robotic system over iot,” CLIO An Annual Interdisciplinary
robotic arm with gestures. Table IV shows the efficiency of Journal of History, vol. 6, no. 2, pp. 604–611, 2020.
our system. [7] L. Yang, J. Chen, and W. Zhu, “Dynamic hand gesture recognition based
From table IV we can see that our system has a mean on a leap motion controller and two-layer bidirectional recurrent neural
network,” Sensors, vol. 20, no. 7, p. 2106, 2020.
efficiency of 92.8%. The robot can almost perfectly move,
[8] J. Olawale, A. Oludele, A. Ayodele, and N. M. Alejandro, “Development
place an object, or stop moving when the user instructs it of a microcontroller based robotic arm,” in Proceedings of the 2007
to. However, there were a few drawbacks. From Table III Computer Science and IT Education Conference, pp. 549–557, 2007.
we can see that the Grab command is read when the LP [9] Z. A. Ali, M. Tanveer, H. Shaukat, and S. Anwar, “Android operated
controller detects a half grip palm and the Place command robotic arm,” Universal Journal of Control and Automation, vol. 2, no. 1,
p. 13, 2014.
is read when the controller detects a fully open palm. The
[10] K. Premkumar and K. G. J. Nigel, “Smart phone based robotic arm
controller at times failed to identify Grab command and was control using raspberry pi, android and wi-fi,” in 2015 International
mostly reading it as Place command instead. The same case Conference on Innovations in Information, Embedded and Communica-
occurred with the Rotation command as the LP sensor was tion Systems (ICIIECS), pp. 1–3, IEEE, 2015.
identifying that as Movement instructions on some occasions. [11] K. Khoshelham and S. O. Elberink, “Accuracy and resolution of kinect
depth data for indoor mapping applications,” Sensors, vol. 12, no. 2,
Although, movement command is identified perfectly we faced pp. 1437–1454, 2012.
a few issues with the direction the arm moved in. The LP [12] F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the
uses Infrared rays that cannot penetrate through an object. accuracy and robustness of the leap motion controller,” Sensors, vol. 13,
As a result, when overlapping occurs movement direction is no. 5, pp. 6380–6393, 2013.
not followed properly. The LP Sensor also fails to detect the [13] A. Chan, T. Halevi, and N. Memon, “Leap motion controller for authen-
tication via hand geometry and gestures,” in International Conference on
motion if the starting position is sideways or upside down. Human Aspects of Information Security, Privacy, and Trust, pp. 13–22,
Springer, 2015.
V. C ONCLUSIONS AND L IMITATIONS
[14] T. V. S. N. Venna and S. Patel, “Real-time robot control using leap
As labeled before Human-computer interactive devices, motion a concept of human-robot interaction,” ASEE, 2015.
robots and machines are becoming so much popular in use [15] S. Kayastha and P. Upadhyaya, “Design and implementation of a cost-
recently. Undoubtedly, gesture-controlled robots are easier to efficient smart home system with raspberry pi and cloud services,”
in 2019 Artificial Intelligence for Transforming Business and Society
deal with very basic technical knowledge and it can serve (AITB), vol. 1, pp. 1–7, IEEE, 2019.
anyone. The integration of the LP sensor to maneuver the [16] Y. Tajitsu, “Piezoelectric poly-l-lactic acid fabric and its application to
robotic hand in the Home-Assistance robot is surely remark- control of humanoid robot,” Ferroelectrics, vol. 515, no. 1, pp. 44–58,
able to control a complex robot. We have used this system 2017.
in various situations where human intervention is rare, and [17] J. Iqbal, R. U. Islam, and H. Khan, “Modeling and analysis of a
6 dof robotic arm manipulator,” Canadian Journal on Electrical and
we have tested this whole system by specially-abled users and Electronics Engineering, vol. 3, no. 6, pp. 300–306, 2012.
they happened to use the robotic arm very efficiently with ease. [18] J. Q. Gan, E. Oyama, E. M. Rosales, and H. Hu, “A complete analytical
LP allows us to increase the level of performance and allows solution to the inverse kinematics of the pioneer 2 robotic arm,”
us to improve the system. Increasing the Degree of Freedom Robotica, vol. 23, no. 1, pp. 123–129, 2005.
will result in making the system more flexible and better to [19] T. I. R. Uday, N. Ahmad, A. Ghosh, J. Jahin, M. M. Rahman, M. I.
Iqbal, M. T. Islam, F. Farzana, M. M. Rahman, G. E. U. Rahman, et al.,
use. Using grippers in the tips of the fingers will make it is “Design and implementation of the next generation mars rover,” in 2018
easy to grab things. Currently, we have implemented the arms 21st International Conference of Computer and Information Technology
with two fingers. Increasing the number of fingers will result (ICCIT), pp. 1–6, IEEE, 2018.
in increased maneuverability of the robotic arm. [20] V. B. Semwal and G. C. Nandi, “Generation of joint trajectories using
hybrid automate-based model: a rocking block-based approach,” IEEE
R EFERENCES Sensors Journal, vol. 16, no. 14, pp. 5805–5816, 2016.
[21] Y. Chen, Z. Ding, Y.-L. Chen, and X. Wu, “Rapid recognition of
[1] M. M. M. Maung and S. N. L. Aung, “Stair climbing robot with stable
dynamic hand gestures using leap motion,” in 2015 IEEE International
platform,” 2019.
Conference on Information and Automation, pp. 1419–1424, IEEE,
[2] H. Takeda, N. Tsujiuchi, T. Koizumi, H. Kan, M. Hirano, and Y. Naka-
2015.
mura, “Development of prosthetic arm with pneumatic prosthetic hand
and tendon-driven wrist,” in 2009 Annual International Conference of [22] S. Chen, H. Ma, C. Yang, and M. Fu, “Hand gesture based robot control
the IEEE Engineering in Medicine and Biology Society, pp. 5048–5051, system using leap motion,” in International Conference on Intelligent
IEEE, 2009. Robotics and Applications, pp. 581–591, Springer, 2015.

978-1-7281-7366-5/20/$31.00 ©2020 IEEE 5


270

You might also like