Professional Documents
Culture Documents
7181
http://doi.org/10.4038/icter.v9i2.7180
Sensor
MCU
nRF24L01
Sensor
(b) PC
Sensor
MCU nRF24L01
Fig. 1 Kinematics[14] (a) Forward kinematics, (b) inverse kinematics
(a)
1
Pwrist PelbowQ forearmV forearmQ forearm . (7)
(𝑥𝑒𝑙𝑏𝑜𝑤 , 𝑦𝑒𝑙𝑏𝑜𝑤 , 𝑧𝑒𝑙𝑏𝑜𝑤 ) = For a sensor in the state of rest, the linear acceleration is
1 0 0 cos𝜃𝑅𝑆𝑦 0 sin𝜃𝑅𝑆𝑦 0 quite small. So the signals from an accelerometer can be
[0 cos𝜃𝑅𝑆𝑥 −sin𝜃𝑅𝑆𝑥 ] [ 0 1 0 ] [ 0 ]. regarded as the gravitational acceleration. But in most
0 sin𝜃𝑅𝑆𝑥 cos𝜃𝑅𝑆𝑥 −sin𝜃𝑅𝑆𝑦 0 cos𝜃𝑅𝑆𝑦 −𝑙0 situations, the kinematics linear acceleration is usually in
existence. Thus, the data from accelerometer and gyroscope
(10)
are fused using EKF to calculate the gravitational
acceleration. The state vector of the dynamical system is the
Using the values θRSxand θRSy from 10, the motion range of
gravitational acceleration expressed in the sensor coordinate
elbow joint can be calculated. By comparing this elbow range
frame, represented by gs. The control vector of the system is
with the result of 6, we could filter input sensor data. If the
the angular velocity from the gyroscope, denoted by ωs. The
results of 6 is not in the range of elbow, we assume that it
measurement vector of the system is the acceleration data
may not a human regular motion. s
from the accelerometer, denoted by aS. Define gS= [ g x , g Ys ,
For the right-elbow (RE) joint shown in the Figure 7(b), it
g sZ ]T, ωS= [ sx , Ys , sZ ]T, aS= [ a sx , aYs , asZ ]T. Then the state
is not only deal with its own range of motion, since it is a
kinematics chain. It also has to follow the transfer relation. equation of the dynamical system is represented by,
So the angels can be described as following range:
s
g t T g x
s t sy t g sz t sz t g sy t T
𝜃𝑅𝑆𝑥 < 𝜃𝑅𝐸𝑥 < 𝜃𝑅𝑆𝑥 + 150°, sx s
s s
g t T g t s t g t s t g t T
, (12)
y y z x x z
𝜃𝑅𝑆𝑦 < 𝜃𝑅𝐸𝑦 < 𝜃𝑅𝑆𝑦 + 150°. s
g t T g s t s t g s t s t g s t T
z z
x y y x
After we get the elbows range of motion fore-arm length
where T denotes the sampling period of sensors. The
is l1, the end values can be calculated using,
measurement equation of the system is represented by,
(𝑥𝑒𝑙𝑏𝑜𝑤 , 𝑦𝑒𝑙𝑏𝑜𝑤 , 𝑧𝑒𝑙𝑏𝑜𝑤 ) a sx t g sx t n gx t
1 0 0 cos𝜃𝑅𝑆𝑦 0 sin𝜃𝑅𝑆𝑦 0 a sy t g sy t n gy t , (13)
s s
a z t g z t n gz t
= [0 cos𝜃𝑅𝑆𝑥 −sin𝜃𝑅𝑆𝑥 ] [ 0 1 0 ][ 0 ]
0 sin𝜃𝑅𝑆𝑥 cos𝜃𝑅𝑆𝑥 −sin𝜃𝑅𝑆𝑦 0 cos𝜃𝑅𝑆𝑦 −𝑙0
𝑥𝑒𝑙𝑏𝑜𝑤
+ [𝑦𝑒𝑙𝑏𝑜𝑤 ] where the n = [ngx, ngy, ngz]T denotes the measurement noise.
𝑧𝑒𝑙𝑏𝑜𝑤
(11) The result of the human motion of human skeleton is
described using Euler angle. So all data from sensors used to
calculate angle in order to make the dynamic of human
motion. The gyroscope can be calculated into the angle using
integrating of sensor sampling time, and the accelerometer
can be calculated a deflected angle by using trigonometric.
IF a> 0.4477539 both legs are standing or are in the double stance phase (DS).
Angleaccel= sin-1(x/gyro) During the DS phase, we consider the leg which is just going
ELSE into the ST as the support leg. The stick model of the walking
Angleaccel= tan-1(x/z) gait shows in Figure 8 (b). (p): Left terminal swing, Right
∀g∈gyro terminal stance, Right toe reference. (q), (r), (s): Left support
g = Ext. Kal. Filter (gyro, accel) leg, Left toe reference. (t): Left terminal stance, Right
Angleg = ∫g dt terminal swing, Left toe reference. Here grey dash lines
RETURN Angleaccel, Angleg represent the lower body movements
END
IV. EXPERIMENTAL RESULTS AND DISCUSSION
Algorithm 3: Walking estimation We performed the following experiments to evaluate the
INPUT : legs status (LS), stands (ST), swing (SW) efficacy of our inertial sensor based human motion tracking
OUTPUT: Kinematic analysis (Ka) system. We used OpenNI library since it is open source and
BEGIN we could easily get the coordinate value of each joint in
∀LS human model. First, we compared the tracking result. Then
IF both legs ST we record 14 joints position data in our method and Kinect,
Ka = calculate (Forward Kinematics) include left-shoulder, left-elbow, left-wrist, right-shoulder,
ELSE IF one leg ST and the other SW right-elbow, right-wrist, left-hip, left-knee, left-ankle, right-
IF leg stands hip, right-knee, right-ankle, chest, and head.
Ka = calculate (Forward Kinematics)
ELSE
Ka = calculate (Inverse Kinematics) A. Comparing with Kinect Result
RETURN Ka While running our method and OpenNI project at the same
END time, recorded the posture result. Inertial sensors are mounted
to the human body segments including upper-arms, forearms,
F. Gait Estimation chest, thighs and shanks. First we sit on chair and swing the
Walking estimation is conducted in the following two arms. Then we stand and do much more postures. Figure 9
steps: walking status detect and kinematics analysis as shows six screenshots of recorded results. The left picture of
written in algorithm 3. Walking status detect is used to a particular screenshot represents the result of proposed
determine the support leg of which ankle joint is selected as method while the right picture represents Kinect OpenNI.
the root joint of the lower body model during walking. During this experiment, we could notice that the proposed
Human walking is a cyclical motion. We can divide into two motion tracking system works well in position of human
phase as shown in Figure 8 (a) stance phase (ST) and swing skeleton in real-time.
phase (SW).
(a)
(b)
Fig. 8 (a) Leg status during walking, (b) walking simulation.
B. Coordinate Diagram used our method to play some video games and use our
method to control the Microsoft Power Point. Figure 11 (a)
shows that play First Person Shooting (FPS) game using our
method. Used chest joints angle to control the rotation of the
player. When the chest joint leans to the right side, the
player in the video game will leads to right side. Ankle joints
movement controls the players step. Player will fire while
we put right hand up. A flight simulator game has been
tested using our method which has been shown in Figure 11
(b). We used upper body’s posture to control planes fly state.
D. Walking Estimation
(a)
Fig. 10 Coordinate diagram result. (a) Chest result using our method. (b) (b)
using Kinect OpenNI. (c) takes the left wrist position result using our method,
(d) using Kinect OpenNI. (e) the result of right ankle using our method (f) Fig. 11 Game control using our method (a). FPS game using our method (b)
Kinect result flight simulator game using our method
We have recorded 13 skeleton joints coordinate value
including shoulder joints, elbow joints, wrist joints, hip joints, As we used two kinds of kinematics, our model is not
knee joints, ankle joints and chest of both methods. Among fixed at the centre of mass. Model can walk as what we have
them we have considered chest, left wrist and right ankle done in the reality. We have recorded many walking frames
asexamples. Figure 10 depicts the comparative results of the as shown in Figure 12.
joints cooridinate diagram result for our method and Kinect
TABLE 1.
MOTION ANALYSIS OVER TIME IN SECONDS AND JOINT
POSITION VALUES CORRESPONDINGTO FIGURE10(a) AND 10(b)
(a) (b)
Time x z y x z y
-0.0013 0.0076 0.0073 0 0.0075 0.0074
0
(a) -0.0025 0.0075 0.0075 -0.005 0.0076 0.0078
1
2 -0.0075 0.0076 0.0075 -0.01 0.0077 0.0078
0.005 0.0125 0.0125 -0.0025 0.0076 0.0077
3
0.0025 0.0075 0.0076 0.005 0.0078 0.0078
4
5 0.01875 0.0075 0.0075 0.005 0.0076 0.0078
C. Applications
HCI is wildly used in many parts of human life, like video
games, virtual reality, 3D technology, Multi-Media, etc. We
[20] Holbrook, E. A., Barreira, T. V., & Kang, M. (2009). Validity and [23] Mihelj, M. (2006). Inverse Kinematics of Human Arm based on
Reliability of Omron Pedometers for Prescribed and Self-Paced Multisensor Data Integration. J Intell Robot Syst., 47, 139-153.
Walking. Medicine & Science in Sports & Exercise, 41(3), 670-674. [24] Downs, L. (Ed.). (2003, June 03). CS184: Using Quaternions to
[21] Yang, C., & Hsu, Y. (2010). A Review of Accelerometry-Based Represent Rotation. Retrieved December 23, 2016, from
Wearable Motion Detectors for Physical Activity Monitoring. Sensors, http://web.archive.org/web/20040214133703/http://www.cs.berkeley.
10(8), 7772-7788. edu/~laura/cs184/quat/quaternion.html.
[22] Bamberg, S., Benbasat, A., Scarborough, D., Krebs, D., & Paradiso, J.
(2008). Gait Analysis Using a Shoe-Integrated Wireless Sensor
System. IEEE Transactions on Information Technology in
Biomedicine IEEE Trans. Inform. Technol. Biomed., 12(4), 413-423.