Professional Documents
Culture Documents
Article QR
DOI: https://doi.org/10.32350.icr.31.05
Received: December 7, 2023, Revised: February 15, 2022, Accepted: May 25, 2023,
History: Published: June 24, 2023
Citation: M. I. Majid, A. Gauhar, and A. Rasool, “Three axis kinematics study for motion
capture using augmented reality,” Innov. Comput. Rev., vol. 3, no. 1, pp. 71–108,
2023, doi: https://doi.org/10.32350.icr.31.05
A publication of
School of Systems and Technology
University of Management and Technology, Lahore, Pakistan
Three-axis Kinematics Study for Motion Capture using Augmented
Reality
Muhammad Imran Majid1,2∗, Ali Gauhar2, and Aamir Rasool2
Visiting Associate Professor School of Engineering University of Warwick, UK
1
Karachi, Pakistan
ABSTRACT Motion Capture (MoCap) refers to the process of tracking, processing,
recording, and translating real-world motions into a 3D model. MoCap provides vital
realism to animal and human motion not only in entertainment industry but also in
other sectors. It is also frequently used in Medical Motion Therapy, robotics,
Augmented Reality (AR), and Virtual Reality (VR). Additionally, MoCap techniques
and systems based on Inertial Measurement Units (IMUs), enables the development of
compact sensors that can be worn on the body and send movement single to a host
computer. IMU can estimate the orientation of a rigid body. To compute orientation,
IMU technology employs data from three sensors, namely a magnetometer, an
accelerometer, and a gyroscope. The movement of any object can be monitored using
multiple IMU devices. The current study describes the design and execution of a real-
time motion capture system to record the human arm’s movement and a six-degree-of-
freedom tracking system using COTS and reduced noise. For this purpose, this study
assembles the hardware components and the software-side analysis of the obtained
motion data and its implementation on a 3D model. The proposed design was
adjustable and long-lasting. The results of six DoFs in terms of overall noise filtration
percentages of the system were improvised using Noise based on RMS, which were
10.924% and 40.787% for the gyroscope and accelerometer, respectively. Real-time
motion capturing technique using a prototype revealed that it is acceptable to collect
the values of motion parameters and track the motion of objects that can be mated with
digital characters on the virtual screen using AR or VR. Several high-resolution
cameras and specially designed attire make MoCap prohibitively expensive. The main
aim of this study is to introduce a cost-effective and less noisy prototype or way to
capture movement or postures more accessible to a broader variety of people.
INDEX TERMS complementary filter, Degree of Freedom (DoFs), host computer,
Inverse Kinematics (IK), Inertial Measurement Unit (IMU), Internet of Things (IoT),
Unity 3D
I. INTRODUCTION Primarily, it is used in video games and
filmmaking production to capture and
Motion capture (mo-cap or mocap) is a
record human actions to animate digital
technique, which refers to the motion of
character models in 3D or 2D on the host
an object or human. It has innumerable
computer. There are various methods to
applications in different fields, such as
involve human- computer interface but
entertainment, military, medical, sports,
motion capture (mocap) has been a well-
robot validation and computer vision.
∗
Corresponding Author: m.majid.1@warwick.ac.uk
School of System and Technology
73
Volume 3 Issue 1, Spring 2023
Three-axis Kinematics Study for Motion…
researched topic with several bias assessed with the help of COTS
applications in virtual reality, decision- parameters.
making, health monitoring, demand
C. OBJECTIVES
reliability, and quick response upon input
changes. In other words, it can be defined The aim of the current study is to provide
as a technique to digitally capture the a user-friendly mocap system with the
movement of objects or humans with the following qualities
assistance of a sensor mounted on a target
using simulation tools. Augmented 1 Development of low-cost and self-
Reality (AR) is an instance of pervasive containing, flexible wearable sensors
for human arm interfaced with 3D
computing technology, where designers
animation [3]
enhance parts of users’ real world
interaction using computer-generated 2 Low-cost prototype with less reduced
input. AR interfaces, unlike interactive noise
Virtual Reality, allow users to see the
physical world, while simultaneously 3 Framework for testing noise in all
displaying virtual imagery attached to three axes.Use of COTS (Commercial
real locations and objects with the help of off- the-Shelf) software in analysis of
Inverse Kinematics (IK). [1] IK is the data extracted.
mathematical process of calculating the D. SCOPE
variable joint parameters, which helps to
achieve the desired endpoint. This The suggested prototype is designed to
technique was developed for gait analysis measure the six degrees of freedom and
in the life sciences industry, however, it capture the movement of any object with
is now widely used by VFX studios, sensor detection. The prescribed
sports therapists, and neuroscientists for prototype has two sensors and one
computer vision, robotics validation, and microcontroller connected to the host
control [2]. computer, the mocap sensor collects the
values of all parameters (6 DOFs) and
A. INVERSE KINEMATICS sends them to the controller, which then
Setting all DoFs (Degrees of Freedom) in processes them according to the program
an automated fashion, Inverse uploaded. The parameter it extracts are as
Kinematics (IK) technique requires end follows:
effectors which are commonly called • Acceleration (in all three axes)
positions and orientation of used joints.
The user has to specify the end effectors • Orientation (in all three axes)
the remaining DoFs would automatically The prototype can be simply maintained
be observed according to the criteria of and implemented anywhere; though a
the user, which would depend upon the hosting computer and a good internet
variants of Inverse Kinematics. connection is essential for the working
B. PROBLEM STATEMENT environment.
The project aim, scope and applicability Etienne-Jules Marey [13] undertook
are discussed. The current study is based separate research on animals and human
on IoT, therefore, the project is capable of motion by capturing moving objects
extracting 6 DoFs and placing them on several times in a quick period and
the cloud, along with capturing an devising equipment to graphically
object's motion and translating it onto a captureand record locomotion
3D skeleton[4]. Essentially, the project likewalking, jumping, running,
focuses on a prototype, however, future swimming, flying, gait analysis and so on
researchers can employ advanced sensors [14].
as a developmental factor Furthermore,
The rotoscope was invented by cartoonist
more financing could result in the
Max Fleischer in the 1900 s [15], allowed
structure and functionality of
animators to track actors by using
improvement for the project.
photographs of their live performances. In
II. LITERATURE REVIEW the mid-1980s, Robert Abel employed
performing animation to produce
Motion capture technique is a movement
brilliance, which marked the beginning of
detection technique, which is proposed to
the second iteration of performing
be utilized in capturing, analyzing, and
animation in the entertainment industry.
monitoring human or animal motion
This commercial was the first among all
[5][6] for certain applications within real-
to use MoCap for 3D animation [16].
time virtual reality [7]. This technique
Moreover, Wavefront technologies
records patterns of movement, especially
introduced and demonstrated mainstream
digital movements [8] for digital
3D character animation in the 1980s,
characters, movies, and video games. The
which became the first commercial
unique application of sports as well as in
product available for computer animation
performing arts, requires physical
[17].
mobility hence leading to injury which
can be prevent[9]. Here, this current In the 1970s and 1980s, mocap began to
research uses data from animation be used as a photogrammetric analysis
framework including six degree of technique in the biomechanical field [18]
freedom and advanced reverse and as the technology advanced, it
kinematics. ) as well as enhanced extended into a variety of fields.
processing by using stochastic analysis of
noise. The extent to which mo-cap
technologies are being used, requires
review of current and previous research.
A. AN OVERVIEW OF MOTION
CAPTURE
Man has always been compelled to create
visual representations of things, which he
observes in his surroundings [10]. New
inventions and recent developments
since the industrial age has enabled man
to make use of his five senses [11].
Eadweard Muybridge [12] and (a)
School of System and Technology
Volume 3 Issue 1, Spring 2023
75
Three-axis Kinematics Study for Motion…
(b)
FIGURE 1. (a) Right-hand motion of a human body captured by using an
Electromagnetic Motion Capture system. (b) Right-hand motion tracking effect [27].
Many of the reflecting markers may be problems in the data collection process
concealed from the sight of a few of the [25]. The optical technique has a
recording cameras based on the actor's drawback of not providing real-time
stance and this blockage might cause feedback, Since no orientation data was
dependent on high cameras may be mechanical mocap suit, which used forty-
inverted. The technology decodes optical three (43) potentiometers to capture joint
signals using photosensitive marker tags orientations. This suit uses an
rather than retro-reflective or active LED exoskeleton (external skeleton) The
markers. The tag attached to the photo range of motion restricts the exo skeleton
sensors can identify and derive not only to move within the perimeter of a fifty-
their positions in three dimensions but foot radius circle ( battery powered) , or
also speed in three vector format, but also twenty-five feet ( AC mains power). Hence
various other parameters affecting not the importance of geometry and the
only the amount of energy transferred but impact of location and range of AC
angle, incident illumination, and power sources.
reflectance. These indicators can be
Even though the Gypsy suit had benefits
included in garments to track movement,
including cheap price, quick and easy
while being undetectable to the wearer.
setup, and mobility, the report stated that
Choice of Motion Capture Systems
due to the mechanics of the suit, it
The MoCap system type and its use in a suffered from motion data drift, limits,
project or study depends on many factors. and noise of the actor's movements.
The type of motion to be captured, as well
Motion drift was mentioned as a constant
as the requirements of the subject have a
issue and it was claimed that the virtual
big impact on the chosen system. The size
performer's function was frequently
of the accessible capture volume, the
marred by jerkiness and noise. The
overall impact of the hardware on the
drifting issue was blamed on the
movements of the subject, and the desired
orientation trackers of the suit, according
accuracy of orientations of the joints
to the study.
wouldneed to considered in the overall
analysis. The budget required for the John Haag [30] conducted similar
mocap session would also have a research with the inertial mocap
significant impact on the chosen technique for live performance and
solutions. IT depends on the type of discovered some quite similar issues. A
financial incentives that the project difficulty with relative positioning was
offers. Selection of a system is also discovered. He further claimed that
determined by finance, economics apart the accuracy of data was dependent on
from technical objectives. These need to the precision of the performer's measures.
be studied in depth to understand the Highly energetic, acrobatic, or twisted
importance of motion capture systems. movements may not capture correctly,
according to Haag, highly energetic,
D. MOTION CAPTURE AND
acrobatic, or twisted movements may not
PERFORMING ARTS
capture correctly and actions like
Meador [29] presented a cross- jumping and ascending stairs are hard to
departmental initiative in which real-life record. Similar to the analysis of Meador,
artists on stage were integrated with Haag saw substantial numbers of
virtual actors in a virtual setting hardware tied to the suit as an issue.
controlled by real-time motion tracking However, he viewed this as a costume
to create a public play. This initiative limitation instead of a performer
employed the Gypsy three electro- limitation . Haag identified numerous
good results from the research, some of Linden [32] published a study to share the
which were similar to the findings of vibrotactile feedback, which was
Meador. The mobility and convenience supplied in real-time to novice violin
of set up, as well as the flexibility to run players. The feedback was calculated
in real-time, were praised. Haag based on the bowing gesture of the player
mentioned another positive effect, which to the desired trajectory. Here any sort of
is the range of motion of the suit, which deviation from the desired trajectory
counters Meador's concern. According to results in deviation leading to a fault .
Haag, could be operated from a range of Using an Animazoo Lycra suit,
up to two hundred meters. The difficulty movement data was captured using IMU
of the virtual space that might be used in (Inertial Measurement Units). The
the act is also a point of contention among findings of the study indicated that
the aforementioned authors. According to participants who got real-time feedback
Meador, the production has a 2D feel carried on to enhancing their bowing
regarding it. Due to simplicity in design skills. long after they had completed
as well as that of the imagery scene, the receiving their real time feedback from
implementing artistic performing arts participants. was Another reason for the
looks at aesthetics and impact of society failure was that the measurements were
is the compromised feature. According to taken only on a sample of eight
Haag, complex effects, multiple participants and concluded in six
characters, and rich 3D environments can sessions, The cumulative impact of the
be utilized. technique and its application in a long-
term, real-world teaching scenario are
In a study comparable to Haag's, a
unknown.
multidisciplinary team [31] staged a live
act in a virtual space employing real The impacts of valgus pressure on
actors and digital characters controlled baseball throwing-related elbow damage
by an actor are studied using an optical system by
Aguinaldo [33].
wearing an inertial mocap suit. With the
Motion Builder software of Autodesk, According to the findings, the pitching
the movement data was translated to the method has a substantial relationship
virtual character. Andreadis, like Haag with the torque of the elbow valgus.
and Meador, discussed a problem of Aguinaldo examined the throws of 69
decreasing positional precision and drift, baseball players, a far larger sample size
which got worse over time or was caused than Linden. There was no real-time
by electromagnetic interference. feedback because of the used optical
Andreadis’s inquiry yielded promising system. Techniques related to elbow
results, including real-time damage caused by elbow valgus torque
communication between the real-life have been studied in detail, but no
performers and avatars, which concurred strategies to reduce the effect have been
with Haag that a complex virtual space investigated or established.
with visual effects might be constructed
The impacts of balancing and plyometric
to improve the live performance.
exercise on resulting lower limb
E. MOTION ANALYSIS AND kinematics were tested using a camera-
PERFORMANCE DEVELOPMENT based mocap approach in a study by
TABLE I
MOTION CAPTURE SYSTEM PROVIDERS FROM ACROSS THE WORLD
WITH SUIT PRICE
Motion Capture Suit
S. No. Vender Name Vender Location
Price
1 OptiTrack [52] United State $ 325
2 Xsens Technologies [53] Netherlands $ 3,790
$ 3495 (Body + Hands
San Francisco, Bundle)
3 Rokoko [54]
USA $ 2745 (Only Body
Tracking)
4 Noitom [55] China $1,500
An advantage of the proposed prototype recent market trends. These results were
is that it is a two-in-one project. Firstly, it confirmed by a graph, which showed that
captures the motion of the object. the worldwide 3D mocap industry market
Secondly, it tracks the six DoFs, is expected to grow at a CAGR of 10.6%
compares them by performing from 2017-2024.
calculations, and performs noise analysis.
III. METHODOLOGY
Xsens Technologies, has a huge market
in different countries including India, A. SYSTEM OVERVIEW
Malaysia, Japan, USA, and many more
[37]. /As the primary distributor of this In recent times, the demands f motion
tracking systems are increasing [37]. This
system, Xsens makes sure the users
trend is driven by the medical and game
develop software for their system. This is
industries. In medical industry, a motion-
done by once a user pays for the software
tracking system is used to help patients in
to get a framework to be followed during
the motion tracking. As far as the the rehabilitation process, which is an
software is concerned, this system uses a essential application of the Mocap
system. On the other hand, the gaming
COTS product that is a readily available
industry use this system to create a
software (Unity 3D) for motion capture
sophisticated virtual world designed to
and works in any type of environment.
With respect to cost, the manufacturing operate a game character. The
cost of our prototype (pair of sensors) is microcontroller to be used should be
flexible, lost cost, and IoT enabled and
around $15. This can be increased if one
would be able to map various
use the high-quality sensors and other
programming interfaces available.
components. Users can purchase the full-
NodeMCU fits the criteria and is
body suit for around $70 to $80. OptiTrack
delivers the cheapest mocap suit for preferred in both industries.
$325. As discussed the proposed system
can be used to track the six DoFs and
calculate the noise to improve and prepare
for the next time.
The current study discussed motion
capture, its uses, innovations, and its
School of System and Technology
Volume 3 Issue 1, Spring 2023
81
Three-axis Kinematics Study for Motion…
angles [41].
.
extracting systems, and the real-time the time delay. Each sample of the
motion capture systems were put through collected data was gathered after a time
their paces by employing Unity 3D. The delay of 100ms during the demonstration.
entire implementation process went To run a system or device properly, one
smoothly and the results were achieved. had to calibrate it first, as it was done in
this study. First, the system was
V. RESULTS
calibrated so it is easily coordinated for
A. PROJECT RESULTS FOR IOT- displacement and velocity dimensions
BASED SIX DOFS TRACKING having tracked through the six DoFs. For
SYSTEM calibration, the sensor was set to (0, 0, 0)
as shown in Fig. 4.3.
After the project completion, it was
inspected and tested that the outcomes Table II shows all the readings of the
could have various implications. The gyroscope and accelerometer for sensor-
values of various parameters selected on 1 along the X-axis inthe calibration
the Blynk application were discovered by process. A total number of five
using Internet of Things (IoT)-based experiments were performed. By using a
system. sample time interval of two seconds, all
the readings were observed on the Bylnk
The entire prototype was setup in a self-
IoT mobile application. In a likewise
made isolated room for testing, with no
manner, Table III and Table IV show the
ventilation of air and only two pieces of sensor-1 readings for the Y-axis and Z-
electromechanical equipment were axis, respectively.
present: a laptop as a host computer (Dell
with Intel offset bias can be calculated by:
(R) Core (TM) i7-3612QM CPU @ 𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 = 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸 −
2.10GHz 2.10 GHz) and a smartphone 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 𝑜𝑜𝑜𝑜 𝑎𝑎𝑎𝑎𝑎𝑎 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸 (5.1)
(Oppo A1k) to note the results on the
Here the displacement is described along
Blynk application. A tiny cable of
the x axis. Specifically, to Move 2 inches
microcontroller with a built-in Wi-Fi
along the X-axis and the new location of
module was connected to both sensors.
sensor 1 is (2inch, 0, 0) as shown in Fig.
The microcontroller was responsive for
4.4.
both tracking, capturing, and transferring
parameter data to the smartphone via Table V is responsible to show the results
cloud computing. As discussed above the of filtered and unfiltered with and without
project is categorized into two sub- Offset Bias of the gyroscope of sensor-1
projects; Real-Time Motion Capture and along X-axis and the second last column
the IoT-based Six DoFs Tracking shows the running average error of
Systems. First, the results of the IoT gyroscope values. Table VI shows the
based Six DoFs Tracking System, readings of filtered and unfiltered with and
namely gyroscope and accelerometer without offset bias of accelerometer of
were discussed. To make the calculation sensor-1 along X-axis and the second last
easier the researcher performed five column shows the running average error of
experiments using a two-sample time accelerometer values. Whereas for showing
interval refereed to one sensor. The the Power of Noise (PN) in dBs
sample rate was selected with the help of
TABLE II
THE FILTERED, UNFILTERED AND OFFSET BIAS VALUES OF SENSOR 1 ALONG X- AXIS
Unfiltered Filtered Unfiltered Filtered
Expr Offset Offset Offset Offset
Readings of Readings of Readings of Readings of
No. Bias Bias Bias Bias
S1X Gyro S1X Gyro S1X Acc S1X Acc
1 13.36 -4.012 12.57 -3.156 1.38 0.318 1.21 0.34
2 15.01 -2.362 14.67 -1.056 1.15 0.088 0.97 -0.1
3 17.81 0.438 15.78 0.054 1.01 -0.052 0,83 -0.04
4 19.57 2.198 17.32 1.594 0.93 -0.132 0.69 -0.18
5 21.11 3.738 18.29 2.564 0.84 -0.222 0.65 -0.22
TABLE III
THE FILTERED, UNFILTERED AND OFFSET BIAS VALUES OF SENSOR 1 ALONG Y-AXIS
Unfiltered Filtered Unfiltered Filtered
Expr Offset Offset Offset Offset
Readings of Readings of Readings of Readings of
No. Bias Bias Bias Bias
S1X Gyro S1X Gyro S1X Acc S1X Acc
1 13.22 -5.66 12.63 -4.37 1.36 0.314 1.26 0.332
2 16.83 -2.05 15.01 -1.99 1.19 0.144 1.05 0.122
3 18.61 -0.27 16.94 -0.06 1.03 -0.016 0.91 -0.018
4 21.99 3.11 19.48 2.48 0.91 -0.136 0.79 -0.138
5 23.75 4.87 20.94 3.94 0.74 -0.306 0.63 -0.298
TABLE IV
THE FILTERED, UNFILTERED AND OFFSET BIAS VALUES OF SENSOR 1 ALONG THE Z-AXIS
Unfiltered Filtered Unfiltered Filtered
Expr Offset Offset Offset Offset
Readings of Readings of Readings of Readings of
No. Bias Bias Bias Bias
S1X Gyro S1X Gyro S1X Acc S1X Acc
1 12.88 -3.484 11.74 -2.754 1.26 0.142 1.18 0.14
2 15.49 -0.874 13.21 -1.284 1.21 0.092 1.15 0.11
3 16.29 -0.074 14.17 -0.324 1.13 0.012 1.08 0.04
4 17.54 1.176 16.26 1.766 1.01 -0.108 0.92 -0.12
5 19.62 3.256 17.09 2.596 0,98 -0.138 0.87 -0.17
TABLE V
ALL THE READINGS OF THE GYROSCOPE ALONG THE X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG X-
AXIS
PN = -10.0118 dB
1 14.97 18.982 13.88 17.036 0.76
2 15.43 17.792 14.27 15.326 0.95
3 17.37 16.932 16.21 16.156 0.12
4 18.05 15.852 17.89 16.296 0.02
5 20.11 16.372 19.13 16.566 0.29
TABLE VI
ALL THE READINGS OF THE ACCELEROMETER ALONG THE X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES
ALONG X-AXIS
Unfiltered Readings Unfiltered Readings Filtered Readings of Filtered Readings of Running
Expr
of S1X Acc with of S1X Acc without S1X Acc with offset S1X Acc without Average Relative
No.
Offset Bias Offset Bias Bias offset Bias Error
PN = -44.707 dB
1 2.21 1.892 2.11 1.77 0.012
2 2.08 1.992 1.94 1.84 0.058
3 1.93 1.982 1.81 1.85 0.068
4 1.78 1.912 1.63 1.81 0.028
5 1.53 1.752 1.42 1.64 0.142
TABLE VII
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (2INCH, 0, 0) ALONG Y-AXIS
PN = -45.827 dB
1 1.40 1.086 1.27 0.938 0.0504
2 1.29 1.146 1.18 1.058 0.0696
3 1.17 1.186 1.05 1.068 0.0796
4 1.03 1.166 0.86 0.998 0.0096
5 0.87 1.008 0.71 0.88 0.1084
TABLE VIII
ALL THE READINGS OF ACCELEROMETER Y-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG X-AXIS
Unfiltered Readings Unfiltered Readings Filtered Readings of Filtered Readings of
Expr Running Average
of S1X Gyro with of S1X Gyro without S1X Gyro with S1X Gyro without
No. Relative Error
Offset Bias Offset Bias offset Bias offset Bias
PN = -1.577 dB
1 13.26 18.92 12.19 16.56 1.7
2 15.17 17.22 13.00 14.99 0.13
3 16.57 16.84 14.07 14.13 0.73
4 18.87 15.76 16.54 14.06 0.8
5 20.81 15.94 18.50 14.56 0.3
TABLE IX
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (2INCH, 0, 0) ALONG Z-AXIS
PN = -13.14869 dB
1 13.26 16.744 11.19 14.944 1.8
2 15.17 16.044 12.00 14.284 1.76
3 16.57 16.644 14.15 14.394 2.25
4 18.87 17.694 15.45 14.774 2.92
5 20.81 17.554 19.05 15.904 1.65
TABLE X
ALL THE READINGS OF ACCELEROMETER ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG X-AXIS
PN = -35.081 dB
1 1.51 1.368 1.38 1.24 0.202
2 1.36 1.268 1.25 1.14 0.102
3 1.15 1.138 1.03 0.99 0.048
4 0.94 1.048 0.82 0.94 0.098
5 0.87 1.008 0.71 0.88 0.158
TABLE XI
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 2INCH, 0) ALONG X-AXIS.
PN = -13.14869 dB
of S1X Gyro with of S1X Gyro without S1X Gyro with S1X Gyro without
No. Relative Error
Offset Bias Offset Bias offset Bias offset Bias
1 13.26 16.744 11.19 14.944 1.8
2 15.17 16.044 12.00 14.284 1.76
3 16.57 16.644 14.15 14.394 2.25
4 18.87 17.694 15.45 14.774 2.92
5 20.81 17.554 19.05 15.904 1.65
TABLE XII
ALL THE READINGS OF ACCELEROMETER ALONG X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Y-
AXIS
PN = -31.164 dB
1 2.37 2.052 2.26 1.92 0.022
2 2.22 2.132 2.13 2.23 0.288
3 2.04 2.092 1.90 1.94 0.002
4 1.85 1.982 1.73 1.91 0.032
5 1.69 1.912 1.49 1.71 0.232
TABLE XIII
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 2INCH, 0) ALONG Y-AXIS
Unfiltered
Unfiltered Readings Unfiltered Readings Filtered Readings Filtered Readings of
Expr Readings of S1X
of S1X Acc with of S1X Acc without of S1X Acc with S1X Acc without
No. Acc with Offset
Offset Bias Offset Bias offset Bias offset Bias
Bias
PN = -38.433 dB
1 2.01 1.696 1.90 1.568 0.028
2 1.92 1.776 1.75 1.628 0.088
3 1.85 1.866 1.60 1.618 0.078
4 1.53 1.666 1.42 1.558 0.018
5 1.22 1.526 1.03 1.328 0.212
TABLE XIV
ALL THE READINGS OF ACCELEROMETER WHEN THE SENSOR-1 WAS PLACE AT (0, 2INCH, 0) ALONG Y-AXIS.
PN = -1.834 dB
1 15.04 20.7 14.82 19.19 1.68
2 16.45 18.5 15.34 17.33 0.18
3 17.81 18.08 17.11 17.17 0.34
4 20.00 16.89 19.86 17.38 0.13
5 22.04 17.17 20.42 16.48 1.03
TABLE XV
ALL THE READINGS OF GYROSCOPE ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Y-AXIS.
PN = -12.529 dB
1 13.26 16.744 12.09 14.844 0.286
2 14.71 15.584 13.40 14.684 0.446
3 16.32 16.394 15.27 15.594 0.464
4 17.48 16.304 16.46 14.694 0.436
5 19.16 15.904 18.43 15.834 0.704
TABLE XVI
ALL THE READINGS OF ACCELEROMETER ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Y-AXIS
PN = -39.286 dB
1 1.43 1.288 1.32 1.18 0.108
2 1.33 1.238 1.27 1.16 0.088
3 1.24 1.228 1.15 1.11 0.038
4 1.01 1.118 0.89 1.01 0.062
5 0.84 0.978 0.73 0.9 0.172
TABLE XVII
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 0, 2INCH) ALONG X-AXIS
PN = -35.342 dB
1 1.85 1.532 1.71 1.37 0.118
2 1.72 1.632 1.62 1.72 0.232
3 1.60 1.652 1.48 1.52 0.032
4 1.42 1.552 1.29 1.47 0.018
5 1.29 1.512 1.14 1.36 0.128
TABLE XVIII
ALL THE READINGS OF ACCELEROMETER ALONG X-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Z-AXIS
PN = -9.938 dB
1 13.76 17.772 12.85 16.006 1.046
2 14.38 16.742 13.71 14.766 0.194
3 15.81 15.372 15.02 14.966 0.006
4 16.92 14.722 15.91 14.316 0.644
5 18.04 14.302 17.31 14.746 0.214
TABLE XIX
ALL THE READINGS OF GYROSCOPE WHEN THE SENSOR-1 WAS PLACE AT (0, 0, 2INCH) ALONG Y-AXIS
PN = -54.226 dB
1 1.32 1.006 1.27 0.938 0.044
2 1.22 1.076 1.17 1.048 0.066
3 1.08 1.096 1.00 1.018 0.036
4 0.95 1.086 0.80 0.938 0.044
5 0.79 1.096 0.67 0.968 0.014
TABLE XX
ALL THE READINGS OF ACCELEROMETER WHEN THE SENSOR-1 WAS PLACE AT (0, 0, 2INCH) ALONG Y-AXIS.
Unfiltered
Unfiltered Readings Filtered Readings Filtered Readings of Running
Expr Readings of S1X
of S1X Gyro with of S1X Gyro with S1X Gyro without Average
No. Gyro without
Offset Bias offset Bias offset Bias Relative Error
Offset Bias
PN = -8.166 dB
1 12.17 17.83 11.72 16.09 1.13
2 14.12 16.17 13.00 14.99 0.03
3 15.41 15.68 14.76 14.82 0.14
4 17.02 13.91 16.67 14.19 0.77
5 19.32 14.45 18.65 14.71 0.25
TABLE XXI
ALL THE READINGS OF GYROSCOPE ALONG Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES ALONG Z-AXIS
Unfiltered
Unfiltered Readings Filtered Readings Filtered Readings of Running
Expr Readings of S1X
of S1X Gyro with of S1X Gyro with S1X Gyro without Average
No. Gyro without
Offset Bias offset Bias offset Bias Relative Error
Offset Bias
PN = -8.834 dB
1 14.61 18.094 13.19 15.944 0.096
2 15.53 16.404 14.00 15.284 0.564
3 16.24 16.314 14.98 15.304 0.544
4 18.67 17.494 17.54 15.774 0.074
5 21.18 17.924 19.53 16.934 1.086
TABLE XXII
ALL THE READINGS OF THE ACCELEROMETER ALONG THE Z-AXIS, WHEN THE SENSOR-1 MOVE 2 INCHES
ALONG Z-AXIS
Unfiltered
Unfiltered Readings Filtered Readings Filtered Readings of Running
Expr Readings of S1X
of S1X Acc with of S1X Acc with S1X Acc without Average
No. Acc without Offset
Offset Bias offset Bias offset Bias Relative Error
Bias
PN = -40.189 dB
1 1.30 1.158 1.19 1.05 0.144
2 1.17 1.078 1.07 0.96 0.054
3 1.04 1.028 0.93 0.89 0.016
4 0.89 0.998 0.76 0.88 0.026
5 0.76 0.898 0.58 0.75 0.156
Eq. (5.2) helped to extract the offset bias results. The new location of the sensor-1
from filtered and unfiltered values of would be (0, 2inch, 0) as shown in Fig. 4.6.
gyroscope and acceleration.
Table XI is responsible to show the results
𝑊𝑊𝑊𝑊𝑊𝑊ℎ𝑜𝑜𝑜𝑜𝑜𝑜 𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 = of filtered and unfiltered with and without
𝑊𝑊𝑊𝑊𝑊𝑊ℎ 𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 − 𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂𝑂 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 (5.2) Offset Bias of gyroscope of sensor-1 along
the X-axis. Table XIIshows the readings of
Eq. (5.3) helped in calculating the running
filtered and unfiltered with and without
average error for both acceleration.
Offset Bias of acceleration of sensor-1
𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸 = along the X-axis. Whereas column-7 Table
| 𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹 𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉 − XI to Table XXII shows the running
𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 𝑜𝑜𝑜𝑜 𝑎𝑎𝑎𝑎𝑎𝑎 𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹𝐹 𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤𝑤 𝐵𝐵𝐵𝐵𝐵𝐵𝐵𝐵 𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉𝑉| average error of gyroscope readings and
(5.3) accelerometer values.
To calculate the Power of Noise, Eq. (5.4) Table XIII is responsible to show the results
was used, of filtered and unfiltered with and without
𝑃𝑃𝑁𝑁 (𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 𝑜𝑜𝑜𝑜 𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁) = offset bias of gyroscope of sensor-1 along
the Y-axis. Table XIV shows the readings
∑𝑛𝑛
𝑖𝑖 (𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸)
2
� (5.4) of filtered and unfiltered with and without
𝑛𝑛 (𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁 𝑜𝑜𝑜𝑜 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸)
offset bias of acceleration of sensor-1 along
Eq. (5.5) converted the Power of Noise the Y-axis.
(PN) in decibel:
To track values of all six DoFs, we perform
𝑃𝑃𝑁𝑁(𝑑𝑑𝑑𝑑) = 20 log(𝑃𝑃𝑁𝑁 ) (5.5) the same producer for Z-axis by moving
sensor-1 2 inches in the direction of Z-axis,
Table VII is responsible to show the results and analysis the results. The new location
of filtered and unfiltered with and without of the sensor-1 became (0, 0, 2inch) as
offset bias of gyroscope of sensor-1 along shown in Fig. 4.5.
the Y-axis and the column-7 shows the
running average error of gyroscope values. Table XVII is responsible to show the
Table VIII shows the readings of filtered results of Filtered and Unfiltered with and
and unfiltered with and without offset bias without Offset Bias of Gyroscope of
of acceleration of sensor-1 along the Y-axis sensor-1 along X-axis. Table XVIII shows
and the last column shows the running the readings of Filtered and Unfiltered with
average error of accelerometer values. and without Offset Bias of Acceleration of
sensor-1 along X-axis.
Table IX shows results of filtered and
unfiltered with and without offset bias of Table XIX is responsible to show the
gyroscope of sensor-1 along the Z-axis and results of Filtered and Unfiltered with and
the last column shows the running average without Offset Bias of Gyroscope of
error of gyroscope values. Table X shows sensor-1 along Y-axis. Table XX shows the
the readings of filtered and unfiltered with readings of Filtered and Unfiltered with and
and without offset bias of acceleration of without Offset Bias of Acceleration of
sensor-1 along the Z-axis and the column-7 sensor-1 along Y-axis.
shows the running average error of Table XXI is responsible to show the
accelerometer values. results of Filtered and Unfiltered with and
Consequently displacing 2 inches along Y- without Offset Bias of Gyroscope of
axis analysis can be performed on the sensor-1 along Z-axis. Table XXII shows
Innovative Computing Review
100
Volume 3 Issue 1, Spring 2023
Majid et al.
It was observed that the Rokoko suit does gyroscope values and the RMS percentage
not have level support, so it cannot climb value of PN values were 16.194, 4.917 and
stairs or anywhere else because your feet 11.660 along X-axis, Y- axis, and Z-axis
are practically tethered to the ground, respectively, all these values show that
whereas the Xsens and this prototype much of the noise was remove from the
have level support, so it can be used to recoded values. In the case of acceleration
jump and to walk upstairs. these values were 37.500, 46.610, and
38.250. Furthermore, it described the
Secondly, Rokoko gave a full-body suit
configuration process to capture the real-
with built-in sensors; With respect to
time motion of an object. Lastly, the
enabling the user’s ,it is much faster, but
proposed prototype was compared with the
it is general brought to use . While both
available systems in the market.
the Xsens and the proposed system
feature individual sensors, which can be VI. CONCLUSION
mounted on an object or human body.
The current research described the
This system takes some time to set up but
development of a real-time IMU-based
it can be /utilized by all sizes and shaped
mocap system to enhance the flexibility
bodies.
and usability of a serial- chain network,
Furthermore, Mr. Guru elaborated that which is proposed to record orientation
with Xsens users had to pay some extra data from various IMU sensors. Thereby,
money for Xsens Studio. While Rokoko the results of the experiments, the
provided its users free Rokoko Studio designed prototype, which comprises two
with its suit, compared with our system it IMU sensors, were capable of gathering
uses easily available software like Unity orientation data from a reasonably spaced
3D and Blender in our case we used Unity object.
3D.
Motion capture (mo-cap or mocap) is a
The results of the suggested prototype, technique, which refers to the motion of
from calibration results to the values of six an object or human. It has innumerable
DoFs. Moreover, Table II, Table III, and applications in different fields, such as
Table IV show all the readings of entertainment, military, medical, sports,
gyroscope and accelerometer for sensor-1, . robot validation [45] and computer vision
Here calibration along X-axis, Y-axis, and [46]. Primarily, it is used in video games
Z-axis, respectively imply that calibration and filmmaking production to capture
along all three axis sensors is possible. and record human actions to animate
Table V and other tables are responsible to digital character models in 3D or 2D on
show the filtered and unfiltered results with the host computer [47-49]. It is often
and without offset bias of gyroscope and described as performance capture [50]
accelerometer of sensor-1 along X-axis, Y- when it involves the fingers and face or
axis, and Z-axis and the second last column captures delicate expressions. Motion
shows the running average error of capture is sometimes referred to as
gyroscope and accelerometer values. motion tracking in numerous fields. It is
Whereas the last column in these tables repeated to familiarize the user that the
shows the Power of Noise (PN) in dBs. The impact of this technology exists in
PN column shows the stability and today’s world. The motions of one or
reliability of the proposed system. The even more performers are sampled
numerous times per second during latency, near to real-time. This can lower
motion capture events. This can be seen the expenses of key frame-based
implemented in the various simulation animation for entertainment purposes.
techniques that are discussed in this
The amount of effort does not change
paper. Unlike early systems, which used
much with the length or the complexity of
photographs from numerous cameras to
the act with traditional techniques. This
determine 3D positions [50], the goal of
enables multiple tests to be performed
motion capture is frequently to record just
with varied techniques or deliveries,
the actor's actions, rather than their
resulting in various personality, which is
physical appearance. This has
only bound by the talent of the actor.
implication for improving the efficiency
of visual arts so talent and creativitiy can While comparing with other traditional
brought in the firefront. This animated animation approaches, a huge number of
data is then transferred to a 3D model, animated data can significantly be created in
which then executes the identical a given period. This helps to achieve the
movements as the actor. This method can production deadlines and cost-
be compared with the traditional effectiveness.
rotoscoping approach.
This costs could be reduced by using open
The IMU sensor-based motion capture source software and third- party services.
solution, which was built for human arm
motion capture using reverse This report describes the development of
kinematicsThe arm's (upper and lower) a real-time IMU-based mocap system. To
enhance flexibility and usability, a serial-
orientation would be tracked by the IMU
chain network is proposed to record
sensor. The sensors calibration was
orientation data from various IMU
required in this application to maximize
the tracking performance and accuracy of sensors. Based on the results of the
the motion capture. This paper other experiments, the designed prototype,
which comprises two IMU sensors, is
contribution also included a basic
capable of gathering orientation data
calibration procedure. The prototype was
from an object.
divided into two sub-projects: one for the
IoT-based tracking of six degrees of Table II to Table X shows the values of
freedom and transmitting data to the accelerometer and gyroscope on different
cloud, which can be accessed from positions with and without offset bias for
anywhere on the planet and the other for better understanding of noise. Let’s go
the motion capture system, which can be through these tables and analysis how this
used to capture the movement of any prototype can be improved. After
object. This can be translated into the analyzing all the readings it can be
action of a computer-generated 3D concluded that filter has performed its
character on screen. duty perfectly on accelerometer values
and make it near to actual value. Noise
The MoCap system introduced in this
along Z-axis for accelerometer values is
study has several advantages over other
more than along X- axis and Y-axis.
motion capturing /techniques, which are
While gyroscope values are also filtered
as follows:
superbly, the recoded values shows the
Performance can be achieved with low noise along Y-axis for gyroscope values
School of System and Technology
Volume 3 Issue 1, Spring 2023 103
Three-axis Kinematics Study for Motion…
is less than along X-axis and Y-axis [51]. monitoring of functional arm
The noise filtration for whole system movement,” IEEE/ASME Trans.
(both the Gyroscope and Accelerometer) Mechatron., vol. 16, pp. 213–220, Apr.
is about 17.237 percent. 2011, doi:
http://dx.doi.org/10.1109/TMECH.20
Based on mocap market analysis, the
09.2039222
annual growth rate of 9.8 percent is
expected between 2018 and 2023. [5] P.-Z. Chen, J. Li, M. Luo, and N.-h.
However, this figure has been revised to Zhu, "Real-Time human motion
12.12 percent for the period 2019-2024. capture driven by a wireless sensor
In fact, motion capture is anticipated to be network,” Int. J. Comput. Games
a $252.8 million industry in five years. Technol., vol. 2015, Art. no. 695874,
Furthermore, because the industry is Feb. 2015, doi:
growing at such a rapid pace, it is pushing https://doi.org/10.1155/2015/695874
the creation of new innovations and
[6] Z. Wang, S. Qiu, Z. Cao, and M. Jiang,
advancements such as marker-less that
“Quantitative assessment of dual gait
implies that ping-pong balls will no
analysis based on inertial sensors with
longer be required. Higher order DoF
body sensor network,” Sensor Rev.,
have a positive impact on mocap
vol. 33, no. 1, pp. 48–56, Jan. 2013,
performance In addition, mocap
doi:
manufacturers are attempting to make the
http://dx.doi.org/10.1108/0260228131
equipment more portable and affordable.
1294342
REFERENCES
[7] B. Xu, “The design and
[1] W.-H. Tan, W.-J. Li, Y.-Z. Zheng, and implementation of a motion capture
X.-C. Zhou, “ePet: A physical game system based on MEMS sensors and
based on wireless sensor networks,” Zigbee network,” M. S. thesis, Univ.
Int. J. Distrib. Sens. Netw., vol. 5, no. Elec. Sci. Technol., 2013.
1, p. 68. Jan. 2009, doi:
[8] K. Motta-Valencia, “Dance-Related
http://dx.doi.org/10.1080/1550132080
injury,” Phys. Med. Rehabil. Clin. N.
2555262
Am., vol. 17, no. 3, pp. 697–723, Aug.
[2] L. BO, “The development and design 2006, doi:
of motion capture system based on https://doi.org/10.1016/j.pmr.2006.06.
sensors,” M.S. thesis, Beijing Insti. 001
Technol., Beijing, China, 2011.
[9] C. E. Hiller, K. M. Refshauge, and D.
[3] Z. Wang, C. Zhao, and S. Qiu, “A Beard, “Sensory motor control is
system of human vital signs impaired in dancers with functional
monitoring and activity recognition ankle instability,” Am. J. Sports Med.,
based on body sensor network,” Sens. vol. 32, no. 1, pp. 216–223, Jan. 2004,
Rev., vol. 34, no. 1, pp. 42–50, Jan. https://doi.org/10.1177/036354650325
2014, doi: https://doi.org/10.1108/SR- 8887
12-2012-735
[10] F. Thomas and O. Johnston, The
[4] K. D. Nguyen, I-M. Chen, Z. Luo, S.H. illusion of life: Disney animation.
Yeo, and H. B. L. Duh, “A wearable Disney, 1981.
sensing system for tracking and
Innovative Computing Review
104
Volume 3 Issue 1, Spring 2023
Majid et al.
Bouguet, and M. Holler, “A real time [54] Rokoko. “Quality motion capture in
system for robust 3D voxel one simple suit.”
reconstruction of human motions,” https://www.rokoko.com/products/sm
Proc. IEEE Conf. Comput. Vision artsuit-pro (accessed Jan 19, 2023).
Pattern Recog., CVPR 2000.
[55] S. Hayden. “Noitom releases
[51] Ameri Research. “3D motion capture Perception Neuron 2.0 motion capture
market size, analysis.” system.” Road to VR.
https://www.ameriresearch.com/produ https://www.roadtovr.com/noitom-
ct-tag/3d-motion-capture-market/ releases-perception-neuron-2-0-
(accessed Jan. 19, 2023). motion-capture-system/ (accessed Jan.
19, 2023).
[52] Optitrack. “Motion capture suits.”
https://optitrack.com/accessories/wear
/. (accessed Jan 19, 2023).
[53] Xsens. “Refreshed pricing model for
Xsens’ motion capture solution.”
Available:
https://www.xsens.com/news/xsens-
motion-capture-pricing (accessed Jan
19, 2023).