You are on page 1of 7

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/337414007

Physical Activity Intensity Classification using a Convolutional Neural


Network and Wearable Accelerometer

Conference Paper · November 2019

CITATIONS READS

0 216

5 authors, including:

Arif Widianto Tommy Sugiarto


National Taiwan University of Science and Technology Industrial Technology Research Institute
5 PUBLICATIONS   4 CITATIONS    20 PUBLICATIONS   39 CITATIONS   

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

Biomedical Devices for Incubator Application View project

AI COPD Diagnosis View project

All content following this page was uploaded by Tommy Sugiarto on 21 November 2019.

The user has requested enhancement of the downloaded file.


The 16th International Confernce on Automation Technology (Automation 2019),
November 22-24, 2019, Taipei, Taiwan

Physical Activity Intensity Classification using a


Convolutional Neural Network and Wearable Accelerometer

Arif Widianto1, Tommy Sugiarto1,2,3, Yi-Jia Lin1, Yung-Hsiang Lee4,


and Wei-Chun Hsu1,2*
1 Graduate Institute of Biomedical Engineering NTUST, Taipei City, Taiwan
2 Graduate Institute of Applied Science and Technology, NTUST, Taipei City, Taiwan
3 Information & Communication Research Laboratory, Industrial Technology Research Institute, Hsinchu County, Taiwan
4 Sporting Educational Office, NTUST, Taipei City, Taiwan

*Corresponding author: wchsu@mail.ntust.edu.tw

Abstract: By measuring intensity using metabolic equivalent (MET) values, we can classify physical activity (PA) as
sedentary, light, moderate, or vigorous. Wearable accelerometers, which have been widely used to collect data
related to PA, can be used to measure intensity. Furthermore, improvements in deep learning, especially
convolutional neural networks (CNNs), which automatically extract features from data can facilitate classification.
Therefore, CNN was adopted in this study to evaluate its performance in classifying PA by extracting features from
wearable accelerometers placed on five locations and based on ground truth from MET values. We prepared a
temporal window (TW) and sliding window (SW) to evaluate model performance and determine which of these
considerably influenced the performance of the CNN. After determining the optimal TW and SW, we further
evaluated which sensor placement location influenced the network the most. A 5-s TW with a 20-frame SW and
sensor placed on the sternum demonstrated the optimal results followed by the sensor placed on the right wrist.

Keywords: Physical activity intensity; MET; wearable accelerometer; convolutional neural network; window size

1. Introduction necessary for deep learning frameworks, especially


Physical activity (PA) has become a parameter for convolutional neural networks (CNNs), because they
evaluating the health of a person and can be are scale-invariant [9]. Studies have reported the ability
categorized into sedentary, light, moderate, and of deep neural networks for classifying PA [10] and data
vigorous levels [1]. A study [2] reported that moderate from public datasets such as Opportunity and Skoda
to vigorous PA performed for at least 60 minutes daily [11].
benefits body composition, mental health, and Our previous investigation of accelerometer-based PA
academic achievement in school-aged children. intensity classification resulted in a CNN, which is the
The PA intensity required for certain tasks has been novelty of our method proposed herein. Some studies
evaluated as metabolic equivalent (MET) units; 1 MET is have used a statistical approach [12], a cut-point
approximately the amount of oxygen consumed when approach in ROC analysis [5, 13-16], logistic regression
sitting quietly on a chair, 3.5 mL kg-1min-1 [3]. However, approach [17], and machine learning for recognizing
measuring METs is difficult because it requires and classifying PA intensity [18] and machine learning
measurement of oxygen consumption and is limited to for PA recognition [6, 7, 19-23]. However, deep learning,
a laboratory setting. In [4] and [5], a VO2000 and including Fully-Connected Deep Neural Network [24-26],
Cosmed K4b2, respectively, were used to measure hybrid networks of CNN and recurrent neural network
oxygen consumption. [9, 11], and CNN model [27, 28], has been widely used
The rapid improvements in microelectromechanical to classify PA [10]. One study [29] used deep learning
system-based sensors and wearable devices have for classifying PA intensity; however, the model used
facilitated PA measurement. Studies measured PA by was artificial neural network, and the PA intensities
placing accelerometers on the wrists [6] or ankles [7] of considered were different from those in this study.
healthy adults or on the skin of children [8]. This study evaluated a CNN for classifying PA with true
Moreover, the rapid improvement in artificial labels produced using MET values. We examined
intelligence has improved PA classification. As reported performance using two variables: temporal window
in [6], which used a supervised machine learning (TW) and sliding window (SW). We subsequently
algorithm to recognize PA. Manual features extraction, evaluated which features significantly influence the
however, is needed prior to classifying samples by using model because wearing numerous sensors is
machine learning. By contrast, features extraction is not impractical. This paper is arranged as follows: Section 2
The 16th International Confernce on Automation Technology (Automation 2019),
November 22-24, 2019, Taipei, Taiwan

Fig. 1: Flow-chart of the protocol. Fig. 2: Illustration of sensors placement.


discusses our method, Section 3 presents the results, Table 1. MET-based Physical Activity
and Section 4 draws conclusions. Classification
Threshold Class
2. Method MET<1.5 Sedentary
1.5≤MET<4 Light
2.1 Participants, Equipment, and Protocol 4≤MET<7 Moderate
Twelve individuals (age = 22.75 ± 3.85 years, height = MET≥7 Vigorous
165 ± 6.98 cm, and weight = 62.17 ± 12.46 kg) agreed to
participate in this study, and each provided complete For its working principle, the Axivity AX3 sensors used
set of data. Each participant was asked to perform eight the real-time clock from the host computer to which it
laboratory-scale PAs: slow walking, fast walking, jogging, was connected during setup. By setting the starting time
lying on a mattress, sitting with hands activity; standing for the experiment according to the time shown on the
with hands activity; climbing up and down on stairs, and host computer, we synchronized the clocks.
resting. Excluding resting, which was performed for 6
minutes, all activities were performed for 4 minutes for 2.2. Data Pre-processing
each. Only one individual who performed each activity The data collected from Axivity AX3 were exported in
including resting for 4 minutes. Resting was performed .csv format by using Open Movement 1.0.0.37 (OmGui)
as an inter-trial activity to avoid fatigue and achieve a and segmented by activity. The segmented signals were
MET value of <1.5 before the next activity. Participants resampled to 100 Hz, because of the fluctuating actual
were advised not to perform any intensive movement sampling frequency. Butterworth band-pass filtering
during resting or lying. The walking and jogging with passband 1-20 Hz was applied. To avoid noise
activities were performed on a treadmill with at 0.5 m caused by the transition between activities and thus to
s-1 for slow walking, 1.3 m s-1 for fast walking, and 1.67 ensure the robustness of our model, the first and last 5
m s-1 for jogging. The stair-climbing activity consisted of seconds of data from resting, lying, sitting, and standing
traversing three 25-cm steps. Participants played a activities were removed.
mobile phone game during the sitting activity and TW size and SW size were then fed into the network. A
caught a ball with a partner during the standing activity. TW divides time-series data into smaller segments
Figure 1 provides a complete illustration. comprising sufficient information. The values we
Prior to the experiment, each participant provided selected for TW were 1, 2, and 5 s. An SW is useful for
consent to wear five triaxial Axivity AX3 (Axivity Ltd, data augmentation because it can be used to duplicate
Newcastle, UK) accelerometers placed on the left wrist data by sliding the starting frame while maintaining the
(LW), right wrist (RW), dominant-side shank (DShank), TW size. These variables may affect the performance of
lower back (L5), and sternum (between collarbones) as the model; particularly the optimal TW size depends on
shown in Figure 2. All participants were right-side the number and period of the samples used [30]. The
dominant. The range of each Axivity AX3 was set to ±8g, values we prepared for SW size were 20-, 50-frames,
and the sampling frequency was set to 100 Hz. and no sliding window.
PA was classified on the basis of MET values provided Raw MET values from the Cortex Metalyzer® were
by a CORTEX Metalyzer® (CORTEX Biophysik Gmbh, exported into .xlsx file format with 1s intervals. The
Leipzig, Germany), which utilizes face mask sensors to values were resampled to 10 Hz and filtered using the
measure O2 and CO2 flow and volume. Oxygen Butterworth low-pass filter with a passband of < 0.1 Hz.
consumption was used to infer the MET values. The We considered the mean value of a 1-min interval in the
MET thresholds from [5] were used to classify each middle of each activity to be the activity’s MET value for
activity as shown in Table 1. each participant because participants were assumed to
be in a steady-state during this period. The mean values
The 16th International Confernce on Automation Technology (Automation 2019),
November 22-24, 2019, Taipei, Taiwan

were again averaged among all participants and this of precision and recall, as in Eq. (4), in which FP and FN
resulting mean was the MET value used for analysis. represent false positives and false negatives,
We used the leave one subject out (LOSO) method to respectively.
evaluate the performance of our model. Therefore, we
𝑇𝑃+𝑇𝑁
used 11 out of 12 participants’ data as a training 𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 = (1)
𝑇𝑃+𝐹𝑃+𝑇𝑁+𝐹𝑁
dataset, and the remaining participant constituted the
validation and test datasets at a ratio of 7:3. We avoided 𝑇𝑃
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 = (2)
imbalanced data by selecting only one resting activity 𝑇𝑃+𝐹𝑃
for each participant. We used seed number of 7 to keep 𝑇𝑃
the randomness same. 𝑅𝑒𝑐𝑎𝑙𝑙 = (3)
𝑇𝑃+𝐹𝑁

2.3 CNN Model Architecture 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 × 𝑅𝑒𝑐𝑎𝑙𝑙


𝐹1 𝑠𝑐𝑜𝑟𝑒 = 2 × ( ) (4)
𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛+𝑅𝑒𝑐𝑎𝑙𝑙
CNNs offer high performance in terms of spatial
dimension. It can adapt to changes and thus exhibit
2.5 Hardware and Software for Computation
scale invariance. Furthermore, CNNs extract local
All computation and training processes were conducted
dependency because they can consider adjacent signals
using Python 3.6.9 and the Keras 2.2.4 library with a
which tend to be correlated [9, 28].
Tensorflow 1.14.0 backend on a PC with a 12-core Intel®
A One-dimensional (1D) CNN (followed by a pooling
Core i7-8700 processor with 3.2 GHz clock speed, 16 GB
layer) was used for PA classification in [27] and
of RAM, and NVIDIA GeForce RTX 2060 with a 1.68 GHz
outperformed other models such as a deep belief
boost clock, 6 GB frame buffer, and 14 Gbps memory
networks, support vector machine, and discrete cosine
speed.
transform (DCT) because the CNN could completely
extract adjacent signals, whereas DCT, for example,
extracts only the information from the same axis.
However, a fully-connected layer may generalize the
output of convolutional layer before feeding to the
output layer. Therefore, a 1D CNN followed by pooling
and fully-connected layers was used in this study. The
architecture presented in this study is based on data
obtained using a 1-s TW and no SW. Accuracy of 0.9 on
the test datasets was our considered threshold. The
model is configured as follows: the first and second
layers are 1D convolutional layers with filter size 70 and
60, respectively and use a rectified linear unit (ReLU)
activation function. We initialized the kernels by using
glorot uniform and the same seed number. The third
layer is a 1D max-pooling layer followed by a flattening
layer that reshapes the output of the convolutional
layers into a 1D array as input for the fully-connected
layers. The first such layer comprises 18 units with a
ReLU activation function, and the second is the output
layer comprising 3 units with softmax activation
functions. During training process, we used Early
Stopping method with minimum delta was 0.03 and
used 5 iterations interval of patience. Accuracy of Fig. 3: Model’s architecture.
validation dataset was the monitored parameter and
”restore best weight” was used. Figure 3 presents the 3. Results and Discussion
complete architecture.
3.1 MET Values
2.4 Model Evaluation Parameters The ground truth is shown in Table 2. The MET values
Four parameters were calculated to evaluate model range from 1.4 to 6.88. Our protocol successfully
performance: accuracy, precision, recall, and F1-score. reduced the MET values to <1.5 for resting activity. As
Accuracy is the sum of true positives (TPs) and true presented in Table 2, only three classes were identified
negatives divided by all predictions, as in Eq. (1). in this study: sedentary, light, and moderate. This might
Precision is the rate of TPs among all predicted have been caused by the ability and superior endurance
positives, as in Eq. (2). Recall is the rate of TPs among all of young participants when do jogging. have been
positive samples, as in Eq. (3). F1 score is a combination caused by the ability and superior endurance of young
The 16th International Confernce on Automation Technology (Automation 2019),
November 22-24, 2019, Taipei, Taiwan

Fig. 4: Loss curve of the best model. Fig. 5: Accuracy curve of the best model.
participants when jogging. Previous reported which The dimension information for the validation and test
identified jogging as vigorous showed the range of age datasets are presented in Table 3, in which the first
of the participants is different from ours [5]. Running column states the number of samples, second shows
activity is suggested to be included in future studies in the product of the TW and sampling frequency, and
order to obtain PA intensity in vigorous level. third present the number of features extracted from the
five tri-axial Axivity AX3 accelerometers.
Table 2. True Labels for Classification
Activity MET [SD*] PA class 3.3 Model Performance
Slow walking 3.34 [0.32] Light The model architecture presented in this study was
Fast walking 5 [0.6] Moderate obtained from the result of 1s TW size without an SW
Jogging 6.88 [0.5] Moderate and has exhibited accuracy, precision, recall, and F1
Resting 1.39 [0.17] Sedentary score of 0.9. The average performance of the model
Lying 1.4 [0.12] Sedentary evaluated through LOSO validation was used to observe
Sitting 1.47 [0.14] Sedentary the influence of TW and SW size on model performance.
Standing 2.57 [0.5] Light
Climbing stairs 4.26 [0.51] Moderate Table 4. Model’s Performance: Test Scores of 1s
*
standard deviation TW size
SW size Acc. Prec. Recall F1-score
3.2 Datasets (frames) [SD] [SD] [SD] [SD]
We obtained a sufficient number of samples from just 0 .92 [.06] .91 [.09] .91 [.09] .9 [.11]
12 participants, unlike [27], which obtained 31688 20 .94 [.06] .93 [.09] .93 [.09] .93 [.09]
samples (constituting the training and test datasets) 50 .93 [.07] .92 [.1] .92 [.1] .91 [.12]
from 100 participants. The number of training samples
with a 1-s of TW and 0-, 20-, and 50-frames SWs were Table 5. Model’s Performance: Test Scores of 2s
21880, 109048, and 43672, respectively. With a 2-s TW TW size
and 0-, 20-, and 50-frames SWs were 10940, 108608, SW size Acc. Prec. Recall F1-score
43496, respectively. Finally, a 5-s TW and with 0-, 20-, (frames) [SD] [SD] [SD] [SD]
and 50-frames SWs were 4376, 107288, and 42968 0 .94 [.06] .93 [.09] .93 [.09] .92 [.11]
samples, respectively. 20 .96 [.05] .95 [.07] .95 [.07] .95 [.07]
50 .95 [.06] .94 [.09] .94 [.09] .93 [.11]
Table 3. Input Dimension of Validation and Test
Dataset Table 6. Model’s Performance: Test Scores of 5s
TW SW Validation TW size
Test Dataset
(s) (frames) Dataset SW size Acc. Prec. Recall F1-score
0 1400, 100, 15 600, 100, 15 (frames) [SD] [SD] [SD] [SD]
1 20 6977, 100, 15 2991, 100, 15 0 .94 [.06] .92 [.1] .92 [.1] .91 [.12]
50 2794, 100, 15 1198, 100, 15 20 .98 [.03] .97 [.04] .97 [.04] .97 [.04]
0 700, 200, 15 300, 200, 15 50 .96 [.06] .95 [.09] .95 [.09] .94 [.11]
2 20 6949, 200, 15 2979, 200, 15
50 2783, 200, 15 1193, 200, 15 In Table 4-6, the results obtained with 20- and 50- frame
0 280, 500, 15 120, 500, 15 SWs were not substantially more accurate than those
5 20 6865, 500, 15 2943, 500, 15 obtained without an SW. This is obtained by observing
50 2749, 500, 15 1179, 500, 15 the difference between the accuracy values with SW
and without SW for each TW size as we considered at
The 16th International Confernce on Automation Technology (Automation 2019),
November 22-24, 2019, Taipei, Taiwan

value of a certain axis that correspond to specific


activity. The result from DShank may face slightly
underfitting problem since the training accuracy is 0.93
which is less than the test accuracy. An accelerometer
placed on the LW can distinguish PA intensity with just
marginally inferior accuracy to placement on the RW.
This contrasts with the results obtained using the cut-
point approach [16], in which the accuracy of the
classification with the sensor put on LW was < 0.8
among right-side-dominant participants. Additionally,
placement on the RW was quite accurate in our study,
unlike [5], in which the RW was least accurate. However,
Fig. 6: Normalized confusion matrix of the best model on the protocol and age ranges of [16] and [5] differed
test dataset. from ours. Even though the results from RW and LW are
least 0.05 increment for the test score in this study. lower than sternum, placing an accelerometer either on
Since we used fixed SW size, this finding agrees with right or left wrist become the best approaches since
[30], which concluded that fixed SW size is not an placing an accelerometer on sternum or lower back may
effective approach. One thing should be noted is that be difficult to monitor PA intensity.
statement in [30] was for PA recognition. On the other
hand, by defining SW size we could augment the 4. Conclusions
samples and produced sufficient numbers of the We evaluated the performance of a 1D CNN in
samples. Additionally, defining an appropriate SW size classifying PA as sedentary, light, moderate. We found
may keep the variability of the evaluation parameter the optimal architecture through trial and error.
values low. It can be observed in Table 4-6 that a 20- Additionally, we evaluated the influence of TW and SW.
frame SW size showed the lowest standard deviation Compared to the other studies, 1D CNN provided a good
value compared to the others. performance to distinguish PA intensity. Setting an
The highest accuracy and the lowest standard deviation optimal TW may maximize the evaluation parameter.
were our consideration to decide the best model. As The best result was obtained from dataset with
shown in Tables 4-6, the highest model performance combination of TW size of 5-s and SW size of 20-frames.
was obtained when 5-s TW size and 20-frame SW size. The model, however, requires further improvement
This indicates that a 5-s TW may contain sufficient because only three classes were identified despite the
information about PA patterns. The loss and accuracy high performance. We plan to continue our study with
curves of training and validation are shown in Figure 4 further observations to be able to identify vigorous PA
and 5 respectively, while the confusion matrix is shown intensity.
in Figure 6. We noticed that our proposed model is able
to distinguish most of the samples with the accuracy for References
sedentary-intensity is the highest, followed by 1 Shephard, R.J.: ‘Absolute versus relative intensity of
moderate-intensity, and light-intensity as the least physical activity in a dose-response context’, Medicine
(Figure 6). & Science in Sports & Exercise, 2001, 33, (6)
2 Janssen, I., and LeBlanc, A.G.: ‘Systematic review of
the health benefits of physical activity and fitness in
Table 7. Single Placement Comparison from the
school-aged children and youth’, International Journal
Best Performance: Test Score of Behavioral Nutrition and Physical Activity, 2010, 7,
Acc. Prec. Recall F1-score (1), pp. 40
Position
[SD] [SD] [SD] [SD] 3 Jette, M., Sidney, K., and Blümchen, G.: ‘Metabolic
DShank .94 [.08] .95 [.07] .95 [.07] .94 [.08] equivalents (METS) in exercise testing, exercise
L5 .94 [.08] .93 [.09] .93 [.09] .93 [.1] prescription, and evaluation of functional capacity’,
LW .93 [.06] .92 [.07] .92 [.07] .92 [.07] Clinical cardiology, 1990, 13, (8), pp. 555-565
RW .94 [.05] .94 [.06] .94 [.06] .94 [.06] 4 de Almeida Mendes, M., da Silva, I., Ramires, V.,
Reichert, F., Martins, R., Ferreira, R., and Tomasi, E.:
Sternum .98 [.01] .98 [.02] .98 [.02] .98 [.01]
‘Metabolic equivalent of task (METs) thresholds as an
indicator of physical activity intensity’, PloS one, 2018,
3.4 Single Sensor Comparison 13, (7), pp. e0200701
We further evaluated the sensor placement location 5 Esliger, D.W., Rowlands, A.V., Hurst, T.L., Catt, M.,
that influenced the model the most. Table 7 reveals that Murray, P., and Eston, R.G.: ‘Validation of the GENEA
each placement accurately distinguished PA intensity Accelerometer’, Medicine & Science in Sports &
since all evaluation parameters are above 0.9. The Exercise, 2011, 43, (6), pp. 1085-1093
accelerometer put on sternum is the most accurate. 6 Cooper, K., Sani, S., Corrigan, L., MacDonald, H.,
Prentice, C., Vareta, R., Massie, S., and Wiratunga, N.:
This probably caused by the movement pattern and the
The 16th International Confernce on Automation Technology (Automation 2019),
November 22-24, 2019, Taipei, Taiwan

‘Accuracy of physical activity recognition from a wrist- activity with a single accelerometer’, Journal of Applied
worn sensor’, Physiotherapy, 2017, 103, pp. e47 Physiology, 2009, 107, (3), pp. 655-661
7 Mannini, A., Intille, S.S., Rosenberger, M., Sabatini, 20 Ellis, K., Kerr, J., Godbole, S., Lanckriet, G., Wing, D.,
A.M., and Haskell, W.: ‘Activity Recognition Using a and Marshall, S.: ‘A random forest classifier for the
Single Accelerometer Placed at the Wrist or Ankle’, prediction of energy expenditure and type of physical
Medicine and Science in Sports and Exercise, 2013, 45, activity from wrist and hip accelerometers’,
(11), pp. 2193-2203 Physiological Measurement, 2014, 35, (11), pp. 2191-
8 Schneller, M.B., Bentsen, P., Nielsen, G., Brond, J.C., 2203
Ried-Larsen, M., Mygind, E., and Schipperijn, J.: 21 Khan, A.M.: ‘Recognizing physical activities using the
‘Measuring Children's Physical Activity: Compliance axivity device’, in Editor (Ed.)^(Eds.): ‘Book
Using Skin-Taped Accelerometers’, Medicine and Recognizing physical activities using the axivity device’
Science in Sports and Exercise, 2017, 49, (6), pp. 1261- (2013, edn.), pp.
1269 22 Kongsvold, A.M.: ‘Validation of the AX3
9 Lv, M., Xu, W., and Chen, T.: ‘A hybrid deep accelerometer for detection of common daily activities
convolutional and recurrent neural network for complex and postures’, NTNU, 2016
activity recognition using multimodal sensors’, 23 Payey, T.G., Gilson, N.D., Gomersall, S.R., Clark, B.,
Neurocomputing, 2019, 362, pp. 33-40 and Trost, S.G.: ‘Field evaluation of a random forest
10 Wang, J., Chen, Y., Hao, S., Peng, X., and Hu, L.: ‘Deep activity classifier for wrist-worn accelerometer data’,
learning for sensor-based activity recognition: A Journal of Science and Medicine in Sport, 2017, 20, (1),
survey’, Pattern Recognition Letters, 2019, 119, pp. 3- pp. 75-80
11 24 Radu, V., Tong, C., Bhattacharya, S., Lane, N.D.,
11 Ordonez, F.J., and Roggen, D.: ‘Deep Convolutional Mascolo, C., Marina, M.K., and Kawsar, F.:
and LSTM Recurrent Neural Networks for Multimodal ‘Multimodal Deep Learning for Activity and Context
Wearable Activity Recognition’, Sensors (Basel), 2016, Recognition’, Proceedings of the ACM on Interactive,
16, (1) Mobile, Wearable and Ubiquitous Technologies, 2018,
12 Vaha-Ypya, H., Vasankari, T., Husu, P., Suni, J., and 1, (4), pp. 1-27
Sievanen, H.: ‘A universal, accurate intensity-based 25 Hammerla, N.Y., Halloran, S., and Plötz, T.: ‘Deep,
classification of different physical activities using raw convolutional, and recurrent models for human activity
data of accelerometer’, Clinical Physiology and recognition using wearables’, arXiv preprint
Functional Imaging, 2015, 35, (1), pp. 64-70 arXiv:1604.08880, 2016
13 Crouter, S.E., Flynn, J.I., and Bassett Jr, D.R.: 26 Vepakomma, P., De, D., Das, S.K., and Bhansali, S.:
‘Estimating physical activity in youth using a wrist ‘A-Wristocracy: Deep learning on wrist-worn sensing
accelerometer’, Medicine and science in sports and for recognition of user complex activities’, in Editor
exercise, 2015, 47, (5), pp. 944 (Ed.)^(Eds.): ‘Book A-Wristocracy: Deep learning on
14 Trost, S.G., Loprinzi, P.D., Moore, R., and Pfeiffer, wrist-worn sensing for recognition of user complex
K.A.: ‘Comparison of accelerometer cut points for activities’ (2015, edn.), pp. 1-6
predicting activity intensity in youth’, Medicine and 27 Chen, Y., and Xue, Y.: ‘A Deep Learning Approach to
science in sports and exercise, 2011, 43, (7), pp. 1360- Human Activity Recognition Based on Single
1368 Accelerometer’, in Editor (Ed.)^(Eds.): ‘Book A Deep
15 Van Loo, C.M., Okely, A.D., Batterham, M.J., Hinkley, Learning Approach to Human Activity Recognition
T., Ekelund, U., Brage, S., Reilly, J.J., Trost, S.G., Based on Single Accelerometer’ (2015, edn.), pp. 1488-
Jones, R.A., and Janssen, X.: ‘Wrist accelerometer cut- 1492
points for classifying sedentary behavior in children’, 28 Zeng, M., Nguyen, L.T., Yu, B., Mengshoel, O.J., Zhu,
Medicine and science in sports and exercise, 2017, 49, J., Wu, P., and Zhang, J.: ‘Convolutional Neural
(4), pp. 813 Networks for human activity recognition using mobile
16 Welch, W.A., Bassett, D.R., Thompson, D.L., Freedson, sensors’, in Editor (Ed.)^(Eds.): ‘Book Convolutional
P.S., Staudenmayer, J.W., John, D., Steeves, J.A., Neural Networks for human activity recognition using
Conger, S.A., Ceaser, T., Howe, C.A., Sasaki, J.E., and mobile sensors’ (2014, edn.), pp. 197-205
Fitzhugh, E.C.: ‘Classification Accuracy of the Wrist- 29 Staudenmayer, J., Pober, D., Crouter, S., Bassett, D.,
Worn Gravity Estimator of Normal Everyday Activity and Freedson, P.: ‘An artificial neural network to
Accelerometer’, Medicine and Science in Sports and estimate physical activity energy expenditure and
Exercise, 2013, 45, (10), pp. 2012-2019 identify physical activity type from an accelerometer’,
17 van Hees, V.T., Golubic, R., Ekelund, U., and Brage, S.: Journal of Applied Physiology, 2009, 107, (4), pp.
‘Impact of study design on development and evaluation 1300-1307
of an activity-type classifier’, Journal of Applied 30 Noor, M.H.M., Salcic, Z., and Wang, K.I.K.: ‘Adaptive
Physiology, 2013, 114, (8), pp. 1042-1051 sliding window segmentation for physical activity
18 Sasaki, J.E., Hickey, A.M., Staudenmayer, J.W., John, recognition using a single tri-axial accelerometer’,
D., Kent, J.A., and Freedson, P.S.: ‘Performance of Pervasive and Mobile Computing, 2017, 38, pp. 41-59
Activity Classification Algorithms in Free-Living Older
Adults’, Medicine and Science in Sports and Exercise,
2016, 48, (5), pp. 941-950
19 Bonomi, A.G., Plasqui, G., Goris, A.H.C., and
Westerterp, K.R.: ‘Improving assessment of daily
energy expenditure by identifying types of physical

View publication stats

You might also like