Abstract

Exoskeletons have decreased physical effort and increased comfort in activities of daily living (ADL) such as walking, squatting, and running. However, this assistance is often activity specific and does not accommodate a wide variety of different activities. To overcome this limitation and increase the scope of exoskeleton application, an automatic human activity recognition (HAR) system is necessary. We developed two deep-learning models for HAR using one-dimensional-convolutional neural network (CNN) and a hybrid model using CNNs and long-short term memory (LSTM). We trained both models using the data collected from a single three-axis accelerometer placed on the chest of ten subjects. We were able to classify five different activities, standing, walking on level ground, walking on an incline, running, and squatting, with an accuracy of 98.1% and 97.8%, respectively. A two subject real-time validation trial was also conducted to validate the real-time applicability of the system. The real-time accuracy was measured at 96.6% and 97.2% for the CNN and the hybrid model, respectively. The high classification accuracy in the test and real-time evaluation suggests that a single sensor could distinguish human activities using machine-learning-based models.

References

1.
Ding
,
Y.
,
Kim
,
M.
,
Kuindersma
,
S.
, and
Walsh
,
C. J.
,
2018
, “
Human-in-the-Loop Optimization of Hip Assistance With a Soft Exosuit During Walking
,”
Sci. Rob.
,
3
(
15
).10.1126/scirobotics.aar5438
2.
Zhang
,
J.
,
Fiers
,
P.
,
Witte
,
K. A.
,
Jackson
,
R. W.
,
Poggensee
,
K. L.
,
Atkeson
,
C. G.
, and
Collins
,
S. H.
,
2017
, “
Human-in-the-Loop Optimization of Exoskeleton Assistance During Walking
,”
Science
, 356(6344), pp.
1280
1284
.10.1126/science.aal5054
3.
Kim
,
J.
,
Lee
,
G.
,
Heimgartner
,
R.
,
Revi
,
D. A.
,
Karavas
,
N.
,
Nathanson
,
D.
,
Galiana
,
I.
, et al.,
2019
, “
Reducing the Metabolic Rate of Walking and Running With a Versatile, Portable Exosuit
,”
Science
,
365
(
6454
), pp.
668
672
.10.1126/science.aav7536
4.
Kim
,
M.
,
Jeong
,
H.
,
Kantharaju
,
P.
,
Yoo
,
D.
,
Jacobson
,
M.
,
Shin
,
D.
,
Han
,
C.
, and
Patton
,
J. L.
,
2022
, “
Visual Guidance Can Help With the Use of a Robotic Exoskeleton During Human Walking
,”
Sci. Rep.
, 12(1), pp.
1
10
.10.1038/s41598-022-07736-w
5.
Kantharaju
,
P.
,
Jeong
,
H.
,
Ramadurai
,
S.
,
Jacobson
,
M.
,
Jeong
,
H.
, and
Kim
,
M.
,
2022
, “
Reducing Squat Physical Effort Using Personalized Assistance From an Ankle Exoskeleton
,”
IEEE Trans. Neural Syst. Rehabil. Eng.
,
30
, pp.
1786
1795
.10.1109/TNSRE.2022.3186692
6.
Housley
,
S. N.
,
Wu
,
D.
,
Richards
,
K.
,
Belagaje
,
S.
,
Ghovanloo
,
M.
, and
Butler
,
A. J.
,
2017
, “
Improving Upper Extremity Function and Quality of Life With a Tongue Driven Exoskeleton: A Pilot Study Quantifying Stroke Rehabilitation
,”
Stroke Res. Treat.
, 2017(19), p.
3603860
.10.1155/2017/3603860
7.
Bryan
,
G. M.
,
Franks
,
P. W.
,
Song
,
S.
,
Voloshina
,
A. S.
,
Reyes
,
R.
,
ODonovan
,
M. P.
,
Gregorczyk
,
K. N.
, and
Collins
,
S. H.
, “
Optimized Hip–Knee–Ankle Exoskeleton Assistance at a Range of Walking Speeds
,”
J. Neuroeng. Rehabil.
, 18(1), pp.
1
12
.10.1186/s12984-021-00943-y
8.
Zhou
,
L.
,
Chen
,
W.
,
Chen
,
W.
,
Bai
,
S.
,
Zhang
,
J.
, and
Wang
,
J.
,
2020
, “
Design of a Passive Lower Limb Exoskeleton for Walking Assistance With Gravity Compensation
,”
Mech. Mach. Theory.
, 150(
2020
), Article No. 103840.10.1016/j.mechmachtheory.2020.103840
9.
Jacobson
,
M.
,
Kantharaju
,
P.
,
Jeong
,
H.
,
Ryu
,
J.-K.
,
Park
,
J.-J.
,
Chung
,
H.-J.
, and
Kim
,
M.
,
2022
, “
Foot Contact Forces Can Be Used to Personalize a Wearable Robot During Human Walking
,”
Sci. Rep.
, 12(1), pp.
1
12
.10.1038/s41598-022-14776-9
10.
Elliott
,
G.
,
Marecki
,
A.
, and
Herr
,
H.
,
2014
, “
Design of a Clutch-Spring Knee Exoskeleton for Running
,”
ASME J. Med. Dev.
,
8
(
3
), p.
031002
.10.1115/1.4027841
11.
Sado
,
F.
,
Yap
,
H. J.
,
Ghazilla
,
R. A. R.
, and
Ahmad
,
N.
,
2019
, “
Design and Control of a Wearable Lower-Body Exoskeleton for Squatting and Walking Assistance in Manual Handling Works
,”
Mechatronics
,
63
, p.
102272
.10.1016/j.mechatronics.2019.102272
12.
Yong
,
X.
,
Yan
,
Z.
,
Wang
,
C.
,
Wang
,
C.
,
Li
,
N.
, and
Wu
,
X.
, 2019. “
Ergonomic Mechanical Design and Assessment of a Waist Assist Exoskeleton for Reducing Lumbar Loads During Lifting Task
,”
Micromachines
, 10(7), p.
463
.10.3390/mi10070463
13.
Medrano
,
R. L.
,
Thomas
,
G. C.
,
Keais
,
C. G.
,
Rouse
,
E. J.
, and
Gregg
,
R. D.
,
2022
, “
Real-Time Gait Phase and Task Estimation for Controlling a Powered Ankle Exoskeleton on Extremely Uneven Terrain
,”
arXiv preprint arXiv:2205.00155
.10.48550/arXiv.2205.00155
14.
Feichtenhofer
,
C.
,
Fan
,
H.
,
Malik
,
J.
, and
He
,
K.
,
2019
, “
SlowFast Networks for Video Recognition
,”
Proceedings of the IEEE/CVF International Conference on Computer Vision
, Seoul, Korea, Oct. 27–28, pp.
6202
6211
.https://openaccess.thecvf.com/content_ICCV_2019/papers/Feichtenhofer_SlowFast_Networks_for_Video_Recognition_ICCV_2019_paper.pdf
15.
Park
,
S. U.
,
Park
,
J. H.
,
Al-Masni
,
M. A.
,
Al-Antari
,
M. A.
,
Uddin
,
M. Z.
, and
Kim
,
T. S.
,
2016
, “
A Depth Camera-Based Human Activity Recognition Via Deep Learning Recurrent Neural Network for Health and Social Care Services
,”
Procedia Comput. Sci.
,
100
, pp.
78
84
.10.1016/j.procs.2016.09.126
16.
Kim
,
Y. W.
,
Joa
,
K. L.
,
Jeong
,
H. Y.
, and
Lee
,
S.
,
2021
, “
Wearable IMU-Based Human Activity Recognition Algorithm for Clinical Balance Assessment Using 1D-CNN and GRU Ensemble Model
,”
Sensors
, 21(22), p.
7628
.10.3390/s21227628
17.
Bai
,
L.
,
Yao
,
L.
,
Wang
,
X.
,
Kanhere
,
S. S.
, and
Xiao
,
Y.
,
2020
, “
Prototype Similarity Learning for Activity Recognition
,”
Advances in Knowledge Discovery and Data Mining12084
, pp.
649
661
.10.1007/978-3-030-47426-3_50
18.
Sanz-Pena
,
I.
,
Blanco
,
J.
, and
Kim
,
J. H.
,
2021
, “
Computer Interface for Real-Time Gait Biofeedback Using a Wearable Integrated Sensor System for Data Acquisition
,”
IEEE Trans. Human-Mach. Syst.
, 51(5), pp.
484
493
.10.1109/THMS.2021.3090738
19.
Hussain
,
Z.
,
Sheng
,
Q. Z.
, and
Zhang
,
W. E.
,
2020
, “
A Review and Categorization of Techniques on Device-Free Human Activity Recognition
,”
J. Network Comput. Appl.
,
167
, p.
102738
.10.1016/j.jnca.2020.102738
20.
D'Sa
,
A. G.
, and
Prasad
,
B.
,
2019
, “
A Survey on Vision Based Activity Recognition, Its Applications and Challenges
,” 2019 Second International Conference on Advanced Computational and Communication Paradigms (
ICACCP
),
IEEE
, Gangtok, India, Feb. 25–28, pp.
1
8
.10.1109/ICACCP.2019.8882896
21.
Qamar
,
N.
,
Siddiqui
,
N.
,
Ehatisham-Ul-Haq
,
M.
,
Azam
,
M. A.
, and
Naeem
,
U.
,
2020
, “
An Approach Towards Position-Independent Human Activity Recognition Model Based on Wearable Accelerometer Sensor
,”
Procedia Comput. Sci.
,
177
, pp.
196
203
.10.1016/j.procs.2020.10.028
22.
Janidarmian
,
M.
,
Fekr
,
A. R.
,
Radecka
,
K.
, and
Zilic
,
Z.
,
2017
, “
A Comprehensive Analysis on Wearable Acceleration Sensors in Human Activity Recognition
,”
Sensors
, 17(3), p.
529
.10.3390/s17030529
23.
Lara
,
O. D.
, and
Labrador
,
M. A.
,
2013
, “
A Survey on Human Activity Recognition Using Wearable Sensors
,”
IEEE Commun. Surv. Tutorials
,
15
(
3
), pp.
1192
1209
.10.1109/SURV.2012.110112.00192
24.
Wang
,
J.
,
Chen
,
Y.
,
Hao
,
S.
,
Peng
,
X.
, and
Hu
,
L.
,
2017
, “
Deep Learning for Sensor-Based Activity Recognition: A Survey
,”
Pattern Recognit. Lett.
, 119(7), pp.
3
11
.10.1016/j.patrec.2018.02.010
25.
Ramamurthy
,
S. R.
,
Roy
,
N.
, and
Ramamurthy
,
C. S. R.
,
2018
, “
Recent Trends in Machine Learning for Human Activity Recognition - A Survey
,”
Wiley Interdiscip. Rev.: Data Min. Knowl. Discov.
, 2018(8), p. e1254.https://mdsoar.org/bitstream/handle/11603/8781/Royrecent-trends-machine%20%281%29.pdf?sequence=1&isAllowed=y
26.
Nweke
,
H. F.
,
Teh
,
Y. W.
,
Al-Garadi
,
M. A.
, and
Alo
,
U. R.
,
2018
, “
Deep Learning Algorithms for Human Activity Recognition Using Mobile and Wearable Sensor Networks: State of the Art and Research Challenges
,”
Expert Syst. Appl.
, 105(9), pp.
233
261
.10.1016/j.eswa.2018.03.056
27.
Lee
,
S. M.
,
Yoon
,
S. M.
, and
Cho
,
H.
,
2017
, “
Human Activity Recognition From Accelerometer Data Using Convolutional Neural Network
,”
2017 IEEE International Conference on Big Data and Smart Computing, BigComp 2017
, Jeju, Korea,
Feb. 13–16, pp. 131–134
.10.1109/BIGCOMP.2017.7881728
28.
Cho
,
H.
, and
Yoon
,
S. M.
,
2018
, “
Divide and ConquerBased 1D CNN Human Activity Recognition Using Test Data Sharpening
,”
Sensors
, 18(4), p.
1055
.10.3390/s18041055
29.
Horn
,
Z. C.
,
Auret
,
L.
,
McCoy
,
J. T.
,
Aldrich
,
C.
, and
Herbst
,
B. M.
,
2017
, “
Performance of Convolutional Neural Networks for Feature Extraction in Froth Flotation Sensing
,”
IFAC-PapersOnLine
,
50
(
2
), pp.
13
18
.10.1016/j.ifacol.2017.12.003
30.
Cruciani
,
F.
,
Vafeiadis
,
A.
,
Nugent
,
C.
,
Cleland
,
I.
,
Mccullagh
,
P.
,
Votis
,
K.
,
Giakoumis
,
D.
,
Tzovaras
,
D.
,
Chen
,
L.
, and
Hamzaoui
,
R.
,
2020
, “
Feature Learning for Human Activity Recognition Using Convolutional Neural Networks a Case Study for Inertial Measurement Unit and Audio Data
,”
CCF Trans. Pervasive Comput. Interact.
,
2
(
1
), pp.
18
32
.10.1007/s42486-020-00026-2
31.
Chung
,
S.
,
Lim
,
J.
,
Noh
,
K. J.
,
Kim
,
G.
, and
Jeong
,
H.
,
2019
, “
Sensor Data Acquisition and Multimodal Sensor Fusion for Human Activity Recognition Using Deep Learning
,”
Sensors
, 19(7), p.
1716
.10.3390/s19071716
32.
Sansano
,
E.
,
Montoliu
,
R.
, and
Belmonte Fernández
,
Ó.
,
2020
, “
A Study of Deep Neural Networks for Human Activity Recognition
,”
Comput. Intell.
,
36
(
3
), pp.
1113
1139
.10.1111/coin.12318
33.
Madan
,
R.
, and
Mangipudi
,
P. S.
,
2018
, “
Predicting Computer Network Traffic: A Time Series Forecasting Approach Using Dwt, Arima and Rnn
,” 2018 Eleventh International Conference on Contemporary Computing (
IC3
), Noida, India, Aug. 2–4, pp.
1
5
.10.1109/IC3.2018.8530608
34.
Hochreiter
,
S.
, and
Schmidhuber
,
J.
,
1997
, “
Long Short-Term Memory
,”
Neural Computation
,
9
(
8
), pp.
1735
1780
.10.1162/neco.1997.9.8.1735
35.
Lyu
,
Q.
, and
Zhu
,
J.
,
2014
, “
Revisit Long Short-Term Memory: An Optimization Perspective
,” Deep Learning and Representation Learning Workshop (
NIPS 2014
), Montreal, QC, Canada, Dec. 8–13, pp.
1
9
.https://ml.cs.tsinghua.edu.cn/~jun/pub/lstmparallel.pdf
36.
Britz
,
D.
,
Goldie
,
A.
,
Luong
,
M.
, and
Le
,
Q. V.
,
2017
, “
Massive Exploration of Neural Machine Translation Architectures
,”
CoRR.
, abs/1703.03906.10.48550/arXiv.1703.03906
37.
Greff
,
K.
,
Srivastava
,
R. K.
,
Koutnik
,
J.
,
Steunebrink
,
B. R.
, and
Schmidhuber
,
J.
,
2017
, “
Lstm: A Search Space Odyssey
,”
IEEE Trans. Neural Networks Learn. Syst.
,
28
(
10
), pp.
2222
2232
.10.1109/TNNLS.2016.2582924
38.
Lindemann
,
B.
,
Müller
,
T.
,
Vietz
,
H.
,
Jazdi
,
N.
, and
Weyrich
,
M.
,
2021
, “
A Survey on Long Short-Term Memory Networks for Time Series Prediction
,”
Procedia CIRP
,
99
, pp.
650
655
.10.1016/j.procir.2021.03.088
39.
Gopali
,
S.
,
Abri
,
F.
,
Siami-Namini
,
S.
, and
Namin
,
A. S.
,
2021
, “
A Comparison of Tcn and Lstm Models in Detecting Anomalies in Time Series Data
,” 2021 IEEE International Conference on Big Data (
Big Data
), Orlando, FL, Dec. 15–18, pp.
2415
2420
.10.1109/BigData52589.2021.9671488
40.
Xia
,
K.
,
Huang
,
J.
, and
Wang
,
H.
,
2020
, “
LSTM-CNN Architecture for Human Activity Recognition
,”
IEEE Access
, 8, pp.
56855
56866
.10.1109/ACCESS.2020.2982225
41.
Tang
,
Y.
,
Teng
,
Q.
,
Zhang
,
L.
,
Min
,
F.
, and
He
,
J.
,
2021
, “
Layer-Wise Training Convolutional Neural Networks With Smaller Filters for Human Activity Recognition Using Wearable Sensors
,”
IEEE Sens. J.
, 21(1), pp.
581
592
.10.1109/JSEN.2020.3015521
42.
Singh
,
S. P.
,
Lay-Ekuakille
,
A.
,
Gangwar
,
D.
,
Sharma
,
M. K.
, and
Gupta
,
S.
,
2020
, “
Deep ConvLSTM With Self-Attention for Human Activity Decoding Using Wearables
,”
IEEE Sens. J.
, 21(
6
), pp.
8575
8582
.10.1109/JSEN.2020.3045135
43.
Hysenllari
,
E.
,
Ottenbacher
,
J.
, and
McLennan
,
D.
,
2022
, “
Validation of Human Activity Recognition Using Aconvolutional Neural Network on Accelerometer and Gyroscope Data
,”
German J. Exercise Sport Res.
,
52
(
2
), pp.
248
252
.10.1007/s12662-022-00817-y
44.
Challa
,
S. K.
,
Kumar
,
A.
, and
Semwal
,
V. B.
,
2021
, “
A Multibranch CNN-BiLSTM Model for Human Activity Recognition Using Wearable Sensor Data
,”
Visual Comput.
, 1(8), pp.
1
15
.10.1007/s00371-021-02283-3
45.
Tang
,
Y.
,
Zhang
,
L.
,
Min
,
F.
, and
He
,
J.
,
2022
, “
Multi-Scale Deep Feature Learning for Human Activity Recognition Using Wearable Sensors
,”
IEEE Trans. Ind. Electron.
, 70(2), pp.
2106
2116
.10.1109/TIE.2022.3161812
46.
Huang
,
W.
,
Zhang
,
L.
,
Gao
,
W.
,
Min
,
F.
, and
He
,
J.
,
2021
, “
Shallow Convolutional Neural Networks for Human Activity Recognition Using Wearable Sensors
,”
IEEE Trans. Instrum. Meas.
,
70
, pp.
1
11
.10.1109/TIM.2021.3091990
47.
Cheng
,
X.
,
Zhang
,
L.
,
Tang
,
Y.
,
Liu
,
Y.
,
Wu
,
H.
, and
He
,
J.
,
2020
, “
Real-Time Human Activity Recognition Using Conditionally Parametrized Convolutions on Mobile and Wearable Devices
,”
IEEE Sens. J.
, 22(6), pp.
5889
5901
.10.1109/JSEN.2022.3149337
48.
Wan
,
S.
,
Qi
,
L.
,
Xu
,
X.
,
Tong
,
C.
, and
Gu
,
Z.
,
2020
, “
Deep Learning Models for Real-Time Human Activity Recognition With Smartphones
,”
Mobile Networks Appl.
,
25
(
2
), pp.
743
755
.10.1007/s11036-019-01445-x
49.
Ronao
,
C. A.
, and
Cho
,
S. B.
,
2016
, “
Human Activity Recognition With Smartphone Sensors Using Deep Learning Neural Networks
,”
Expert Syst. Appl.
,
59
, pp.
235
244
.10.1016/j.eswa.2016.04.032
50.
Chen
,
Y.
, and
Xue
,
Y.
,
2016
, “
A Deep Learning Approach to Human Activity Recognition Based on Single Accelerometer
,”
Proceedings - 2015 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2015
, Hong Kong, China, Oct. 9–12, pp.
1488
1492
.10.1109/SMC.2015.263
51.
Mutegeki
,
R.
, and
Han
,
D. S.
,
2020
, “
A CNN-LSTM Approach to Human Activity Recognition
,”
2020 International Conference on Artificial Intelligence in Information and Communication, ICAIIC 2020
, Fukuoka, Japan, Feb. 19–21.10.1109/ICAIIC48513.2020.9065078
52.
Mukherjee
,
D.
,
Mondal
,
R.
,
Singh
,
P. K.
,
Sarkar
,
R.
, and
Bhattacharjee
,
D.
,
2020
, “
EnsemConvNet: A Deep Learning Approach for Human Activity Recognition Using Smartphone Sensors for Healthcare Applications
,”
Multimedia Tools Appl.
,
79
(
41–42
), pp.
31663
31690
.10.1007/s11042-020-09537-7
53.
Lv
,
T.
,
Wang
,
X.
,
Jin
,
L.
,
Xiao
,
Y.
, and
Song
,
M.
,
2020
, “
Margin-Based Deep Learning Networks for Human Activity Recognition
,”
Sensors
, 20(7), p.
1871
.10.3390/s20071871
54.
Su
,
T.
,
Sun
,
H.
,
Ma
,
C.
,
Jiang
,
L.
, and
Xu
,
T.
,
2019
, “
HDL: Hierarchical Deep Learning Model Based Human Activity Recognition Using Smartphone Sensors
,”
Proceedings of the International Joint Conference on Neural Networks
, Vol.
7
, Budapest, Hungary, July 14–19, pp.
1
8
.10.1109/IJCNN.2019.8851889
55.
Gumaei
,
A.
,
Hassan
,
M. M.
,
Alelaiwi
,
A.
, and
Alsalman
,
H.
,
2019
, “
A Hybrid Deep Learning Model for Human Activity Recognition Using Multimodal Body Sensing Data
,”
IEEE Access
,
7
, pp.
99152
99160
.10.1109/ACCESS.2019.2927134
56.
Bo Yang
,
J.
,
Nhut Nguyen
,
M.
,
Phyo San
,
P.
,
Li Li
,
X.
, and
Krishnaswamy
,
S.
,
2015
, “
Deep Convolutional Neural Networks on Multichannel Time Series for Human Activity Recognition
,”
IJCAI
, AAAI Press, Buenos Aires, Argentina, pp.
3995
4001
.https://www.ijcai.org/Proceedings/15/Papers/561.pdf
57.
Hreljac
,
A.
,
Imamura
,
R. T.
,
Escamilla
,
R. F.
, and
Edwards
,
W. B.
,
2007
, “
When Does a Gait Transition Occur During Human Locomotion?
,”
J. Sports Sci. Med.
,
6
(
3
), pp.
36
43
.https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3778697/
58.
Jones
,
G. D.
,
James
,
D. C.
,
Thacker
,
M.
, and
Green
,
D. A.
,
2018
, “
Parameters That Remain Consistent Independent of Pausing Before Gait-Initiation During Normal Rise-to-Walk Behaviour Delineated by Sit-Towalk and Sit-to-Stand-and-Walk
,”
PLoS One
,
13
(
10
), p.
e0205346
.10.1371/journal.pone.0205346
59.
Iwana
,
B. K.
, and
Uchida
,
S.
,
2021
, “
An Empirical Survey of Data Augmentation for Time Series Classification With Neural Networks
,”
PLoS ONE
, 16(7), pp.
1
32
.10.1371/journal.pone.0254841
60.
Wen
,
Q.
,
Sun
,
L.
,
Yang
,
F.
,
Song
,
X.
,
Gao
,
J.
,
Wang
,
X.
, and
Xu
,
H.
,
2020
, “
Time Series Data Augmentation for Deep Learning: A Survey
,”
IJCAI
, pp.
4653
4660
.10.24963/ijcai.2021/631
61.
Eyobu
,
O. S.
, and
Han
,
D. S.
,
2018
, “
Feature Representation and Data Augmentation for Human Activity Classification Based on Wearable IMU Sensor Data Using a Deep LSTM Neural Network
,”
Sensors
, 18(9), p.
2892
.10.3390/s18092892
62.
Girshick
,
R.
,
Donahue
,
J.
,
Darrell
,
T.
, and
Malik
,
J.
,
2014
, “
Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation
,” Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, June 23–28, pp.
580
587
.
63.
Li
,
F.
,
Shirahama
,
K.
,
Nisar
,
M. A.
,
Koping
,
L
¨., and
Grzegorzek
,
M.
,
2018
, “
Comparison of Feature Learning Methods for Human Activity Recognition Using Wearable Sensors
,”
Sensors
, 18(2), p.
679
.10.3390/s18020679
64.
Chan
,
S. C.
,
Chia
,
J. W.
, and
Tan
,
Y. Q.
,
2018
, “
Kinetic and Kinematic Impacts During Level Walking, Uphill Walking, Level Running and Uphill Running
,”
2018 IEEE 16th Student Conference on Research and Development, SCOReD
, Selangor, Malaysia, Nov. 26–28, pp.
1
4
.10.1109/SCORED.2018.8710809
65.
Anguita
,
D.
,
Ghio
,
A.
,
Oneto
,
L.
,
Parra
,
X.
, and
ReyesOrtiz
,
J. L.
,
2012
, “
Human Activity Recognition on Smartphones Using a Multiclass Hardware-Friendly Support Vector Machine
,”
Ambient Assisted Living andHome Care: 4th International Workshop
, IWAAL2012, Vitoria-Gasteiz, Spain, Dec. 3–5, pp. 216–223.10.1007/978-3-642-35395-6_30
66.
Reiss
,
A.
, and
Stricker
,
D.
,
2012
, “
Introducing a New Benchmarked Dataset for Activity Monitoring
,”
Proceedings of International Symposium on Wearable Computers, ISWC,
Newcastle, UK, June 18–22, pp.
108
109
.10.1109/ISWC.2012.13
67.
Reyes-Ortiz
,
J.-L.
,
Oneto
,
L.
,
Sama
,
A.
,
Parra
,
X.
, and
Anguita
,
D.
,
2016
, “
Transition-Aware Human Activity Recognition Using Smartphones
,”
Neurocomputing
, 171, pp.
754
767
.10.1016/j.neucom.2015.07.085
You do not currently have access to this content.