Temporal Convolutional Recurrent Neural Network for Elderly Activity Recognition Manuscript Received: 18 April 2024, Accepted: 10 June 2024, Published: 15 September2024, ORCiD: 0000-0002-3781-6623, https://doi.org/10.33093/jetap.2024.6.2.12

Main Article Content

Jia Hui Ng
Ying Han Pang
Sarmela Raja Sekaran
Shih Yin Ooi
Lillian Yee Kiaw Wang

Abstract

Research on smartphone-based human activity recognition (HAR) is prevalent in the field of healthcare, especially for elderly activity monitoring. Researchers usually propose to use of accelerometers, gyroscopes or magnetometers that are equipped in smartphones as an individual sensing modality for human activity recognition. However, any of these alone is limited in capturing comprehensive movement information for accurate human activity analysis. Thus, we propose a smartphone-based HAR approach by leveraging the inertial signals captured by these three sensors to classify human activities. These heterogeneous sensors deliver information on various aspects of nature, motion and orientation, offering a richer set of features for more accurate representations of the activities. Hence, a deep learning approach that amalgamates long short-term memory (LSTM) in temporal convolutional network (TCN) is proposed. We use independent temporal convolutional networks, coined as temporal convolutional streams, to independently analyse the temporal data of each sensing modality. We name this architecture multi-stream TC-LSTM. The performance of multi-stream TC-LSTM is assessed on the self-collected elderly activity database. Empirical results exhibit that multi-stream TC-LSTM outperforms the existing machine learning and deep learning models, with an F1 score of 98.3 %.

Article Details

Section
Articles

References

V. Soni, H. Yadav, V. B. Semwal, B. Roy, D. K. Choubey and D. K. Mallick, “A Novel Smartphone-Based Human Activity Recognition Using Deep Learning in Health care,” Lect. Notes Electr. Eng., vol. 946, pp. 493–503, 2023.

O. Pavliuk, M. Mishchuk and C. Strauss, “Transfer Learning Approach for Human Activity Recognition Based on Continuous Wavelet Transform,” Algorithms, vol. 16(2), no. 77, 2023.

S. Mekruksavanich and A. Jitpattanakul, “CNN-Based Deep Learning Network for Human Activity Recognition During Physical Exercise from Accelerometer and Photoplethysmographic Sensors,” Lect. Notes Data Eng. Commun. Technol., vol. 117, pp. 531–542, 2022.

G. Sebestyen, I. Stoica and A. Hangan, “Human activity recognition and monitoring for elderly people,” in Proc. - 2016 IEEE 12th Int. Conf. Intell. Comput. Commun. Process., Cluj-Napoca, Romania, pp. 341–347, 2016.

H. Cho and S. M. Yoon, “Divide and Conquer-Based 1D CNN Human Activity Recognition Using Test Data Sharpening,” Sensors 2018, vol. 18, no. 4, pp. 1055, 2018.

L. Minh Dang, K. Min, H. Wang, M. Jalil Piran, C. Hee Lee and H. Moon, “Sensor-based and Vision-based Human Activity Recognition: A Comprehensive Survey,” Pattern Recognit., vol. 108, pp. 107561, 2020.

R. A. Voicu, C. Dobre, L. Bajenaru and R. I. Ciobanu, “Human Physical Activity Recognition using Smartphone Sensors,” Sensors, vol. 19, no. 3, pp. 458, 2019.

D. Gholamiangonabadi, N. Kiselov and K. Grolinger, “Deep Neural Networks for Human Activity Recognition with Wearable Sensors: Leave-One-Subject-Out Cross-Validation for Model Selection,” IEEE Access, vol. 8, pp. 133982–133994, 2020.

F. Hernández, L. F. Suárez, J. Villamizar and M. Altuve, “Human Activity Recognition on Smartphones Using A Bidirectional LSTM Network,” in 2019 22nd Symp. Image, Signal Process. Artif. Vision, Bucaramanga, Colombia, pp. 1–5, 2019.

C. Kok, V. Jahmunah, S. L. Oh, X. Zhou, R. Gururajan, X. Tao, K. H. Cheong, R. Gururajan, F. Molinari and U. R. Acharya, “Automated Prediction of Sepsis using Temporal Convolutional Network,” Comput. Biol. Med., vol. 127(C), pp. 103957, 2020.

N. Nair, C. Thomas and D. B. Jayagopi, “Human Activity Recognition Using Temporal Convolutional Network,” in Proc. 5th Int. Workshop on Sensor-based Activ. Recogn. and Interact., no. 17, pp. 1-8, 2018.

Y. W. Kim, K. L. Joa, H. Y. Jeong and S. Lee, “Wearable IMU-Based Human Activity Recognition Algorithm for Clinical Balance Assessment using 1D-CNN and GRU Ensemble Model,” Sensors 2021, vol. 21, no. 22, pp. 7628, 2021.

K. Chen, D. Zhang, L. Yao, B. Guo, Z. Yu and Y. Liu, “Deep Learning for Sensor-based Human Activity Recognition,” ACM Comput. Surv., vol. 54(4), no. 77, pp. 1-40, 2021.

A. Michele, G. Biagetti, P. Crippa, L. Falaschetti, and C. Turchetti, "Recurrent Neural Network for Human Activity Recognition in Embedded Systems using PPG and Accelerometer Data." Electronics, vol. 10, no. 14, pp. 1715, 2021.

S. Jameer and H. Syed, “Deep SE-BiLSTM with IFPOA Fine-Tuning for Human Activity Recognition using Mobile and Wearable Sensors,” Sensors 2023, vol. 23, no. 9, pp. 4319, 2023.

S. Raja Sekaran, P. Y. Han and O. S. Yin, “Smartphone-based Human Activity Recognition Using Lightweight Multiheaded Temporal Convolutional Network,” Expert Syst. Appl., vol. 227(C), pp. 120132, 2023.

L. Trinh and B. Ha, “An Incorporation of Deep Temporal Convolutional Networks with Hidden Markov Models Post-processing for Sensor-based Human Activity Recognition,” in Proc. 11th Int. Sym. Inform. and Commun. Technol., pp. 96–102, 2022.

Y. He and J. Zhao, “Temporal Convolutional Networks for Anomaly Detection in Time Series,” J. Phys. Conf. Ser., vol. 1213, no. 4, pp. 042050, 2019.