GenReGait: Gender Recognition using Gait Features

Main Article Content

Yue Fong Ti
Tee Connie
Michael Kah Ong Goh

Abstract

Gender recognition based on gait features has gained significant interest due to its wide range of applications in various fields. This paper proposes GenReGait, a robust method for gender recognition utilizing gait features. Gait, the unique walking pattern of individuals, contains distinct gender-specific characteristics, such as stride length, step frequency, and body posture, making it a promising modality for gender estimation. The proposed GenReGait method begins by extracting landmark positions on the human body using a human keypoint estimation technique. These landmarks serve as informative cues for estimating gender based on their spatial and temporal characteristics. However, environmental factors can impact gait patterns and introduce fluctuations in landmark points, affecting the accuracy of gender estimation. To overcome this challenge, GenReGait introduces a robust preprocessing technique known as Weighted Exponential Moving Average to smoothen the gait signals and reduce noise caused by environmental factors. The smoothed signals are then fed into a deep learning network trained to perform gender estimation based on the gait features extracted from the landmark positions. By leveraging deep learning algorithms, the proposed GenReGait method effectively captures complex patterns and relationships within the gait features, enhancing the accuracy and reliability of gender recognition. Experimental evaluations conducted on the Gait in the Wild dataset and a self-collected dataset validate the robustness and effectiveness of the proposed GenReGait approach.

Article Details

How to Cite
Ti, Y. F. ., Connie, T., & Goh, M. K. O. . (2023). GenReGait: Gender Recognition using Gait Features. Journal of Informatics and Web Engineering, 2(2), 129–140. https://doi.org/10.33093/jiwe.2023.2.2.10
Section
Regular issue

References

M. Xin, L. W. Ang, and S. Palaniappan, “A Data Augmented Method for Plant Disease Leaf Image Recognition based on Enhanced GAN Model Network,” Journal of Informatics and Web Engineering, vol. 2, no. 1, Art. no. 1, Mar. 2023, doi: 10.33093/jiwe.2023.2.1.1.

L. Lee and W. E. L. Grimson, “Gait analysis for recognition and classification,” in Fifth IEEE International Conference on Automatic Face and Gesture Recognition, 2002. Proceedings, May 2002, pp. 148–155. doi: 10.1109/AFGR.2002.1004148.

Y. Makihara, H. Mannami, and Y. Yagi, “Gait Analysis of Gender and Age Using a Large-Scale Multi-view Gait Database,” in Computer Vision – ACCV 2010, R. Kimmel, R. Klette, and A. Sugimoto, Eds., in Lecture Notes in Computer Science. Berlin, Heidelberg: Springer, 2011, pp. 440–451. doi: 10.1007/978-3-642-19309-5_34.

Y. Wang and M. Yu, “A Study for Gender Classification Based on Gait via Incorporating Spatial and Temporal Feature Matrix,” in 2013 International Conference on Computational and Information Sciences, Jun. 2013, pp. 1701–1704. doi: 10.1109/ICCIS.2013.444.

A. Wazzeh, R. G. Birdal, and A. Sertbas, “Human Gait Based Gender Detection Using Light CNN with Max Feature Map,” in 2019 4th International Conference on Computer Science and Engineering (UBMK), Sep. 2019, pp. 1–4. doi: 10.1109/UBMK.2019.8907174.

P. Nithyakani, A. Shanthini, and G. Ponsam, “Human Gait Recognition using Deep Convolutional Neural Network,” in 2019 3rd International Conference on Computing and Communications Technologies (ICCCT), Feb. 2019, pp. 208–211. doi: 10.1109/ICCCT2.2019.8824836.

K. T. Thomas and K. P. Pushpalatha, “Deep Learning-based Gender Recognition Using Fusion of Texture Features from Gait Silhouettes,” in Data Science and Security, S. Shukla, X.-Z. Gao, J. V. Kureethara, and D. Mishra, Eds., in Lecture Notes in Networks and Systems. Singapore: Springer Nature, 2022, pp. 153–165. doi: 10.1007/978-981-19-2211-4_13.

J. Deng, S. Bei, S. Shaojing, T. Xiaopeng, and Z. Zhen, “Gender Recognition via Fused CNN of Separated GEI,” in 2020 IEEE 4th Information Technology, Networking, Electronic and Automation Control Conference (ITNEC), Jun. 2020, pp. 2032–2037. doi: 10.1109/ITNEC48623.2020.9084792.

H.-S. Fang et al., “AlphaPose: Whole-Body Regional Multi-Person Pose Estimation and Tracking in Real-Time.” arXiv, Nov. 07, 2022. doi: 10.48550/arXiv.2211.03375.

R. Jin, Z. Chen, K. Wu, M. Wu, X. Li, and R. Yan, “Bi-LSTM-Based Two-Stream Network for Machine Remaining Useful Life Prediction,” IEEE Transactions on Instrumentation and Measurement, vol. 71, pp. 1–10, 2022, doi: 10.1109/TIM.2022.3167778.

Z. Zeng, J. Sun, Z. Han, and W. Hong, “Radar HRRP Target Recognition Method Based on Multi-Input Convolutional Gated Recurrent Unit With Cascaded Feature Fusion,” IEEE Geoscience and Remote Sensing Letters, vol. 19, pp. 1–5, 2022, doi: 10.1109/LGRS.2022.3192289.

K. Hasan, M. M. Othman, S. T. Meraj, S. Mekhilef, and A. F. B. Abidin, “Shunt Active Power Filter Based on Savitzky-Golay Filter: Pragmatic Modelling and Performance Validation,” IEEE Transactions on Power Electronics, vol. 38, no. 7, pp. 8838–8850, Jul. 2023, doi: 10.1109/TPEL.2023.3258457.

N. D. Nagahawatte, L. K. Cheng, R. Avci, L. R. Bear, and N. Paskaranandavadivel, “A generalized framework for pacing artifact removal using a Hampel filter,” in 2022 44th Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Jul. 2022, pp. 2009–2012. doi: 10.1109/EMBC48229.2022.9871096.