4.7 Article

Walk as you feel: Privacy preserving emotion recognition from gait patterns

Journal

Publisher

PERGAMON-ELSEVIER SCIENCE LTD
DOI: 10.1016/j.engappai.2023.107565

Keywords

Emotion recognition; Gait analysis; Biometrics; Privacy preserving

Ask authors/readers for more resources

This paper presents a gait-based emotion recognition method that does not rely on facial cues, achieving competitive performance on small and unbalanced datasets. The proposed approach utilizes advanced deep learning architecture and achieves high recognition and accuracy rates.
Emotion recognition from gait has gained significant interest due to its applicability in different fields such as healthcare, social cues, surveillance, and smart applications. Gait, as a biometric trait, offers unique advantages, allowing remote identification and robust recognition even in uncontrolled scenarios. Moreover, gait analysis can provide valuable insights into an individual's emotional state. This work presents the Walk-as-you-Feel(WayF) framework, a novel approach for gait-based emotion recognition that does not rely on facial cues, ensuring user privacy. To address challenges with small and unbalanced datasets, a balancing procedure suitable for deep learning architecture is also developed. Adapted Inception-v3 and EfficientNet are employed for the feature extraction phase. Classification is performed using a Gated Recurrent Units network (GRUs) and Transformers-Encoder. Experimental results demonstrate the competitiveness of the proposed approach with respect to state-of-the-art works which also integrate facial cues. WayF reaches an average recognition rate of approximately 77% in its best configuration. Moreover, when excluding the neutral emotion, the proposed method achieves an outstanding overall accuracy of 83.3%.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available