4.6 Article

Attention-Based Residual BiLSTM Networks for Human Activity Recognition

Journal

IEEE ACCESS
Volume 11, Issue -, Pages 94173-94187

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/ACCESS.2023.3310269

Keywords

Human activity recognition; Wearable sensors; wearable sensors; attention mechanism; residual block

Ask authors/readers for more resources

Human Activity Recognition (HAR) commonly utilizes wearable sensors to analyze time series data and recognize specific actions. Existing approaches face challenges in differentiating similar actions when employing a combination of convolutional and recurrent neural networks. To improve recognition accuracy, integrating a residual structure and layer normalization into a bidirectional long short-term memory network (BLSTM) is proposed here. Experimental results on three public datasets (UCI-HAR, WISDM, and KU-HAR) demonstrate significant overall recognition accuracies of 98.37%, 99.01%, and 97.89% respectively, validating the effectiveness of this method. The codebase implementing the described framework can be found at: https://github.com/lyh0625/1DCNN-ResBLSTM-Attention.
Human activity recognition (HAR) commonly employs wearable sensors to identify and analyze the time series data collected by them, enabling the recognition of specific actions. However, the current fusion of convolutional and recurrent neural networks in existing approaches encounters difficulties when it comes to differentiating between similar actions. To enhance the recognition accuracy of similar actions, we suggest integrating the residual structure and layer normalization into a bidirectional long short-term memory network (BLSTM). This integration enhances the network's feature extraction capabilities, introduces an attention mechanism to optimize the final feature information, and ultimately improves the accuracy and stability of activity recognition. To validate the effectiveness of our approach, we extensively tested it on three public datasets: UCI-HAR, WISDM, and KU-HAR. The results were highly encouraging, achieving remarkable overall recognition accuracies of 98.37%, 99.01%, and 97.89% for the respective datasets. The experimental results demonstrate that this method effectively enhances the recognition accuracy of similar behaviors. A codebase implementing the described framework is available at: https://github.com/lyh0625/1DCNN-ResBLSTM-Attention.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available