4.6 Article

A hybrid deep convolutional and recurrent neural network for complex activity recognition using multimodal sensors

Journal

NEUROCOMPUTING
Volume 362, Issue -, Pages 33-40

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2019.06.051

Keywords

Complex activity; Multimodal sensors; Convolutional neural network; Recurrent neural network

Funding

  1. Zhejiang Provincial Natural Science Foundation of China [LY18F020033]
  2. National Natural Science Foundation of China [U1509214, 61772026]

Ask authors/readers for more resources

Complex activities refer to users' activities performed in their daily lives (e.g., having dinner, shopping, etc.). Complex activity recognition is a valuable issue in wearable and mobile computing. The time-series sensory data from multimodal sensors have sophisticated relationships to characterize the complex activities (e.g., intra-sensor relationships, inter-sensor relationships, and temporal relationships), making the traditional methods based on manually designed features ineffective. To this end, we propose HConvRNN, an end-to-end deep neural network for complex activity recognition using multimodal sensors by integrating convolutional neural network (CNN) and recurrent neural network (RNN). To be specific, it uses a hierarchical CNN to exploit the intra-sensor relationships among similar sensors and merge intra-sensor relationships of different sensor modalities into inter-sensor relationships, and uses a RNN to model the temporal relationships of signal dynamics. The experiments based on real-world datasets show that HConvRNN outperforms the existing complex activity recognition methods. (C) 2019 Published by Elsevier B.V.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available