4.3 Article

Resource-Efficient Continual Learning for Sensor-Based Human Activity Recognition

Journal

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3530910

Keywords

Continual learning; human activity recognition; deep learning

Ask authors/readers for more resources

This paper proposes a resource-efficient and high-performance continual learning solution for sensor-based human activity recognition (HAR). By utilizing a neural network trained with a replay-based method and a highly-compressed replay memory, the method achieves accuracy improvements and faster operation on low-cost resource-constrained devices.
Recent advances in deep learning have granted unrivaled performance to sensor-based human activity recognition (HAR). However, in a real-world scenario, the HAR solution is subject to diverse changes over time such as the need to learn newactivity classes or variations in the data distribution of the already-included activities. To solve these issues, previous studies have tried to apply directly the continual learning methods borrowed from the computer vision domain, where it is vastly explored. Unfortunately, these methods either lead to surprisingly poor results or demand copious amounts of computational resources, which is infeasible for the low-cost resource-constrained devices utilized in HAR. In this paper, we provide a resource-efficient and high-performance continual learning solution for HAR. It consists of an expandable neural network trained with a replay-based method that utilizes a highly-compressed replay memory whose samples are selected to maximize data variability. Experiments with four open datasets, which were conducted on two distinct microcontrollers, show that our method is capable of achieving substantial accuracy improvements over baselines in continual learning such as Gradient Episodic Memory, while utilizing only one-third of the memory and being up to 3x faster.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available