4.7 Article

Class-incremental object detection

Journal

PATTERN RECOGNITION
Volume 139, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2023.109488

Keywords

Class -incremental learning; Object detection; Information asymmetry; Non -affection distillation; Deep learning

Ask authors/readers for more resources

Deep learning architectures perform well in object detection, but suffer from performance drop when learning new classes incrementally without forgetting old ones. Many incremental learning methods have been proposed to address the catastrophic forgetting problem, but current strategies encounter issues in preserving old knowledge and learning new classes simultaneously.
Deep learning architectures have shown remarkable results in the object detection task. However, they experience a critical performance drop when they are required to learn new classes incrementally with-out forgetting old ones. This catastrophic forgetting phenomenon impedes the deployment of artificial intelligence in real word scenarios where systems need to learn new and different representations over time. Recently, many incremental learning methods have been proposed to avoid the catastrophic for-getting problem. However, current state-of-the-art class-incremental learning strategies aim at preserving the knowledge of old classes while learning new ones sequentially, which would encounter other prob-lems as follows: (1) In the process of preserving information of old classes, only a small portion of data in the previous tasks are kept and replayed during training, which inevitably incurs bias that is favor-able for the new classes but malicious to the old classes. (2) With the knowledge of previous classes distilled into the new model, a sub-optimal solution for the new task is obtained since the preserving process of previous classes sabotages the training of new classes. To address these issues, termed as In-formation Asymmetry (IA) , we propose a double-head framework which preserves the knowledge of old classes and learns the knowledge of new classes separately. Specifically, we transfer the knowledge of the previous model to the current learned one for overcoming the catastrophic forgetting problem. Fur-thermore, considering that IA would introduce impacts on the training of the new model, we propose a Non-Affection mask to distill the knowledge of the interested regions at the feature level. Comprehensive experimental results demonstrate that our proposed method significantly outperforms other state-of-the-art class-incremental object detection methods on PASCAL VOC and MS COCO datasets. (c) 2023 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available