4.7 Article

SATS: Self-attention transfer for continual semantic segmentation

Journal

PATTERN RECOGNITION
Volume 138, Issue -, Pages -

Publisher

ELSEVIER SCI LTD
DOI: 10.1016/j.patcog.2023.109383

Keywords

Continual learning; Semantic segmentation; Self -attention transfer; Class -specific region pooling

Ask authors/readers for more resources

In this study, a new knowledge transfer method was proposed to alleviate the catastrophic forgetting issue in continuous semantic segmentation. This method captures the relationships between elements within each image using self-attention maps in a Transformer-style segmentation model. Extensive evaluations show that the proposed method outperforms state-of-the-art solutions when combined with widely adopted strategies.
Continually learning to segment more and more types of image regions is a desired capability for many intelligent systems. However, such continual semantic segmentation exhibits catastrophic forgetting is-sues similar to those of continual classification learning. Unlike the existing knowledge distillation strate-gies for alleviating this problem, transferring a new type of information, namely, the relationships be-tween elements (e.g., pixels) within each image that can capture both within-class and between-class knowledge, is proposed in this study. Such information can be effectively obtained from self-attention maps in a Transformer-style segmentation model. Considering that pixels belonging to the same class in each image typically share similar visual properties, a class-specific region pooling operator is novelly applied to provide reliable relationship information for knowledge transfer. Extensive evaluations on mul-tiple public benchmarks reveal that the proposed self-attention transfer method can effectively alleviate the catastrophic forgetting issue. Furthermore, flexible combinations of the proposed method with widely adopted strategies considerably outperform state-of-the-art solutions. (c) 2023 Elsevier Ltd. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available