4.5 Article

Fully convolutional network with attention modules for semantic segmentation

Journal

SIGNAL IMAGE AND VIDEO PROCESSING
Volume 15, Issue 5, Pages 1031-1039

Publisher

SPRINGER LONDON LTD
DOI: 10.1007/s11760-020-01828-8

Keywords

Semantic segmentation; Fully convolutional network; Attention module

Funding

  1. Joint fund for regional innovation and development of NSFC [U19A2083]
  2. Science and Technology Plan Project of Hunan Provinc [2016TP1020]
  3. Hunan Provincial Key Laboratory of Intelligent Information Processing and Application for Hengyang normal university [IIPA20K04]

Ask authors/readers for more resources

The paper proposes a fully convolutional network with attention modules for semantic segmentation, which enhances pixel relevancy through post-processing and skip-layer attention, optimizes the loss function, and improves performance compared to other models.
Fully convolutional network is a powerful end-to-end model for semantic segmentation. However, it performs prediction pixel by pixel to pose weak consistency on intra-category. This paper proposes fully convolutional network with attention modules for semantic segmentation. Based on the framework of fully convolutional network, the post-processing attention module and skip-layer attention module are introduced to enhance the relevancy among pixels. Post-processing attention module is to calculate the similarity among pixels to obtain global information. Skip-layer attention module is designed to combine semantic information from a deep, coarse layer with contour information from a shallow, fine layer to produce the feature with high resolution and strong semantic information. Loss function, obtained by cross-entropy between estimated probability and label, is to optimize the network. Extensive experiments demonstrate that the proposed approach is superior to DeepLab and other models in performance of mean IoU with moderate computational complexity

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available