4.5 Article

Edge Distraction-aware Salient Object Detection

Journal

IEEE MULTIMEDIA
Volume 30, Issue 3, Pages 63-73

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/MMUL.2023.3235936

Keywords

Feature extraction; Image edge detection; Object detection; Visualization; Filling; Task analysis; Convolution

Ask authors/readers for more resources

In this study, we propose a new method to generate distraction-free edge features by incorporating holistic interdependencies between high-level features. Experimental results demonstrate that our method outperforms the state-of-the-art methods on benchmark datasets, with fast inference speed on a single GPU.
Integrating low-level edge features has been proven to be effective in preserving clear boundaries of salient objects. However, the locality of edge features makes it difficult to capture globally salient edges, leading to distraction in the final predictions. To address this problem, we propose to produce distraction-free edge features by incorporating cross-scale holistic interdependencies between high-level features. In particular, we first formulate our edge features extraction process as a boundary-filling problem. In this way, we enforce edge features to focus on closed boundaries instead of those disconnected background edges. Second, we propose to explore cross-scale holistic contextual connections between every position pair of high-level feature maps regardless of their distances across different scales. It selectively aggregates features at each position based on its connections to all the others, simulating the contrast stimulus of visual saliency. Finally, we present a complementary features integration module to fuse low- and high-level features according to their properties. Experimental results demonstrate our proposed method outperforms previous state-of-the-art methods on the benchmark datasets, with the fast inference speed of 30 FPS on a single GPU.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available