3.8 Proceedings Paper

Calibrated RGB-D Salient Object Detection

Publisher

IEEE COMPUTER SOC
DOI: 10.1109/CVPR46437.2021.00935

Keywords

-

Funding

  1. Key Area Research and Development Program of Guangdong Province, China [2018B010111001]
  2. National Key R&D Program of China [2018YFC2000702]
  3. Scientific and Technical Innovation 2030-'New Generation Artificial Intelligence' Project [2020AAA0104100]
  4. University of Alberta Start-up Grant
  5. UAHJIC grants
  6. NSERC [RGPIN-2019-04575]

Ask authors/readers for more resources

The proposed Depth Calibration and Fusion (DCF) framework addresses the challenges in Salient Object Detection by calibrating depth bias and fusing features from RGB and depth modalities, achieving superior performance compared to state-of-the-art methods. The depth calibration strategy can also be used as a preprocessing step to improve existing RGB-D SOD models.
Complex backgrounds and similar appearances between objects and their surroundings are generally recognized as challenging scenarios in Salient Object Detection (SOD). This naturally leads to the incorporation of depth information in addition to the conventional RGB image as input, known as RGB-D SOD or depth-aware SOD. Meanwhile, this emerging line of research has been considerably hindered by the noise and ambiguity that prevail in raw depth images. To address the aforementioned issues, we propose a Depth Calibration and Fusion (DCF) framework that contains two novel components: 1) a learning strategy to calibrate the latent bias in the original depth maps towards boosting the SOD performance; 2) a simple yet effective cross reference module to fuse features from both RGB and depth modalities. Extensive empirical experiments demonstrate that the proposed approach achieves superior performance against 27 state-of-the-art methods. Moreover, our depth calibration strategy alone can work as a preprocessing step; empirically it results in noticeable improvements when being applied to existing cutting-edge RGB-D SOD models.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available