期刊
KNOWLEDGE-BASED SYSTEMS
卷 264, 期 -, 页码 -出版社
ELSEVIER
DOI: 10.1016/j.knosys.2023.110322
关键词
Cross -modality salient object detection; Universality; Anti -interference
Cross-modality salient object detection is improved for universality and anti-interference by proposing a network with feature extraction strategy, graph mapping reasoning module (GMRM), and mutual guidance fusion module (MGFM). Experimental results show good performance in universality and anti-interference.
Cross-modality salient object detection (SOD) mainly includes RGB-D salient object detection and RGB-T salient object detection. Depth or thermal infrared information is used to compensate for the RGB information. Although cross-modality salient object detection has achieved excellent results, the current methods need to be improved in terms of universality and anti-interference. Therefore, we propose a cross-modality salient object detection network with universality and anti-interference. First, we offer a feature extraction strategy to enhance the features in the feature extraction stage. It can promote the mutual improvement of different modal information and avoid the influence of interference on the subsequent process. Then we use the graph mapping reasoning module (GMRM) to infer the high-level semantics to obtain valuable information. It enables our proposed method to accurately locate the objects in different scenes and interference to improve the universality and antiinterference of the method. Finally, we adopt a mutual guidance fusion module (MGFM), including a modality adaptive fusion module (MAFM) and across-level mutual guidance fusion module (ALMGFM), to carry out an efficient and reasonable fusion of multi-scale and multi-modality information. To verify the universality and anti-interference of our proposed method, we conduct experiments on many RGBD/T SOD datasets and compare our method with the current state-of-the-art methods. Experimental results show that our method performs well in universality and anti-interference.(c) 2023 Elsevier B.V. All rights reserved.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据