4.7 Article

A differential information residual convolutional neural network for pansharpening

期刊

出版社

ELSEVIER
DOI: 10.1016/j.isprsjprs.2020.03.006

关键词

Pansharpening; RCNN; Differential information mapping; Auxiliary gradient

资金

  1. National Natural Science Foundation of China [61671334, 41701400, 41922008, 61971319]

向作者/读者索取更多资源

Deep learning based methods are the state-of-the-art in panchromatic (PAN)/multispectral (MS) fusion (which is generally called pansharpening). In this paper, to solve the problem of the insufficient spatial enhancement in most of the existing deep learning based pansharpening methods, we propose a novel pansharpening method based on a residual convolutional neural network (RCNN). Differing from the existing deep learning based pansharpening methods that are mainly devoted to designing an effective network, we make novel changes to the input and the output of the network and propose a simple but effective mapping strategy. This strategy involves utilizing the network to map the differential information between the high spatial resolution panchromatic (HR-PAN) image and the low spatial resolution multispectral (LR-MS) image to the differential information between the HR-PAN image and the high spatial resolution multispectral (HR-MS) image, which is called the differential information mapping strategy. Moreover, to further boost the spatial information in the fusion results, the proposed method makes full use of the LR-MS image and utilizes the gradient information of the up-sampled LR-MS image (Up-LR-MS) as auxiliary data to assist the network. Furthermore, an attention module and residual blocks are incorporated in the proposed network structure to maximize the ability of the network to extract features. Experiments on four data sets collected by different satellites confirm the superior performance of the proposed method compared to the state-of-the-art pansharpening methods.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据