4.6 Article

Image translation between high-resolution optical and synthetic aperture radar (SAR) data

期刊

INTERNATIONAL JOURNAL OF REMOTE SENSING
卷 42, 期 12, 页码 4762-4788

出版社

TAYLOR & FRANCIS LTD
DOI: 10.1080/01431161.2020.1836426

关键词

-

资金

  1. National Natural Science Foundation of China [61572515]

向作者/读者索取更多资源

This study introduces a novel approach for remote-sensing image translation between high-resolution optical and SAR data through machine learning methods. The efficiency of the proposed methods has been validated with different SAR parameters in three regions, showing that the translated images effectively preserve land cover types and exhibit excellent performance in classification accuracy and similarity indicators.
This paper presents a novel study: remote-sensing image translation between high-resolution optical and Synthetic Aperture Radar (SAR) data through machine learning approaches. To this end, conditional Generative Adversarial Networks (cGANs) have been proposed with the guide of high-level image features. Efficiency of the proposed methods have been verified with different SAR parameters on three regions from the world: Toronto, Vancouver in Canada and Shanghai in China. The generated SAR and optical images have been evaluated by pixel-based image classification with detailed land cover types including: low and high-density residential area, industry area, construction site, golf course, water, forest, pasture and crops. Results showed that the translated image could effectively keep many land cover types with compatible classification accuracy to the ground truth data. In comparison with state-of-the-art image translation approaches, the proposed methods could improve the translation results under the criteria of common similarity indicators. This is one of first study on multi-source remote-sensing data translation by machine learning.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.6
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据