4.6 Article

LDA-GAN: Lightweight domain-attention GAN for unpaired image-to-image translation

Journal

NEUROCOMPUTING
Volume 506, Issue -, Pages 355-368

Publisher

ELSEVIER
DOI: 10.1016/j.neucom.2022.07.084

Keywords

Generative Adversarial Networks; Unpaired Image Translation; Attention Mechanism; Style Transfer; Lightweight

Ask authors/readers for more resources

In this paper, a lightweight domain-attention generative adversarial network (LDA-GAN) is proposed for unpaired image-to-image translation. By introducing an improved domain-attention module and a novel separable-residual block, the generator can focus more on important object regions and retain depth and spatial information, resulting in more realistic images.
Recently, image-to-image translation has attracted the interest of researchers, which purpose is to learn a mapping between two image domains. However, image translation will become an intrinsically ill-posed problem when given unpaired training data, that is, there are infinite mappings between two domains. Existing methods usually fail to learn a relatively accurate mapping, leading to poor quality of generated results. We believe that if the framework can focus more on the translation of important object regions instead of irrelevant information, such as background, then the difficulty of mapping learning will be reduced. In this paper, we propose a lightweight domain-attention generative adversarial network (LDA-GAN) for unpaired image-to-image translation, which has fewer parameters and lower memory usage. An improved domain-attention module (DAM) is introduced to establish a long-range dependency between two domains. Thus, the generator can focus more on the relevant regions to generate more real-istic images. Furthermore, a novel separable-residual block (SRB) is designed to retain depth and spatial information during the translation with a lower computational cost. Extensive experiments show the effectiveness of our model on various image translation tasks according to qualitative and quantitative evaluation.(c) 2022 Elsevier B.V. All rights reserved.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available