4.7 Review

Image inpainting based on deep learning: A review

Journal

DISPLAYS
Volume 69, Issue -, Pages -

Publisher

ELSEVIER
DOI: 10.1016/j.displa.2021.102028

Keywords

Computer vision; Image inpainting; Variational autoencoder (VAE); Generative adversarial networks (GAN)

Funding

  1. National Science Foundation of China [61901436]
  2. Key Research Program of the Chinese Academy of Sciences [XDPB22]

Ask authors/readers for more resources

This article reviews the latest research status in the field of image inpainting based on deep learning, including inpainting methods of different neural network structures, technical improvement mechanisms, and comprehensive evaluation of model network structures and restoration methods. Future research directions include addressing current issues in image inpainting and driving continuous development in this field.
Image inpainting aims to restore the pixel features of damaged parts in incomplete image and plays a key role in many computer vision tasks. Image inpainting technology based on deep learning is a major current research hotspot. To deeply understand related methods and technologies, this article combs and summarizes the latest research status in this field. Firstly, we summarize inpainting methods of different types of neural network structure based on deep learning, then analyze and study important technical improvement mechanisms. In addition, various algorithms are comprehensively reviewed from the aspects of model network structure and restoration methods. And we select some representative image inpainting methods for comparison and analysis. Finally, the current problems of image inpainting are summarized, and the future development trend and research direction are prospected.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available