4.5 Article

How to train your pre-trained GAN models

Journal

APPLIED INTELLIGENCE
Volume -, Issue -, Pages -

Publisher

SPRINGER
DOI: 10.1007/s10489-023-04807

Keywords

Deep learning; Generative adversarial networks (GAN); Computer vision (CV); Artificial intelligence (AI)

Ask authors/readers for more resources

This paper provides a comprehensive review of the latest transfer learning methods for generative adversarial networks (GANs) and proposes an effective method of fixing some layers of the generator and discriminator to address training issues. The experiment uses StyleGAN and evaluates performance using Frechet Inception Distance (FID), coverage, and density. Results show that the proposed method avoids overfitting and achieves better performance on various datasets compared to existing methods.
Generative Adversarial Networks (GAN) show excellent performance in various problems of computer vision, computer graphics, and machine learning, but require large amounts of data and huge computational resources. There is also the issue of unstable training. If the generator and discriminator diverge during the training process, the GAN is subsequently difficult to converge. In order to tackle these problems, various transfer learning methods have been introduced; however, mode collapse, which is a form of overfitting, often arises. Moreover, there were limitations in learning the distribution of the training data. In this paper, we provide a comprehensive review of the latest transfer learning methods as a solution to the problem, propose the most effective method of fixing some layers of the generator and discriminator, and discuss future prospects. The model to be used for the experiment is StyleGAN, and the performance evaluation uses Frechet Inception Distance (FID), coverage, and density. Results of the experiment revealed that the proposed method did not overfit. The model was able to learn the distribution of the training data relatively well compared to the previously proposed methods. Moreover, it outperformed existing methods at the Stanford Cars, Stanford Dogs, Oxford Flower, Caltech-256, CUB-200-2011, and Insect-30 datasets.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available