3.8 Proceedings Paper

Sketch Your Own GAN

Publisher

IEEE
DOI: 10.1109/ICCV48922.2021.01379

Keywords

-

Funding

  1. Signify Lighting Research
  2. Naver Corporation
  3. DARPA [SAIL-ON HR0011-20-C-0022]

Ask authors/readers for more resources

In this study, a method called GAN Sketching is introduced to rewrite GAN models with one or more sketches, making it easier for novice users to train GANs. The experiments show that this method can shape GANs to match shapes and poses specified by sketches while maintaining realism and diversity. The resulting GAN has various applications, including latent space interpolation and image editing.
Can a user create a deep generative model by sketching a single example? Traditionally, creating a GAN model has required the collection of a large-scale dataset of exemplars and specialized knowledge in deep learning. In contrast, sketching is possibly the most universally accessible way to convey a visual concept. In this work, we present a method, GAN Sketching, for rewriting GANs with one or more sketches, to make GANs training easier for novice users. In particular, we change the weights of an original GAN model according to user sketches. We encourage the model's output to match the user sketches through a crossdomain adversarial loss. Furthermore, we explore different regularization methods to preserve the original model's diversity and image quality. Experiments have shown that our method can mold GANs to match shapes and poses specified by sketches while maintaining realism and diversity. Finally, we demonstrate a few applications of the resulting GAN, including latent space interpolation and image editing.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

3.8
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available