4.7 Article

Evolutionary Variational Optimization of Generative Models

期刊

出版社

MICROTOME PUBL

关键词

Expectation Maximization; Variational Methods; Evolutionary Algorithms; Sparse Coding; Denoising; Inpainting

资金

  1. German Research Foundation (DFG) [352015383 (SFB 1330)]
  2. Enrico Guiraud through a Wolfgang Gentner scholarship of the German Federal Ministry of Education and Research (BMBF) [13E18CHA, INST 184/157-1 FUGG]
  3. North -German Supercomputing Alliance [nim00006]

向作者/读者索取更多资源

In this study, we combine two popular optimization approaches, variational optimization and evolutionary algorithms, to derive a novel variational approach for generative models. By using truncated posteriors as variational distributions and interpreting latent states as genomes of individuals, we apply evolutionary algorithms to optimize the variational bounds efficiently. The proposed approach shows significant improvements in competitive benchmarks for image denoising and inpainting, highlighting the importance of optimization methods for generative models.
We combine two popular optimization approaches to derive learning algorithms for generative models: variational optimization and evolutionary algorithms. The combination is realized for generative models with discrete latents by using truncated posteriors as the family of variational distributions. The variational parameters of truncated posteriors are sets of latent states. By interpreting these states as genomes of individuals and by using the variational lower bound to define a fitness, we can apply evolutionary algorithms to realize the variational loop. The used variational distributions are very flexible and we show that evolutionary algorithms can effectively and efficiently optimize the variational bound. Furthermore, the variational loop is generally applicable (black box) with no analytical derivations required. To show general applicability, we apply the approach to three generative models (we use Noisy-OR Bayes Nets, Binary Sparse Coding, and Spike-and-Slab Sparse Coding). To demonstrate effectiveness and efficiency of the novel variational approach, we use the standard competitive benchmarks of image denoising and inpainting. The benchmarks allow quantitative comparisons to a wide range of methods including probabilistic approaches, deep deterministic and generative networks, and non-local image processing methods. In the category of zero-shot learning (when only the corrupted image is used for training), we observed the evolutionary variational algorithm to significantly improve the state-of-the-art in many benchmark settings. For one well-known inpainting benchmark, we also observed state-of-the-art performance across all categories of algorithms although we only train on the corrupted image. In general, our investigations highlight the importance of research on optimization methods for generative models to achieve performance improvements.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据