4.7 Article

Arbitrary-Scale Texture Generation From Coarse-Grained Control

Journal

IEEE TRANSACTIONS ON IMAGE PROCESSING
Volume 31, Issue -, Pages 5841-5855

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TIP.2022.3201710

Keywords

Markov random fields; Solid modeling; Task analysis; Bayes methods; Visualization; Real-time systems; Pipelines; Texture synthesis; Bayesian hierarchy; Markov random field; fully convolutional network

Funding

  1. National Key Research and Development Program of China [2018AAA0100602]

Ask authors/readers for more resources

This paper proposes a method to generate textures directly from coarse-grained control or high-level guidance, dividing the texture generation process into a three-level Bayesian hierarchical model and using a fully convolutional network to produce diverse textures in real-time.
Existing deep-network based texture synthesis approaches all focus on fine-grained control of texture generation by synthesizing images from exemplars. Since the networks employed by most of these methods are always tied to individual exemplar textures, a large number of individual networks have to be trained when modeling various textures. In this paper, we propose to generate textures directly from coarse-grained control or high-level guidance, such as texture categories, perceptual attributes and semantic descriptions. We fulfill the task by parsing the generation process of a texture into the three-level Bayesian hierarchical model. A coarse-grained signal first determines a distribution over Markov random fields. Then a Markov random field is used to model the distribution of the final output textures. Finally, an output texture is generated from the sampled Markov random field distribution. At the bottom level of the Bayesian hierarchy, the isotropic and ergodic characteristics of the textures favor a construction that consists of a fully convolutional network. The proposed method integrates texture creation and texture synthesis into one pipeline for real-time texture generation, and enables users to readily obtain diverse textures with arbitrary scales from high-level guidance only. Extensive experiments demonstrate that the proposed method is capable of generating plausible textures that are faithful to user-defined control, and achieving impressive texture metamorphosis by interpolation in the learned texture manifold.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available