4.5 Article

ZeroEGGS: Zero-shot Example-based Gesture Generation from Speech

Journal

COMPUTER GRAPHICS FORUM
Volume 42, Issue 1, Pages 206-216

Publisher

WILEY
DOI: 10.1111/cgf.14734

Keywords

animation; gestures; character control; motion capture

Ask authors/readers for more resources

We introduce ZeroEGGS, a neural network framework for generating speech-driven gestures with zero-shot style control based on examples. Our model uses a Variational framework to learn style embeddings, enabling easy style modification. Through a series of experiments, we demonstrate the flexibility and generalizability of our model to new speakers and styles, and show its superiority in naturalness of motion, appropriateness for speech, and style portrayal compared to previous techniques. We also release a high-quality dataset for further research.
We present ZeroEGGS, a neural network framework for speech-driven gesture generation with zero-shot style control by example. This means style can be controlled via only a short example motion clip, even for motion styles unseen during training. Our model uses a Variational framework to learn a style embedding, making it easy to modify style through latent space manipulation or blending and scaling of style embeddings. The probabilistic nature of our framework further enables the generation of a variety of outputs given the input, addressing the stochastic nature of gesture motion. In a series of experiments, we first demonstrate the flexibility and generalizability of our model to new speakers and styles. In a user study, we then show that our model outperforms previous state-of-the-art techniques in naturalness of motion, appropriateness for speech, and style portrayal. Finally, we release a high-quality dataset of full-body gesture motion including fingers, with speech, spanning across 19 different styles. Our code and data are publicly available at .

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.5
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available