4.7 Article

Unpaired Motion Style Transfer from Video to Animation

Journal

ACM TRANSACTIONS ON GRAPHICS
Volume 39, Issue 4, Pages -

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3386569.3392469

Keywords

motion analysis; style transfer

Funding

  1. National Key R&D Program of China [2018YFB1403900, 2019YFF0302902]
  2. Israel Science Foundation [2366/16]

Ask authors/readers for more resources

Transferring the motion style from one animation clip to another, while preserving the motion content of the latter, has been a long-standing problem in character animation. Most existing data-driven approaches are supervised arid rely on paired data, where motions with the same content are performed in different styles. In addition, these approaches are limited lo transfer of styles that were seen during training. In this paper, we present a novel data-driven framework for motion style transfer, which learns from an unpaired collection of motions with style labels, arid enables transferring motion styles not observed during training. Furthermore, our framework is able lo extract motion styles directly from videos, bypassing 3D reconstruction, and apply them to the 3D input motion. Our style transfer network encodes motions into two latent codes, for content and for style, each of which plays a different role in the decoding (synthesis) process. While the content code- is decoded into the output motion by several temporal convolutional layers, the style code modifies deep features via temporally' invariant adaptive instance normalization (AdaIN). Moreover, while the content code is encoded from 3D joint rotations, we learn a common embedding for style from either 3D or 2D joint positions, enabling style extraction from videos. Our results are comparable to the state-of-the-art, despite not requiring paired training data, and outperform other methods when transferring previously unseen styles. To our knowledge, we are the first to demonstrate style transfer directly from videos to 3D animations - an ability which enables one to extend the set of style examples far beyond motions captured by MoCap systems.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available