Semi-supervised Character Motion Style Transfer
-
-
Abstract
Most existing methods for motion style transfer rely on a small amount of motion data with style labels, limiting their ability to generalize to unseen motion data outside the training distribution. Other arbitrary motion style transfer methods use large-scale, unlabeled motion data to enhance generalization ability, but often suffer from a loss of style consistency, producing unnatural results. Aiming at the above problems, this paper proposes a semi-supervised motion style transfer framework that combines limited, style-labeled motion data with large-scale, unlabeled motion data to achieve style transfer for arbitrary motion content. Specifically, the framework adopts graph convolutional neural networks and designs corresponding loss functions to preserve the content of unlabeled motion and the style of labeled motion. Additionally, the framework enhances motion stylization quality by incorporating the StyleNet fusion module and multi-level network architecture. Experimental results demonstrate that the proposed framework has better generalization ability and higher generation quality compared to state-of-the-art methods while maintaining style consistency better.
-
-