Motion Style Transfer via Deep Autoencoder and Spatio-Temporal Feature Constraint
Hu Dong1,2), Peng Shujuan1,2)*, Liu Xin1,2), and Du Jixiang1,2)
1) (College of Computer Science and Technology, Huaqiao University, Xiamen 361021)2) (Key Laboratory of Pattern Recognition and Computer Vision, Xiamen City, Xiamen 361021)
Motion style transfer associated with the motion editing techniques can be well utilized to meet the requirements of different motion processing purpose in computer animation. Often, the data driven based motion style processing may suffer from the unnatural posture and lack of adaptability with continuous motion. To tackle these problems, we present an efficient motion style transferring approach by using deep autoencoder and spatio-temporal feature constraint. According to the user request for style transferring, we propose to decompose the human motion into behavior motion and style motion. By embedding the history motion frames within the deep autocoder model, the discriminative spatio-temporal features corresponding to the behavior motion and style motion can be well extracted. Finally, we exploit the style constraints in feature space to control the motion style transfer by Gram matrix, whereby the motion style of different semantics can be well transferred from one style to another one. The experiments have shown that the proposed approach can produce a variety of different movement styles, and the transferred motion styles are visually natural and vivid. Meanwhile, the related experiments have also demonstrated the good generalization ability and adaptability of our proposed model in comparison with existing counterparts.