Font Size: a A A

Research On Data-driven For Virtual Character Motion Style Transfer

Posted on:2022-11-17Degree:MasterType:Thesis
Country:ChinaCandidate:P Z LiFull Text:PDF
GTID:2518306788956579Subject:Computer Software and Application of Computer
Abstract/Summary:PDF Full Text Request
Motion style transfer is an important research direction in virtual reality animation simulation.The purpose of this research is to generate different motion styles by inputting the motion behavior of virtual characters under the inspiration of image style transfer.In recent years,the research and development of deep neural network has promoted the fusion of two different concepts of image and motion style transfer.By referring to the advanced image style transfer network processing method,it can be effectively transplanted into motion data processing.Motion style transfer network model.Based on this consideration,the contributions of this paper are summarized as follows:First,by studying the extraction method of image content features and style features by image style transfer,this paper designs a neural network framework based on temporal convolutional network as the backbone to process human motion data with time series and space.In the encoder,Instance Normalization(IN)in image style transfer is introduced to extract motion content features and motion style features.Experiments show that this method can extract motion content features and style features from unpaired datasets like an image style transfer network.Secondly,for the motion generation decoder guided by the motion style feature,this paper improves and adapts it to the motion decoder based on the Spatially-Adaptive Normalization(SPADE)method of multimodal generated images,and proposes a Improved SPADE method for motion style transfer.Compared with the Adaptive Instance Normalization(Ada IN)method,the method proposed in this paper has better generality and accuracy,which proves that the improved SPADE method can retain more style information to guide the realization of better sports style transfer.Then,in view of the multi-faceted constraints of virtual character motion data in objective reality,this paper introduces forward kinematics into the network to constrain the joint errors transmitted along the kinematic chain,and introduces the inverse kinematics method in the motion style transfer results.Correction of foot restraints.Through experiments on the style transfer results,it is proved that kinematics can make the motion output by the network more in line with the objective motion law,and the visual effect is more natural.Finally,in order to demonstrate the data-driven virtual character style transfer,this paper builds a virtual character visualization platform based on Unity3 D,and uses the experimental data of this paper to drive the virtual character model,including the virtual character performance and skeletal animation depiction of multiple raw data and generated data.It is proved that the virtual character style transfer method proposed in this paper can generate realistic animation effects,which has certain innovation and application value.
Keywords/Search Tags:Deep learning, motion style transfer, data-driven, Spatially-Adaptive Normalization
PDF Full Text Request
Related items