Font Size: a A A

Facial Expression Recognition Based Interaction With Virtual Environment

Posted on:2018-11-04Degree:MasterType:Thesis
Country:ChinaCandidate:Y YangFull Text:PDF
GTID:2348330569486494Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
With the rapid development of Virtual Environment(VE)technology,a more intelligent and affective interaction interface is desired.In VE,virtual characters or avatars that have subtle and high-fidelity expressions can effectively enhance the practicablity of VR applications.Nowadays,a number of facial expression recognition-based VR applications have been showing up.For example,virtual characters in 3D movies usually have colorful expressions in a wide range.However,high costs on 3D capturing and scanning devices deter the promotion of those applications.This thesis studies static and dynamic facial expression recognition algorithm.We constructed a mixed face expression dataset with RGB-D images to test and verify the proposed algorithm,and designed and developed a natural interaction system with VE based on facial expression recognition.The main contents are as follows.1.Static facial expression recognition algorithm is studied and a facial expression recognition method using RGB-D images and multichannel features is proposed.We use the HOG to extract depth data entropy,intensity image entropy,and RGB saliency map as appearance features.AAM is used to extract facial expression feature points as shape features.Concatenating the two features,a multichannel feature vector is built.On this basis,Kernel based multiclass Support Vector Machine(SVM)is selected as the multi-classifier.Experimental results demonstrate the proposed method has higher robustness and accuracy than single channel features based traditional algorithms.2.For the dynamic facial expression recognition,Emotion Profiles(EPs)are utilized as the dynamic discriminative rule,and each frame is voted by 6 SVM classifiers.The comparison tests are conducted under 3 varying head rotations with a variety of changing angles and 3 different illuminations.The experimental results show that the proposed method can maintain a high recognition rate in the above mentioned environments.3.Studying on animation generation methods of virtual avatar,the VE testing toolbox based on facial expression recognition is established to realize modellings and remapping.A virtual avatar is driven by Kinect facial motion capturing technology after modeling and skinning.The blend shape combining weights are configured by 23 user-specific expressions.In Unity3 D,retargeting is used for smoothing operations and accomplishing user expression controlling.4.Combined the dynamic facial expression recognition and the virtual avatar animations,the VE interaction system based on facial expression recognition is developed.The feedback system,including facial motions and body gesture,is designed in accordance with the recognized expression results to enhance user experience in virtual environments.In a virtual laboratory environment,Unity Profiler is applied for analyzing the performance of the CPU,memory,and rendering to optimize the system.It turns out that the system has a smooth frame rate that is suitable for interactions.
Keywords/Search Tags:Facial expression recognition, VR interaction, support vector machine, virtual character animation
PDF Full Text Request
Related items