Font Size: a A A

Non-human-like Characters’ Facial Animation Driven By One Or Two Actors

Posted on:2015-08-02Degree:MasterType:Thesis
Country:ChinaCandidate:D GuoFull Text:PDF
GTID:2298330422471010Subject:Computer software and theory
Abstract/Summary:PDF Full Text Request
In recent years, there are more and more virtual characters, such as the monsters inthe classic movie “Monsters, Inc.”, appeared in the computer virtual world and loved bypeople. These characters have dissimilar facial structure with human. The number orarrangements of their facial parts are unusual. They are called non-human-like characters.Being indispensable in games and movies makes non-human-like characters’ facialanimation becoming an important and challenging research topic. In this paper, we presenta novel method that can transfer one actor’s performance or simultaneously retarget twoactors’ expressions to non-human-like characters successfully.First,in order to capture actors’ facial expressions, we use Kinect to record facialmotion. After analysis, the facial motion data is decomposed into rigid head motion andsubtle facial expressions. With the3D sensor camera Kinect, the actor does not need to puton any equipment or facial marker. It brings convenience that the actor is allowed toperform freely in a natural environment.Second, we find that similar facial motion with the human face is shown up on thelocal part of the character when we change our viewing angle. Based on this, we defineseveral local-coordinate systems on a non-human-like character to make its’ facial motionbe associated with human’s. That is to say, source expressions can be transferred to anon-human-like character in its local-coordinate systems.Third, we use Laplacian mesh deformation to complete facial expressions retargeting.Non-human-like characters’ facial feature points are divided into two kinds for meshdeformation. One kind feature points can be found direct association with human facialfeature points. These facial parts have similar facial motion with the human face. So theydon’t need to create a local-coordinate system. Feature points on those parts should addthe actor’s expressional displacements directly. The other feature points just have facialassociation with human face in local-coordinate systems. So after transferring actors’facial expressions to these facial feature points in local-coordinate systems, we calculatetheir global coordinates by transformation matrices. And then, we use Laplacian deformation to propagate the feature points’ deformation to the whole character mesh.With our approach, we can transfer two actors’ facial expressions onto different parts ofone character.Last, through the open graphics interface OpenGL and C++programming language,we finished a system for one or two performers driving a non-human-like character’s facialanimation in the Windows platform. We do experiments to prove our system work well.Our method can be applied to games, movies, video conferences and other areas.
Keywords/Search Tags:Non-human-like characters, expression capture, Laplacian deformation, local-coordinate system, expression retargeting
PDF Full Text Request
Related items