Font Size: a A A

Affective Body Language Detection and Generation for Socially Assistive Robots

Posted on:2016-06-01Degree:Ph.DType:Thesis
University:University of Toronto (Canada)Candidate:McColl, DerekFull Text:PDF
GTID:2475390017981542Subject:Robotics
Abstract/Summary:
For socially assistive robots to be successfully integrated and accepted within society (especially by vulnerable populations such as the elderly), they need to be able to interpret human affective states (emotions, moods and attitudes) in order to respond appropriately during assistive human-robot interactions (HRI). Previous research has indicated that body language is a very important method of communicating affective states during natural human social interaction. This thesis focuses on developing, implementing and evaluating non-contact based autonomous affective body language recognition and classification systems that allow socially assistive robots to estimate a person's affect and respond appropriately during interactions. Both a novel static body language based recognition system and a novel dynamic body language based recognition system have been developed and implemented. The first system classifies body language using a categorical approach based on a person's accessibility (openness and rapport) towards a robot, while the second system uses a dimensional approach based on a person's valence (feeling of unpleasantness or pleasantness) and arousal (level of activity) levels. Both systems are implemented onto the human-like socially assistive robot Brian 2.1 and evaluated during one-on-one socially assistive HRI experiments. The results of experiments with young adults and elderly participants indicate that both systems accurately identify user affect. This thesis further investigates how individuals interact and perceive an affect-aware socially assistive robot, by designing Brian 2.1 to adapt its behaviours based on a user's degree of accessibility towards it throughout social HRI. Experiments indicate that participants were more accessible towards the accessibility-aware robot over a non-accessibility-aware robot, and perceived the former robot to be more socially intelligent. With this in mind, this work lastly investigates the development of an emotionally intelligent robot capable of effectively displaying its own affective body language during such interactions. Uniquely, body language based on movements and postures founded in human behaviour research are integrated onto Brian 2.1 in order for such a robot to be able to respond to a user's affective state using its own set of emotions. The importance of these body language displays are that they are designed and tested to be easily recognizable by non-expert users. A number of social HRI studies are conducted in this work and lay a foundational framework for developing socially assistive robots that can recognize and react to human affect while providing needed assistance.
Keywords/Search Tags:Socially assistive, Body language, Human, HRI
Related items