Font Size: a A A

Study And Application Of Hand Gesture Recognition Based Spatio-temporal Features With Multiple Motion Sensors

Posted on:2022-08-15Degree:MasterType:Thesis
Country:ChinaCandidate:M Y RanFull Text:PDF
GTID:2518306536972509Subject:Software engineering
Abstract/Summary:PDF Full Text Request
The human-computer interaction method based on the gesture interface can reduce the complexity of the interaction.As a natural and intuitive interaction method,it is widely used in various usage scenarios.Compared with other input modes,gestures help to externalize thinking and realize simpler and more efficient interaction between users and devices.The essence of the gesture interface is to allow computing devices to understand the meaning expressed by gestures.According to different types of collection devices,gesture recognition technology can be divided into recognition technology based on environmental sensors and wearable sensors.Although the recognition technology based on environmental sensors can detect gestures without touching the human body,it still has many limitations in terms of coverage,mobility,privacy protection,and cost.Traditional wearable sensors based on physiological signals,such as electromyography sensors,require electrodes to directly contact muscles,which makes this technology unsuitable for long-term monitoring of exercise.At the same time,the wearing mode and position of the sensor will have a greater impact on the recognition effect,and it is not suitable for recognizing faster gestures and actions.Due to the advantages of small size,high accuracy and low cost,more and more researchers currently choose to use inertial measurement units(motion sensors)for motion recognition.A key challenge in wearable-based hand gesture recognition is the fact that a hand gesture can be performed in several ways,with each consisting of its own configuration of motions and their spatio-temporal dependencies.However,the existing methods generally focus on the characteristics of a single point on hand,but ignores the diversity of motion information over hand skeleton,and as a result,they suffer from two key challenges to characterize hand gestures over multiple wearable sensors: motion representation and motion modeling.This leads us to define a spatio-temporal framework,that explicitly characterizes the hand motion context of spatio-temporal relations among multiple bones and detects hand gestures in real-time.In particular,our framework incorporates Lie group-based representation to capture the inherent structural varieties of hand motions with spatio-temporal dependencies among multiple bones.To evaluate our framework,we developed a hand-worn prototype device with multiple motion sensors.Our in-lab study on a dataset collected from nine subjects suggests our approach significantly outperforms the state-of-the-art methods.In particular,we show in-wild examples that highlight the interaction capability of our framework.
Keywords/Search Tags:Hand gesture recognition, Wearable motion sensors, Lie group, Motion modeling
PDF Full Text Request
Related items