Font Size: a A A

Research On Robot Interaction Method Based On Multi-space Unification

Posted on:2021-01-16Degree:MasterType:Thesis
Country:ChinaCandidate:B ZhangFull Text:PDF
GTID:2428330611466935Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
Human-robot interaction plays a very important role in robot teleoperation,and it is a research hotspot in the field of robotics.With the progress of science and technology,the interaction between human and robot has been developing towards a more natural and efficient direction.Aiming at the shortcomings of existing human-robot interactions,this thesis proposes a human-robot interaction method based on multi-space unification.It combines augmented reality glasses and gesture sensors into a set of wearable mobile interaction devices.The virtual robot is taken as the interaction object,and the users can interact through gesture and voice.The research work of this thesis mainly includes the following two parts:1.The theory and technology of multi-space unification: This thesis takes the real robot scene as a reference to establish a virtual robot model,and uses the three-dimensional registration technology to superimpose the virtual model into the real environment.Therefore,the virtual space and the real space are fused.By establishing the coordinate system of the interactive scene and analyzing the transformation relationship between the coordinate systems,the registration of the operation space and the virtual space are realized.The singular value decomposition method is used to solve the coordinate system transformation matrix.Finally,the gesture data of the user's bare hand can be transformed into the virtual space through coordinate transformation for interaction with virtual robot.2.Robot mobile interaction based on multi-space unification technology: This thesis introduces the concept of mobile interaction based on multi-space unification.A Leap Motion sensor is used to acquire the gesture data,and the data is filtered and smoothed through the interval Kalman filter.A gesture-based guided human-robot interaction is designed,which enables the end of the virtual robot to move with the hand.The following of the virtual robot is triggered by the collision detection between the hand and the virtual robot.Moreover,the user's speech is recognized by the Speech Platform SDK,and a text classifier based on support vector machine is used to understand the speech.Gesture information and speech information are integrated by filling the command slot to generate the robot control instructions,realizing the fusion of gesture and speech.At the end of the thesis,the feasibility and effectiveness of the research content are verified by the experiments of robot manipulator control.
Keywords/Search Tags:Human-Robot Interaction, Augmented Reality, Guided Interaction, Gesture, Speech
PDF Full Text Request
Related items