| Human-computer interaction is a discipline that studies the interaction between users and systems.With the development of computer technology and the emergence of new interaction requirements,the traditional two-dimensional interaction methods currently cannot meet the needs of users due to the small interaction space,low degree of interaction freedom,and lack of interaction methods.Research hotspots.Three-dimensional pen-type interaction uses paper-pen metaphor to reduce user learning costs to achieve efficient and natural human-computer interaction.It is an emerging 3D interaction method.3D pen gesture interaction refers to an interaction method in which a user performs a specific interaction operation by holding an electronic pen during a pen-type interaction.It has the characteristics of large interaction space,six degrees of freedom,and various gestures.Therefore,based on the 3D large-space pen-type interactive platform as the hardware foundation,the research on the definition and recognition methods of 3D pen gestures incorporating multi-channel information provides new functions and applications for the 3D pen-type interactive platform,which has significant application value.This peper takes the 3D pen-type interactive device as the hardware platform,and uses Unity 3D software to set up a 3D interaction and experimental environment.Focusing on the problem of 3D pen gesture recognition that combines ultrasonic positioning and IMU,this paper mainly works on the 3D pen gesture definition method and 3D pen gesture recognition method.Finally,experimental verification is carried out.The main research contents are as follows:(1)A 3D pen gesture definition method that fuses multi-modality information.Combining the characteristics of 3D pen gestures with greater interaction space,higher degrees of freedom,and more diverse holding gestures,the four attributes of 3D gestures are analyzed and defined,which are track shape attributes,motion direction attributes,and pen posture.Properties and spatial location properties.In order to describe the relevant information of these four attributes of 3D gestures,multi-modality sensor data are needed.In this paper,ultrasonic positioning data,triaxial acceleration data,and pen-body posture data are selected.Among the four attributes of 3D pen gestures,the trajectory shape attribute is a basic attribute possessed by all 3D gestures.It has scale-independent,translation-independent,and rotation-independent.In terms of movement direction attributes,the movement direction of the pen tip and the movement direction of the pen body are described to describe the movement direction of the pen gesture.In terms of the posture of the pen body,the degree of change of the pen body posture during the pen gesture process is concerned.The spatial position attribute defines the sub-space area where the pen gesture action takes place,and has important reference value in a 3D interactive environment.Based on this,a 3D pen gesture definition method with trajectory shape as basic attributes,motion direction,pen posture,and spatial position as supplementary attributes is proposed,and the choice of data modality for describing feature information of each attribute is explained.Finally,this method is then used to define 10 typical 3D pen gestures.(2)Based on the 3D pen gesture definition method and the 3D pen gesture set,a method that combines ultrasonic positioning and IMU for 3D pen gesture recognition is proposed.In this method,the attribute information of the 3D pen gesture is gradually identified according to the priority,and the corresponding data modality and recognition method are selected according to the currently determined gesture attributes to determine the pen gesture label.This method first recognizes the basic attribute information of the trajectory shape,encodes the ultrasonic trajectory sequence using a normalized center distance function,and then performs matching recognition through a dynamic time warping algorithm.Secondly,the pen tip positioning trajectory and triaxial acceleration data are used to estimate the pen tip trajectory direction and the pen body movement direction.Then,through the threshold screening method and the variance of Euler angle sequence,the translation and rotation of the pen body are identified.Finally,the spatial position where the pen gesture action takes place is identified by the sum of the distances from the points in the subspace to the center of the interface.The main contributions and innovations of this paper include:(1)A 3D pen gesture definition method based on multi-modality information is proposed.Based on the multi-modality information of ultrasonic positioning coordinates,triaxial acceleration data,and pen posture data,a 3D pen gesture definition method is proposed with trajectory shape as the basic attribute and motion direction,pen posture,and spatial position as supplementary attributes.The 3D gesture definition method proposed in this paper fully considers the characteristics of 3D pen gestures with large interaction space,six degrees of freedom,and various holding gestures,so it can better meet the 3D pen gesture interaction requirements.(2)A method of 3D pen gesture recognition based on fusion of ultrasonic positioning and IMU is proposed.This method determines the attribute information of 3D pen gestures gradually according to the priority,selects the corresponding data modality according to the currently determined pen gesture attributes,and designs recognition methods to determine the pen gesture labels based on the data characteristics.The 3D pen gesture recognition method proposed in this paper fuses multi-modality information and real-time 3D pen gesture recognition,and solves the problem that traditional methods cannot identify 3D pen gestures with multiple attribute information. |