Font Size: a A A

Human-computer Interaction Semantic Model For Layout Design System In Augmented Reality Environment

Posted on:2015-03-28Degree:MasterType:Thesis
Country:ChinaCandidate:F ZhangFull Text:PDF
GTID:2252330428997371Subject:Mechanical engineering
Abstract/Summary:PDF Full Text Request
Augmented reality technologies have laid the theoretical and technological foundation for creating a hybrid environment of real plant and virtual equipment. They provide a brand-new idea for designers to design and modify plant layout scheme through an intuitive and natural way and make sensory evaluation directly. Currently, the traditional human-computer interaction technology can’t meet the need of carrying on intuitive, natural and efficient modification operations. Therefore, it is necessary to research the human-computer interaction method in the hybrid environment. Interaction based on hand gestures was used in plant layout design. The research content includes the following several aspects:Firstly, augmented reality plant layout was analyzed and basic gestures were defined. Specific interaction task was extracted combined with analysis of interaction characteristics (such as range limitation) and interaction requirements in the plant layout design. Method of operating virtual models conveniently in the wide range augmented reality plant environment within a certain range was researched. Combined with operation behaviors on objects in people’s daily life, the basic operation process of the plant layout design was conducted decomposition and the basic hand gestures were extracted and defined.Secondly, interaction semantic model was defined. A human-computer interaction semantic model based on gesture change was defined and it was divided into physical layer, lexical layer, syntax layer and semantic layer. The function of each layer was analyzed and described with mathematical formulas. The interaction semantic model judged operation intentions of operators (such as adding, moving, rotating and removing models) according to the change and timing sequence of gestures, movement of hands and context information by using data glove and tracking device.Thirdly, interaction semantic model was instantiated. Data of semantic model (such as gesture set, gesture change set and interaction rule set) was instantiated according to gestures used, gesture change and operation condition in the operation process of plant layout. The basic operation process of plant layout design was defined and process of operation intentions judgment was analyzed.Fourthly, interaction semantic model was implemented. The gesture data was extracted based on data glove, and the k-nearest neighbor method was used to train gestures template and recognize the current gesture. In the form of gesture sequence tank filling, the gesture change was recognized.Lastly, a plant layout design system under augmented reality environment was designed and completed based on related hardware and software platform. An application based on the android mobile terminal was developed to assist user in the operations such as speed setting, text remark, layout scheme saving. By running the system, the feasibility and effect of the research methods and related algorithm proposed in this paper were verified.
Keywords/Search Tags:Augmented reality, Plant layout, Human-computer interaction, Semantic model, Data glove, Gesture recognition
PDF Full Text Request
Related items