Font Size: a A A

Research And Design Of Expert System To Capture Users' Intentions Oriented To Virtual Assembly Interface

Posted on:2017-01-30Degree:MasterType:Thesis
Country:ChinaCandidate:L W LiangFull Text:PDF
GTID:2311330488968646Subject:Computer Science and Technology
Abstract/Summary:PDF Full Text Request
With the development of computer technology, communication technology, sensor technology and other information technology and improvement of human's product experience requirements, human-computer interaction interfaces in natural and harmonious ways are becoming research hotspots. As traditional human-computer interaction interfaces interact in the machine-centered way and the users' cognitive loads and operational loads are heavy, it is difficult to meet the needs of the human-centered computer applications in the future. Therefore, human-computer interaction mode is becoming the main barrier for computer application development. The ultimate goal of the human-computer interaction is human-human interaction, namely, the machines can perceive human intention, so the machines' sensing abilities must be enhanced.At present, there are such problems as low perception abilities in the existing virtual assembly interfaces, and the interactive mode is only in explicit interaction manner, so new interactive ways are needed to improve the users' interactive experience. Implicit human-computer interaction is a new type of human-computer interaction, which is an important research frontier in the field of interaction. In implicit interactive mode, users only need to pay attention to interactive tasks themselves, rather than execution mode of the tasks and their process. Machines have the perceptual abilities just like human, and they can perceive the users' explicit intentions and make use of context information and users' explicit intentions to infer the users' implicit intentions. Users complete the explicit interaction intentions and the machines complete the users' implicit interaction intentions. They cooperate with each other to complete the interactive tasks, which improves the efficiency of interaction. Therefore, it is of great importance to study how to integrate implicit interaction into the interactive interface of virtual assembly.This thesis is supported by the National Natural Science Foundation of China(Grant No.61472163) and Key Project of Natural Science Foundation of Shandong Province(Grant 2015GGX101025). Vision based gesture interactive platform for virtual assembly is regarded as the research background. Reducing the users' cognitive burdens and operational burdens are treated as the research goals. On the basis of this, we study on the intelligent perception for users' intentions in virtual assembly interfaces. Intelligent perception for users' intentions is a bridge between users and machines for semantic communication, and it is an important part of implicit human-computer interaction at the same time. How to realize the intelligent perception of users' intentions and implicit interaction in virtual assembly interface is the key point of this thesis. The innovations of this thesis are reflected in the following aspects:(1) The interactive knowledge databases are constructed for intelligent perception.Different from the existing methods of capturing operational intentions, users' intentions are divided into explicit operational intentions and implicit operational intentions by analyzing users' operational intentions during interaction in this thesis. We obtain the knowledge of virtual assembly interface from two aspects and establish the scene knowledge model and user knowledge model to carry out the knowledge representation. On the basis of this, explicit interactive knowledge database and implicit interactive knowledge database are separately constructed. Through reasoning mechanism of the expert system, the machine realizes the intelligent perception of explicit operational intentions and implicit operational intentions.(2) Man-machine opposite movement algorithms are proposed.Man-machine opposite movement algorithms are proposed for the interactive interface of virtual assembly in two scenarios. On the one hand, we propose an algorithm based on the position relationship between scene object and virtual hand when human hand is moving towards the target position, which can make scene actively close to the user. On the other hand, we propose an algorithm of object rotation on the basis of the direction relationship between the object and virtual hand when human hand is grasping or releasing the object. The algorithm reflects the idea of the human-centered interaction, and the machine can perceive the users' operational intentions and actively complete the users' implicit intentions to reduce the users' operating time and moving distance.
Keywords/Search Tags:Virtual assembly, knowledge model, explicit interaction interactive knowledge database, implicit interaction interactive knowledge database, expert system, intelligent perception, implicit interaction
PDF Full Text Request
Related items