Font Size: a A A

Research On The Measure Of Psychological Information And Its Applications To Reliability Engineering

Posted on:2010-10-31Degree:DoctorType:Dissertation
Country:ChinaCandidate:G B WangFull Text:PDF
GTID:1102360308965880Subject:Mechanical and electrical engineering
Abstract/Summary:PDF Full Text Request
Information entropy theory traces back the pioneering work by Hartley, Shannon and Wiener. In general, information entropy theory refers to the theoretical system of information entropy proposed by Shannon and its derived maximum information en-tropy. Hartley pointed out that information contains both of the physical attributes and the psychological attributes. The objective of this research is to quantify the uncertainty of psychological information and create a psychological information measure according to its application demands in reliability engineering.The measures of psychological information need theoretical support and practical engineering applications can be used to verify the psychological information measure developed. Theoretical research and application of the theory of this paper are inter-twined and complementary. Firstly, the spatial expression problem of the psychological attributes of information and the needs for measures of psychological information in reliability engineering are identified based on an introduction of the origin of classical information theory and generalization of the concept of information entropy, etc. Sec-ondly, the axiomatic systematical demand of the psychological information entropy function is studied and the definition of psychological information space is given, and the Wang-Huang entropy theorem and its related properties are proposed in order to es-tablish a measure theory of multi-dimensional psychological information entropy. Then, verification and application expansion of the psychological information entropy theory are carried out in three main areas of the reliability engineering practical application, namely, psychological information entropy for system reliability optimization, a risk assessment method based on the maximum psychological information entropy, and a ternary integration method for quantifying system effectiveness. Through theoretical and application-oriented research on the psychological information entropy measure, the innovative achievements of this study include the following:(1) Summarize the limitations of Shannon information entropy theory and its root causes. Analysis of the meaning of continuity of the information entropy function and of the definition of additivity, as well as the shortcomings of the classical examples which are used to verify the information entropy theorem. The relationship between definitions of risk analysis and system effectiveness in reliability engineering and the "abuse" of the information entropy theorem are analyzed in the view of paradigm. We find that the principal causes of the limitation of the current information entropy theory are the in-completeness and inapplicability of its axiomatic conditions. The boundedness condi-tion is added in order to perfect the axiomatic system the psychological information en-tropy theory.(2) Analysis of the maximum and extreme value conditions of psychological in-formation entropy and psychological information. The different concepts of psycho-logical information entropy and psychological information are defined. The definition of a balanced characteristic of psychological information is proposed according to the law of diminishing marginal benefit. The regression characteristic of psychological in-formation is analyzed. The mapping from system effectiveness function to psychologi-cal uncertainty function is established. The relationships between psychological infor-mation entropy and psychological information are given, namely, psychological infor-mation entropy can be perceived as a measure to quantify the degree of uncertainty of psychological information in a balanced meaning paradigm.(3) Propose Wang-Huang entropy theorem and its related properties of entropy function. Through the introduced entropy function approximation method, the features of the maximum information entropy of psychological information are studied which should have in nature. Based on the Shannon information entropy theorem assumptions, the axiomatic conditions of psychological information are developed according to its diminishing marginal benefit law. Regarding the improved axiomatic conditions which psychological information entropy function should meet, a measure in the form of Wang-Huang entropy is defined for quantifying multi-dimensional psychological in-formation. The uniqueness of the theorem is proved using a function approximation method. The maximum entropy and the cross-entropy derived by the Wang-Huang en-tropy theorem is analyzed, and compared with the maximum entropy theorem and the cross-entropy method of the existing theoretical system.(4) System reliability optimization by psychological information entropy measure. The Kullback-Leiber cross-entropy concept and its deduced the cross entropy method in the current information theory system are introduced. Applying the cross-entropy method, some typical reliability optimization problems are studied. A definition of psy-chological cross-entropy is provided based on the Wang-Huang entropy. The boundary of psychological cross-entropy is analyzed. According to the psychological cross-entropy measure, the concept of a virtual shortest path is defined and the psycho-logical cross-entropy method which can be applied to solve system reliability optimiza-tion problems is developed.(5) Risk evaluation based on the maximum psychological information entropy method. The deficiencies of the risk priority number (RPN) method include dependence of combination errors of the RPN method and non-equilibrium of the RPN function for selecting the weights of indexes. Based on the maximum psychological information en-tropy, a new risk measurement and risk evaluation method are provided in the form of risk possibility number theorem. At the same time, the smallest risk possibility number is derived by the risk possibility number theorem. The physical meaning of the risk pos-sibility number is discussed, and the two methods of the risk priority number and the risk possibility number are compared.(6) A ternary integrated method for quantifying system effectiveness. The defi-ciency of the current ADC method for system effectiveness quantification is analyzed. The axiomatic system requirements of system effectiveness function are discussed. Based on the maximum entropy function, the axiomatic system of system effectiveness function is expanded. The mapping from system effectiveness function to psychological uncertainty function is established. A new method for system effectiveness qualification is established applying Peirce's ternary integrated logic model.In the concluding part, this dissertation gives an answer to the spatial expression problem of information psychological attribute identified in the introduction part, which can be summarized as follows:psychological information entropy and psychological information have different maximum values and have different extreme conditions; psychological information entropy provides a unifying measure by integrating three levels of meaning, which are syntax, pragmatics and semantics.
Keywords/Search Tags:the information entropy theory, Wang-Huang entropy theorem, reliability engineering, maximum psychological information entropy, psychological cross-entropy method
PDF Full Text Request
Related items