Font Size: a A A

Entropy-based Research On Measurement Information Methods

Posted on:2021-07-31Degree:DoctorType:Dissertation
Country:ChinaCandidate:H PanFull Text:PDF
GTID:1480306107455164Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Measurement is the main methods of mankind to obtain information from the objective world.With the deepening of people's understanding of the mathematical basis and connotation of information theory,the applications of information entropy methods in various scientific fields have achieved many good results different from general traditional methods.At the beginning of this century,researchers in the field of measurement in Chinese universities proposed the conclusion that "The essence of measurement is information acquisition".Under the guidance of this conclusion,the dissertation explores and constructs a unified method using Shannon information entropy as the measure to describe,analyze and evaluate general measurement processes under random and uncertain conditions.The dissertation contains the following main contents:Based on partial information,the probability distribution that satisfies the known information and has the maximum entropy is unique and unbiased.The hypothesis that arbitrary source entropy is composed of infinite number of entropy subsets is proposed.On the basis of this principle,a source estimation method based on maximum entropy with measurement significance is studied and applied to the interval design of quantization unit.Based on the selectivity of sensor,taking the convolution relationship between the actual measurement source and the sensor as an example,the entropy description model of source-sensor is established.The basic theorems of measurement information theory are researched and summarized,and two new information measure and their theorems are proposed based on the particularity of measurement.Among them,the existence and measurability of the true value of theoretical entropy are demonstrated,which provides a theoretical basis for the expression and calculation of entropy;the entropy balance theorem is proposed and proved for the measurement closed system with a general structure,which provides a convenient method for system modeling and entropy calculation;for the calculation problem of the amount of information between any two points of the measurement system,two new amounts of information are defined — the sum of mutual information,and the sequential information amount,and the difference and their complementary relationship between the two amounts of information are proved.Furthermore,the error entropy-based decision criterion and the relative entropy-based method for binary signal detection are studied and designed respectively,which provide a certain theoretical basis and technical support for the measured source reconstruction.To explore the mechanism of measurement transformation in information acquisition process,the entropy representation of Laplace transform and Z transform are derived and proved,which realize the information entropy description of complex frequency domain transform.On this basis,the equivalent effect of measurement transform processing and conditional entropy in the information acquisition process is proved,and the correspondence between the physical realization of information processing and the mathematical transform representation in the information acquisition process is revealed.Furthermore,the entropy characteristics of the general series,pure entropy increase/decrease measurement system are studied,which provides convenience for the rapid entropy calculation and analysis of the measurement system.To verify the effectiveness of the proposed theoretical methods,the entropy modeling of three typical information processing algorithms of measurement system — phase sensitive detection algorithm,hidden Markov model and blind source separation algorithm are listed.From the perspective of mutual information improvement,the effect of phase sensitive detection algorithm on the performance of information acquisition is discussed.The entropy rate expression of hidden Markov model is derived,and the entropy description of the maximum entropy hidden Markov model is realized.The entropy representation of blind source separation is established,and the entropy change characteristic of the signal in the process of blind source separation is revealed.For the extended information entropy representation and evaluation problems of feature information extraction and selection,it is proved that the feature information extraction process can improve the correlation between the model input and output based on the measurement system entropy characteristics.The variable tolerance fuzzy entropy is designed to characterize the feature information,and combined with Fourier transform multi-filter decomposition to realize high-quality feature information extraction.On this basis,from the perspective of information processing,the filtering effect of feature selection in the generalized measurement system is analyzed,and a feature selection method based on modified neighborhood mutual information is designed to achieve high-resolution feature information selection.
Keywords/Search Tags:Measurement information theory, Entropy, Information acquisition, Measurement system modeling
PDF Full Text Request
Related items