| With the advancement of science and technology and the development of informatization,massive amounts of information and data are collected in people’s production and life,and various industries hope to better guide economic life by analyzing information.However,such information is often full of ambiguity,uncertainty and conflict.How to measure the uncertainty of this information and how to integrate different information to make reasonable decisions are becoming research hotspots.Entropy is one of the most commonly used uncertainty measurement methods.The earliest information entropy was proposed by Shannon.Subsequently,Renyi entropy and Tsallis entropy were proposed as the generalized form of Shannon entropy.Recently Lad et al.proposed extropy as the complementary dual entropy of Shannon entropy,which measures uncertainty from the perspective of negative events.Then Tsallis eXtropy was proposed as the dual entropy of Tsallis entropy.This thesis studies Renyi eXtropy and Gini eXtropy on the basis of extropy.In addition,the negation operation proposed by Yager and extropy have similarities in thought,and both consider the problem from the opposite side of the event.The negation operation on the probability distribution provides a way to represent information from a new perspective.This thesis also studies the connection between negation and extropy.Because evidence theory can express more uncertainties,this thesis also extends extropy from probability theory to Dempster-Shafer evidence theory,and proposes the information volume of the Basic Probability Assignment(BPA)based on the maximum extropy.Finally,this thesis proposes Renyi eXtropy and Gini eXtropy,and proposes two information fusion algorithms based on Gini eXtropy.The main research content of this thesis revolves around extropy,and proposes two new information fusion methods.The specific contributions are as follows:(1)Research on extropy and negationFirstly,the negation operation on the probability distribution and the corresponding change of extropy are studied.Experiments show that the probability distribution will be more evenly distributed when iterating negation,and extropy will increase synchronously;with continuous negation iterative operations,the probability distribution will eventually converge to a uniform distribution,and extropy will achieve the maximum value at the same time.Secondly,the extension of negation is proposed.The original negation operation is to use"1-4)"as the negation of4),and then perform normalization to obtain the probability distribution after negation.This thesis extends"1"to the parameter"(6",and limits the parameter:max(4))<(6≤1+min(4))to ensure the probability distribution.The experimental results show that under the condition max(4),9)/2)<(6≤1+min(4)),as the value of(6 increases,the speed at which the same probability distribution converges to a uniform distribution increases,and the corresponding entropy also increases,that is,the negation operation will increase the entropy value.Whereas the uniform distribution is not affected by the negation operation and always remains the same.Finally,based on the maximum extropy,this thesis puts forward the information volume of BPA in the evidence theory.That is,the confidence value of the multi-element focal element is split equally on its power subset,and the obtained single-element focal element is cumulatively calculated for extropy,and the final convergent value is the information volume of BPA.Experiments show that the information volume obtained in the case of complete uncertainty and the case of uniform distribution of confidence values on the power subset of the identification framework is equal,which is also consistent with intuition.(2)Propose Renyi eXtropy,Gini eXtropy and two information fusion methodsBased on extropy and Renyi entropy,Renyi eXtropy,maximum Renyi eXtropy and conditional Renyi eXtropy are proposed.It is proved that when the parameterof Renyi eXtropy tends to 1,Renyi eXtropy degenerates into extropy.When the probability distribution is a uniform distribution,Renyi eXtropy takes the maximum value,and the maximum Renyi eXtropy is equal to the maximum extropy.Based on extropy and Gini entropy,this thesis proposes Gini eXtropy and maximum Gini eXtropy,verifies the complementary relationship between Gini eXtropy and Gini entropy,and the relationship between Gini eXtropy and Shannon entropy,extropy,Tsallis entropy,Tsallis eXtropy,Renyi entropy,Renyi eXtropy is analyzed.Due to the simplicity of Gini eXtropy calculation,the information fusion algorithm proposed in this thesis uses it to calculate the entropy weight.Considering the conflict between evidence and the reliability of evidence itself,the first fusion method proposed in this thesis combines the credibility weight of evidence with the entropy weight to obtain a comprehensive weight,and then performs N-1 Dempster-Shafer fusions.Compared with other fusion methods in the target detection experiment,the final result can get higher recognition accuracy than other methods.The second fusion method conducts experiments on two public datasets.Due to the characteristics of the iris flower dataset and the wheat seed dataset,this thesis uses the method of generating interval numbers proposed by Kang et al.when processing the data.This fusion method combines interval similarity and entropy weight to obtain the average evidence.This fusion method combines interval similarity and entropy weight to obtain the average evidence.Like the first method,it also considers distance and the information volume contained in each piece of evidence.The second fusion method is compared with Kang et al.’s method,Buono and Longobardi’s method and KNN algorithm,the results show that the proposed method has better overall recognition rate. |