Font Size: a A A

Research On Exact Inference Algorithm In Bayesian Networks

Posted on:2007-07-08Degree:MasterType:Thesis
Country:ChinaCandidate:H X YanFull Text:PDF
GTID:2178360182996270Subject:Software engineering
Abstract/Summary:PDF Full Text Request
Uncertain information processing is an important research area in Artificialintelligence. From viewpoint of expert system, all the processing approaches fallinto two categories -rule-based and model-based ones. One of the model-basedapproaches, Bayesian networks, was developed in 1980's and has been paidincreasing attention to since 1990's. Compared with the early rule-basedapproaches, it has more clear semantics and usually makes more reasonableconclusions, but involves much more calculation.Inference in general Bayesian networks is NP hard. So far tens of Inferencealgorithms have been developed to make Bayesian networks as practical aspossible. All the algorithms fall into two categories-exact ones and approximateones. Three early algorithms-Belief-Net-Ask, message-passing and variableelimination, lay the foundations of other inference algorithms. Belief-Net-Askcan calculate posterior distribution for a single node in a Bayesian network oftree form. Message-Passing can calculate posterior distributions for multiplenodes in a Bayesian network of tree form. Variable elimination can calculateposterior distribution for a single node in a multi-connected Bayesian network.The three algorithms are typical since they have their own characteristics. In thispaper, mathematical description, comparison and characterization of the natureof the three algorithms are presented on the basis of introduction to basicconcepts of Bayesian networks.Junction Tree Algorithm (JTA) is the one of most widely used exactalgorithms, which is on the basis of the deeply analyzing and understandingabout the relationships between Graph theory and probabilistic knowledge. JTAis so different from other algorithms at the point that it do not analyze on theoriginal graph, but on constructing an auxiliary data structure named JunctionTree. After that, it needs to define the message-passing procedure on the JunctionTree, and then go forward to belief computing. The structure can helps tocompute the conditional probability of sole node of the set of all nodes givencertain set of observation. The main idea of JTA is to develop a methodologywhich can divide whole computation on joint probability into set of related localones. The key point of it is this concept. A special data structure——Junctiontree—being introduced to figure out the important relations between local undergraph theory and probabilistic inference.JTA needs five steps below: firstly, graphical conversion, that is, convertBayesian networks into moral graph and then go into triangulation and in the endconstruct junction tree;secondly, initialization means the initializing of potentialof nodes;thirdly, message-passing;fourthly, belief computation;lastly, adding ofevidence. From the execution process, it can be optimized at the two points: first,as the conversion is not unique, so there will be different junction trees from oneBayesian networks according to which there are different numbers included inthe maximum clique. It is known that the size of the maximum clique is the keyfactor in influencing the time and space complexity the algorithm needs. So itcan reduce the complexity of time and space through searching such a smallestjunction tree which can produce the maximum clique. Another, after convertinginto junction tree from Bayesian networks, it needs to reach the globalconsistency in the junction tree through message-passing whenever insert newevidence each time. It is easy to see that it is the hinge in optimizing JTA todevelop an efficient message-passing scheme. In this paper, it will improve theJTA through changing the process of message passing.There are two sorts of inference algorithms in Bayesian networksProbabilistic Inference. The one is called direct computation which is based onquery and Variable Elimination for instance. The task of Probabilistic Inferenceis defined as computing the probabilistic distribution P(T|e) of observationvariable T given the evidence e if used the based query algorithm. It needs tosolve one query Q that removes the irrelevant variables with Q in Bayesiannetworks. However, the removal comes up with other two operations:d-separation and barren variables removal. Moreover, it is necessary to executethis analysis when each query comes. Another kind is called indirectcomputation which is based on the secondary graph structure converted from theoriginal Bayesian networks passing messages and Junction Tree for instance.Junction Tree comes from the original Bayesian network and this operation iscreated once and off-line. Since that, it should finish subsequent inference. Thisstructure should be big enough that it can propagate every possible dependenceof the original network given any sets of evidences. Nevertheless, suchalgorithms are all local computation and no such global analysis for graphs.In this paper, we introduce two such methods combining with two sorts ofcomputation. It tells a kind of propagation scheme——lazy propagation ofmessages. Under this scheme, there are two kinds of algorithms VE-LP andAR-LP which are respectively LP improvement on the operation of VariableElimination (VE) and Arc reversal (AR). Moreover, there are accordinglymathematical equations to depict the essence of these methods. Then, it happenson Asia and Alarm as an experimental test after which it shows the result ofcompare these two improved with original JTA. The result shows that these tworeduce the space complexity of junction tree algorithm to some extent andimprove actual computation efficiency using the conditional independence nodesproduced.
Keywords/Search Tags:Inference
PDF Full Text Request
Related items