Font Size: a A A

A sampling based approach to Bayesian inference in hidden Markov models

Posted on:2008-02-10Degree:M.ScType:Thesis
University:University of Manitoba (Canada)Candidate:Hong, Say PhamFull Text:PDF
GTID:2448390005451872Subject:Statistics
Abstract/Summary:
Hidden Markov models (HMM's) have been studied extensively both in the frequentist and Bayesian literature. Typically, the expectation maximization (EM) algorithm is used for likelihood inference, whereas Markov chain Monte Carlo (MCMC) has been applied in the Bayesian setting when the number of hidden states is known. When the number of hidden states is considered unknown, however, statistical computation and analysis of HMM's becomes extremely difficult due to the complexity of the model. In the frequentist perspective, penalized likelihood and penalized minimum methods have been used to estimate the number of hidden states, while the Bayesian approach typically relies on reversible jump MCMC to infer the number of hidden states.; The contribution of this thesis is to propose an alternative to the use of reversible jump MCMC for Bayesian inference in HMM's. Our methodology is based on a sampling procedure developed by Fu & Wang (2002). This method is based on the discretization of density functions with respect to the Lebesgue measure. One of the great features of this method is its mathematical simplicity which makes it easy to implement relative to most other sampling procedures (including reversible jump MCMC). In Chapter 5, several examples are used to show how this technique can be used to estimate the parameters as well as the number of hidden components of a HMM model.
Keywords/Search Tags:Hidden, Bayesian, Reversible jump MCMC, Markov, Sampling, Inference, Used
Related items