Font Size: a A A

The Study Of Training Problem And Decoding Problem About HMM

Posted on:2005-11-10Degree:MasterType:Thesis
Country:ChinaCandidate:G J HanFull Text:PDF
GTID:2120360155971919Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
The hidden Markov model (HMM) is a sort of statistical signal model. The basal theory of this model was brought forward in 1960s by L. E. Baum, then it has been gradually applied to many fields such as speech recognition, gene relative analysis and gene recognition, character recognition, image processing, target tracking and signal processing etc. There are three problems needed to be solved by HMM, which are training, recognizing and decoding. The answers for those three problems consist of the theory of HMM. Parameter estimation or train is the core problem of the training process. The work of this paper can be generalized as follow.Innovations of this article.1. It is easier to get into the trap of local best solutions applying traditional training methods of HMM to training parameters, what's more , the training of parameters belong to optimization problem with restriction, so it's difficult to apply general methods to solving this problem. Luckily Genetic Algorithm has the ability of searching global best solutions relatively, and its ability of dealing with optimization problem with restriction. So this paper uses GA to train the parameters combining a sort of comparison mechanism. The result of simulation embodies the reliability of this method.2. HMM with continuous time and continuous observation space is rarely considered since it is difficult to deal with it. However it is studied in this article and a sort of MAP estimation of parameter is derived.3. As for as the recognition methods we have known---traditional Viterbi algorithm and the recognition algorithm of Hidden Markov Model based on file and smoothing are concerned, the latter does better than the former. So we extend this method to inhomogeneous nonstationary HMM in order to extend its applicational range. Meanwhile we also analyze the error of this algorithm.4. The fifth section of third chapter of this paper generalizes the theory foundations of application of HMM in reducing noises of speech. Some signals are sent together with stronger noises, i.e. besides the error of system, the signals we have received also include additive noises. In order to estimate the signal well and truly, the primary disposal has to be done firstly so that the signal disposed is close to the true signal relatively. So the quality of recognition may be enhanced.
Keywords/Search Tags:HMM, Genetic Algorithm, Diffusion Process, Viterbi algorithm, Filter Extended EM Algorithm, Jenson Inequality
PDF Full Text Request
Related items