Font Size: a A A

Bayesian Analysis And Prediction For Mixed Poisson Autoregressive Models

Posted on:2011-11-30Degree:MasterType:Thesis
Country:ChinaCandidate:Y WangFull Text:PDF
GTID:2120360305955441Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Poisson distribution is mainly used to describe the statistical law of the specified event number in the unit time or in the specified scope, for example, the frequency switchboard received in the unit time, the particle number radioactive materials emitted etc., which can be studied by use of the Poisson distribution. However, with the complexity and diversity of the sample, single Poisson distribution can no longer accurately describe the observed data, thus the researchers commit themselves to study mixed Poisson distributions.Research includes parameter estimation of mixed distribution, multivariate mixed Poisson distributions, mixed Poisson regressive models and so on. But, according to the author to understand,at present, studies on the mixed Poisson autoregressive models of the literature are relatively small. Therefore, the author thinks it is very necessary to study mixed Poisson autoregressive models based on the mixed Poisson distributions and the mixture of autoregressive model, and to do Bayesian analysis and prediction.In this paper, first, introduce the development and research of the mixed Poisson distributions, mixture of autoregressive models and Bayesian analysis and prediction theory. Secondly, establish mixed Poisson autoregressive models based on the mixed Poisson distributions and the mixture of autoregressive model, and do Bayesian analysis and prediction. Thirdly, the simulation study shows whether the estimated value is close to the theoretical value.Establish mixed Poisson autoregressive models in the first part of the second chapter.Definition 2.1:let ( k1 ,L ,kn) be time series observations of ( x1 ,L ,xn), mixed Poisson autoregressive models with an unknown mixing distribution G: the paper, u represents (Φp,p). The unknown distribution, G, is a joint distribution of (Φp,p), autoregressive parameters and autoregressive order, poisson (xtΦ'p Kp,t?1) is distribution law with meanΦ'p K p,t?1. For notational convenience, we also use poisson( xt Xt?1 = Kt?1,ut) to represent poisson (xtΦ'p Kp,t?1)。The hierarchical form of mixed Poisson autoregressive models is Calculate posterior distribution in the second part of the second chapter.X n = (x1,x2,L,xn), K n = (k1,k2,L,kn), from the above hierarchical structure, the conditional likelihood given G and U n = (u1,u2,L,un) of the mixed Poisson autoregressive models is p (Xn KnUn,G)p(XnKnUn)poisson(xtXt1Kt1,ut), Prior distribution of ( Un ,G) is p (Un ,G)= G(dUn)?D(dGα0G0), So we can get the joint posterior distribution h (dU,dGXK)D(dGG)poisson(xXK,u)(Gt1)(dut)Integrating the above joint posterior distribution with respect to G, we can obtain the margin-al posterior distribution of U n h (dUXK)poisson(xXK,u)(G)(dut)Calculate posterior mean in the second part of the second chapter. s (Un ,G) is positive integral function, thus posterior mean is It is difficult to obtain numerical estimates of E[ s(Un ,G)Xn= Kn], a crucial step is to sim-ulate from h (dUn Xn= Kn), we express h (dUn Xn= Kn) as a finite sum. See Lo et al.(1996). Let { }p/ =C1 ,C2,L ,Cn(p/) denotes a partition of size n (p/ ) on the integers {1 , 2,L ,n} and e jbe the cardinality of each C j for j = 1,2,L ,n(p/) .Here C j is a subgroup of a part-ition. A natural numerical scheme to estimate the posterior mean of s (Un ,G)is by Monte Carlo averaging, provided the t (p/ )is analytically accessible. In other words, E[ s(Un ,G)Xn= Kn] can be approximated by N1 t(p( i))where p/ (i) is simulated from w (p/ ).This paper adopts the Gibbs scheme of the Weighted Chinese Restaurant(gWCR)procedure.Generate in-sample prediction in the first part of the third chapter.is a variable in the intervals for t≤n, the value of E[∫U poisson(~xXt 1=Kt1,ut)G(dut)Xn=Kn] is the in-sample prediction value. Given a sample p/ ( M+1 ),L, p/(M+N)from w (p/ )using the gWCR procedure, (1 ?α)%in-sample prediction intervals can be worked out by solvingEstimates of the prediction intervals can be formed as ( N1i MM N1L(i),N1i MM N1U(i))Generate out-of-sample prediction in the second part of the third chapter.Inference1:Let ( k1 ,L ,kn)is the time series observation of ( x1 ,L ,xn), the condition distribution of x t given by X t?1,G is f (xt = ktXt?1 =Kt?1,G)=∫U poisson(xt=ktXt?1=Kt?1,ut)G(dut)=∫U poisson(xt,Φ'p Kp,t?1)G(dΦp,dp), let g be an arbitrary positive or quasi-integral function. Then the expected value of g (xn + 1,L ,xn+h),We are interested in g (xn +1 ,L ,xn+hcalculated via suitable recursions. Let g* in this particular g. Then, The desired E (xn + hXn=Kn) is the obtained by the following sequential methods:⑴Use gWCR procedure to get a Monte Carlo Markov chain (MCMC) sample of partitionStochastic simulation for mixed Poisson autoregressive models with order 1 in the fourth chapter.
Keywords/Search Tags:Mixed Poisson distribution, Mixture of autoregressive model, Bayesian analysis, Dirichlet process
PDF Full Text Request
Related items