| Time series are a family of random variables that vary with time and are widely used in practice.Time series models can be divided into two categories:Integer-valued time series models and noninteger-valued time series models from the point,of view of whether the time series observation data are integer or not.Conditional heteroscedas-ticity is a common phenomenon in both integer and non-integer time series.Therefore,from the perspective of whether the time series observation data are integer or not,con-ditional heteroscedastic time series models can be divided into noninteger-valued condi-tional heteroscedastic time series model and integer-valued conditional heteroscedastic time series model.In this paper,based on the empirical likelihood method,we investi-gate the parameter estimate and hypothesis test problems of autoregressive conditional heteroscedastic model and threshold autoregressive conditional heteroscedastic model.In what follows,we give our main results.Firstly,we employ the empirical likelihood method to estimate the unknown pa-rameters in Poisson autoregressive model in the presence of auxiliary information.Specifically,consider the following Poisson autoregressive modelwhere σ fiele Ft-1=σ(Xt-1,Xt-2,…),α0>0,αi ≥ 0(i = 1,2,…,p)and α =(α0,α1,…,αp)is unknown parameter vector.We assume that,we have some auxiliary information that.can be represented as the conditional moment restrictions E(g(Xt,…,Xt-p;θ0)|X(t-1))= 0,t = 0,1,2,…,where the unknown parameter vector θ0 E Rd,X(t-1)=(Xt-1,… Xt-p),g(x;θ)∈Rτ is some function with r ≥ d.By using the the auxiliary information,we can obtain data adaptive weights of the weighted least squares estimate based on the empirical likelihood method.Before we state our main results,the following assumptions will be made:Assumption 1.1 The parametric space T is compact with T = {α:δ ≤ α0 ≤M,0<α1 + … αp<M*<1,αi ≥ 0,i = 1,2,…,'},where δ and M are finite positive constants,and the true parameter value α0 is an interior point in T.Assumption 1.2 There exists θ0 such that E(gt(θo))= 0,the matrix E(θ)=E(gt(θ)gtτ(θ))is positive definite at θ0,(?)g(x;θ)/(?)θ is continuous in a neighborhood of the true value θ0,||(?)g(x;θ)/(?)θ||and ||g(x;θ)||3 are bounded by some integrable function W(x)in this neighborhood,and the rank of E((?)gt(θ)/(?)0)is d.Note that the conditional moment restrictions imply that E(gt(θ0))= 0,which combines with the empirical likelihood method,let where θ0 is unknown parameter.By using the auxiliary information,we can obtain data adaptive weights ωt.Further,combining with the least squares method,we can obtain the following weighted least squares estimate α = arg min(?)ω(Xt-Ztτα)2,where Ztτ=(1,Xt-1,…,Xt-p).By introducing a Lagrange multiplier λ ∈ Rτ,standard derivations in the empirical likelihood lead toωt(θ0)=1/n1/1+λθ0gt(θ0),(2)where λθ0 satisfies Utilizing the weights(2),we haveIn the following,we will give the asymptotic properties of a.Theorem 1 Assume that 1.1 and 1.2 hold.Ifα0 is the true value of a,then(?)(α-α0)→(0,W-1(Λ-Λ12Σ-1(θ0)ΛT12τ)W-1),where W-E(ZtZtτ),Λ = E(ZtZtτ(Xt-Ztτα0)2)and Λ12 = E(Ztgtτ(θ0)(Xt-Ztτα0)).To apply the proposed estimator(3),we need to further estimate the unknown parameter θ.Let θ = arg maxθ L(θ).Following the results in Qin and Lawless(1994),the x corresponding weights ωt(01/n1/1+λθ0gt(θ0),where λθ is the solution to 1/n(?)gt(θ)/1+λθτgt(θ)=0 and 1/n(?)(?)gt(θ)/(?)θτλθ/1+λθτgt(θ)0.Letα1 = arg min(?)ωt(θ)(Xt-Ztτα)2.(4)In order to study the estimator(4),we define Γ(θ0)= E((?)gt(θ0/(?)θ)),Ω(θ0)=(r(θ0)Σ-1(θ0)rT(θ0)-1and B =Σ-1(θ0)(I-Γ(θ0)Ω(θ0)rT(θ0)Σ-1(θ0)),where I is the iden-tity matrix.The limiting distribution of α1 is given in the following theorem.Theorem 2 Assume that 1.1 and 1.2 hold.If α0 is the true value of α,then(?)(α1-α0)→(0,W-1(Λ-Λ12BΛ12τ)W-1).Secondly,we consider the test for the conditional heteroscedasticity of the p-order poisson autoregressive model based on the following two test methods:a parametric test based on the maximum likelihood method and a nonparametric test based on the empirical likelihood method.Specifically,consider the following p-order Poisson autoregressive model(5)where α0>0,α0 ≥ 0,i = 1,2,…,p,σ field ft = σ(Xt-1,Xt-2,…).In what follows,we consider the test for the conditional heteroscedasticity of the above p-order poisson autoregressive model based on the recorded data {X1-p,…,δX0,…,Xn}.For convenience,let α =(α0,α1,…,αp)τ,T =(0,1,1,…,1)T,Zt =(1,Xt-1,…,Xt-pτ.Before stating our methods and main results,we make the following assumption.Assumption 2.1 The parametric space Θ is compact withΘ = {α =(α0,α1,…αp):δ ≤ α0 ≤ M,0 ≤ α1 + …αp ≤ M*<1}where δ,M and M*are finite positive constants,and the true parameter value a0 is an interior point in Θ.We first use the maximum likelihood method to establish the test statistic.In order to test the conditional heteroscedasticity of model(5),consider the following hypothesis test:H0:α ∈ Θ0 vs.H1:α ∈ Θ—Θ0,(6)where Θ0 = {(α0,0,…,0):(α0,0,…,0)∈ Θ}Let Pt(α)= λtXte-λt/XT!,where λt = α0 +∑ αiXt-i.Based on the observation data,the conditional log-likelihood function can be written as If α ∈ Θ,the maximum likelihood estimation a of a is the solution of the likelihood equation(?)l(α)/(?)α 0.If the null hypothesis is true,that is,the parameter value α0 ∈Θ0,then the maximum likelihood estimation α0 = g(α0).Note that a = g(α0)and Pt(a)= Pt(g(αo)).Further denote Pt(α)as Pt(α0).Consequently,Pt(α0)=α0Xte-α0/Xt!,and α0 is the solution of the likelihood equation(?)l(α0)/(?)α0,whereA simple calculation shows that α0=(?).By the strong law of large numbers for independent and identically distributed random variables,it is easy to see thatα0→α0 as n → ∞.For the test problem(6),we define the following likelihood ratio test statisticFor the test statistics λ,we have the following theorem:Theorem 3 Assume that 2.1 holds.Then under H0,for any t>0,we have lim P{2logλ ≤t} = P{x2(1)≤ t},whereχ2(1)is a chi-squared distribution with one degree of freedom.Next,by using the empirical likelihood method,we establish the test statistic to test the conditional heteroscedasticity of model(5).Let β =(?).Consider the following hypothesis test:H0:β = 0 vs.H1:β>0.For this,we first consider the estimation ofβ.We can minimize Q(a)=(?)(Xt-Ztτα)2 with respect to ca to obtain the conditional least-squares estimator α of α.Solving xii dQ/dα = 0 for α<=(?)-1()XtZt.Let α0 =(1,0,…,0)τα.Then α0 is a consistent estimator of α0.Further,let α*=(α0,α1,…,αp)T.Then,the estimating equation of β can be written as Let Ht(β)= XtZtτ(1/n(?)ZτZtτ)-1T-β.According to Owen(1988),the empirical like-lihood function can be constructed as Using the standard Lagrange multiplier arguments,the optimal value of pt is found to be Pt = 1/n1/1+λ(β)Ht(β)=0,where A(β)satisfiesTherefore,ignoring the constant term-nlog n,the empirical log-likelihood function is defined asIn order to test H0,we define the following empirical likelihood ratio statistic T=-2logL(0)/supβ≥0L(β).For the test statistics T,we have the following theorem:Theorem 4 Assume that 2.1 hold.Then under H0,for any t>0,we have lim P{(?)≤t}=1/2P{χ2(1)≤t} + 1/2 where χ(1)is a chi-squared distribution with one degree of freedom.Lastly,we consider the parameter estimate problem of the one order threshold autoregressive conditional heteroscedastic model Xt = θ1Xt-1+ + θ1X-1-+ εt(7)where εt =(?),ht = α0 + α1(εt-1+)2 +α2(εt-1-)2,et is a sequence of independent identically distributed random variables with Eet = 0,Var(et)= 1;α0,α1 and a2 are model parameters with α0>0,0≤ αj<1,j = 1,2;Xt+ = max(Xt,0),Xt-=min(Xt,0).In what follows,we use the empirical likelihood method to estimate the model parameter.Before we state our main results,the following assumptions will be made:Assumption 3.1 The probability density function f(·)of εt has the support in(-∞,∞),and satisfies θmax +(?)<1,where θmax = max{|θ1|,|θ2|},αmax =max{α1,α2}.Assumption 3.2 E(Xt6)<∞.Let θ=(θ1 θ2)τ,Xt=(Xt-1+,Xt-1-)τ = Ht{θ)=(Xt-Xtτθ)Xt.By using the estimate equation of least squares estimate(?)(Xt-Xtτθ)Xt = 0,we can obtain the following profile empirical likelihood ratio function Using the standard Lagrange multiplier arguments,the optimal value of pt is found to be Pt = 1/n1/1 + bτ(θ)Ht(θ),where b(θ)satisfies So we have-2log(L(θ))=(?)log(1+bτ(θ)Ht(θ)).Now we give the limiting distribution of L(θ).Theorem 5 Assume that 3.1 and 3.2 hold,Then,as n→∞,-2log(L(θ))→ χ2(1),where χ2(1)is a chi-squared distribution with one degree of freedom. |