Generalized estimating equations (GEE), which extends the generalized linear model by Liang and Zeger in 1986, is an important method for the regression analysis of the longitudinal data. The main characteristic of GEE is that it introduces the working correlation matrix into the GEE. Since it was introduced by Liang and Zeger, it has made a great progress in both theoretical research and applications. In the analysis of the traditional regression model, in general we will assume the situation that "the sample size nâ†'+∞, but the dimension of parameter p has fixed". With the advent of high-dimensional data in many areas, the case of "the simple sizes nâ†'∞, and the dimension of covariates pnâ†'∞" has been gradually got the attention of statisticians.This thesis is mainly study on the asymptotical theory of generalized estimating equations analysis of the Poisson model when the number of covariates pn goes to infinity with the number of cluster. Under the regularity conditions that pn3/nâ†'∞ with nâ†'∞, we prove the existence, consistency and the asymptotic normality of the GEE estimator. We extend the recent elegant work of Xie and Yang (Ann.Statist.31(2003) 310-347), Balan and Schiopu-Kratina (Ann. Statist.33 (2005)522-541) to the case of high-dimensional covariates, and also generalize the results of Wang (Ann.Statist.39(2011) 389-417) to the case of unbounded response variable. |