With rapid advances of computing power and other modern technology,more informa-tion is available in data.Statistical inference of high-dimensional data has become a hot research topic for more than ten years.In some practical problems,in addition to sample information,we can usually obtain relevant information about regression coefficient.When some auxiliary information about the regression coefficients can be obtained in advance,the ficiency of the estimator can be improved significantly.In addition,in many regression problems we are interested in finding important explanatory factors in predicting the re-sponse variable,where each explanatory factor may be represented by a group variables.Common examples include the representation of multilevel categorical covariates in a re-gression model by a group of indicator variables,and the representation of the effect of a continuous variable by a set of basis functions.It is important to develop group selection method that can choose predictive variables in a grouped manner.Chapter one introduces the background and preliminary knowledge of this paper.As the oldest,the simplest and the most popular approach,linear regression analysis where investigators are interested in the relationships between a set of predictors and the responses is of utmost importance in statistics.However,we cannot know the true model in advance.If the underlying model structure is not linear,then some wrong conclusions will be obtained.While nonparametric models are less dopendent on the model assumption and hence able to uncover nonlinear efforts hidden in data.Scmiparamotric models,a class of models botween linear and nonparametric models,combine the oxihility of nonparamotric regression and parsimony of linear regression,and have wide applications in practice duo to their flexibility.Chapter two considers partially linear models with the dimension of parametric com-ponent diverging,and studios a restricted profile least-squares estimation for the parametric component.We use polynomial splines to estimate the nonparametric component.This esti-mator is shown to be consistent and asymptotically normal under some regularity conditions.Some simulation studies are conducted to illustrate our approach.Chapter three studies partially linear additive models with the number of parameters diverging when some linear constraints on the parametric part are available.We propose a constrained profile least-squares estimation for the parametric component after the non-parametric functions are estimated by basis function approximations.The consistency and asymptotic normality of the restricted estimator are given under some certain conditions.We construct a profile likelihood ratio test statistic to test the validity of the linear constraints on the parametric component,and demonstrate that it follows asymptotically chi-squared distribution under the null and alternative hypotheses.The finite sample performance of the proposed method is illustrated by simulation studies and a data analysis.In the framework of partially linear models,chapter four proposes an adaptive group bridge method to achieve the group selection for high-dimensional partially linear model,and considers the choice of index in the adaptive group bridge and use leave-one-observation-out cross-validation to implement this choice.It can significantly reduce the computational burden.Furthermore,we give the consistency,convergence rate and asymptotic distribution of the adaptive group bridge estimator which is the global minimizer of the objective function.The local linear approximation algorithm is an effective algorithm for computing a global solution of the folded concave penalization problem.However,the effectiveness of this method is highly dependent on a reasonably good initial estimator.It will lose efficacy when the correlation among predictors is high.In chapter five,we propose a new local linear approximation ridge algorithm designed to deal with highly correlated predictors.The ridge estimator is chosen as an initial estimator,the local linear approximation ridge algorithm is stable and effective.Simulation studies show that the proposed algorithm has better performance in the presence of highly correlated predictors than the local linear approximation algorithm.Chapter six gives some concluding remarks and some topics worthy of further work. |