Font Size: a A A

Based On The Experience Of Punishment Likelihood And Jump To Punish Least Squares Variable Selection

Posted on:2011-11-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y W RenFull Text:PDF
GTID:1110330335492162Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Variable selection in high or ultrahigh dimensionality has been studied deeply in the past fifteen years. Most of these researches focused on variable selection of the structure "loss+penalty", criteria for choosing tuning parameters and algo-rithms for handling the optimization. In this paper, we discuss variable selection under moment restriction models, including linear regression models, generalized linear models, quantile regression models, partially linear models, times series models and others models which satisfy some moment restrictions.In Chapter 2 and Chapter 3, for general moment conditions, we propose the penalized empirical likelihood (PEL) approach with nonconcave penalties such MCP penalty to select variables. We obtain the Oracle property of the PEL estimator in the situations of both p, the number of parameters, fixed and p diverging with n, the sample size, i.e., with probability tending to 1, we select the subset consisting of all the indices of nonzero coefficients and the PEL estimator of the nonzero coefficients has the property of asymptotic normality. An approximate algorithm, a modified Newton-Raphson algorithm, along with a consistent BIC-type criterion for selecting the tuning parameters, are provided for PEL. Similar to empirical likelihood ratio statistics, PEL ratio statistics admits inference on parameter without having to estimate its estimator's covariance.In Chapter 4, we proposed an unbiased estimator—we call it the Jump se-lector and its extension the adaptive Jump selector—in variable selection for lin-ear regression models which is a penalized least square estimator with the Jump penalty that has a "jump" at some thresholding value. The (adaptive) Jump se-lector can be viewed as the subset regression estimator for the estimator in the selected model is exactly the ordinary least square estimator. With proper choice of tuning parameters, the (adaptive) Jump selector possesses the oracle property and the unbiasedness. A Jump algorithm is proposed to handle the optimization. Its computational cost is the same as the least angle regression (LARS) algorithm per step. The adaptive version Jump algorithm guarantees the importance covari-ates to enter the active set prior to the unimportant ones with probability tending to 1. In order to properly handle the collinearity problem and variable selection in high-dimensionality simultaneously, we propose the (adaptive) Ridge-Jump se-lector, which is the the combination of the ridge regression and the Jump selector. Under some mild condition, the (adaptive) Ridge-Jump selector has the Oracle property. For the estimator in active set of the Ridge-Jump algorithm are ridge regression estimator, which can handle high dimensional variable selection. In the ultrahigh dimensional, under the partial orthogonal condition, the marginal (adaptive) Jump selector can identify the covariates with zero coefficients and those with nonzero coefficients with probability tending to 1.In Chapter 5, we discuss model selection in vector autoregressive models. There are two approach are provided. First, we straighten the coefficient matrix in the vector model to be a vector and then use the penalized method to select order and subset simultaneously. Second, we use the group Jump algorithm to select order and select those covariates with the norms of their coefficients matrix are nonzero. Then we use the fist method to select subset. Under some mild conditions, the estimated order and the parameter estimator are consistent. With probability tending to 1, the nonzero coefficients possess the asymptotic normality.The proposed methods are illustrated with numerical studies and real exam-ples.
Keywords/Search Tags:Adaptive, High(Ultrahigh) dimensional data, Jump algorithm, Jump selector, Linear regression models, Moment restriction models, Oracle property, Penalized empirical likelihood, Tuning parameters, Variable selection, Vector autoregressive models
PDF Full Text Request
Related items