Font Size: a A A

Efficient computation and model selection for regularized quantile regression

Posted on:2008-02-02Degree:Ph.DType:Dissertation
University:University of MichiganCandidate:Li, YoujuanFull Text:PDF
GTID:1449390005456225Subject:Statistics
Abstract/Summary:
Quantile regression extends the statistical analysis of the response models beyond conditional means. We consider regularized quantile regression models. In particular, we consider two types of regularization, the L2-norm and the L1-norm. The L2-norm regularization leads to the quantile regression in reproducing kernel Hilbert spaces, which we refer to as the kernel quantile regression (KQR). The L1-norm regularization uses the sum of the absolute values of the regression coefficients as the penalty. We derive efficient algorithms that compute exact entire solution paths for both the KQR model and the L1-norm quantile regression (L1-norm QR) model. The solution path, ranging from the least regularized model to the most regularized model, is achieved with a computational cost essentially same as fitting a single model for a specified regularization parameter. We also derive an estimate for the effective dimension of the proposed model, which allows convenient selection of the regularization parameter. Both simulation and real world data are used to demonstrate the KQR and the L1-norm QR models.
Keywords/Search Tags:Model, Quantile regression, Regularized, KQR, Regularization, L1-norm
Related items