Font Size: a A A

Variable Selection In Mixture Of Quantile Regression Model

Posted on:2018-01-15Degree:MasterType:Thesis
Country:ChinaCandidate:G Y ChenFull Text:PDF
GTID:2359330518492090Subject:Statistics
Abstract/Summary:PDF Full Text Request
Scientists can collect data of unprecedented size and complexity due to the rapid advancement in modern technology. In applications,there will appear a large amount of variables. However the depen-dent variable may be related with a small subset of the large number of variables. In certain applications, the data samples may even be coming from multiple subpopulations. In these cases, selecting the correct variables for each subpopulation should be crucial. On the other side, it's better to make the model forms concise. In conclu-sion, variable selection has attracted more and more attention. The classical best subset selection methods, such as Akaike information criterion, Bayes information criterion and their modify, are compu-tationally too expensive as the number of covariates and components increases. Thus much time will be wasted.The New type of variable selection methods have been success-fully developed over the last decade to deal with large numbers of variables, and it gets good performance. It works by introducing a penalized likelihood in the likelihood function. It can compress the parameters which are close to zero to zero. They have been designed for simultaneously selecting important variables and estimating their effet in a statistical model. Penalties depend on the size of the regres-sion coefficients and the model structure. The new method is shown to be consistent for variable selection. The estimator is proved to have the property of asymptotic normality. It is also applied for the case of large samples.The finite mixture models have the property of flexibility. They will be applied in the model data that arise from a heterogeneous popualtion.In particular, an FMR model segments the population in-to subpopulations and models each subpopulation by a distinct linear or generalized linear regression model. We can learn from it that FM-R models can reduce the bias of fitting diretly. Mixture models have been widely used in many files, including image processing?voice recognition?finding motif problem of biopolymer?face recognition and so on. Due to just describing the simple changes of mean values for the linear regression models, quantile regression models are put forward in recent years. Quantile regression models can not only de-scribe the changes of mean values, but also describe the changes of response variables along with dependent variables in different quan-tiles. Also it has the good property of robust. So, in this article, I introduce the quantile regression to the mixture models and study the variable selection of mixture quantile regression models. BIC method for selecting tuning parameters and an EM algorithm for ef-ficient numerical computations are developed. Simulaltion show that the method performs very well and requires much less computing power than existing methods.
Keywords/Search Tags:EM algorithm, LASSO, Mixture model, Quantile regression, Asymptotic property
PDF Full Text Request
Related items