Font Size: a A A

A Stepwise Partial Least Squares Regression Method And Its Application

Posted on:2014-07-14Degree:MasterType:Thesis
Country:ChinaCandidate:G D YangFull Text:PDF
GTID:2250330425972829Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Abstract:Multiple regression analysis is an important branch of statistics, and the least square method plays a key role in the multiple regression analysis. Sophisticated and systematic developments both in theory and in application on the least square regression analysis have been achieved since the creation of the least square method. Nevertheless, the method does not work well when the explanatory variables possess multicollinearity. To solve the multicollinearity problem, a lot of improved methods have been put forward, such as the methods of stepwise regression, ridge regression, and principal component regression. Although those methods can solve the multicollinearity problem in a certain extent, there are still some defects. In1983, S. Wold and C. Albano et al established a partial least squares regression method which can solve the multicollinearity problem better and thus was soon applied not only in the areas of chemistry but also in the areas of biology, medical science, mechanics and economics.In this thesis, we try to establish a stepwise partial least squares regression method to improve the fitting precision of the regression model.The thesis consists of three chapters. The first two chapters devote as a survey on the theory of the least square method and the partial least squares method. In the first chapter we briefly present the idea of the least square method and its related properties, describe the reasons and the judgment method of the multicollinearity, and its influence on the regression model.In the second chapter, we firstly introduce the principle and the calculation method of the partial least squares regression, then introduce a more simple method of extracting components, expound the limitation of partial least squares regression, and finally introduce the orthogonal projection algorithm for overcoming the limitation. The third chapter is the main part of this thesis. We will present a stepwise partial least squares regression method based on variable selection—selecting variables either backward or forward. The idea is motivated by the variable selecting method in the ordinary regression analysis. The difference between them is that in the ordinary regression analysis variables are selected or rejected by tests of significance, while in our case it is done by comparing the fitting precisions of the regression equations. Detailed steps for selecting variables and obtaining partial least squares regression equations are provided, and program codes for realizing the backward variable selecting method are also presented. The effectiveness of our method is demonstrated by an example in the end of this chapter.
Keywords/Search Tags:Multicollinearity, Cross validation, Orthogonal projectionoperator, Variable selection
PDF Full Text Request
Related items