Font Size: a A A

RIDGE REGRESSION: CONSTRAINED OPTIMIZATION WITH APPLICATIONS IN DESIGN OF EXPERIMENT

Posted on:1982-08-25Degree:Ph.DType:Dissertation
University:University of California, RiversideCandidate:SMITH, KENT DOUGLASFull Text:PDF
GTID:1470390017965837Subject:Statistics
Abstract/Summary:
In classical linear model estimation, the ordinary least squares estimators of the unknown parameters have minimum variance in the class of unbiased estimators. Applications of ordinary least squares methodology to regression models produce estimates that may be construed to indicate that a specific relationship exists between dependent and independent variables. Often the indicated relationship contradicts physical evidence or knowledge of how the variables interact. In addition the estimates may have unacceptably high variances. A common cause of these problems is the singularity (or near-singularity) of the matrix of independent variables. An alternative to least squares estimation has been introduced by Hoerl and Kennard (1970 a, b). This estimation procedure is known as ridge regression. The estimates are biased and may be shown to have smaller variances and a shorter mean square distance from the parameter than the least squares estimates.;The research conducted by Hoerl and Kennard (1970 a) demonstrated that ridge regression estimates may be viewed as being those estimates that minimize the squared length of the parameter vector subject to a specified residual sum of squares. This research demonstrates ridge regression estimates to be "constrained maximum likelihood" estimates. The constraint required to derive ridge regression estimates is contrasted with those used in classical design of experiments estimation; the required constraint is shown to be related to the ridge parameter. The application of ridge regression estimates to the general linear model is furthered by developing testing procedures based upon a constrained likelihood ratio technique. The resulting distributions are derived and techniques for conducting tests of hypotheses based upon ridge regression estimates are developed. Finally, ridge regression estimation procedures are developed for design of experiment models. Such models are ideally suited for ridge regression techniques since the matrix is almost always singular. Parameter estimates are readily obtained since the design matrix has such a well defined pattern to its structure.;Estimates for the design model's parameters are derived by obtaining the limit of the ridge regression estimates as the ridge parameter approaches zero. The limiting ridge regression estimates of estimable functions are shown to be the ordinary least squares estimates, and, therefore, are shown to be unbiased estimates. Such limiting estimates still possess bias for nonestimable functions. However, an estimate of bias is derived for the estimates of these nonestimable functions.;Ridge regression estimates and test procedures are shown to exist when least squares estimates either do not exist or have unacceptably high variances. Design of experiment models are addressed as a specific example to demonstrate the viability of this alternative to least squares methodology, and statistical inference procedures are developed using a constraint on the parameter space.
Keywords/Search Tags:Ridge regression, Least squares, Parameter, Estimation, Constrained, Procedures
Related items