Font Size: a A A

The FWLS-SVM Regression Method Based On The Extraction Of The KPLS Features

Posted on:2009-05-28Degree:MasterType:Thesis
Country:ChinaCandidate:H J LianFull Text:PDF
GTID:2120360308478083Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
In the establishment of mathematical models, the selection or extraction of the feature is very important. The traditional PLS method transforms the original observation vector into a group of new features through the linear transformation, that is to say, every new feature is a linear combination of primitive features. However, the intelligibility of the features is poor. Besides, the high dimension of the feature space will decrease the accuracy rate. And the nonlinear PLS can develop into KPLS through the introduction of kernel function, which first mapped the initial importation to the space of high-dimensional characteristics and then calculate the components there. The relevance and noise of the data can be removed and the dimensions of space can be reduced as well through the extraction of sample features information by KPLS.After the feature extraction, the classic statistical theory based on the law of large numbers theorem assumes that samples subject to a specific distribution function. However, in practice, on the one hand, the number of samples is limited; on the other hand, the distribution of samples is often unknown. Compared with the classic statistical theory, the modern statistical learning theory provides a unified framework for solving the problem of the limited samples. Based on this, the new developed learning method——SVM has the features of higher fitting accuracy, less choice of parameters, a strong popularizing ability and global optimum.However, the traditional SVM method is very complicated as it is ultimately transformed into the convex quadratic programming problems of restrictive conditions. In addition, the kernel function must be positive. Based on this, the LS—SVM method put by Suykens el is adopted here. In which the learning problem of SVM is transformed into linear equations and the calculating complexity is greatly reduced.Both the traditional SVM and the LS—SVM method require that the observation or experimental data must be a classic data. Whereas descriptions of the real phenomena are obscure. On the basis of this, the fuzzy data is processed and the LS—SVM method is developed into the method based on fuzzy data. Accordingly, the SVM method can be widely used in practice.Considering the differences among the samples, some experts and scholars have put forth different punishment coefficient method for different samples. However, the current method for identifying the samples is based on measuring the distance and concerns little with the degree of correlation among the samples. Therefore, the obscure defining method based on the combination of comprehensive factors is proposed, in which not only the relationships between a sample and the class center are taken into account but relations among the samples as well. Accordingly, the support vector and noise samples can be distinguished effectively and clearly.
Keywords/Search Tags:support vector machine, support vector regression, partial least square, kernel, fuzzy membership function
PDF Full Text Request
Related items