Font Size: a A A

An Implementation Method For Minimal VC Dimensional Classifier

Posted on:2008-04-03Degree:MasterType:Thesis
Country:ChinaCandidate:H R WeiFull Text:PDF
GTID:2120360215453823Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Machine learning is based on traditional statistics, which provides conclusion only for the situation where sample size is tending to infinity. So they may not work in-practical cases of limited samples. Statistical Learning Theory or SLT is a small-sample statistics by Vapnik et al, which concerns mainly the statistic principles and learning methods when samples are limited. STL provides us a framework for the general learning problem, and a powerful learning method called Support Vector Machine or SVM, which can solve small sample learning problems well. SVM theory is the theoretical basis for a more complete and better learning performance, which makes it become a new research focus after neural networks.Though the performance of SVM theory has been verified in many practical problems, the selection of the kernel parameter is still an open question in the research field of SVM. Usually in support vector machine the parameters of kernel function have to be adjusted previous. The constrained nonlinear programming of minimal VC dimensional classifier in this thesis involves the parameter of RBF kernel, which could be determined: adaptively. The purpose of this thesis is to find a rapid solution for the minimal VC dimensional classifier.This thesis introduces STL firstly, which involves the generalization performance of learning, the structural risk minimization principle and SVM. Several popular training algorithms are summarized and analyzed, especially the decomposition algorithm of Osuna, which is used in the thesis. Three optimization algorithms, that is, the gradient, the penal function and the complex optimization method are introduced briefly.Based on the fundamental principle of the minimal VC dimensional classifier, this thesis presents a basic algorithm of the minimal VC dimensional classifier, which uses gradient and penalty function method to get an initial feasible point of the constrained nonlinear programming, on the basis of the initial feasible point, the constrained nonlinear programming is solved by the complex optimization method. The experiment shows that the solution can be used for data classification. But with the increasing of the samples, the speed of the solution will become slower and slower until it is unable to deal with them.Finally, in order to resolve the above problems, an improved method has been proposed, which uses the idea of Osuna decomposition algorithm, that is, to solve a large-scale constrained nonlinear programming through solving a serial of small-scale constrained nonlinear programming. The experiment demonstrates that the improved method runs faster and obtains higher precision than the famous SVM light does.
Keywords/Search Tags:machine learning, support vector machines, kernel parameter, decomposition method, complex optimization method
PDF Full Text Request
Related items