Font Size: a A A

Support Vector Machines Classifier Based On Margin Vectors

Posted on:2007-08-03Degree:MasterType:Thesis
Country:ChinaCandidate:B KongFull Text:PDF
GTID:2120360242960850Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
The Statistical Learning Theory (SLT) is a new technique for solving various machine learning problems and shows that it is suitable for the finite data. Support Vector Machine (SVM) is based on statistical learning theory and provides a new theory for the finite data. Compared to Neural Network, Genetic algorithms, Artificial intelligence, it have preferable generalize ability and nonlinear process ability, especially in processing most dimensions data, it's effective to solve dimension ill. Now, it applications widely in pattern recognize and regression estimation et al.Support vector machine is a relatively recent machine learning technique. There are still many unexplored areas and unanswered questions of both theoretical and practical nature in this field. In order to study and solve these problems, this thesis mainly focuses on the support vector machine algorithm.Firstly, According to the relations of support vector, center distance ratio, margin vector and incremental learning, a new method called incremental support vector machine based on center distance ratio is presented in this paper. First of all, some support vectors were extracted by the method; then others were made up by the incremental learning method, so all the support vectors were found. Compared to the CDRM+SVM, incremental support vector machine based on center distance ratio utilizes effectively center distance ratio and suits to incremental learning. So the new method improves the speed of SVM greatly, while the ability of SVM to classify is unaffected.Secondly, a method called sparse least squares support vector machine classifier (SLS-SVM) is presented for imposing sparseness to the LS-SVM. This is done by pre-extracting margin vectors using center distance ratio method as training examples. This method not only imposes sparseness to the LS-SVM, but also increases the speed of training and classifying, while the ability of LS-SVM to classify is unaffected.At last, According to the sparsity of support vectors in SVM and the fact that support vectors are distributed around the separating hyperplane, this thesis presents a new algorithm called Pre-extracting Relatively Closer Margin Vectors for Chunking Algorithm, which improves the correct rate of FFMVM reduces the iterative times of Chunking Algorithm. So the new algorithm improves the speed of SVM greatly, while the ability of SVM to classify is unaffected.
Keywords/Search Tags:statistical learning theory, support vector machine, least squares support vector machine, margin vector, support vector, center distance ratio, relatively closer margin vector
PDF Full Text Request
Related items