Font Size: a A A

The Researches On Support Vector Machine Classification And Regression Methods

Posted on:2005-03-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:D S SunFull Text:PDF
GTID:1100360125463942Subject:Probability theory and mathematical statistics
Abstract/Summary:PDF Full Text Request
Recently statistical learning theory has received considerable attention proposed based on small sample data, which is an important complementarity and development of traditional statistics. Support Vector Machines (SVMs) algorithms based on the foundations of statistical learning theory show excellent learning performance, which have been successfully extended from basic classification tasks to regression, density estimation, novelty detection, etc. Unlike traditional methods, which minimize the empirical training error, SVMs make use of the structure risk minimization principle, which may bring on a good generalization performance.Additional advantages of SVMs can be appreciated in comparison to neural networks. For SVMs there are only a small number of tunable parameters and training amounts to solving a convex quadratic programming problem hence giving solutions that are global, and usually unique.This thesis consists of seven chapters, which studies the problems of pattern classification, regression and their applications.The basic methods about classification and regression are summarized in the first chapter, which include Bayesian method, neural networks, support vector machines, etc.Chapter 2 introduces the algorithms of one-class support vector machine, binary classification and regression, and then a new multi-class classification algorithm is proposed based on one-class classification idea, which can largely reduce computation complexity and syncretize the methods of one-class, two-class and multi-class classification. At the same time, a decomposition algorithm of multi-class classification is proposed, which can provide a feasible approach for solving the classification problem of large-scale data.In chapter 3, support vector machine algorithms based on linear programming are summarized firstly. Secondly, three new regression models are proposed based on linear programming, which can reduce thecomplexity of models and keep the good performance of prediction. Finally, a new multi-class classification algorithm and its form of decomposition are proposed based on linear programming, which can obtain good recognition precision, and largely shorten the training time. At the same time, a new method of face recognition is proposed, which is based on kernel Principal Component Analysis (KPCA) and multi-class classification algorithm. The results of experiments at ORL face image database show that the proposed method is feasible and effective.The contents of chapter 4 consist of three parts. Firstly, a new Support Vector Regression (SVR) algorithm is proposed by introducing a single parameter, and then the equivalence between the proposed algorithm and standard support vector regression is proved. Secondly, a kind of weighting support vector regression method is proposed contraposing heterogeneity of variance in regression models. Thirdly, a notion of predicting credibility is proposed in support vector regression, which can make predicting value have a credible measure, and then relationship between predicting credibility and noise is discussed. Finally, the connection between regression and classification is studied, which can provide definite theory foundation for fast classification algorithm applying to regression models.In chapter 5 outlier detection is discussed. A method of outlier detection in regression is proposed making use of the character of structure risk function and KKT condition in support vector regression. In addition, a new method of outlier detection in time series is proposed by combination of phase space theory and one-class classification method.Chapter 6 discusses the relationship between neural networks and support vector machines, and then applies different versions of support vector regression and RBF networks to the prediction of noise chaotic time series and compares their capability of prediction.Chapter 7 is the summarization of whole thesis and expectation for the future.
Keywords/Search Tags:support vector machine, structure risk minimization principle, kernel function, classification, regression
PDF Full Text Request
Related items