Font Size: a A A

Support Vector Regression And Its Application

Posted on:2006-09-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y J TianFull Text:PDF
GTID:1116360152992505Subject:Management Science and Engineering
Abstract/Summary:PDF Full Text Request
Support Vector Machines (SVMs) are new data mining methods developed in recent years based on the foundations of Statistical Learning Theory (SLT), and they have been proven to be more powerful and robust than existing methods in most aspects. SVMs are developing promisingly either in theory or applications.Support vector machines for classification problem are called support vector classification (SVC), while for regression problem are support vector regression (SVR). This paper focuses on SVR in several aspects, including theory foundation and application.1. The success of SVMs depends heavily on the model selection. One of the most popular approaches is to select the kernel and the parameters by minimizing the bound of Leave-One-Out (LOO) error. For SVC some famous bounds have been proposed, we derive three LOO bounds for three algorithms of SVR, and a new SVR algorithm (LOO-SVR) is proposed.2. A generalized model of SVR is proposed, in the optimization problem of which a selectable function is included. Different selections of this function will derive some kinds of existing algorithms of SVR, and the kernel function in this model may not be necessarily definite; Unconstrained SVR is constructed after transforming the primal convex quadratic programming into unconstrained problem, so that some efficient unconstrained optimization methods can be applied to SVR.3. For standard e-SVR, we propose two ways to derive it. One is that after transforming regression problem to classification problem and applying SVC to this problem, we deduce the primal problem of standard e-SVR. The other is that after giving out the concept of margin in regression problem, the primal problem of e-SVR can also be derived by implying the idea of maximizing margin.4. Existing researches of SVR are based on optimization theory of finite dimension space. This paper proves the relation between the solutions of primal problem and dual problem in Hilbert space, which maybe infinite dimension, therefore makes up the optimization theory foundation.5. For the problem of soil erosion prediction, we construct support vector regression predict model, and choose optimal parameters by minimizing the bound of LOO error we proposed. The comparison results with traditional model show the priority of our new model. This application is a new and meaningful direction of support vector regression in the future.
Keywords/Search Tags:Support Vector Machines, Classification Problem, Regression Problem, Model Selection, Bound of LOO error
PDF Full Text Request
Related items