Font Size: a A A

Iterative Training Algorithms Of Support Vector Machines And Applications

Posted on:2015-10-24Degree:DoctorType:Dissertation
Country:ChinaCandidate:B LiFull Text:PDF
GTID:1488304226989619Subject:Control Science and Engineering
Abstract/Summary:PDF Full Text Request
Support vector machine (SVM) is a novel approach of machine learning based onStatistical Learning Theory. It has good generalization, converges to global optimumand overcomes the curse of dimensionality comparing with traditional machine learningmethods. It has been applied to many fields successfully and has broad prospects ofdevelopment because of its excellent performance.The research of training algorithms is an important part of the research of SVM. Ef-ficient training algorithms with lower time and space complexity are of great significancefor both theory and applications. In this dissertation, from the optimization methodpoint of view, by investigating some optimization methods for the training problem aris-ing from SVM, several efcient iterative training algorithms are proposed. Finally weapply the SVM in the efciency monitoring and fault diagnosis and prediction of windturbines. The main work consists of the following parts:1. For least squares SVM (LS-SVM), the training problem is transformed into anunconstrained convex quadratic programming problem and the reduced formulations areproposed, then the conjugate gradient method is used to obtain the numerical solutions.By introducing this transformation, the times of using conjugate gradient method arereduced to one instead of two as in the iterative conjugate gradient algorithm proposedbefore.2. An iterative single data approach with stepsize accelerating implementation isproposed for the reduced formulations of the unconstrained LS-SVM. Combining theselection rule of variables with the coordinate descent approach makes it converge fasterthan the successive over-relaxation method. Updating only one variable at each iterationmakes it simpler and more flexible than the sequential minimal optimization method.Incorporating the stepsize accelerating implementation makes it more efcient.3. A multiple constraints activated dual active set method is proposed. The con-ventional dual active set method is generalized to adding multiple violated constraintseach time without causing more difculties, thus the convergence is improved. Whenapplying the proposed method to SVM, inverting two large-scale matrices is replacedwith solving a smaller linear system, whose size decreases more rapidly. Thus the com-putational cost is also reduced. 4. A decomposition gradient projection (GP) method is proposed by introducingthe decomposition approach into the GP method. The proposed method reduces thescale of the problem solving by the GP method and saves the computational cost ofseeking the optimizer of the sub-problem in the conventional decomposition method.Moreover, spectral steplength and a nonmonotone line search strategy is incorporatedin the frame of the proposed method, and a binary search algorithm with linear timecomplexity is used for computing the projections onto the specific feasible region inSVM, which together make the proposed method more efcient.5. Wind power has the most commercial prospects among the renewable energytechnologies. The development of wind power is significant for Chinese energy strat-egy. The maintenance task of wind turbines grows heavier as the development of thewind power. We apply the SVM in the efciency monitoring and fault diagnosis andprediction of wind turbines and achieve satisfactory results.
Keywords/Search Tags:statistical learning theory, support vector machine, training algorithm, op-timization method, iterative algorithm
PDF Full Text Request
Related items