Font Size: a A A

Improved Least Squares Support Vector Machine And Its Application In Chemistry And Chemical Engineering

Posted on:2007-05-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:S H TaoFull Text:PDF
GTID:1101360212489196Subject:Chemical Engineering and Technology
Abstract/Summary:PDF Full Text Request
Least squares support vector machine (LSSVM) is a kernel learning machine which obeys structural risk minimization (SRM) during training. LSSVM has been widely used in chemistry and chemical process modeling recently. In this paper, several new algorithms were proposed to solve the problems of dimension reduction technologies, selection of optimal hyper parameters and sparseness of LSSVM modeling. These new algorithms were applied to complex chemical pattern classification, process modeling with sample of small size and vapor liquid equilibrium problems; the results show that the new algorithms overcome some deficiencies of standard LSSVM. The main work is as follows:1. The history, progress and application of statistical learning theory and support vector machine (SVM) were reviewed first. Subsequently, SVM algorithms were explained and some deficiencies of LSSVM were pointed out.2. Since the sample vector must be mapped from original space to high dimensional reproducing kernel Hilbert space (RKHS) when SVM is used to solve nonlinear pattern classification problem, the linear classification correlative analysis (CCA) algorithm was extended to RKHS via kernel trick, and the classification correlative component (CCC) subtracted in RKHS is nonlinear combination of sample vector elements in the original space. Co-linearity and abundant information of the sample can be eliminated by CCA in the RKHS, i.e. non linear CCA (NLCCA), so the sample distribution in the RKHS is improved to be classified easily. At last the NLCCA algorithm was integrated with linear support vector classifier, which is called NLCCA-LSVC here. NLCCA-LSVC was applied to 2 complex chemical pattern classification problems.3. G-LSSVM algorithm was proposed based on the fast leave one out (LOO) method with LOO sum square error of prediction, i.e. sse as its minimized object, the gradient of sse to hyper parameters was induced and then the gradient decreasemethod was used to find optimal hyper parameters for LSSVM modeling with small size sample. The G-LSSVM was applied to model a lemon acid fermentation process.4. Black box model, such as ANN and LSSVM, model a process only depends on experimental data without making use of any prior knowledge, so the model's prediction may be inconsistent with the process mechanism sometimes. To solve this problem when calculate vapor composition of binary vapor liquid equilibrium with constant temperature (pressure), Gibbs-Duhem equation was integrated with ANN and LSSVM to form a hybrid model, i.e. GD-MFNN & GD-LSSVM, whose output are constrained by Gibbs-Duhcm equation. GD-MFNN & GD-LSSVM were applied to several thermodynamic examples.5. Since the empirical risk is calculated via quadratic function, LSSVM loses sparseness of SVM and this leads to the decrease of calculation efficiency when classifying. To sparse LSSVM, statistical method was used to select important examples of training sample as support vector (SV), and the information of non SV examples was transformed to SV, so new sparse algorithms were proposed. The new sparse algorithms were applied to several real life pattern classification problems.6. Based on the singular value decomposition of kernel matrix, SVD-LSSVM was proposed, which oan save time for hyper parameters selection via cross validation. SVD-LSSVM balances the empirical risk and model complexity through singular value contribution, so it carries SRM rule in a new way. Several UCI benchmarking data and the Olive classification problem were used to test SVD-LSSVM.
Keywords/Search Tags:least squares support vector machine, empirical modeling, structural risk minimization rule, kernel function, optimal hyper parameters, artificial neural network, sparse, modeling, hybrid model, Reproducing kernel Hilbert space
PDF Full Text Request
Related items