Font Size: a A A

Radial Basis Function (rbf) Under The Space After The Regularization Parameters In Least-squares Regression Effective Iterative Algorithm

Posted on:2013-12-09Degree:MasterType:Thesis
Country:ChinaCandidate:X Y RongFull Text:PDF
GTID:2240330395450583Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Regularized Least Square, whose purpose is to find good fitted functions for labeled data in Reproduced Kernel Hilbert Space, is one of the important branches of Learning Theory. Regularization parameter plays a critical role of balancing the degree of data fitting and the smoothness of fitted function. The nature space of radial basis function is a Reproduced Kernel Hilbert Space under some kind of inner product. This paper aims to give a selection method and iterative algorithm of regularization parameter in Regularized Least Squares Regression problem in the native space of radial basis function.Based on the similarity form of the solution of regularization least squares regression in radial basis function space and the solution of Co-kriging method [1], a radial basis interpolation method, we propose that the regularization parameter should be selected as the priori variance of white noise when measuring the data. The proposal could be explained in the sense of Bayes theory as the maximum posteriori likelihood estimation[2]. Because of the difficulty in getting the priori variance, we design an iterative algorithm, that is, let regularization parameter be the posteriori variance of measurement error, solve the target function of regularized least squares regression, and then calculate the new posteriori variance of measurement error.The advantage of this algorithm is that it has physical meaning, unlike the com-monly used cross-validation method which tries all possible values to choose the best parameter. In this paper, we prove the convergence of the iterative algorithm, and give an explanation of the asymptotic behavior of the iterative solution to priori variance of white noise when the number of samples tends to infinity in the wavelet analysis and k-ernel principle component analysis. Also some numerical examples verify our proof. The comparison with generalized cross-validation method shows our algorithm has better recovering ability in local fitting and has smaller computational amount.
Keywords/Search Tags:Radial basis function, Regularization parameter, Regularized leastsquare regression, Reproduced Kernel Hilbert Space, Variance white noise
PDF Full Text Request
Related items