Font Size: a A A

Linear Dimensionality Reduction Method Based On Spectral Regularization Research

Posted on:2013-01-06Degree:MasterType:Thesis
Country:ChinaCandidate:C WangFull Text:PDF
GTID:2218330371960198Subject:Computer application technology
Abstract/Summary:PDF Full Text Request
The research of feature extraction based on linear transformation has always been the focus of pattern recognition. Many methods have been proved effective and used in many different fields. Among them, principal component analysis (PCA) and linear discriminant analysis (LDA) are more often used in dimensionality reduction. So, the main contents of this paper are also based on these two topics.Because the small eigenvalues caused by limited samples may make bad influence on PCA, we apply the eigenspectrum regularization to PCA and propose a new method, which is called ER-PCA. The bad influence of small eigenvalues is reduced by the spectrum regularization. Finally, the experimental results on face image datasets and some UCI datesets validate the effectiveness of the proposed method.The dimension of the projection space, which is generated by the classic LDA, is always no more than c-1, where c is the number of the classes. And LDA supposes that every class meet the gaussian distribution, So it may not perform well in non-gaussian distribution samples. Based on this idea, we redefine the expression of within-class scatter matrix and between-class scatter matrix, which are based on the k-nearest neighbor (k-NN) relationship between the samples. The distribution of a sample is reflected by its neighbors. Consequently we propose a new LDA method based on k-NN, which is called KNN-LDA. At last, the experiment is conducted on two handwritten digit datasets. The experimental results show that the proposed method can generate more dimensions of projection space and is superior to the original LDA method.
Keywords/Search Tags:Linear feature extraction, PCA, LDA, Regularization, Scatter matrix
PDF Full Text Request
Related items