Perturbed neural network backpropagation learning and adaptive wavelets for dimension reduction for improved classification of high-dimensional datasets |
| Posted on:2006-08-03 | Degree:Ph.D | Type:Dissertation |
| University:George Mason University | Candidate:Bosch, Edward H | Full Text:PDF |
| GTID:1458390008471682 | Subject:Mathematics |
| Abstract/Summary: | PDF Full Text Request |
| In this work we propose a two-part process aimed at reducing the computational load and retaining the classification accuracies of neural networks applied to high-dimensional datasets of remotely-sensed data---hyperspectral imagery. First, we compute a set of near-optimum adaptive wavelets that depend on the covariance of the data, which are then employed to reduce the dimensionality of the data. Such dimension reduction linear transformation has several desirable properties. Secondly, we discuss two different types of modifications to the backpropagation learning rule that tend to decrease the convergence error rates and increase the neural network's classification accuracy. We show that the combination of the dimension reduction method and the modified backpropagation learning rule produce neural networks with generalization capabilities comparable to those where these methods have not been employed. We also discuss how a particular set of target vectors has a positive effect on the error and convergence of the discussed backpropagation algorithms. Finally, we test these techniques on remotely-sensed hyperspectral imagery for classification purposes. |
| Keywords/Search Tags: | Classification, Backpropagation, Dimension reduction, Neural |
PDF Full Text Request |
Related items |