Font Size: a A A

Development and evaluation of multilayer perceptron training algorithms

Posted on:2002-04-20Degree:Ph.DType:Dissertation
University:The University of Texas at ArlingtonCandidate:Kim, Tae-HoonFull Text:PDF
GTID:1469390011994955Subject:Engineering
Abstract/Summary:
This dissertation is about the evaluation of different advanced neural network training algorithms and development of new algorithms. Through the research, we could obtain detailed properties of the existing algorithms, develop some testing methods, and build new algorithms. The algorithms we start out with are output weight optimization hidden weight optimization (OWO-HWO) and full conjugate gradient (FCG) methods.; The tasks we encountered are as follows: (1) developing optimal learning factors for algorithms, (2) developing tests for learning factors, (3) developing methods for evaluating training algorithms, (4) investigating error functions to build fusing methods, and (5) developing algorithms which alternately use different types algorithms.; The reason we develop optimal learning factor (OLF) is that some existing learning factors are based on empirical method and the others take a lot of time to calculate. Our OLF serves as a standard to which other learning factor methods are compared. Using our new testing methods for learning factors, we can compare different type learning factors.; From experimentations, we found out that FCG is working better on random data and OWO-HWO is working better on correlated data. From this observation we devise alternating algorithms. Adaptive alternating algorithm is working a little better than fixed alternating algorithms.; From the performance observations and examination of algorithms, we find gradient based methods such as FCG are affected by input biases. We find that OWO-HWO is immune to input biases and variances when net control is applied.; Even though we experiment with different fusing methods, first and second fusing methods were found out to have problems. Third fusing algorithm based on OWO-HWO and FCG properties using normalized and Karhounen-Loève transformed data turn out to be successful.
Keywords/Search Tags:Algorithms, Training, FCG, OWO-HWO, Learning factors, Methods, Different, Fusing
Related items