Font Size: a A A

Convergence Of Gradient Learning Algorithm And Output Representation For Multiple Output BP Neural Networks

Posted on:2007-03-08Degree:MasterType:Thesis
Country:ChinaCandidate:F Q ZhouFull Text:PDF
GTID:2120360182961108Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Gradient algorithm has been widely used for training the weights of feedforward neural networks. The convergence of the gradient algorithm for feedforward neural networks with one output unit has thoroughly studied. In this paper, we study the convergence of gradient algorithm for a three-layer BP neural network with multiple output units. Convergence results of the corresponding gradient algorithm and the monotonicity of the error function in the iteration are proved.The input representation of multilayer feedforward neural networks is very important and has been thoroughly studied, while the output representation is hardly considered. Taking a classification problem with eight classes as an example, investigated and compared in this paper are BP neural networks with one, three and eight output units, referred to as approximation, binary and one-for-each approaches, respectively. The experimental results show that the output representation is also important for the network performance. In particular, for our example, the binary approach works much better than the other two. This observation implies that the one-for-each approach, which is commonly suggested in literature, might not necessarily be the best choice. A geometrical explanation of our observation is also presented.
Keywords/Search Tags:BP neural network, Gradient algorithm, Convergence, Multiple output units, Classification problems, Output representation
PDF Full Text Request
Related items