Font Size: a A A

Role Of Transfer Function And Noise On Dynamical Characteristics Of Associative Memory Feedback Networks

Posted on:2007-08-22Degree:DoctorType:Dissertation
Country:ChinaCandidate:R M WangFull Text:PDF
GTID:1100360215457793Subject:Theoretical Physics
Abstract/Summary:PDF Full Text Request
Associative memory is the remarkable function of feedback network which is apt to hardware realization, so that it is broadly used in a lot of areas. At the same time feedback network is attracted highly attention by nonlinear science researchers as a typical large dimension nonlinear dynamical system. On the one hand, we can employ the recent progress on nonlinear dynamics to understand the characteristics of the networks, and to improve the performance of it; on the other hand, feedback network possesses abundant dynamical traits and phenomena, thus they provide a typical representative model for the research work of high dimension dynamical system with complex activity in this area.In this dissertation, the relation between network's structure and its performance is investigated by means of studying local field distribution of attactors in feedback network. We show that the characteristics of networks are mainly decided by the local filed distribution of memory patterns, at the same time it is also affected by the neuron transfer function. Moreover, the effection of transfer function keeps close relationship with the local field distribution. Here we will focus on two different feedback networks, one is Hopfield network designed by Hebb learning rule, another is holistic optimized network which is presented recently by Zhao [Phys.Rev.E.70,066137(2004)].In these two networks the local fields corresponding to the memory patterns and to the fake memory patterns distribute in different region, but with different overlap degree. According this phenomenon, the mechanism for transfer function and random noise working on the performance of networks is presented. The result is shown that the application of continuous transfer function indeed eliminate the spurious attractors in some degree as the researchers found before, but the elimination is not thorough. The main technique to eliminate spurious attractors is to make use of holistic optimizing design method to control the distribution of local field. As long as the local field distribute properly, the spurious attractors will disappear. In a general way the gain parameter in transfer function is considered as the temperature, in other words, it is similar as the random noise in neural networks. Although some investigators pointed out that two of them are different, the intrinsic mechanism and the detailed investigation are still unclear yet. Another key issue in this thesis is to analyzing the working way of gain parameter and random noise amply and clarifying the difference and the similarity between them, also pointing out that how to employ random noise to enhance the performance. At the same time, we explore how the eigenvalue spectrum of synaptic matrix varies under the given learning rule with the designing parameters, comparing with its stray from the random matrix spectrum. The contents for chapters are arranged as follow:1. The background of neural networks research is introduced including its origin, development, types, and brief list of some outstanding progess made by a few physics scientist such as Gardner, Little etc.2. Two associative memory neural network models are introduced in detail, one is Hopfield network, another is holistic optimized network advanced by Prof. Zhao. Ref. [Zhao, Phys. Rev. E 70, 066137 (2004)]. The learning rules and their main characteristics and dynamical behavior are presented.3. Investigating the role of transfer function on associative memory networks, we give the local field distribution for memory patterns and the spurious patterns under two different learning rules. We advance that the traits of local field distribution decide the function of transfer function. The working mechanism of continuous ransfer function on performance of network and dynamical behavior is clarified.4. In this chapter we display how random noise affects the performance of networks. Through adding noise in the dynamical evolution the dynamical characteristic and performance improvement of networks are studied. We note that the effect of noise rest with the local field distribution corresponding memory patterns and the spurious attractors. Utilizing noise to enhance the performance of the neural networks is also presented.5. Here we have explored the eigenvalue spectrum of synaptic matrix. Because of the adding of the learning information the synaptic matrix has the departure from the random matrix. We try to give a hint of link between neural networks researching and random matrix theory. Such an exploring work will help to understand the dynamical behavior in neural networks.
Keywords/Search Tags:local field distribution, feedback neural networks, associative memory, noise, eigenvalue spectrum
PDF Full Text Request
Related items