Font Size: a A A

A Delay-partitioning Approach To The Stability Analysis Of Neural Networks

Posted on:2015-02-15Degree:MasterType:Thesis
Country:ChinaCandidate:J M DuanFull Text:PDF
GTID:2268330425974426Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In recent decades, neural networks have gradually become a hot topic of scientificresearch due to their extensive applications in different fields such as signal processing,artificial intelligence, image processing, neurophysiology, pattern recognition, nonlineardynamics and so on. On the other hand, time delays often occur unavoidably in neuralnetworks, and may cause undesirable dynamic network behaviors such as oscillation and eveninstability. Therefore, the problem of stability analysis for neural networks with time-varyingdelays has received wide attention of the domestic and foreign scholars.The paper mainly studies that the stability problem of neural networks with time delays,first it studies the asymptotic stability of discrete-time recurrent neural networks withtime-varying delays, the second the asymptotic stability of discrete-time recurrent neuralnetworks with randomly occurred nonlinearities and time-varying delays, the third the meansquare asymptotic stability of stochastic Markovian jump neural networks with randomlyoccurred nonlinearities, the last the mean square asymptotic stability of stochastic Markovianjump neural networks with different time scales and randomly occurred nonlinearities.Include the following content concretely:1. We study the stability of discrete-time recurrent neural networks with time-varyingdelays. The Laypunov-Krasovskii stability theory is applied and combined with liner matrixinequality, the conditions for asymptotic stability of the considered neural networks areobtained. In addition, we discuss the case where the lower bound of the delay is zero.2. We study the stability of discrete-time recurrent neural networks with randomlyoccurred nonlinearities and time-varying delays. By utilizing new Lyapunov-Krasovskiifunctions and delay-partitioning technique, the stability criteria are proposed in terms oflinear matrix inequality (LMI). We also show that the conservatism of the conditions is anon-increasing function of the number of delay partitions.3. We study the stability of stochastic Markovian jump neural networks with randomlyoccurred nonlinearities. It’s the first time that the mean square asymptotic stability criteria arederived for the considered stochastic neural networks via delay-partitioning projectiontechnique. We also obtain that the thinner the delay is partitioned, the more obviously theconservatism can be reduced.4. We study the stability of stochastic Markovian jump neural networks with differenttime scales and randomly occurred nonlinearities. Delay-dependent stability criteria arederived for the considered neural networks for cases with or without the information of thedelay rates. Two different partitions are made to deal with the time-varying delay, and itindicates that the results can be used in a wider space, which leads to the reduction ofconservatism.For all the above studies, we use Matlab to program and obtain simulation results. Thesimulation results are in agreement with the conclusions.
Keywords/Search Tags:neural networks, time-varying delay, stability, delay-partitioning, Randomly occurred nonlinearities (RONs), Markovian jump, time scale
PDF Full Text Request
Related items