Font Size: a A A

Stability And Nonlinear Approximation Of Several Classes Of Neural Networks

Posted on:2015-02-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:Z Y QuanFull Text:PDF
GTID:1268330431950249Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this thesis, some novel LMI-based sufficient conditions for existence and global stability of the equilibrium point of several classes of neural networks are obtained by using degree theory, LMI method, new inequalities technique and constructing Lyapunov functionals, which can be easily verified via the Matlab LMI toolbox. Furthermore, the approximation errors of the neural networks with two weights are estimated in the last two chapters. The thesis is divided into six chapters.As the introduction, in Chapter One, the artificial neural network and its structure are firstly described. Then, several classes of neural networks, which will be discussed in this thesis, are briefly addressed while the motivations and outline of this work are also given in this chapter. And some notations, definitions and several lemmas are alse listed in this chapter.In Chapter Two, under the assumption that the activation functions only sat-isfy global Lipschitz conditions, a novel LMI-based sufficient condition for global asymptotic stability of equilibrium point of a class of BAM neural networks with reaction-diffusion terms and distributed delays is obtained by using degree theory, new inequalities technique and constructing Lyapunov functionals. In our results, the assumptions for boundedness and monotonicity in existing papers on the ac-tivation functions are removed. A numerical example is also provided to show the effectiveness of the derived LMI-based stability condition.In Chapter Three, the existence and global exponential stability of equilib-rium points for inertial BAM neural networks with time delays are investigated. Firstly, the system is transformed to the first order differential equations with chosen variable substitution. Then using homeomorphism theory and construct-ing Lyapunov functionals, the LMI-based sufficient condition on the existence and uniqueness of equilibrium point for above inertial BAM neural networks is obtained by using novel inequalities. Secondly, a new LMI-based condition which can ensure the global exponential stability of equilibrium point for the system is obtained by using new LMI method and new inequality technique. Our results extend and improve some earlier Publications. An numerical example is given to illustrate the theoretical results. The Chapter Four is similar but more difficult to be discussed.In Chapter Five, the neural network with two weights and one hidden layer is constructed to approximate Lp integrable functions. We not only show that the constructed neural network with two weights can approximate any Lp integrable function arbitrarily in the Lp metric as long as the number of hidden nodes is sufficiently large, but also show that the the neural network with two weights is of better approximation ability than the BP neural network constructed in a literature reference by using inequalities technique and the modulus of smoothness as a metric tool. Compared with the existing result in a literature reference, in our result, the assumption for the odd functions on the activation functions is removed. On the other hand, the input weights and thresholds are different from those in the existing result.Finally, in Chapter Six, the technique of approximate partition of unity, the way of Fourier series and inequality technique are used to construct a neural network with two weights and with sigmoidal functions. Furthermore by using inequality technique and the modulus of continuity as a metric tool, we prove that the neural network with two weights has a better approximation ability than BP neural network constructed in a literature reference.
Keywords/Search Tags:Neural Network, Delay, Equilibrium Point, Stability, LinearMatrix Inequality (LMI), Approximation Error, Modulus
PDF Full Text Request
Related items