Font Size: a A A

Analysis Of Nonsmooth Dynamical System Based On Differential Inclusion And Its Applications

Posted on:2011-10-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:S T QinFull Text:PDF
GTID:1100360332456496Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
Based on differential inclusion and nonsmooth analysis, the dissertation studies thedynamical property of delayed neural networks with discontinuous activations, neuralnetworks of subgradient system, nonsmooth gradient-like systems and evolution inclusionwith Clarke subdifferential type in Hilbert space. The main result of the dissertation islisted as follows:1. Study the exponential stability and convergence in finite time of delayed neuralnetwork. At present, most existing results of stability of this neural network are mainlybased on continuity and boundedness of activation function. In this dissertation, withoutassumption of continuity and boundedness of activation function, we prove two stabil-ity of such neural network: exponential stability and convergence in finite time. Firstly,by set-valued topology degree theory, we get the existence and uniqueness of equilib-rium point of such neural network. Then by constructing Lyapunov function, we provethat there exists unique global solution for any initial value problem, and this solutionconverges to the equilibrium point with exponential rate,i.e. the neural network is expo-nential stability. Many existing results can be considered as a corollary of this theorem.Moreover, the conditions of theorem are easily testable and robust. In the end, under somemild hypothesises, we prove that any trajectory of such neural network will converge tothe equilibrium point in finite time, i.e., convergence in finite time, which is a peculiarphenomena of discontinuous system. Meanwhile, two numerical examples are presentedto illustrate the applicability of our results.2. Study dynamical behaviors of a class of neural networks of subgradient system,which can be regarded as a generalization of neural network models considered in the op-timization context. Full range cellular neural networks (FR-CNNs) is also its special case.At first, we prove the existence of global solution and equilibrium point, and then studyits stability. Most results on stability of such system are quasi-convergence. In this disser-tation, by nonsmooth ?ojasiewicz inequality, we prove the asymptotic convergence of thetrajectories of this subgradient system, i.e., starting from any initial point, its trajectorywill converge to an equilibrium point in the end. As a direct application, this theorem im-plies asymptotic stability of FR-CNNs, which greatly improves the results of stability of FR-CNNs. Moreover, by ?ojasiewicz exponent, we can easily compute the convergencerate of its solution. Then a constrained minimization problem is studied, which can be as-sociated with this neural network. It is proved that the local constrained (strict) minimumof the objective function coincides with the (asymptotically) stable equilibria point of thisneural network. Finally, we present two theorems about approximation of solutions ofthis subgradient system and serval examples are given to explain these theorems.3. Study dynamical behaviors of a class of nonsmooth gradient-like systems. Thewell-known Hopfield neural network and cellular neural network are all special cases ofit. Firstly, by homotopic invariance of topology degree and the maximal monotonicityof convex subdifferential, we prove the existence and uniqueness of global solution andequilibrium point of this system. Then, by virtue of a constructed Lyapunov function andreduction to absurdity, we get the asymptotic stability of this system. After that, we applyresults above into seeking local minimum point of nonsmooth function over {0, 1}n anda class of nonlinear programming problems, and some examples are presented to showits applicability. In the end, we investigate the existence of periodic solution of this non-smooth gradient-like system, which is divided into three cases:(1) activation function isbounded, (2) activation function satisfies sublinear growth condition, (3) activation func-tion belongs to C2 and strictly increase.4. Study the existence of solution of evolution inclusions in Hilbert space. In re-cent decades, people focus on the evolution inclusion with convex subdifferential, andin this dissertation, we will study more general case: evolution inclusion with Clarkesubdifferential. Compared with convex subdifferential, Clarke subdifferential has widerapplications in theory and practice. However, Clarke subdifferential doesn't have maxi-mal monotonicity, which means it will be more difficult to study evolution inclusion withClarke subdifferential. In this dissertation, at first, we get the existence and uniqueness ofits solution in case that the perturbation is a single-valued function, and we also get twoimportant inequalities. Based on these two inequalities, by continuous selection theoremand Schauder fixed point theorem, we get existence theorem of strong solution of thisevolution inclusion when the perturbation is a multivalued lower semicontinuous map.Then, the existence theorem of extremal solution is proved by extremal selection theo-rem, under which we prove the relaxation theorem, i.e., the extremal solution set is densein the strong solution set of this evolution inclusion. Finally, we apply these results into two examples of parabolic PDE's and get their existence theorems of solution.
Keywords/Search Tags:Differential Inclusion, Neural Network, Dynamical Behaviors, Optimization Problem, Evolution Inclusion
PDF Full Text Request
Related items