Font Size: a A A

The Study Of Neural Network Methods For Solving Two Types Of Non-smooth Optimization Problems

Posted on:2023-07-30Degree:MasterType:Thesis
Country:ChinaCandidate:X Y HuangFull Text:PDF
GTID:2530306794483124Subject:Computer technology
Abstract/Summary:PDF Full Text Request
The optimization problems are widespread in various engineering fields and have received a lot of attention.The advantages of parallel computing,fast convergence,and circuit implementation in solving large-scale optimization problems have led to the rapid development of neural networks.However,most of the neural network optimization algorithms involved in the problem are convex and Lipschitz continuous,and there are fewer studies on pseudoconvex,nonLipschitz continuous and certain shortcomings.To address the shortcomings of the existing research work,two types of neural networks are proposed in this thesis to solve the nonsmooth pseudoconvex optimization problem and the nonLipschitz continuous optimization problem,respectively.First,for a class of nonsmooth pseudoconvex optimization problems with equation constraints and inequality constraints,a novel single-layer recurrent neural network is proposed based on the ideas of regularization and differential inclusion.The effective regularization term ensures the boundedness of the state solutions of the proposed neural network algorithm,which in turn ensures that the state solutions converge to the feasible domain in finite time and eventually converge to an optimal solution.Compared with the existing algorithms,the neural network is a single layer,which not only eliminates the need to calculate penalty factor in advance,but also discards some stronger assumptions,and its initial points can be chosen arbitrarily.More importantly,it can solve more general pseudoconvex optimization problems.Finally,the effectiveness and applicability of the algorithm are illustrated by numerical experiments and dynamic portfolio problems.Second,for a class of non-Lipschitz continuous optimization problems,a recurrent neural network is constructed based on the smoothing method.Under certain conditions,the global existence,and convergence of the state solution of the neural network are proved.Compared with the existing algorithms for solving non-Lipschitz continuous optimization problems,the algorithm has the advantages of simple structure,solving a wider range of problems,no need to compute the projection operator as well as penalty factor,and that the initial point can be chosen arbitrarily.Finally,the validity of the theory and the effectiveness of the neural network algorithm are verified by numerical experiments,and it is used to solve the signal recovery problem.
Keywords/Search Tags:Neural networks, Nonsmooth pseudoconvex optimization, NonLipschitz continuous, Differential inclusion
PDF Full Text Request
Related items