Font Size: a A A

Two Kinds Of Gradient Non-lipschitz Continuous Optimization Algorithms

Posted on:2023-02-27Degree:MasterType:Thesis
Country:ChinaCandidate:M X ZhangFull Text:PDF
GTID:2530306794977479Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Optimization is used as a supporting technique in many research fields related to numerical computation(such as machine learning,deep learning,signal processing,and industrial design,etc).The quasi-Newton methods is one of the most effective methods for solving nonlinear equations and optimization calculations.Today,a large number of quasi-Newton algorithms are included in optimization software to solve unconstrained,constrained and large-scale problems.However,the current quasi-Newton algorithms may fail when faced with the problem that the gradient is non-Lipschitz continuous.This motivates us to find a quasi-Newton method for gradient non-Lipschitz continuous and non-convex optimization based on the classical quasi-Newton formulas.The Broyden-Fletcher-Goldfarb-Shanno(BFGS)method has become one of the most popular quasi-Newton methods due to its rich theoretical achievements and excellent numerical performance.Therefore,this thesis is based on this method.First,a damped BFGS update rule that helps to achieve trust-region properties is proposed.Second,an interesting property of the BFGS method is its self-correcting nature.In order to make this property have better performance,the damped update rule is used to scale the third term of the update formula so that it can better correct large eigenvalues.Global convergence of the algorithms is established under Armijo or weak Wolfe-Powell(WWP)line search.A superlinear convergence rate is obtained under appropriate conditions.At the same time,by virtue of the properties of the trust region,a class of algorithms that do not require line search is also proposed,and the global convergence is also obtained.Preliminary numerical experiments show that the computational efficiency of the proposed algorithm is competitive with other BFGS-type methods.The quasi-Newton methods is a special conjugate gradient(CG)methods.Therefore,for the CG methods,the adaptiveness of the algorithm is increased while retaining the sufficient descending property of the algorithm,thereby realizing the properties of the trust region.The global convergence is also obtained.
Keywords/Search Tags:Quasi-Newton methods, BFGS method, Conjugate gradient methods, Gradient Lipschitz continuity, Global convergence
PDF Full Text Request
Related items