Font Size: a A A

Two Classes Of Modified Three-Term Conjugate Gradient Methods Based On The Truncation And Secant Equation

Posted on:2021-07-12Degree:MasterType:Thesis
Country:ChinaCandidate:Z J ChenFull Text:PDF
GTID:2480306194490914Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method is one of the most effective methods to solve large-scale unconstrained optimization problems,which has the characteristics of simple iteration formula,calculation,low storage and global convergence.Considering the angle of descent,global convergence and calculation efficiency of the methods,combined with the characteristics of three conjugate gradient methods,secant condition and new truncation form,based on MDL,MDL+ methods proposed by Yao et al and the secant condition proposed by Zahra Khoshgam et al,two kinds of conjugate gradient methods with sufficient descent condition are proposed.Firstly,the knowledge related to the nonlinear conjugate gradient methods,common line searches,research background and research status,as well as two important assumptions and three important lemmas,two processing methods for numerical experimental datas,and the main work of this paper are introduced.Secondly,in order to overcome the shortcomings of the non-negative truncation form of the MDL method that do not satisfy the sufficient descent condition and converge globally under the strong Wolfe condition,this paper combines the idea of the MSSMLBFGS method with the truncation term of the DK+ method instead of the non-negative truncation,a modified Dai-Liao(MDL)method has been improved,and two types of three-term conjugate gradient methods-NMDL and NMDL + methods are given,which proves that these two methods have sufficient Descent,And under the Wolfe line search,the NMDL method has strong convergence for uniformly convex functions,and the NMDL+ method has global convergence for general functions.Numerical experiments show that the NMDL + method is superior to the HZ + method and the DK + method.Finally,Because the standard secant equation only uses first-order information and ignores the objective function information,and some existing secant equations make full use of the objective function information and gradient information but cannot guarantee the conjugation,so this paper combines the secant equations proposed by Saman BabaieKafaki et al.and Zahra Khoshgam et al.Gives a modified secant equation and applies it to the NMDL method and the NMDL + method.Three-term conjugate gradient method with global convergence-SNMDL and SNMDL + methods.Under Wolfe line search,it is proved that the SNMDL method has strong convergence for uniformly convex functions and the SNMDL + method has global convergence for general functions.Numerical experiments show that the SNMDL+ method is superior to the HZ + method and the DK + method.
Keywords/Search Tags:Conjugate gradient methods, Sufficient descent property, Truncation, Secant equation, Global convergence
PDF Full Text Request
Related items