Font Size: a A A

Modified Conjugate Gradient Methods Based On Secant Condition And Approximation Idea

Posted on:2021-12-14Degree:MasterType:Thesis
Country:ChinaCandidate:L XieFull Text:PDF
GTID:2480306194490774Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
The nonlinear conjugate gradient algorithm is an important algorithm for solving large-scale unconstrained optimization problems due to its simple iteration format,small computation and storage space.Starting from sufficient descent,global convergence and numerical calculation effect,the conjugate gradient method and spectral conjugate gradient method are further studied and modified respectively,and a modified conjugate gradient method with sufficient descent independent of line search is given.In the first part,several line searches,several classical conjugate gradient methods and their respective advantages and disadvantages,which are often used in solving unconstrained optimization problems,are introduced firstly.Then,some research status of nonlinear conjugate gradient methods in recent years are summarized.Then,two basic assumptions and important lemmas used in proving the convergence of nonlinear conjugate gradient methods are introduced.Finally,the main work of this paper is briefly summarized.In the second part,based on the modified secant condition and combined with the conjugate gradient method proposed by Liu et al.,a new conjugate gradient method(MD method)is derived.in order to obtain better theoretical proof,the MD method is further modified to obtain the corresponding conjugate gradient method(MD+ method).theoretically,MD method and MD+ method do not rely on line search to satisfy sufficient descent.Under Wolfe line search,MD method satisfies strong convergence for uniformly convex functions and MD+ method satisfies global convergence for general functions.In numerical calculation,MD method and MD+ method are slightly better than DK+method.In the third part,based on the approximation idea,the MD conjugate gradient method and the spectral gradient method in the second chapter are considered to be combined to obtain the effective selection of spectral parameters.in order to obtain the sufficient descent of the method,a new spectral conjugate gradient method(MJC method)is proposed by using the double truncation technique to truncate the spectral parameters.theoretically,MJC method does not rely on line search to satisfy the sufficient descent,and has global convergence for general functions under Wolfe line search.numerically,MJC method is obviously superior to DK+ method.
Keywords/Search Tags:Conjugate gradient method, Secant condition, Approximation idea, Wolfe line search, Sufficient descent, Global convergence
PDF Full Text Request
Related items