Font Size: a A A

Improved Conjugate Gradient Method Based On Strong Wolfe Line Search

Posted on:2022-03-30Degree:MasterType:Thesis
Country:ChinaCandidate:L YangFull Text:PDF
GTID:2480306488973189Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Unconstrained optimization theory and method are the basis of optimization research,and are also important branches of optimization field.Conjugate gradient method is one of the most effective methods for solving large-scale unconstrained optimization problems,and it is also a hot topic in optimization field.Based on the strong Wolfe line search condition,the classical conjugate parameters and spectral parameters are improved in this academic dissertation.Then,two improved conjugate gradient methods and a new spectral conjugate gradient method are proposed for solving unconstrained optimization problems.Firstly,based on the second inequality of the strong Wolfe line search condition,the improved IDHS conjugate gradient method and IDP+ conjugate gradient method are proposed respectively by using the good theoretical properties of the Dai-Yuan(DY)conjugate gradient method,the improved Hestenes-Stiefel(HS)formula and Polak-Ribiere-Polyak(PRP)formula.Under the strong Wolfe line search condition,it shows that each iteration of the two improved algorithms satisfies the sufficient descent condition,and the global convergence of the algorithms is proved under usual assumptions.The two algorithms are tested and compared with the existing corresponding methods with better numerical performance.The numerical results are reported intuitively by the performance graph,and these show that the two improved algorithms are both effective.Secondly,based on a class of effective conjugate parameter selection technique and combined with the idea of spectral conjugate gradient method,a new hybrid conjugate parameter is given.In order to obtain a large decline in each iteration of the algorithm,a spectral parameter greater than 1 is designed based on the strong Wolfe line search condition,and a hybrid spectral conjugate gradient method is established.The direction generated by this algorithm is always sufficient descending without any line search condition.The global convergence of the algorithm is proved under the standard Wolfe line search condition.A large number of numerical experiments show that the proposed spectral conjugate gradient method is effective.
Keywords/Search Tags:unconstrained optimization, conjugate gradient method, spectral conjugate gradient method, sufficient descent property, global convergence
PDF Full Text Request
Related items