Font Size: a A A

Study On Several Conjugate Gradient Methods

Posted on:2015-11-07Degree:MasterType:Thesis
Country:ChinaCandidate:W B AoFull Text:PDF
GTID:2180330431978752Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
In the method of solving large-scale unconstrained optimization problems, theconjugate gradient method compared with the Newton’s method, quasi-newton methodhas some advantages such as simple algorithm and easy programming, and theadvantages of small storage requirements, so the conjugate gradient method is to solvethe large-scale unconstrained optimization problems is an important method. Thispaper studies have sufficient descent conjugate gradient algorithm, through a largenumber of numerical test functions to verify the effectiveness of the algorithm. Mainwork is as follows: the second chapter gives a revised spectral DY conjugate gradientalgorithm. The method of revised spectral DY iterative format on the correction, the donot rely on line search and is sufficient descent, and proves that under the condition ofthe uniformly convex function, the new algorithm has global convergence, through anumerical example is given to illustrate the effectiveness of the algorithm. The thirdchapter presents a modified LS conjugate gradient algorithm, and proves that themethod under Backtracking line search has global convergence, good theoretical andnumerical results are obtained. The fourth chapter, presents a modified LS+conjugategradient algorithm, and proves that the method in strong Wolfe line search the globalconvergence. Some numerical results are given.
Keywords/Search Tags:unconstrained optimization, conjugate gradient method, sufficient descent, global convergence
PDF Full Text Request
Related items