Font Size: a A A

The Research On Global Convergence Of Nonlinear Conjugate Gradient Method Including Parameter

Posted on:2012-09-08Degree:MasterType:Thesis
Country:ChinaCandidate:T DengFull Text:PDF
GTID:2120330335953403Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this paper, to DY conjugate gradient method for one in charge, we study the full decline and global convergence of conjugate gradient method for unconstrained optimization. In the introduction, the author reviews the background, development process and advantages of nonlinear conjugate gradient method. Describes several classical conjugate gradient method and its characteristics, where duing to good characters, PRP method and DY conjugate gradient method has been the research hot for the vast number of experts and scholars. The main results are as follows:The first part, in the exact line search,a new conjugate gradient method for unconstrained optimization problems has been proposed. The algorithm can guarantee a sufficient decrease of the objective function. And on condition that the differentiability of objective function, we show the global convergence for the algorithm.The second part, we first amend DY conjugate gradient method effectively so that it has good natures both for the geometric programming problems and equality constrained optimization problem. And it gives the global convergence of the new algorithm generated by the new formula in wolf line search.The third part, combined with the existing modified DY conjugate gradient method and the modified HS conjugate gradient method, it is proposed that a hybrid conjugate gradient method for solving unconstrained optimization problems. And it gives the strong global convergence and the sufficient descent of the new algorithm generated by the new formula in strong Wolf line search.
Keywords/Search Tags:Unconstrained Optimization, Nonlinear Conjugate Gradient Method, Line Search, Global Convergence, Sufficient Descent Direction
PDF Full Text Request
Related items