Font Size: a A A

A Class Of Nonlinear Conjugate Gradient Methods

Posted on:2008-09-11Degree:MasterType:Thesis
Country:ChinaCandidate:X F ZhengFull Text:PDF
GTID:2190360215475122Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
This thesis researches the global convergence of the nonlinear conjugate gradientmethod for solving unconstrained optimization problems. There are three parts in thisthesis.Firstly, we simply review the emergence, development and characteristics ofnonlinear conjugate gradient method, introducing some important formulas and theirbackground of the method. In all these formulas, DY method is famous for its good innerproperties. A new algorithm employing a Wolfe-type line search and DY method is given.The global convergence of the new algorithm is given without descent property forstrictly convex objective function under some proper conditions.Secondly, we give a new formula of the conjugate gradient method and analyze twoproperties of the new formula. Namely, the new formula will produce a descent directionwithout any line searches if the objective function is strictly convex. We propose a newconjugate gradient algorithm for unconstrained optimization with Wolfe line search rulesand a mixed formula that combines the new formula and DY formula. The globalconvergence of the new algorithm without descent condition is gotten.Finally, we propose a mixed formula combined new formula and HS formula. A newalgorithm is developed with the mixed formula and Wolfe line search rule. The newalgorithm satisfies property (*). We proved the global convergence of the new algorithmwhen two standard assumptions and sufficient descent condition are satisfied.
Keywords/Search Tags:unconstrained optimization, conjugate gradient method, line search, global convergence
PDF Full Text Request
Related items