Font Size: a A A

For Nonlinear Conjugate Gradient Methods Of Some Research

Posted on:2013-01-31Degree:MasterType:Thesis
Country:ChinaCandidate:S S MengFull Text:PDF
GTID:2210330374961672Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient methods is a class of effective methods for solvingunconstrained optimization problems. Because the simplicity of the algorithm,smallstorage capacity,fast convergence,easy to implement,etc,conjugate gradient methodsare often used in solving large-scale optimization problems. In this paper, we mainlystudies the descent property and the global convergence of some nonlinear conjugategradient methods for solving unconstrained optimization problems.This master's thesismainly consists of four parts:The first part,briefly introduces the research background and current situation ofsome common nonlinear conjugate gradient methods,hybrid conjugate gradientmethods and the memory gradient methods.The second part,proposes a new formula for the conjugate gradient,and understrong Wolfe line search conditions we prove that the algorithm have the descentproperty and the global convergence.The numerical experiments show that the givenalgorithm is effective.The third part, we first propose a new conjugate gradient formula, and thencombined our new formula with DY,we propose a new hybrid conjugate gradientformula, and under strong wolfe line search conditionwe propose a new hybridconjugate gradient algorithm,verify that this algorithm global convergence.Thenumerical experiments show that the given algorithm is effective.The fourth part,we propose a new class of memory gradient methods, prove itsglobal convergent with the strong Wolfe line search.,and then prove that the newalgorithm has linear convergence speed.The numerical experiments show that thegiven algorithm is effective.
Keywords/Search Tags:conjugate gradient methods, hybrid conjugate gradient methods, memorygradient methods, strong Wolfe line search, global convergence
PDF Full Text Request
Related items