Font Size: a A A

Some Classes Of Modified Dai-kou Conjugate Gradient Methods

Posted on:2018-03-05Degree:MasterType:Thesis
Country:ChinaCandidate:H D ZhouFull Text:PDF
GTID:2310330515494381Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Nonlinear conjugate gradient methods are a class of very important methods for solving large-scale unconstrained optimization problems,because they have a lot of merits,such as simple iteration,small storage,small computation and so on.In order to obtain the conjugate gradient method with better theoretical properties and numerical results,based on the Dai-Kou method,we propose several modified Dai-kou nonlinear conjugate gradient methods which possess the global convergence under the improved Wolfe line search.In chapter 1,introduce some basic concepts of conjugate gradient methods,some classical conjugate gradient methods and their research status.In chapter 2,combing with the best approximation thought of DK method,seek the conjugate gradient direction closest to the direction of the modified scaled memoryless BFGS method,we propose a family of new conjugate gradient methods(called NDK methods)which guarantee the global convergence for the uniformly convex function under the improved Wolfe line search.Similarly to the DK+ method,we propose a modified conjugate gradient method(called NDK+ method),and give the convergence results of the NDK+ method for general functions under the improved Wolfe line search.Numerical results show that NDK methods has a better performance than DK methods.In chapter 3,combing with the two modified secant conditions,we propose two kinds of modified NDK conjugate gradient methods(called MNDK1 methods and MNDK2 methods),and proved the global convergence under the improved Wolfe line search.Numerical results show that MNDK1 mthods and MNDK2 methods has a better performance than DK methods.In chapter 4,combing with the spectral conjugate gradient method,we propose a family of spectral conjugate gradient method(called NDKS method)which possessing sufficient descent property independent of line search,and proved the global convergence of NDKS method for uniformly convex functions under the improved Wolfe line search.In order to obtain the global convergence for general functions,we propose a modified spectral conjugate gradient method(called NDKS+ method),and give the convergence results of NDKS+ method for general functions under the modified Wolfe line search.Numerical results show that NDKS+ methods has a better performance than DK methods.
Keywords/Search Tags:Nonlinear conjugate gradient methods, Sufficient descent property, Global convergence, Improved Wolfe line search
PDF Full Text Request
Related items