Font Size: a A A

Modified Conjugate Gradient Methods With The Sufficient Descent Property Independent Of Line Searches

Posted on:2016-02-21Degree:MasterType:Thesis
Country:ChinaCandidate:J J JiaoFull Text:PDF
GTID:2180330461961700Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient methods are important iterative methods for solving uncon-strained optimization problems. A good property of the conjugate gradient method is its good convergence property, lower storage, simple algorithm and easy programming. So the methods are used to solve large-scale nonlinear optimization problems. However, most traditional conjugate gradient methods may not be descent methods if inexact line search is used. Some methods though provide descent directions for objective function, the descent property strongly relies on the line search used. In this paper, we study some modified conjugate gradient methods for solving unconstrained optimization problems. New methods not only maintain the good property of primary methods, but also satisfies the sufficient descent property, which is independent of the line search used.In Chapter 1, We introduce the background, relevant concepts and development of conjugate gradient methods.In Chapter 2, The chapter gives two groups of modified conjugate gradient method. The first set of methods are based on the ideas of the literatures [25] and [20]. Similarly, we also propose MPL+ method, MMPRP+ method, MMLS+ method, MHPL+ method, MMHP+ method, MMHL+ method and MMPL+ method. These modified methods always generate a sufficient descent direction depending on no line search. Under usual assumptions, we prove the MHSCG+ method, MHPL+ method, MMHP+ method and MMHL+ method to possess global convergence with the standard Wolfe line search, and moreover, we prove the MPL+ method, MMPRP+ method, MMLS+ method and MMPL+ method to possess global convergence with the strong Wolfe line search. The second set of methods are suggested from the ideas of the literatures [19] and [25], we propose a set of modified methods. These modified methods also always generate a sufficient descent direction depending on no line search. Under usual assumptions, New methods satisfy the global convergence. Finally, a large amount of numerical experiments are executed and reported, which show that the modified methods are encouraging.In Chapter 3, we combine the search direction of the document [32] with the pa-rameters βk of the previous chapter, new conjugate gradient methods are obtained. New methods satisfy the sufficient descent direction gkTdk= ||gk||2 and converges globally fo(?) general objective functions if the strong Wolfe line search is used. The numerical experi-ments are given in the end, the given numerical result show that the modified conjugate gradient methods are a little better.
Keywords/Search Tags:Unconstrained optimization, Conjugate gradient method, Sufficient descent property, Line search, Global convergence
PDF Full Text Request
Related items