Font Size: a A A

Hybrid Conjugate Gradient Methods With Global Convergence Property Under The Wolfe Line Search

Posted on:2017-03-14Degree:MasterType:Thesis
Country:ChinaCandidate:H M ChenFull Text:PDF
GTID:2180330485470484Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Nonlinear conjugate gradient methods are a class of very important methods for solving large-scale unconstrained optimization problems because they have a lot of merits, such as the simplicity of iterative form, small computational effort, and low storage. Conjugate gradient method has a faster convergence speed than the steepest descent method, and has a lower storage than the Newton’s method. However, in order to assure the global convergence, most of existing conjugate gradient methods need the assumption that the line search satisfies the strong Wolfe conditions. In this thesis, we focus our attention on weakening this assumption. We propose several hybrid conjugate gradient methods which possess the global convergence as long as the standard Wolfe conditions are satisfied. The contents of this thesis are organized as follows.In Chapter 1, the steps and related concepts of conjugate gradient methods are introduced. Some classical conjugate gradient methods and their research status are also stated.In Chapter 2, on the basis of some existing hybrid conjugate gradient methods, we propose two hybrid conjugate gradient methods which have the descent property independent of the line search used. We prove that the two methods are globally convergent under the Wolfe line search conditions. The numerical results reported in this chapter show that the performance of our methods are comparable with those of some excellent existing conjugate gradient methods.In Chapter 3, on the basis of the DL and DHS methods, we propose three hybrid conjugate gradient methods which have the sufficient descent property under the Wolfe conditions. We prove that two of the methods converge globally under the Wolfe conditions, and the other method converges globally under the strong Wolfe conditions. The numerical results show that the performance of the three methods proposed in this chapter are comparable with those of some excellent existing conjugate gradient methods.In Chapter 4, on the basis of the DL and JHS methods, we propose three hybrid conjugate gradient methods which have the descent property independent of the line search used. We prove that two of the methods are globally convergent under the Wolfe conditions, and the other method converges globally under the strong Wolfe conditions. The numerical results show that the performance of two of the methods proposed in this chapter are comparable with those of some excellent existing conjugate gradient methods, while the performance of the third method is superior to the others.
Keywords/Search Tags:Conjugate Gradient Method, Hybrid Conjugate Gradient Method, Wolfe Line Search, Sufficient Descent, Global Convergence
PDF Full Text Request
Related items