Font Size: a A A

Conjugate Gradient Algorithms Based On The Conjugate Gradient Parameter Of Wei-Yao-Liu

Posted on:2016-11-09Degree:MasterType:Thesis
Country:ChinaCandidate:X L LiFull Text:PDF
GTID:2180330461961949Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
The optimization is a discipline with the widely application and rapid development, and the unconstrained optimization problem is a basic optimization problem. Uncon-strained optimization methods include the steepest descent method, Newton method, conjugate gradient method and Quasi-Newton method. This paper mainly considers the most common conjugate gradient method for solving large-scale unconstrained optimiza-tion problems, who is smaller to store, simpler and easier to program. It is widely applied in the fields of aerospace, oil exploration, atmospheric simulation, engineering designing.The ideas in this thesis are based on the research results of home and abroad, and the main results are summarized as follows:In chapter 1, this article simply introduces several common optimization methods for solving unconstrained optimization problems. Meanwhile, the conjugate gradient method and its related knowledge are introduced.In chapter 2, based on the VPRP conjugate gradient method, several modified conju-gate gradient methods with disturbance factors are proposed. The new methods possess the sufficient descent property and the global convergence for nonconvex minimization problems under some line search, such as the generalized Wolfe line search conditions. Preliminary numerical results show that the new methods are effective.In chapter 3, based on the different advantages of PRP and FR, HS and DY, LS and CD conjugate gradient methods, we consider Powell’s suggestion of restricting βκ≥ 0 and the thoughts of hybrid conjugate gradient method, and three new formulas with hybrid parameters are presented. The new methods under the strong Wolfe line search satisfy the sufficient descent property and the global convergence. We also report some numerical experiments to show the efficiency of the proposed methods.In chapter 4, we make full use of the role of factors in different conjugate parameters and propose a new hybrid parameter. Then we obtain a conjugate gradient method with sufficient descent property, which is independent on any line search. This chapter proves the global convergence under the strong Wolfe and generalized Wolfe line search...
Keywords/Search Tags:Unconstrained optimization, conjugate gradient method, sufficient descent property, global convergence, disturbance factors, line search
PDF Full Text Request
Related items