Font Size: a A A

Study On Several Modified Nonlinear Conjugate Gradient Methods And Their Global Convergence

Posted on:2017-05-31Degree:MasterType:Thesis
Country:ChinaCandidate:X Q ZhouFull Text:PDF
GTID:2180330503478547Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
This paper mainly discusses the global convergence of several modified nonlinear conjugate grad ient methods under the condition of the related line search. The nonlinear conjugate gradient method is a branch of the optimization method, and with the optimization theory has become widely used in the aspects of production, economy and transportation, especially for those more complex large-scale problems, the conjugate gradient method has simp le ideas, easy-programming and the advantages of small storage space when calculating, which makes the conjugate gradient method is frequently used in practice, so it provides practical value to the research in this paper.The main work of this paper can be described as two aspects: one is the preliminary knowledge of conjugate grad ient method, also invo lves a number of the search criteria when the algorithms us ing in the operation process and decent property as well as the convergence of the algorithms, and the second is that we put forward a series of modified conjugate gradient methods, and proves that decent properties and convergence in the corresponding line search conditions.The main contents of this paper are as follows :The first chapter, the main purpose is elaborating the research background and present situation of this article, also introducing the related basic knowledge of conjugate grad ient method.The second chapter, we propose two modified conjugate gradient method, and prove the global convergence of the two modified conjugate gradient methods under the strong Wolfe line search. The first one which gets from modifying PRP conjugate gradient method is satisfied of the sufficient decent condition without any line searches. The second we put forward a modified HS conjugate grad ient method satisfying the sufficient decent condition without any line search.The third chapter, two modified DY conjugate grad ient methods based on the Dai-Yuan method are proposed for unconstrained optimization problems. We prove the global convergence of the two modified DY conjugate gradient methods under the standard Wolfe line search. One of the two modified methods provides a descent direction with the Wolfe line search, the other keeps a sufficient descent direction without any line search.The fourth chapter, we prove the global convergence of DPRP conjugate method which is proposed by Zhifeng-Dai under the General Wolfe line search.The fifth chapter, this paragraph a simple summary and outlook is made to summarize some nonlinear conjugate grad ient method and their global convergence under the condition of the corresponding line search proposed in this paper, which lays a solid theoretical foundation of further numerical computations.
Keywords/Search Tags:conjugate gradient methods, suffic ient decent property, decent property, Wolfe line search, strong Wolfe line search, generalized Wolfe line search, global convergence
PDF Full Text Request
Related items