Font Size: a A A

Two Classes Of Modified Conjugate Gradient Methods Based On The DK Method

Posted on:2019-10-09Degree:MasterType:Thesis
Country:ChinaCandidate:L L ZhangFull Text:PDF
GTID:2370330545472474Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
The conjugate gradient method is an important method to solve unconstrained large-scale optimization problems.In order to obtain the theoretical results and numerical effects are good methods.We propose two classes of modified conjugate gradient methods based on the DK method that is proposed by Dai Yuhong and Kou Caixia.They are satisfied the Sufficient descent property.In chapter 1,this paper introduces some related knowledge,several common lines search,research status of nonlinear conjugate gradient method.And,two important assumptions and an important lemma are introduced.Finally,the two processing methods of numerical experimental data and the main work of this article are introduced.In chapter 2,based on the DK conjugate gradient algorithm,we propose a family conjugate gradient method(called MDK method),the correponding search direction is close to the modified self-scaling memoryless BFGS(MSSML-BFGS)method.It is proved that this method has the sufficient descent,property.Under the condition of improving Wolfe line search,the global convergence of uniformly convex function can be prove.Based on the truncated DK+ method,the truncated MDK+ method is obtained.Under the improving the Wolfe line search,the MDK+ method has the global convergence of general function.The Numerical results show that the MDK method and MDK+ method are slightly better than the DK+ method.They are better than the HZ+ method.In chapter 3,based on a modified secant condition that is proposed by Saman Babaie-Kafaki and combined with the DK method,a class of modified conjugate gradient method(called SMDK method)is proposed.It is proved that this method has the sufficient descent property.Under the condition of improving Wolfe line search,the global convergence of uniformly convex function can be prove.Based on the truncated DK+ method,the truncated SMDK+ method is obtained.Under the improving the Wolfe line search,the SMDK+ method has the global convergence of general function.The Numerical results show that the SMDK method and SMDK+ method are slightly better than the DK+ method.They are better than the HZ+ method.
Keywords/Search Tags:conjugate gradient methods, Sufficient descent property, Improved Wolfe line search, Global convergence
PDF Full Text Request
Related items