Font Size: a A A

Improved Conjugate Gradient Methods Based On DDL And DLVHS Methods

Posted on:2019-07-05Degree:MasterType:Thesis
Country:ChinaCandidate:X LiuFull Text:PDF
GTID:2370330545972474Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
This paper is based on the existing research results in nonlinear conjugate gradient algorithms.It is mainly based on DDL and DLVHS methods for research and correction.In order to get a new conjugate gradient algorithm who has better Theoretical properties and computational effects,the Wolfe line Search and strong Wolfe line search are used,this paper presents several modified conjugate gradient algorithms.In the chapter 1,several common algorithms for solving unconstrained optimization problems and their advantages and disadvantages are briefly introduced.In addition,the research status of conjugate gradient algorithms is described,and some theoretical knowledge about the algorithms are given.In the chapter 2,the DDL methods proposed by Saman Babaie-Kafaki and Reza Ghanbari[44]are further studied and revised.Two modified DDL methods are proposed,namely VDDL1 and VDDL2 method,the search direction of both methods has sufficient descent.At the same time,it can be proved that both methods converge globally for the uniform convex function when adopting the Goldstein line search or the Wolfe line search.On the other hand,taking into account the idea of the truncation modification of the HZ+ method,the VDDL1 and VDDL2 methods were truncated,suggesting VDDL1+and VDDL2+ method,the truncated method also has sufficient descent,and the line search condition is weakened until the Wolfe line search,which has global convergence for the general function.In the numerical experiment,the approximate Wolfe proposed by HagerZhang in the literature[37]is used.The proposed methods are compared with the pre-correction method and the currently recognized best numerical method.The results show that these new methods are effective.In the chapter 3,based on the MDL method proposed by Saman Babaie-Kafaki and Reza Ghanbari[45],combined with the DLVHS method proposed by Yao Shengwei et al.[46],a modified DLVHS conjugate gradient method is proposed,called MDLVHS method.When the step size satisfies the strong Wolfe line search condition,the method has sufficient descent and proves the convergence of the MDLVHS method for a uniformly convex function.On the other hand,using the secant equations in Dai and Wen[50],the MDLVHS method is further modified to get the MDLVHS-D method.The new method is also sufficient descent on the strong Wolfe line,and it turns out that has global convergence for the general function.The numerical result shows that the MDLVHS method is slightly better than the current best DK+ method,MDLVHS-D method is comparable to some of the numerically better methods.
Keywords/Search Tags:Conjugate gradient method, Wolfe line search, Strong Wolfe line search, Sufficient descent, Global convergence
PDF Full Text Request
Related items