Font Size: a A A

Study On Dai-Liao Conjugate Gradient Method And Three-term Conjugate Gradient Method

Posted on:2020-04-01Degree:MasterType:Thesis
Country:ChinaCandidate:K K ZhangFull Text:PDF
GTID:2370330602450568Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Optimization Theory and Method is a highly applicable discipline.Recently,with the rapid development of science and technology,a large number of large-scale optimization problems have rapidly emerged in reality.Therefore,the methods for fast and efficient optimization algorithms have become the hot trend of research.Conjugate gradient(CG)method is one of the most important methods to solve optimization problems.It has the characteristics of low storage requirements,simple iterative format and high stability.These advantages make CG methods can meet the needs of large data and cloud computing technology.Conjugate gradient method based on Dai-Liao(DL)model is one of the most effective and stable methods of numerical performance.The key of the DL conjugate gradient method is the selection of the parameter in DL model,which has attracted wide attention of many scholars.With the emergence of more and more large-scale problems,the subspace technique becomes particularly important and is widely applied to the filed of optimization.Recently,many scholars combined the subspace technique with the three-term CG method and proposed some effective algorithms.The new idea for studying three-term CG method can reduce the amount of calculation and improve the numerical effect of the algorithm.In this paper,based on the above two different ideas,two different kinds of conjugate gradient algorithms are proposed.The specific work is listed as follows:1.Based on the Dai-Liao conjugate gradient method,a new choice of DL parameter is proposed,and a new conjugate gradient method based on Dai-Liao model is obtained.By minimizing the upper bound of spectral condition number of the matrix defining it in order to cluster all the singular values,we get a better DL parameter and the search direction with the new DL parameter satisfies the sufficient descent condition.Based on some mild assumptions of the object function,the global convergence of the proposed method under the Wolfe line search conditions is proved both for uniformly convex functions and the general functions.Numerical experiments show that,for the CUTEr library and the test problem collection given by Andrei,the proposed method is superior to M1 proposed by Babaie-Kafaki and Ghanbari,CG DESCENT(5.3)and CGOPT.2.By applying the idea of the subspace minimization to the the three-term conjugate gradient method,a new adaptive subspace three-term conjugate gradient algorithm is proposed.In each iteration,the choice of the subspace is dynamically selected,and further giving the the adaptive selection criteria of the search direction under different subspaces.Under some mild assumptions,we prove that the two important properties of the search direction.Meanwhile,based on the non-monotone line search conditions,the global convergence of the proposed algorithm for general functions is proved.Numerical experiments show that the numerical performance of the new three-term conjugate gradient algorithm is better than the classical CG DESCENT(5.3),CGOPT and SMCG BB.
Keywords/Search Tags:Conjugate gradient, Dai-Liao model, Subspace minimization, Three-term conjugate gradient method, Line search condition, Sufficient descent condition, Global convergence
PDF Full Text Request
Related items