Font Size: a A A

Research On Two Types Of Conjugate Gradient Methods

Posted on:2019-01-19Degree:MasterType:Thesis
Country:ChinaCandidate:M LiFull Text:PDF
GTID:2370330572458953Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
With the wide application of large data and cloud computing technology,there is a huge amount of data and complex structure problems in real life,and the dimension and scale of the problem need to be dealt with increase rapidly.In many nonlinear optimization methods,the conjugate gradient method(CG)has the advantages of simple iterative format,low storage requirements and good theoretical properties.The conjugate gradient method(CG)is one of the most effective optimization strategies for solving large-scale linear equality systems and nonlinear optimization problems.Therefore,it is widely applied in real life,such as image processing,engineering mechanics,neural network,finite element method,pattern recognition and so on.Dai-Liao conjugate gradient method is one of the most effective and stable numerical methods among conjugate gradient methods.The optimal selection of DL parameters as an important open problem of conjugate gradient method has attracted wide attention of many scholars.The subspace minimization conjugate gradient method(SMCG)proposed by Yuan et al.is an important CG method.Recently,this method has been continuously concerned by scholars.Based on the above two methods and the existing nonlinear conjugate gradient method,two classes of conjugate gradient methods satisfying the sufficient descent property are proposed.The specific work is as follows:1.In view of the Dai-Liao conjugate gradient method,a reasonable DL parameter selection method is proposed and an improved DL conjugate gradient method is obtained.By minimizing the distance between DL direction and a direction of three-term conjugate gradient method,the conjugate gradient method satisfying DL conjugacy and sufficient descent condition is proposed.The improved conjugate gradient method is a subfamily of the DL family conjugate gradient method and consists of Hager-Zhang family of conjugate gradient method.Based on the assumptions of objective function,the improved method is globally convergent under the Wolfe search condition.Numerical results indicate that the proposed method is a promising method which outperforms CGOPT and CG_DESCENT on a set of unconstrained optimization testing problems.2.By extending the idea of Barzilai-Borwein conjugate gradient(BBCG)method proposed by Dai et al.to solving general large scale unconstrained optimization problems, the improved SMCG method based on non-monotone line search is proposed.Based on the ideas of SMCG and BBCG,two-dimensional subspace in SMCG method is extended to three-dimensional subspace and three subspaces are selected.By minimizing the two model of the objective function on different subspaces,we propose different ways to select the search direction.Combined with the ideas of Barzilai-Borwein gradient method and BBCG method,the selection criteria of search direction are given.Under mild conditions,each selection of the search direction has the sufficient descent property.Meanwhile,based on the non-monotone Wolfe line search condition,it is proved that the improved SMCG method has the global convergence property.Numerical results indicate that the improved SMCG method is a promising method which outperforms CGOPT and CG_DESCENT on a set of unconstrained optimization testing problems.
Keywords/Search Tags:conjugate gradient method, sufficient descent property, global convergence, line search condition, subspace minimization
PDF Full Text Request
Related items