Font Size: a A A

Hybrid Conjugate Gradient Algorithms For Unconstrained Optimization Problems

Posted on:2018-01-06Degree:MasterType:Thesis
Country:ChinaCandidate:X HanFull Text:PDF
GTID:2310330536473199Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Conjugate gradient(CG)methods are widelry used and effective for solving large-scale unconstrained optimization problems.Compared with Newton methods and quasi-Newton methods,obvious merits of CG methods are low storage and easy it-erative scheme.As known,for all kinds of the well-known CG methods,their global convergence properties and computational performance may be significantly differ-ent.Naturally,so many researchers put effort into obtaining for new CG methods with not only good convergence property but also excellent practical performance.So,we directly modify conjugate scalar ?k of the well-known CG methods or hybrid CG methods.In this paper,hybrid CG methods are considered.Recently,some hy-brid CG methods and its good conclusions are gained by studiers.Motivated by the ideas of studiers,two new hybrid CG methods are proposed.Some good theoretical properties of the proposed CG methods are presented and numerical experiments are also reported by some tables and their performance profiles,the conclusions of this paper are,respectively,given byFirstly,Based on the ideas of Dai and Wen(Applied Mathematics and Com-putation,2012,218(14):7421-7430.),Jian et al.(Applied Mathematical Modelling,2015,39(3):1281-1290.)and Wei et al.(Applied Mathematics and Computation,2006,183(2):1341-1350.),a new hybrid CG method(referred to as NHC)is pro-posed and a new conjugate parameter ?kNHC satisfying (?) is also given.The NHC method can generate a sufficient descent direction at each itera-tion,moreover,this property is independent of any line search.Under the standard Wolfe line search,global convergence of the NHC method is proved.Finally,nu-merical experiments and performance profiles are also present,ed,which show that is effective and promising,especially for solving highly dimensional problems.Secondly,Motivated by the ideas of Dai and Wen(Applied Mathematics and Computation,2012,218(14):7421-7430.)and Wei et al.(Applied Mathematics and Computation,2006,179(2):407-430.),another new hybrid CG method(referred to as HZW)is put forward for solving unconstrained optimization problems.The conjugate scalar ?kHZW satisfying 0 ? ?kHZW??kFEis also introduced.The HZW method always generates a sufficient descent direction at each iteration.Further-more,we show that the proposed CG method possesses global convergence under the standard Wolfe line search.Finally,some numerical results are reported,which also demonstrate that the HZW method possesses good computational performance.
Keywords/Search Tags:unconstrained optimization, hybrid conjugate gradient algorithm, sufficient descent, global convergence
PDF Full Text Request
Related items