Font Size: a A A

A Hybrid Conjugate Gradient Method For Nonlinear Unconstrained Optimization

Posted on:2013-11-08Degree:MasterType:Thesis
Country:ChinaCandidate:Z LiFull Text:PDF
GTID:2230330374970107Subject:Mathematics
Abstract/Summary:PDF Full Text Request
In this dissertation a new hybrid conjugate gradient algorithm is proposed and analyzed. The parameter βk is computed as a convex combination of the HS (Hestenes-Stiefel) and the DY (Dai-Yuan) conjugate gradient algorithms, i.e. βk=θkβkHS+(1-θk)βkDY. The parameter Ok in the convex combination is computed in such a way that the descent direction is generated and the conjugacy condition is satisfied as far as possible under the standard Wolfe conditions. The algorithm generates sufficient descent directions when the strong Wolfe conditions used. The method is proved to be globally convergent for the uniformly convex function. Numerical comparisons with conjugate gradient algorithms using a set of511unconstrained optimization problems, some of them from the CUTE library. show that the algorithm outperforms the HS algorithm and the DY algorithm.
Keywords/Search Tags:Hybrid conjugate algorithm, unconstrained optimization, HS con-jugate algorithm, DY conjugate algorithm
PDF Full Text Request
Related items