Font Size: a A A

A Class Of Hybrid Conjugate Gradient Algorithm

Posted on:2007-06-26Degree:MasterType:Thesis
Country:ChinaCandidate:Z F DaiFull Text:PDF
GTID:2190360185464381Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
With its rapid development,optimization is used on a large scale by us nowadays. Especially,finding high performance algorithms in nonlinear optimization is a very heated research topic for the optimization specialists. Many algorithms were developed in recent years ,among these are conjugate gradient method , quasi-Newton method, and so on. And people try to prove the convergence of the methods. This paper presents some new conjugate gradient methods for unconstrained optimization,which are based on Hestenes-Stiefel Algorithm and Dai-Yuan Algorithm.As a attempt,we get to use both of the advantages of Hestenes-Stiefel Algorithm and Dai-Yuan Algorithm synthetically. Simultaneously, We proved it can ensure the convergence of the new methods under the Wolfe line search and without the descent condition .Finally,Numerical experiments show that the algorith is efficient by comparing with HS conjugate gradient method and PR conjugate gradient method.In chapter 1 ,we introduce the development of optimization and some extensive optimality conditions to determine the optimum solution. Then we review several extensive derivative descent methods for unconstrained programming .In chapter 2, we presents a mixed of conjugate gradient methods for unconstrained optimization based on Hestenes-stiefel Algorithms and Dai-Yuan Algorithms[16],which had taken the advantages of two Algorithms.We proved it can ensure the convergence of the new methods under the Wolfe line search and without the descent condition .Numerical experiments show that the algorith is efficient by comparing with HS conjugate gradient method and PR conjugate gradient method.In chapter 3, we improve the mixed of conjugate gradient methods for unconstrained optimization based on Hestenes-stiefel Algorithms and Dai-Yuan Algorithms in [16] .This paper allows β_k to be selected in a wider than [16]. Based the same thinking ,we propose a mixed conjugate gradient method for unconstrained optimization based on Polak - Ribiere - Polyak Algorithms and Dai-Yuan Algorithms,which has taken the advantages of two Algorithms.We proved they can ensure the convergence of the new methods under the Wolfe line search and without the descent condition .Numerical experiments show that the algoriths are efficient.
Keywords/Search Tags:Unconstrained optimization, Conjugate gradient method, Wolfe Line search, Global convergence
PDF Full Text Request
Related items