Font Size: a A A

Some Classes Of The GSD+ Conjugate Gradient Methods

Posted on:2018-06-01Degree:MasterType:Thesis
Country:ChinaCandidate:J LuoFull Text:PDF
GTID:2310330515994438Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
In various optimization algorithms,conjugate gradient methods are very important.The utility model has the advantages of small storage requirement,provide with step convergence,high stability and easier to program,is an important method for solving unconstrained optimization problems.As everyone knows,sufficient descent property count a great deal to analyze the convergence of conjugate gradient method.However,in many existing conjugate gradient methods,does not always satisfied the sufficient descent property.Therefore,in this paper,form the perspective of decline,we study several kinds of GSD+ conjugate gradient methods for solving unconstrained optimization problems.New methods satisfies the sufficient descent property independent of the line search used.The main research contents are as follows:In Chapter 1,we briefly introduced the conjugate gradient methods,some important lemmas and assumptions and the main results obtained in this thesis.In Chapter 2,based on Nakamura,as well as Zhang and Li have provided another modified conjugate parameter,the chapter gives three kinds of GSD+ conjugate gradient methods.Called MTDPRP+ method,MTDHS+ method and MTDLS+ method,respec-tively.Which parameter ?>1/4,h>0.These ones here methods satisfies the sufficient,descent property independent of the line search used.Under usual assumptions,with the strong Wolfe line search,and moreover,we prove the MTDHS+ method to possess global convergence with the standard Wolfe line search and MTDLS+ method to possess global convergence with the generalized Wolfe line search.Preliminary numerical results show that the update methods are effective,the corresponding numerical results are given.In Chapter 3,adopt the ideal of spectral conjugate gradient method,the chapter gives five kinds of GSD+ spectral conjugate gradient methods,called STDHS+ method,STDPRP+ method,STDHP+ method,SMTDHS+ method and SMTDPRP+ method,respectively.These ones here methods satisfies the sufficient descent property independent of the line search used.Under usual assumptions,we have proved the update methods to possess global convergence.Preliminary numerical results show that the update methods are effective.
Keywords/Search Tags:Nonlinear conjugate gradient method, Sufficient descent property, Global convergence
PDF Full Text Request
Related items