Font Size: a A A

Research On A Class Of Conjugate Gradient Methods

Posted on:2015-03-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:S W YaoFull Text:PDF
GTID:1260330428975579Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
With the development of science and technology, especially the progress of internet and information science, large-scale optimization problems emerge quite frequently in many fields. The conjugate gradient method (CG-Method) has played a special role in solving large-scale nonlinear optimization problems due to the simplicity of their very low memory requirements. Hence, a lot of works have been dedicated to finding the conjugate gradient method which globally converges and has efficient numerical performance.In this thesis, we focus on finding new conjugate gradient methods for solving large-scale unconstrained optimization problems. The FR method and PRP method are two of very rep-resentative conjugate gradient methods. FR method has nice theoretical convergence, whereas PRP method has better numerical performance. Many works have been done to combining the advantages of those methods and finding the method which possesses both global convergence properties and efficient numerical performance.In Chapter2, based on HS and DY formulae, a combinational formula is given in which the combinational coefficient is adjusted by the gradient of sequential iterations. The corresponding method is global convergent under the Wolfe-Powell line search for general functions. By using the same strategy to LS and CD methods, a modified method is proposed and proved to be global convergent.Based on DL conjugate gradient method, a class of DL-type conjugate gradient methods are given in Chapter3. All the methods possess a significant property:the directions gener-ated by the methods are sufficient descent directions under Wolfe-Powell line search, which guarantees the global convergence of the proposed methods.In Chapter4, in order to obtain more efficient methods under the Wolfe-type line search, a new conjugacy condition and the related method are given. For the given method, it is not only global convergent, but also performs better than DL and HS methods.In Chapter5, we extended the results obtained in Chapter2and proposed a class of one parameter conjugate gradient methods. The convergence of the given method is analyzed by some unified tools which show the global convergence of the proposed method. Numerical experiments with the CUTE collections show that the proposed method is promising.Some conclusions and a long view of future research work are discussed in the Chapter6.
Keywords/Search Tags:Nonlinear programming, continuous optimization, unconstrained optimization, conjugate gradient method, global convergence
PDF Full Text Request
Related items