Font Size: a A A

Global Convergence Analysis Of Two Modiifed Conjugate Gradient Methods

Posted on:2013-01-13Degree:MasterType:Thesis
Country:ChinaCandidate:R YangFull Text:PDF
GTID:2230330374455079Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method is a common optimization methods to solve unconstrainedoptimization, it has a higher efficiency to solve the general target function of unconstrainedoptimization problem. The common conjugate gradient methods have FR method, PRP method,CD method, DY method, HS method and LS method etc. According to searching direction,predecessors put forward all kinds of generalized conjugate gradient method from differentangles, based on the original HS conjugate gradient method and LS conjugate gradient method,according to sufficient descent property of search direction, this paper puts forward two modifiedconjugate gradient methods, NHS algorithm and NLS algorithm. Under proper conditions, theglobal convergence of the proposed formula with the Wolfe line search, Armijo line search andGoldstein line search is proved.To check the validity of the algorithm, this paper use of MATLAB, then form thecomparison of the numerical results, the modified conjugate gradient method computation andtheir iteration times are improved in this paper, And when the problem of dimension is increased,the iterative times and no significant increase. Thus, in this paper the calculation of the algorithmhas good effect, especially suitable for solving large-scale unconstrained optimization problem.
Keywords/Search Tags:Unconstrained optimization, Conjugate gradient method, Inexact line search, Descent direction, Global convergence
PDF Full Text Request
Related items