Font Size: a A A

Research On Nonlinear Conjugate Gradient Algorithm

Posted on:2009-06-16Degree:MasterType:Thesis
Country:ChinaCandidate:L L MoFull Text:PDF
GTID:2120360245467691Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Our main purpose in this paper is to study two new conjugate gradient algorithms. We establish the global convergence of these methods. We also test the proposed methods through a set of problems. The thesis is organized as follows:In Chapter 1, we recall the foundational knowledge about conjugate gradient method and some famous researches.In Chapter 2, a new nonlinear conjugate gradient formula for solving unconstrained optimization problem is proposed, which is constructed by combining the famous formulaβkFR withβkPRP, the new formula satisfies the sufficient descent condition, and the corresponding algorithm under the strong Wolfe-Powell condition is global convergent. Preliminary numerical result shows that the method is promising.In Chapter 3, Wei etal. in [22] proposed a class of new quasi-Newton equations Bksk-1 = yk-1* = yk-1 + Ak(3)sk-1, where Ak(3) is definite matrix. Based on this we give a modified conjugate gradient method. The modified type are globally convergent under gentle conditions. Preliminary numerical result shows that the method is efficient and advisable for nonlinear unconstrained optimization.
Keywords/Search Tags:unconstrained optimization, conjugate gradient method, line search condition, global convergence
PDF Full Text Request
Related items