Font Size: a A A

Conjugate Gradient Method's Research For Unconstrained Optimization

Posted on:2009-02-01Degree:MasterType:Thesis
Country:ChinaCandidate:G YuFull Text:PDF
GTID:2120360245970063Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method enjoyed grate interest amongst both domestic and overseas researchers between 1960s and 1970s. Now, the rapid development of computer and a grate deal of large scale optimization problems make the research in conjugate gradient method revive.In this thesis, we mainly discuss the algorithm and the theory of conjugate gradient method. The structure of this paper is organized as follows: In the first chapter we survey the history of conjugate gradient method, and discuss some conjugate gradient methods respectively which are FR method, PRP method, HS method, DY method and so on. They are currently considered to be well-known methods for large scale unconstrained optimization problems. In each section, we discuss every method's global convergence properties and numerical behaviors etc.In the second chapter, we develop several new conjugate gradient algorithms, explore their convergence properties and analyze the numerical results. By comparing the numerical results, we can find the advantages of these new algorithms. What's more, we make a deep investigation on the HS method and obtain a convergence result theoretically.In the last chapter, we summarize the contents of the above two chapters and propose several problems that are worthwhile to further research.
Keywords/Search Tags:unconstrained optimization, conjugate gradient method, line search, the Wolfe condition, global convergence property
PDF Full Text Request
Related items