Font Size: a A A

Research On Two Methods For Solving Nonlinear Unconstrained Optimization Problems

Posted on:2005-03-02Degree:MasterType:Thesis
Country:ChinaCandidate:Y H XiaoFull Text:PDF
GTID:2120360122998428Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The conjugate gradient method and quasi-Newton method are two important methods for sovling nonlinear optimization problems. The conjugate gradient method have the simplicity of their iteracion and their very low memory-requirements; The quasi-Newton method approximates Newton method by utilizing some symmetric positive definite approximation of the Hesse(or inverse Hesse) instead of the corresponding exat value thereby attaining a fast convergence rate. The purpose of this thesis is to provide a modified HS and a BFGS-type method. The chapter-by-chapter description of the thesis follows:Chapter 1:preliminary knowlege. Intrduce the knowledge which will be used in this thesis.Chapter 2:nonlinear conjugate gradient type method, we present a conjugate grdient type method which is the modified HS method. It has the similar property to the HS method, but the nuerical results are better.Chapter 3:A Modified BFGS type method. A new BFGS-type formula and a new BFGS-type method with the Wolfe-Powell(WWP) step size are presented. The global convergence property is provided and the nurnberical results indicate that the new method is very encouraging.
Keywords/Search Tags:conjugate gradient method, quasi-Newton method, BFGS method, global convergence, inexact line search
PDF Full Text Request
Related items