Font Size: a A A

For Solving Constrained Optimization Problems, A New Descent Algorithm

Posted on:2007-08-16Degree:MasterType:Thesis
Country:ChinaCandidate:C Y PanFull Text:PDF
GTID:2190360185464378Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
With its rapid development, optimization is used on a large scale by us nowadays. Especially, finding high performance algorithms in nonlinear optimization is a very heated research topic for the optimization specialists. Many algorithms were developed in recent years , among these are Polak-Ribiere Algorithm, Fletcher-Reeves Algorithm, Hestenes-Stiefel, Dai-Yuan Algorithm, and so on. And they try to prove the global convergence of this algorithms.This paper presents us a class of efficient new descent methods for unconstrained optimization, which are developed from Dai-Yuan Method. And also, we prove their global convergence in Wolfe line search. As a attempt, we get to use both of the advantages of Hestenes-Stiefel Algorithm and our new methods synthetically. Simultaneously, we prove that the Wolfe line search condition can ensure their global convergence. Finally, we also get better numerical results with the new algorithm.In chapter 1 ,we introduce the development of optimization and some extensive optimality conditions to determine the optimum solution. Also, We review several extensive derivative descent methods for unconstrained programming.In chapter 2, we propose a class of efficient new descent methods. moreover, we show their global congenvence in Wolfe line search without the descent property. And, the conditions on the objective function are also weak, which is similar to those required by the Zoutendijk condition. Moreover, we give some property about our new methods, this property is the same with Dai-Yuan Method.In chapter 3, we propose two hybrid conjugate gradient algorithms based on Hestenes-Stiefel method and our new method. And, we proved their global congenvence in Wolfe line search without descent condition. Numerical experiments shows that our methods are very efficient, especially for large scale problems.
Keywords/Search Tags:Unconstrained Optimization, Conjugate Gradient Method, Descent Method, Wolfe Line Search, Global Convergence
PDF Full Text Request
Related items