Font Size: a A A

A Class Of Nonlinear Equality Constrained Optimization Problems Is Reduced Hessian Algorithm Research

Posted on:2008-02-09Degree:MasterType:Thesis
Country:ChinaCandidate:W WangFull Text:PDF
GTID:2190360212987979Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Penaly function methods, feasible direction method, sequential quadratic programming method and reduced Hessian method are applied more extensive for solving nonlinear constrained optimization problem. Sequential quadratic programming method has been the most important method for solving middle scale and small scale nonlinear constrained optimization. Moreover, reduced Hessian method is less other kinds of methods whether in memory or each iterative calculations, so many authors studied the methods for these advantages. Reduced Hessian method contains one-sided reduced Hessian method and two-sided reduced Hessian method. In 1978, W.Murray and M.H.Wright suggested the idea of two-sided reduced Hessian method, later, many authors studied and proposed some particular algorithms. The convergence rates of these methods are locally two-step Q-superlinear. In 1985, J. Nocedal and M.L.Overton proposed one-sided reduced Hessian method and proved the method has a local one-step Q-superlinear convergence property. In this paper, the faster convergence rate of one-sided reduced Hessian method and the less memory of two-sided reduced Hessian method are consided, we describe an algorithm which updates an approximation to one-sided reduced Hessian and two-sided reduced Hessian seperately. It is shown that the method is locally one-step Q-superlinearly convergent under certain condition. Base on the new reduced Hessian method, we also propose a reduced sequential quadratic programming algorithm, the algorithm has global convergence property.In chapter 1, we first introduce the development of optimization and some extensive optimality conditions which to decide the optimum solution. We review several methods of nonlinear optimization.In chapter 2, the problem considered is that of minimizing a nonlinear function subject to a set of equality constrains. We describe an algorithm whichupdates approximations to one-sided reduced Hessian and two-sided reduced Hessian seperately. It is shown that if at least one of the updates is performed at each iteration, the method is locally one-step Q-superlinearly convergent, and some numerical results are given .In chapter 3, base on the algorithm described in chapter 2, we propose a reduced successive quadratic programming algorithm for solving optimization problems with nonlinear equality constraints. In order to avoid the Maratos effect, the merit functions used are approximations to Fletcher's differentiable exact penalty function. Global convergence is proved on some conditions, and some numerical results are given .
Keywords/Search Tags:constrained optimization, reduced Hessian, quasi-Newton methods, successive quadratic programmming, locally superlinerly convergence, global convergence
PDF Full Text Request
Related items