Font Size: a A A

A Generalized Super-Memory Gradient Projection Method Of Strongly Sub-Feasible Directions With Strong Convergence For Nonlinear Inequality Constrained Optimization

Posted on:2007-05-13Degree:MasterType:Thesis
Country:ChinaCandidate:Y F CengFull Text:PDF
GTID:2120360185987470Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this paper, we consider the nonlinear inequality constrained optimization problems. We know that gradient projection method is one of the early important methods of feasible directions for solving this kind of problems, and some new generalized gradient projection methods have been researched in recent decades. On the other hand, in order to use the information of the previous iterations to generate a new iterative point, some authors, combining with generalized gradient projection methods, generalize the super-memory method for unconstrained optimization to constrained optimization. At the same time, for the problems starting with an infeasible initial point, strongly sub-feasible direction methods are one of some classes of effective methods for solving them.In this work, combining the properties of the generalized super-memory gradient projection methods with the ideas of the strongly sub-feasible directions methods, we present a new algorithm with strong convergence for nonlinear inequality constrained optimization. At each iteration, the proposed algorithm can sufficiently use the information of the previous t steps' iterations to generate a new iterative point. Particularly, the intervals of parameters in the super-memory gradient projection direction are adjustable. The main properties of the new algorithm are described as follows: (i) the improving super-memory gradient projection direction is a combination of the generalized gradient projection and the t steps' super-memory gradients, which include both the previous t steps' search directions dk-1, dk-2, ..., dk-t and gradients ▽f(xk-1), ▽f(xk-2), ..., ▽f(xk-t), moreover, only the gradients associated with a (εk, δk)-active constrained set I(xk, εk, δk) are dealt with rather than the gradients of all constraints, (ii) the initial point can be chosen arbitrarily, and at each iteration, the number of the functions satisfying the inequality constraints is nondecreasing. Specially, once a feasible iteration is obtained, then the subsequent iterations are also feasible; (iii) under suitable assumptions, it possesses the global and strong convergence. Finally, some preliminary numerical results show that the proposed algorithm is promising.
Keywords/Search Tags:inequality constrained optimization, super-memory gradient method, strongly sub-feasible directions, global convergence, strong convergence
PDF Full Text Request
Related items