Font Size: a A A

Super-memory Gradient Algorithm And GLP Projection Algorithm In Nonlinear Programming

Posted on:2008-11-27Degree:MasterType:Thesis
Country:ChinaCandidate:P ChengFull Text:PDF
GTID:2120360218963642Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Non-linear programming is one of the active topics in numerical computation field. Except the importance itself, non-linear programming is sometimes a subproblem of linear programming problem. For the optimization scholars, how to design an effective method to solve the non-linear programming problem is an important topic.This paper is divided into five chapters. In Chapter 2, we consider the convergence properties of a new class of memory gradient methods with errors and generalized Armijo step size rule for minimizing a continuously differentiable function f on R n assuming that ? f(x)of f is uniformly continuous. Combining quasi-Newton equation with our new method, quasi-Newton methods with errors are modified to have global convergence property. Numerical results show that the new algorithms are efficient. In Chapter 3, by using general projection matrix conditions and memory gradient method, a general memory gradient projection method for nonlinear programming with nonlinear in-equality constraints is presented. The global convergence properties of the new method are discussed. The numerical results illustrate that the new methods are effective. In Chapter 4, a generalized memory gradient projection method for convex constrained optimization is presented by using Goldstein-Lavintin-Polyak projection. Conditions are given on the scalar to ensure that the general memory gradient projection direction is descent. The global convergence properties of the new method are discussed with an accurate step size rule and without assuming that the sequence of iteration is bounded. Combining FR, PR, HS methods with our new method, three new classes of memory gradient projection methods with conjugate gradient scalar are presented. The new methods use little storage, thus the methods are attractive for large-scale problems. Numerical results show that the algorithm is efficient by comparing with Goldstein-Lavintin-Polyak gradient projection method.
Keywords/Search Tags:Nonlinear programming, Convex constrained Optimization, Goldstein-Lavintin-Polyak projection, Memory gradient method, Generalized Armijo step size rule
PDF Full Text Request
Related items