Font Size: a A A

Convergences Of Two Classes Of Conjugated Gradient Methods And Properties Of The Augmented Lagrangian Function

Posted on:2012-01-09Degree:MasterType:Thesis
Country:ChinaCandidate:S QinFull Text:PDF
GTID:2120330335951946Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
This paper is devoted to some methods and properties of line search , conjugate gradient methods and augmented Lagrangian function about nonlinear programming. With a view to the current background of those knowledge in China and in the world , which are mainly based on the basic methods and theory of line search ,conjugate gradient methods and penalty methods, moreover the main work of this paper can be summarized as follow:1. A line search procedure based on interpolation of polynomial function is introduced for computing the step length, furthermore, the property of the step length is show here.2. This paper presents two classes of new conjugated gradient algorithms for unconstrained problems. This algorithms have property of sufficient descent, the property depends neither on the line search used nor on the convexity of the objective function ,in strong wolfe line search the two new conjugated gradient algorithms are global convergence.3. A new augmented Lagrangian function is introduced for the general constrained nonlinear programming problems and this article gives the properties of the augmented Lagrangian function,particularly establishs the relationships between stationary points of this function and Kuhn-Tucker pairs of the constrained problem .
Keywords/Search Tags:line search, conjugate gradient method, nonlinear programming, augmented Lagrangian function
PDF Full Text Request
Related items