Font Size: a A A

Some Augmented Lagrangian Function Based Methods For Constrained Optimization

Posted on:2017-09-02Degree:DoctorType:Dissertation
Country:ChinaCandidate:L CheFull Text:PDF
GTID:1310330512459018Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
The augmented Lagrangian method(ALM),which was originally known as the method of multipliers,and one of its relatives,known as the alternating direction method of multipliers(ADMM),have gained much attention in recent years for solving constrained optimization problems,not only for the effectiveness they possessed in solving problems derived from engineering and industry,but also for the efficiency of many practical solvers that are derived from them.Nevertheless,even the classical ALM and ADMM have been designed for more than four decades and a large number of related works were progressively finished,the potential of designing preferable algorithms from the augmented Lagrangian function,the key ingredient of ALM and ADMM,still has not been fully explored.Therefore,the theme of this thesis is dedicated to the algorithmic design,the theoretical analysis and the numerical implementations of several augmented Lagrangian function based methods for solving certain constrained optimization problems.First,we present a second-order augmented Lagrangian method for solving nonlinear programming problems with both equality and inequality constraints,nonlinear second-order cone programming problems and nonlinear semidefinite programming problems,in which the second-order iteration of the multipliers is finished by introducing a specially designed generalized Newton method who can handle the nonsmoothness caused by the inequality constraints or the conic constraints.After that,we conduct a thorough theoretical study on the convergence properties of this method,and the corresponding results show that when the constraint nondegeneracy(or the linear independent constraint qualification)and the strong second-order sufficient condition hold,the method employed in this thesis is locally convergent and possesses a superlinear rate of convergence,despite the penalty parameter is fixed and/or the strict complementarity fails.Second,we propose an inexact multi-block ADMM-type first-order method for solving a class of high-dimensional convex composite conic optimization problems to moderate accuracy.The design of this method combines an inexact 2-block majorized semi-proximal ADMM and the recent advances in the inexact symmetric Gauss-Seidel technique for solving a multi-block convex composite quadratic programming whose objective contains a nonsmooth term involving only the first block-variable.One of the most distinctive features of the proposed method,named by sGS-imsPADMM,is that it only needs one cycle of an inexact sGS method,instead of an unknown number of cycles,to solve each of the subproblems involved.With some simple and implementable error tolerance criteria,the cost for solving the subproblems can be greatly reduced,and many steps in the forward sweep of each sGS cycle can often be skipped,which further contributes to the efficiency of the proposed method.Global convergence,as well as the iteration complexity in the non-ergodic sense,of the proposed method is established.Preliminary numerical experiments on some high-dimensional linear and convex quadratic SDP problems with a large number of linear equality and inequality constraints are also presented.The numerical results show that for the vast majority of the tested problems,the sGS-imsPADMM is 2 to 3 times faster than the directly extended multi-block ADMM with the aggressive step-length of 1.618,which is currently the benchmark among first-order methods for solving multi-block linear and quadratic SDP problems though its convergence is not guaranteed.Third,we intend to clarify some existing theoretical confusions in understanding the convergence properties of ADMM.we construct a counterexample to show that the statement on the convergence of the alternating direction method of multipliers(ADMM)for solving linearly constrained convex optimization problems in a highly influential paper can be false if no prior condition on the existence of solutions to all the subproblems involved is assumed to hold.After that,we present fairly mild conditions to guarantee the existence of solutions to all the subproblems and provide a rigorous convergence analysis on the ADMM,under a more general and useful semi-proximal ADMM(sPADMM)setting considered by Fazel et al.,with a computationally more attractive large step-length that can even exceed the practically much preferred golden ratio of 1:618.Finally,we provide the convergence analysis for a generalized alternating direction method of multipliers with semi-proximal terms for solving linearly constrained 2-block separable convex optimization problems.This method,which relaxes both the primal and the dual variables uniformly in a natural way with one relaxation factor in the interval(0;2),has the advantage of enhancing the performance of the classic ADMM.More importantly,it has the ability of solving multi-block convex composite conic optimization problems with the semi-proximal terms are properly chosen by certain strategies such as the sGS techniques.We used 420 instances of DNN-SDP problems to test the proposed method and the corresponding numerical results indicates that the proposed method is not only effective in solving these problems but also effective due to the relaxation step.
Keywords/Search Tags:Constrained Optimization, Augmented Lagrangian Function, Augmented Lagrangian Method, Alternating Direction Method of Multipliers, Second-Order Augmented Lagrangian Method, Inexact Symmetric Gauss-Seidel Iteration
PDF Full Text Request
Related items