Font Size: a A A

Some Augmented Lagrangian Like Methods For Linearly Constrained Convex Optimization

Posted on:2024-01-27Degree:DoctorType:Dissertation
Country:ChinaCandidate:S J XuFull Text:PDF
GTID:1520307376984859Subject:Mathematics
Abstract/Summary:PDF Full Text Request
The augmented Lagrangian method(ALM)is a fundamental algorithm for optimization problems with equality constraints.When the objective is convex and the constraints are linear,it also derives a series of important operator splitting algorithms including the alternating direction method of multipliers(ADMM).Most of linearly constrained convex optimization problems arising in the application fields have separable structure.Due to the success of the ADMM in the two-block convex optimization field,it is natural to ask whether the direct extension of ADMM still works for convex optimization problems with multiple separable blocks.However,convergence of the direct extension of ADMM can not be guaranteed even for three separable blocks.Hence,there are still many topics worthy of further research in the field of separable convex optimization algorithms.The dissertation considers two more general classes of linearly constrained convex optimization problems and proposes some ALM-like algorithms with simple implementation,fewer parameter restrictions,wider application range and easy promotion,by exploiting the unified framework of the splitting and contraction algorithm for convex optimization proposed by Prof.Bingsheng He.The main research results are presented as follows.1.We study convex optimization problems with linear equality constraints.For the problem that the computational difficulty of subproblems of the ALM is not equal,we present a dual-primal balanced ALM by changing the order of the iteration variables of the most recent balanced ALM.We prove that the algorithm is globally convergent and has an O(1/N)convergence rate in both ergodic and point-wise senses based on its equivalent prediction and correction scheme and the unified framework.Moreover,by refining the prediction matrix,we propose a generalized dual-primal balanced ALM for more generic convex optimization problems with linear constraints.Finally,the superior computational performance of the balanced ALM-type methods is verified by testing basis pursuit and the exchange problem.2.We study convex optimization problems with linear inequality constraints.For the computational difficulty of the core subproblem of the inequality ALM,we first propose an inequality ALM-like method with easier implementation based on a novel representation of the equality ALM.To further reduce the difficulty of the core subproblem,we present a linearized variant by using the linearized technique.Based the unified framework,the global convergence and O(1/N)convergence rate in both ergodic and pointwise senses of the algorithms is proved.To further improve the convergence condition,an indefinite linearized inequality ALM-like method is presented,and its global convergence and O(1/N)convergence rate in ergodic sense is analyzed by adopting the indefinite regularization technique.Finally,we show by support vector machine and image segmentation models the efficient performance of the inequality ALM-like methods.3.We study linearly constrained separable convex optimization problems.For the limitation that the ALM can not split into arbitrarily separable blocks in a Gaussian manner,we first present a two-block Gaussian splitting ALM and its dual-primal variant by utilizing an equivalent scheme of the ADMM.In particular,the methods can handle both linear equality and inequality constraints by reorganizing the penalty term.Furthermore,we propose an arbitrary-block Gaussian splitting ALM and its dual-primal version by extending the proximal term.By using block matrix analysis technique and the unified framework,the global convergence and O(1/N)convergence rate of the generalized algorithms is proved.Finally,for the two-block equality constrained case,we show the new algorithms have the same efficiency as the ADMM by a lot of numerical examples.4.We study linearly constrained separable convex optimization problems.For the limitation that the ALM can not split into arbitrarily separable blocks in a Jacobian manner,and that the existing Jacobian splitting ALM variants have step-size structural constraints,we first present a Jacobian splitting ALM with a rank-two relaxed correction and free step size by constructing the prediction matrix as the sum of a diagonal matrix and a skew-symmetric matrix.Combined with Kronecker product,its global convergence and O(1/N)convergence rate is established by using the unified framework.Furthermore,we propose a proximal relaxed Jacobian splitting ALM with less regularization factor by constructing the prediction matrix as a lower triangular matrix.Moreover,its global convergence and O(1/N)convergence rate is proved by utilizing the unified framework.In particular,the proposed algorithms can directly solve the separable convex optimization problem with linear inequality constraints.Finally,we illustrate the superior and stable numerical performance of the algorithms by a number of numerical experiments.
Keywords/Search Tags:augmented Lagrangian method, separable convex programming, linear equality and inequality constraints, distributed computation, prediction and correction scheme, convergence analysis
PDF Full Text Request
Related items