Font Size: a A A

Optimization Algorithms Based On Alternating Direction Multiplier Method And Operator Splitting

Posted on:2020-04-19Degree:DoctorType:Dissertation
Country:ChinaCandidate:X K ChangFull Text:PDF
GTID:1360330602963870Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The convex minimization problem with linear constraints and a separable objective function has numerous applications in many fields,such as the dual of(quadratic)semidefinite programming(SDP),image processing,compressed sensing,machine learning,and so on.It is a hot topic in the field of optimization to design a fast and efficient algorithm by making full use of separable structure.As first-order algorithms,alternating direction multiplier method(ADMM)and operator splitting algorithm are very popular and efficient for solving separable convex programming problems.For multi-block cases,the direct expansion of ADMM does not guarantee convergence,though it has many advantages,such as extreme simplicity and efficiency.Because of these advantages,ADMM has become one of the most widely used algorithms,and aroused an upsurge in the study of first-order distributed algorithms.In this thesis,we design first-order algorithms for three separable convex optimization models with different structures,based on the basic framework of ADMM and operator splitting algorithm,and uniformly study their convergence,complexity and numerical efficiency.Firstly,a modified ADMM is developed for solving the dual of convex quadratic SDP(CQSDP),which is naturally a 3-block separable convex optimization.The modified ADMM bears a peculiar feature that the augmented Lagrangian function is not necessarily to be minimized with respect to the block-variable corresponding to the quadratic term of the objective function.This saves both the computational cost and the memory for variable storage.To further explore the convergence,the modified ADMM is explained as an application of the3-operator splitting framework studied by Davis and Yin.For the H-weighted nearest correlation matrix problem,we show that the direct extension of ADMM(ADMMd)for its dual is equivalent to a 2-block semi-proximal ADMM for its reformulation,under special initial conditions.Namely,ADMMd is convergent for this special problem,requiring no correction steps at all to generate new iterates.Finally,by balancing the progress of primal and dual feasibilities,we adjust the penalty parameter dynamically to improve efficiency of the proposed method.Secondly,we present a Prediction-Correction-based ADMM(PCB-ADMM)to solve multiblock separable convex minimization model with linear or quadratic blocks.The prediction step takes a special block coordinate descent(BCD)cycle to update the variable blocks,the following step corrects the output slightly by computing a convex combination of two points from the prediction step and previous iteration.The convergence is obtained by using the variational inequality and prediction-correction scheme.Especially,we obtain the convergence of PCB-ADMM with unit step-length in the correction step,when a linear operator is invertible.In this case,the correction step is unnecessary.Numerical results on the low patch-rank image decomposition illustrate that our PCB-ADMM is efficient.Finally,we construct a linearized symmetric ADMM(LSADMM)with indefinite proximal regularization for solving general multi-block separable convex minimization.This LSADMM partitions the data into two group variables and updates the Lagrange multiplier twice in different forms and with suitable step sizes.Two grouped variables are updated in a GaussSeidel scheme,while the variables within each group are updated in a Jacobian scheme,which would make it very attractive for a few problems in big-data setting.Theoretically,we obtain the optimal lower bound of the proximal parameter.Numerical experiments on solving a sparse matrix problem in statistical learning validate the significant improvement of our proposed LSADMM,by comparing with some state-of-the-art algorithms developed recently.
Keywords/Search Tags:Separable convex programming, Alternating direction method of multipliers, Operator splitting, Indefinite proximal regularization, Prediction-correction scheme
PDF Full Text Request
Related items