Font Size: a A A

Studies On Splitting Algorithms For Nonconvex Separable Optimization Problems

Posted on:2022-05-19Degree:DoctorType:Dissertation
Country:ChinaCandidate:M X LiuFull Text:PDF
GTID:1480306722957469Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Nonconvex separable optimization problems,an important class of nonconvex optimization problems with certain structures,are widely used in machine learning,image processing,engineering design,and other fields.Currently,research in this area is in the development stage,which has attracted extensive attention and research interest from many scholars.At the same time,with the rapid development of science and technology,the growth of massive data and its complexity have made the scale of problems increasingly large.Therefore,it is a meaningful work to study effective algorithms in depth for nonconvex separable optimization problems.This dissertation studies splitting algorithms for nonconvex separable optimization problems,further improves the nonconvex optimization theory and algorithm design,and provides algorithm support for solving more practical problems.The main results of this dissertation are summarized as follows:Firstly,a strictly contracted symmetric alternating direction method of multipliers(ADMM)in the convex case is improved and extended to the linear equality constrained nonconvex separable optimization problems.At each iteration,the Lagrangian multiplier is updated twice,and both update steps are attached with a relaxation factor.Under suitable conditions,such as the Kurdyka–(?)ojasiewicz inequality,the algorithm is proved to be strongly,sublinearly and linearly convergent.Preliminary numerical results show that the proposed algorithm is efficient.Secondly,by the ideas of sequential quadratic programming(SQP)method and proximal ADMM,without any line search,a proximal splitting sequential quadratic programming(SSQP)algorithm is proposed for linear equality constrained nonconvex separable optimization problems.Based on SQP method as a main line,the proximal ADMM is applied to the traditional quadratic approximation subproblem to obtain smaller–scale quadratic subproblems.At each iteration,the resulting subproblems use Bregman distance as regularization to be effectively solved.The Lagrangian multiplier is updated in a dual ascending way.Under the Kurdyka–(?)ojasiewicz inequality,the strong convergence and sublinear and linear rate of convergence are obtained.Thirdly,combining the ideas of SSQP algorithm and symmetric ADMM,a symmetric SSQP algorithm with a line search is proposed for general linearly constrained nonconvex separable optimization problems.By introducing an auxiliary variable,the original problem is equivalently transformed into a linear equality constrained three–block nonconvex optimization problem.At each iteration,the algorithm solves a regularized subproblem,and search directions are generated by solving one quadratic subproblem and the other one with an explicit solution.Given the augmented Lagrangian function as a merit function,the Armijo–type inexact line search is executed in succession to calculate double step sizes.The Lagrangian multiplier is updated twice in a simple and explicit manner.Under suitable conditions,the global convergence and iteration complexity of the algorithm are proved,and the unit step size can be accepted by the proposed algorithm.Numerical tests demonstrate the effectiveness of the algorithm.Finally,SSQP algorithms with line search technology are studied for general nonlinearly constrained nonconvex separable optimization problems.By using an auxiliary variable,the original problem is equivalent to a nonlinear equality constrained three–block nonconvex optimization problem,whose quadratic approximation subproblem is further considered.Based on the Jacobi iterative strategy of splitting algorithms,two monotonic SSQP algorithms are proposed.At each iteration,search directions are generated by solving two quadratic subproblems and another one with an explicit solution.As a merit function,the augmented Lagrangian function performs the Armijo–type inexact line search.Under mild conditions,the two proposed algorithms are globally convergent.Numerical experiments show that they are efficient for the test problem.The difference of them lies in the update strategy of the Lagrangian multiplier,one is updated in a simple and explicit manner,and the other is updated by the line search with a corrected negative gradient as the search direction.
Keywords/Search Tags:Nonconvex separable optimization, Symmetric alternating direction method of multipliers, Splitting sequential quadratic programming, Convergence, Rate of convergence, Iteration complexity
PDF Full Text Request
Related items