Font Size: a A A

Alternating Direction Method For Solving Three Separable Non-convex Optimization Problems

Posted on:2019-09-24Degree:MasterType:Thesis
Country:ChinaCandidate:C S LiFull Text:PDF
GTID:2430330548496267Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
The alternating direction method of multipliers(ADMM)originates from the differential equation in the 1970s,which can be dated back to the in 1950s and relat-ed to the famous operator splitting methods,such as the Douglas-Rachford splitting method,the Peaceman-Rachford splitting method,etc.ADMM was introduced into the optimization fields in the 1980s,and becomes a class of efficient methods.Re-cently,along with the blooming of the big data and artificial intelligence,ADMM plays more and more important roles in machine learning,traffic assignment,im-age processing,economic equilibrium,resource allocation,and so on.These results are mainly focused on convex problems,and the algorithm design and convergence analysis are quite mature.However,many optimization models from real applica-tions are nonconvex.For the optimization problems whose objective functions are nonconvex or partially nonconvex,a general way is to relax the original problem to a convex optimization problem,and for the later there are a lot of tools to deal with the convergence analysis.For the nonconvex problem itself,the research is still in its infancy.Essentially,nonconvex models tend to be a better one than its convex counterpart to approximate the practical problem itself.Therefore,more and more scholars begin to focus on the convergence and the convergence rate of ADMM for nonconvex problems.For the convex optimization problem,the classical ADMM is usually efficient and its convergence is well understood when the objective function is two block.However,there is a counterexample constructed to illustrate the divergent of the directly extended multi-block ADMM.The studies for the multi-block ADMM then divides into two directions.The first one is to study with what sufficient conditions the direct extended ADMM is convergent and the other one is simple modification of the algorithm to ensure the convergence.Most recently,for a special class of three separable convex optimization problems,Sun et al.[36]proposed a variant alternating direction method of multipliers,which is convergent.For the nonconvex separable optimization problems,Guo et al.[17]proved that under the assumption that the associated function satisfies the Kurdyka-Lojasiewicz inequality,the iterate generated by the alternating direction method of multipliers converges to a critical point of the associated augmented Lagrangian function,and under some additional conditions on the problems data,they analyzed its rate of convergence.Combine the results in[36]and[17],in this paper,we consider the three-block separable nonconvex optimization where the third component function is a quadratic function.Under the assumption that the objective function satisfies the Kurdyka-Lojasiewicz inequality,we prove that any cluster point of the iterative sequence generated by the variant alternating direction method of multipliers is a critical point of its augmented Lagrangian function.We also present some sufficient conditions guaranteeing the linear convergence rate of the algorithm.
Keywords/Search Tags:separable convex optimization problem, alternating direction method of multipliers, nonconvex optimization problem, Kurdyka-Lojasiewicz inequality, global convergence, linear rate of convergence
PDF Full Text Request
Related items