Font Size: a A A

Convergence Of Generalize D Alternating Direction Method Of Multipliers For Two Classes Of Nonconvex Optimization Problems

Posted on:2020-02-14Degree:MasterType:Thesis
Country:ChinaCandidate:X WangFull Text:PDF
GTID:2370330590462875Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Convex optimization problem is an important branch of mathematical optimization problem.It mainly studies how to minimize convex objective function based on convex compact set.If the actual problem can be described as a convex optimization problem,the global minimum of the actual problem can be obtained.The research of convex optimization problem has been quite mature,and there are many effective algorithms.In the real world,there are still a lot of nonconvex optimization problems.At present,the common method is to relax the nonconvex optimization problems into convex optimization problems,but there is always a big gap between the result and the solution of practical problems.For the case where the objective function is nonconvex or partially nonconvex,the research in this area is still in its early stage,and the existing research results are very few.In this dissertation,based on the original nonconvex optimization problems,the generalized alternating direction method of multipliers(GADMM)is used to solve the nonconvex problems.In this dissertation,for two classes of nonconvex optimization problems,under the assumption that the augmented Lagrange function satisfies the Kurdyka-?ojasiewicz inequality,it is proved that when the penalty parameter of the augmented Lagrange function is greater than the threshold,the iterative sequence generated by the generalized alternating direction method of multipliers(GADMM)converges to a critical point of the augmented Lagrange function.Under more assumptions,the convergence rate of the algorithm is proved.The dissertation consists of the following five chapters:In chapter 1,we introduce the research background and research problems as well as the structure of this dissertation.In chapter 2,some preliminary knowledge to be used in this dissertation is given.In chapter 3,the generalized alternating direction method of multipliers(GADMM)is used to solve the minimization problem of the sum of two functions with linear constraints.One function is convex function and the other function can be expressed as the difference between two convex functions.For each subproblem in the GADMM,the linearization technique in the convex function difference algorithm was employed.Under the assumptions that the associated functions satisfy the Kurdyka-?ojasiewicz inequality,the sequence generated with the GADMM converges to a critical point of the augmented Lagrangian function,while the penalty parameter in the augmented Lagrangian function is sufficiently large.At last,theconvergence rate of the algorithm was established.In Chapter 4,we consider the convergence of the generalized alternating direction method of multipliers(GADMM)for solving linearly constrained nonconvex minimization model whose objective contains coupled functions.Under the assumption that the augmented Lagrangian function satisfies the Kurdyka-?ojasiewicz inequality,we prove that the sequence generated by the GADMM converges to a critical point of the augmented Lagrangian function when the penalty parameter in the augmented Lagrangian function is sufficiently large.At last,the convergence rate of the algorithm was established.In Chapter 5,we summarize the full dissertation,explain the main work of this dissertation and the main conclusions.
Keywords/Search Tags:Generalized alternating direction method of multipliers, Kurdyka-?ojasiewicz inequality, Nonconvex optimization, Convergence
PDF Full Text Request
Related items