Font Size: a A A

Alternating Direction Method With Multipliers And Quadratic Growth Condition Of Functions

Posted on:2017-05-25Degree:DoctorType:Dissertation
Country:ChinaCandidate:J J WangFull Text:PDF
GTID:1220330503969812Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
It is well known that the convergence of the direct extension of the alternating direction method of multipliers(ADMM) can not be guaranteed. This fact has inspired many scholars to improve ADMM. They improve ADMM by two ways: One way is to correct the output of the direct extension of ADMM via a simple correction step, and the other is to employ a simple proximal item to solve inexactly each sub-problem in the direct extension of ADMM. In this paper, in order to solve the multi-block separable convex minimization models effectively, we present two methods by combining the above two ways. Moreover, we propose an effective algorithm to solve the more general nonsmooth and nonconvex composite optimization problems by using the idea of the classical gradient method, and establish the convergence of this algorithm. Finally, we establish several characterizations of the quadratic growth condition of prox-regular and subdifferentially continuous functions. The main contents in this paper have been summarized as follows:1. In this paper, we propose two methods for solving multi-block separable convex minimization models. The two methods contain a simple proximal term which can be selected flexibly in each sub-problems, so that the sub-problems can be solved easily, and the convergence of the two methods can be guaranteed.Since the two methods do not require the matrices with full column rank properties, they have a wider range of applications. The numerical results show that the two methods are effective.2. In this paper, we prove that the ADMM is convergent under a condition which is weaker than the strong convexity. Therefore, it can explain why the ADMM has very good numerical performance, even though there is not strong convex function in the objective.3. In this paper, we propose a gradient method for the nonconvex and nonsmooth composite optimization problems. This method constructs an auxiliary problem by introducing composite gradient mapping. The objective of the auxiliary problem is a sum of two terms, the first part is the linearized function of the smooth function in the original objective, and the second part is the nonsmooth function in the original objective. This method obtains a new iterative point by solving the auxiliary problem in each iteration, and has theoretical convergence.4. In this paper, we prove that the sharpest possible bound for the constant in the second-order growth condition of a proper lower semicontinuous function can be attained under some assumptions, and establish some equivalences among quadratic growth condition, strong metric subregularity and strong metric regularity.
Keywords/Search Tags:ADMM, global convergence, quadratic growth, metric regularity, nonconvex and nonsmooth composite optimization
PDF Full Text Request
Related items