Font Size: a A A

Linear Convergence Rate Analysis Of Two Classes Of Convex Optimization Algorithms

Posted on:2018-11-30Degree:MasterType:Thesis
Country:ChinaCandidate:Y DingFull Text:PDF
GTID:2310330518992680Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
With the development of science and technology, convex program-ming has been used more widely in many fields, such as engineering,management science, etc.This paper considers a convex optimization problem arising from some applications, which can be either modeled as an unconstrained convex program with a coupled term, or a separable convex program with a linear equality constraint, which are tailored for the popular al-ternating minimization method and the alternating direction method of multipliers, respectively. Our purpose is to show the linear conver-gence rates of these two algorithms under some suitable conditions and compare their efficiency and effectiveness numerically.Under the condition that the objective function satisfies the quadrat-ic growth condition at its solution, we first establish a relevant in-equality and then show the linear rate of convergence of the alternat-ing minimization method.Then, under the condition that one of the separable objective func-tion is strongly convex, we construct another inequality for the aug-mented Lagrange function and show the linear convergence rate of the alternating direction method of multipliers. We further consider a variant of this method, a linearized alternating direction method of multipliers, and prove its linear convergence rate under the same condition.We apply the algorithms to solve the image restoration and lo-cation problems to show that the alternating direction method can achieve better numerical results than the alternative minimization method from different point of view.
Keywords/Search Tags:Convex optimization, Alternating minimization method, Alternating direction method of multipliers, Linear convergence rate
PDF Full Text Request
Related items