Font Size: a A A

Study On Some Splitting Methods For Convex Optimization And Monotone Variational Inequality And Their Algorithmic Complexity

Posted on:2014-01-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:X J CaiFull Text:PDF
GTID:1220330395995388Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Nowadays, many optimization problems arise from applications of fields, such as, electrical engineering, statistics, computer science, signal processing, and compressive sensing, etc. These problems are large scale, ill-posed, and often have some favorable structures, such as separability, sparsity, etc. Many researchers devote to solving these problems, and there are many algorithms, e.g., alternating direction method (ADM), splitting algorithms, augmented Lagrangian method, proximal point algorithm (PPA), projection contraction method, Newton method and interior point method. The first-order algorithms are popular in handling large-scale problems for its low per-iteration cost. However, there are still some difficulties in solving the subproblems for these algorithms. So, its important to design some algorithms whose subproblem is easy to solve, and convergence rate can be guaranteed.In this thesis, we proposed some first order splitting algorithms of convex opti-mization problems and monotone Ⅵs, and study the algorithmic complexity.In Chapter1, we review some source problems, such as the differential con-vex optimization problems with or without linear constrains, saddle point problems, complementarity problems, then we summarize some important algorithms and their applications, last we give the definition of∈optimal solution. In Chapter2, we recall the properties of projection, basic and crucial inequalities for VI, some definitions and notations.In Chapter3, we consider the convex programming problems with separable struc-tures whose objective function is the sum of two functions without coupled variables. The alternating direction method of multipliers (ADMM) is a benchmark for these problems. In this chapter, we show that ADMM can be regarded as an application of PPA to the primal model with a customized choice of the proximal parameter. This PPA revisit on ADMM from the primal perspective also enables us to recover the gen-eralized ADMM proposed by Eckstein and Bertsekas easily. A worst-case O(1/t) convergence rate in ergodic sense is established for a slight extension of Eckstein and Bertsekas’s generalized ADMM.In Chapter4, we propose a new correction strategy for some first-order primal-dual algorithms arising from solving total variation image restoration. With this strate-gy, we can prove the convergence of the algorithm under more flexible conditions than those proposed most recently. Some preliminary numerical results of image deblurring support that the new correction strategy can improve the numerical efficiency.Nemirovski’s analysis (2005) indicates that the extragradient method has the O(1/t) convergence rate for variational inequalities with Lipschitz continuous monotone oper-ators. For the same problems, in the last decades, a class of Fejer monotone projection and contraction methods is developed. Until now, only convergence results are avail-able to these projection and contraction methods, though the numerical experiments indicate that they always outperform the extragradient method for the’optimal’step size in the contraction sense. In Chapter5, we prove the convergence rate under a unified conceptual framework, which includes the projection and contraction methods as special cases and thus perfects the theory of the existing projection and contraction methods. Preliminary numerical results demonstrate that the projection and contraction methods converge twice faster than the extragradient method.
Keywords/Search Tags:Convex programming, augmented Lagrangian method, alternating direc-tion method, proximal point algorithm, projection and contraction methods, primal-dual algorithms, O(1/t) convergence rate
PDF Full Text Request
Related items