Font Size: a A A

Convergence Analysis Of Subgradient Algorithms On Riemannian Manifolds With Applications

Posted on:2015-06-04Degree:DoctorType:Dissertation
Country:ChinaCandidate:X M WangFull Text:PDF
GTID:1220330464472388Subject:Non - linear optimization
Abstract/Summary:PDF Full Text Request
In this work, we discuss two types of convex optimization problems on Riemannian manifolds:the problem of minimizing a convex function and the convex feasibility problem (which consists in finding a point in the intersection of finitely closed weakly convex sets in Riemannian manifolds). We introduce the subgradient algorithm and the cyclic subgradient projection algorithm for these two types of problems, respectively. The lay-out of our work is organized as follows:In Chapter 2, we establish the basic inequalities on Riemannian manifolds.In Chapter 3, we study the subgradient algorithms for minimizing a convex function on a Riemannian manifold, which was proposed by Ferreira et al. in [37] for the Riemannian manifolds of nonnegative sectional curvatures. We assume that the underlying manifold M is of lower bounded sectional curvature and the objective function f:M â†' R is a proper convex function. We establish the convergence properties abut the subgradient algorithm under two step size rules:the diminishing step size rule and the dynamic step size rule, respectively. When the objective function consists of the sum of a large number of component functions, we propose the incremental subgradient algorithm and discuss its convergence property. As an application example, we apply the subgradient algorithm and the incremental subgradient algorithm to solve the problem about finding the Riemannian LP centers of mass. Based on analysis and computational experiments, the algorithms appear very promising and effective for calculating the Riemannian Lp centers of mass. An interesting discovery is that the incremental subgradient is more effective than the subgradient algorithm.In Chapter 4, we investigate the cyclic projection algorithm for solving the convex feasi-bility problems on Riemannian manifolds. For the special situation, where each convex set is given as a sublevel set of a convex function, we study the cyclic subgradient projection algorithm, which is first proposed by Bento et al. in [12] for the Riemannian manifolds of nonnegative sectional curvatures. Under the assumptions that the underlying manifold M is of lower bounded sectional curvature and the functions are convex, we establish the conver-gence property about the algorithm. If, additionally, a Slater type condition is satisfied, then this algorithm converges linearly. At the same time, a step size rule is suggested for finite convergence purpose. Thereafter, we show that the linear convergence result preserves if the problem has bounded error bound instead of satisfying the Slater condition. To our best acknowledgement, it seems that it is the first time to study the linear convergence property about the subgradient algorithm for the convex feasibility problem on Riemannian mani-folds. At last, the Hadamard version of the cyclic projection algorithm is introduced and the convergence and the linear convergence results are established, and a numerical example is provided to illustrate the theoretical results.
Keywords/Search Tags:Riemannian manifold, Sectional curvature, Convex optimization, Convex feasibility problem, Slater condition, Bounded error bound, Subgradient algorithm, Sub- gradient, projection method, Projection method
PDF Full Text Request
Related items