Font Size: a A A

Accelerating A Family Of Common Convex Optimization Algorithms

Posted on:2017-06-20Degree:MasterType:Thesis
Country:ChinaCandidate:L W LiFull Text:PDF
GTID:2370330569498563Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Common convex optimization algorithms,such as Forward-backward splitting al-gorithm,Alternating direction method of multipliers,and Douglas-Rachford splitting al-gorithm,can all be unified to the same theoretical frame,namely proximal splitting algo-rithm.Because of their excellent computational properties and wide applicabilities,these convex optimization algorithms play a significant role in the theoretical analysis and nu-merical solution of convex optimization problems.In many practical applications,such as image processing,machine learning,signal processing and automatic controlling,these convex optimization algorithms have been widely applied and perform well.However,many practical problems vary from the ideal model,and are not easy to solve,especially some ill-conditioned problems with enormous condition number.Although it is enough to solve these ill-conditioned problems by convex optimization algorithms,the convergence speed slows down and the algorithms become time-consuming.To these ill-conditioned problems,it is necessary and practically meaningful to take useful preconditioning method to accelerate the convergence of the algorithms.This paper is focused on the research of preconditioning to the common convex optimization algorithms in ill-conditioned prob-lems,and discuss the following issues:1.Discuss the preconditioning methods to the forward-backward splitting algorith-m in ill-conditioned problems and extend these methods to alternating direction method of multipliers and Douglas-Rachford splitting algorithm.The paper points out that equi-libration theory and Sinkhom-Knopp algorithm are more easy and concise to solve the preconditioning matrix,and are more understandable for the deducing analysis.2.Discuss the influence of different norms to the Sinkhorm-Knopp algorithm.In the process of solving preconditioning matrix by Sinkhorm-Knopp algorithm,we take three d-ifferent norms,1-norm,2-norm and oo-norm,and conduct numerical experiments respec-tively.According to the numerical results,we analyze the influence of different norms to the solving of preconditioning matrix.Besides,we also study how to precondition in the cases of non-smooth function and non-full rank matrix in the models.
Keywords/Search Tags:convex optimization algorithm, preconditioning, accelerating convergence, equilibration, ill-conditioned problem, Sinkhorn-Knopp algorithm
PDF Full Text Request
Related items