Font Size: a A A

Fast Iht Methods For L0 Regularized Convex Optimization Problems

Posted on:2019-12-12Degree:MasterType:Thesis
Country:ChinaCandidate:F WuFull Text:PDF
GTID:2370330566996442Subject:Basic mathematics
Abstract/Summary:PDF Full Text Request
The sparse optimization problems have a wide variety of applications such as signal processing,image denoising and visual coding.The main goal of these problems is to find a solution in which most of its elements are zeros.Therefore,the optimization model with cardinality terms is the most direct and ideal model for solving sparse optimization problems.In this thesis,we first propose an accelerated projection gradient algorithm for solving the constrained convex optimization problems.We show that the algorithm has the exponential worst case computational complexity when the objective function is strongly convex.We then propose an accelerated IHT algorithm for solving the l0 regularized convex optimization problem with box constraints.We first substantiate that there exists a threshold,such that if the extrapolation coefficients are chosen below this threshold,the proposed algorithm is equivalent to the accelerated projected gradient algorithm for solving a convex optimization problem after finite iterations.Under the error bound condition of the data fitting function,we prove the iterate sequence is R-linearly convergent to a local minimizer of the problem.Moreover,the corresponding sequence of objective function values is also R-linearly convergent.
Keywords/Search Tags:sparse optimization problem, accelerated IHT algorithm, l0 regularization, linear convergence
PDF Full Text Request
Related items