Font Size: a A A

Theory And Algorithms Of Sparse Recovery Via Non-convex Optimization

Posted on:2017-04-01Degree:DoctorType:Dissertation
Country:ChinaCandidate:L M CheFull Text:PDF
GTID:1310330536958808Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
It is ubiquitous that signals of interest often possess inherent low-dimensional structures,such as sparsity of vectors and low rank of matrices.How to effectively take advantage of these structures has been a hot topic in signal processing society for the last decade.Among the sparse recovery algorithms,the performances of those based on nonconvex optimization are the best,but these algorithms lack convergence guarantees from the initial point to the sparse signal.In addition,noises and perturbations are inevitable in practical applications.Consequently,the denoising performances of the sparse recovery algorithms are quite important.But currently there is no thorough denoising performance comparison of greedy sparse recovery algorithm with Oracle recovery.This thesis tries to address these two crucial problems.First of all,the global and local optimality of l_p minimization(0 ? p < 1)is analyzed.A class of sparsity-inducing penalties is introduced with characterization of their non-convexity,and can be used to approximate l_p“norm”.It is proved that the global optimality of the corresponding non-convex optimization problem approaches the global optimality of l_p minimization.To ensure that the sparse signal is the only local optimum of the non-convex optimization problem in its neighborhood,the requirement of the nonconvexity and the radius of the neighborhood is established,which lays the foundation for convergence analysis of algorithms based on non-convex optimization.Then,algorithms of the non-convex optimization problems are devised for different scenarios.It is proved that if the non-convexity is in inverse proportional to the distance between the initial point and the sparse signal,then these algorithms are guaranteed to converge to the sparse signal,which solves the first crucial problem.The number of iterations required and the upper bound of the recovery error are also derived.Numerical simulations are implemented to verify the theoretical results,and indicate that the algorithms can recover signals with more nonzero elements with less running time,and have better denoising performances.Besides,the research results for sparse recovery are generalized to low rank recovery.A class of low-rank-inducing penalties is introduced with characterization of their non-convexity.An algorithm is proposed to solve the corresponding non-convex optimization problem.Theoretical analysis derives the sufficient condition under which the algorithm is guaranteed to converge from the initial point to the low rank matrix.Numerical simulations are implemented to verify the theoretical results,and indicate that the performance of the proposed algorithm is among the best.Finally,an algorithm is proposed by incorporating the idea of backtracking into the matching pursuit algorithm in sparse recovery.When the measurement vector,the sensing matrix,and the sparse signal are all contaminated,the upper bound of the recovery error of the proposed algorithm is derived.By comparing this upper bound with the lower bound of the recovery error of Oracle recovery,the proposed algorithm is proved to have Oracle-order denoising performance,which solves the second crucial problem.In this thesis,the convergence guarantees are provided for superior algorithms based on non-convex optimization,and the denoising performance is derived for greedy sparse recovery algorithm.This research work helps in promoting practical usage of algorithms based on non-convex optimization,and developing theory of non-convex optimization.
Keywords/Search Tags:sparse recovery, non-convex optimization, convergence analysis, low rank recovery, denoising performance
PDF Full Text Request
Related items