Font Size: a A A

Theoretical Analysis Of A Few Classic Algorithms In Sparse Approximation

Posted on:2016-10-09Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y CaiFull Text:PDF
GTID:1220330464472383Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Compressed sensing and low rank matrix recovery have been the active and burgeoning topics in the fields of applied mathematics and engineering in the past several years. Their central motifs are that a sparse signal or a low rank matrix can be recovered by fewer, nonadaptive and linear measurements. Compressed sensing and low rank matrix recovery have a wide range of applications in reconstruction of dynamic MRI, radar, computational biology, machine learning, quantum physics and recommender systems, etc. In this paper, we give systematic theoretical analysis of a few classical algorithms in compressed sensing and low rank matrix recovery. Our research are as following:First, we consider compressed data separation problem, where the original signal is composed of two distinct subcomponents, from fewer linear measurements. We investigate the dual frames based h Split-analysis approach to recover the two distinct subcomponents which are sparse in two general frames. We show that, when the measurement matrix is a Weibull random matrix (not Gaussian) and the two general frames satisfy a mutual coherence property, the dual frames based l1 Split-analysis approach can exactly recover the two distinct subcomponents with high probability.Next, we study the performance of projected gradient descent algorithm for solving the nonconvex Schatten-p quasi-norm minimization (0< p< 1) problem. Based on the matrix restricted isometry property (M-RIP), we give the convergence guarantee and error bound for this algorithm and show that the algorithm is stable to noise with a linear convergence rate.Last, we study the convergence and stability of iteratively reweighted least squares minimization for recovering a matrix from noisy linear measurements (IRLS-M algorithm for short), which extends the known theoretical results of iteratively reweighted least squares minimization for solving the noiseless low rank matrix recovery problem in the literatures. We not only strictly prove the convergence of the IRLS-M algorithm in the noisy case, but also prove that the IRLS-M algorithm is stable to noise when the linear measurement map satisfies the matrix restricted isometry property.
Keywords/Search Tags:Sparse approximation, Compressed sensing, Low rank matrix recovery, Matrix restricted isometry property, Weibull random matrix, Projected gradient descent algorithm, Iteratively reweighted least squares algorithm
PDF Full Text Request
Related items