Font Size: a A A

Nonmonotone Step Length Alternating Minimization Algorithm For Solving Robust Principal Component Analysis

Posted on:2016-08-25Degree:MasterType:Thesis
Country:ChinaCandidate:J X ZhaoFull Text:PDF
GTID:2180330473961803Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Robust principal component analysis which is also called low rank matrix recovery, principal component pursuit, rank-sparsity incoherence for matrix decomposition and so on, is a new proposed convex optimization problem which is derived from compressed sensing theory in recent years, with the view of recovering original low rank matrix from the matrix contaminated with sparsely highly corrupted observations. At present, the theory has been widely used in image denoising, video processing, web search, biological information and so forth. Via analyzing research status at home and abroad, the current mainstream algorithms are made a comprehensive summary and deep mining. We analyze the advantages and disadvantages of the existing theories and technologies. The main innovative work of this paper is as follows.1. We propose an algorithm which is called nonmonotone step length alternating minimization algorithm (NSA) that uses the alternating minimization idea to solve the relaxed model containing dense small Gaussian noise. Firstly, using Taylor expansion, singular value decomposition (SVD), shrinkage operator and so forth techniques to deduce iterative direction matrices of the low rank matrix and the sparsely highly corrupted observation matrix. Then four lemmas about monotonicity and direction are proposed to support this part in theory. Secondly, we generalize the nonmonotone linear search method to matrix and solve the corresponding direction step length dynamically. Thirdly, a continuous technology is inserted into NSA to accelerate convergence.2. In theory, the global convergence of the NSA algorithm is proved. Experimentally, we compare the NSA algorithm with the current top algorithms:the inexact augmented Lagrange multiplier (IALM), the exact augmented Lagrange multiplier (EALM) and the accelerated proximal gradient with linear search technique (APGL). Without regard to dense small Gaussian noise, the running time of NSA algorithm is little different from that of the current most efficient IALM algorithm. With regard to dense small Gaussian noise, the running time of NSA algorithm is obviously superior to that of APGL algorithm that is the most efficient in this respect, and the relative error of its low rank matrix is slightly superior to that of APGL algorithm.
Keywords/Search Tags:robust principal component analysis, alternating minimization, nonmonotone linear search, low rank matrix
PDF Full Text Request
Related items