Font Size: a A A

The Design And Research On Proximal Splitting Methods For Some Sparse Optimization Problems

Posted on:2019-05-08Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y Y LiFull Text:PDF
GTID:1360330593450296Subject:Mathematics
Abstract/Summary:PDF Full Text Request
It is of significance for solving sparse optimization problems such as denoising,model selection,image processing,and more.This thesis mainly proposes algorithms for the general sparse models based on LASSO,group LASSO and sparse group LASSO.The work focus on the objective function that is the sum of smooth and non-smooth parts.In details:Firstly,we propose a modified proximal gradient method based on the original proximal gradient method.The main idea is to compute the search direction by the proximal gradient with constant step-size.Furthermore,we design the self-adaptive step-size for the search direction and prove the Q-linear convergence rate which improves the convergence rate of the original proximal gradient method.Through the numerical examples,the advantage of our algorithm in CPU time is proved.The reason is that the constant step-size of the modified proximal gradient while the original proximal gradient method often requires line search.Secondly,we propose a new modified proximal gradient method because of the non-decrease of the first modified proximal gradient method.We design the descent direction by using the Lipschitz continuity of the gradient of the smooth part and the proximal gradient.The descent direction can be computed easily.Moveover,the step-size is constant and optimal.The convergence rate of the new modified proximal gradient method is Q-linear.The numeri-cal experiments justify the monotonicity and advantage of the new modified proximal gradient method.Based on the above two parts,our main aim is to obtain the sparse and group sparse solutions.Then we propose a general sparsity constrained optimization model.The sparse and group sparse optimization problems can be seen as the special cases of this sparsity constrained problem.We use the alternative minimization method to solve the sparsity constrained problem.By using the necessary optimality conditions and the projection gradient method,we prove the global convergence of the proposed method.Furthermore,if the objective function is convex,the convergence rate of the proposed method is sub-linear.Finally,we propose the Difference-of-Convex(DC)algorithm to solve a class of sparse optimization problems.By using the linear approximation of the differentiable concave part,we can use the convex programming to solve the problem.Moreover,we prove the convergence of the algorithm under some standard assumptions.
Keywords/Search Tags:Proximal gradient method, Modified proximal gradient method, Sparse opti-mization, Sparsity constrained problems, Alternating minimization
PDF Full Text Request
Related items