Font Size: a A A

Research On Smoothed Trust Region Algorithm For A Class Of Group Sparse Optimization Problem

Posted on:2024-07-10Degree:MasterType:Thesis
Country:ChinaCandidate:Y Y SuFull Text:PDF
GTID:2530307130970039Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Group sparse optimization is a class of sparse optimization problems with group structure,which uses the structural relationship between variables as a priori information to group variables,which can improve the interpretability and predictability of the results.Group sparse optimization problems are a class of nonconvex,nonsmooth,non-Lipschitz and even discontinuous NP-hard problems due to the presence of group sparse regular terms.The existing research results show that the nonconvex relaxation of the group sparse optimization using the folded concave penalty function can obtain an unbiased and oracle solution of the original problem.Under some mild conditions a second-order directional stationary point is a strong local minimizer that fulfils the second-order growth condition,but there is no algorithm that converges to the secondorder directional stationary point of the group sparse optimization problem,but there is no algorithm that converges to the second-order directional stationary point of the group sparse nonconvex relaxation problem.Based on the existing research,we study the nonconvex relaxation model of the group sparse optimization problem,where the penalty term is a folded concave penalty function that improves the group sparse penalty.We obtain a composite nonsmooth nonconvex optimization model by using a continuous nonconvex relaxation with a folded concave penalty function for all the group sparse penalties,and use the smooth approximation of the relaxation problem obtained by the smoothing method as the objective function and solve the subproblem by the trust region method,and then propose a group smoothing trust region algorithm(GSTR)that approximates the second-order directional stability point of the relaxation problem by the approximate solution of the smoothing approximation problem.We prove that the nonzero groups of any accumulation point of the sequence generated by the GSTR algorithm has the lower bound property and the accumulation point is consistent with the second-order directional stationarity point of the group sparse relaxation problem.We performed signal recovery experiments in the presence of noise to compare with the experimental results of the spectral projection gradient algorithm(SPGL1)and the group coordinate descent algorithm(GCD).In this paper,further investigate the nonconvex relaxation model for the sparse plus group sparse optimization problem,in which the penalty term contains both sparse penalty and group sparse penalty.As continuous relaxations,the folded concave penalty functions are used to relax both sparse penalty and group sparse penalty.which results the compound nonsmooth and nonconvex optimization model.In order to characterize the optimality of the nonconvex relaxation problem,the directional derivative and the directional stationary point are introduced,and then the characteristics of the directional stationary points and its local optimality are analyzed.To calculate the directional stationary points of the relaxation model,the smoothing approximation problem is constructed,and it is proved that the stationary points of the smoothing problem converge to the directional stationary point of the relaxation problem,which provides a theoretical guarantee for the calculation of the directional stationary point of the relaxation problem by using the smooth methods.
Keywords/Search Tags:Group sparse optimization, Sparse group sparse optimization, Nonconvex and nonsmooth optimization, Smoothing trust region method, Folded concave penalty function, Directional stationary point, Second-order directional stationary point
PDF Full Text Request
Related items