| The zero-norm optimization problem,as an important non-convex and non-smooth optimization model to obtain a sparse solution,have extensive and important applications in statistics,signal and image processing,machine learning,bioinformatics,finance and many other fields.This paper focuses on convex relaxation approach for a class of zero-norm regularized composite optimization.Inspired by the literature[9,24],this paper starts from the equivalent MPEC of the zero-norm regularized composite optimization,establishing the global exact penalty of the equivalent MPEC and adopting the alternate minimum to solve the global exact penalty problem,then proposes a convex relaxation algorithm to solve this kind of non-convex and non-smooth optimization problem.Under the proper restricted strong convexity condition,the l2-norm error bounds of the optimal solution of convex relaxation in each stage are established,and the reduction of error bounds of successive two-stage optimal solutions is quantitatively described.Finally,the convex relaxation algorithm is applied to solve the zero-norm regular-ized logistic regression.By applying the augmented lagrangian function method and semi-smooth Newton method to solve the convex relaxation subproblem,the implemen-tation of the algorithm is given,and it is compared with the large-scale sparse logistic regression algorithm(Lassplore)using synthetic and real data.The numerical results show that the proposed convex relaxation algorithm has significant advantages in pre-dicting performance,although the calculation time is more than Lassplore,which further confirms the theoretical results obtained. |