Sparse optimization is a new research topic in recent years.As data has been blooming,how to store,transmit,and use the data to dig out valuable information has caught the eyes from various industry.A sparse signal is a signal in which most of the elements are zero.With sparsity,one can extract key information and thus significantly compress data.Sparse optimization has played a key role in all these efforts.This work first introduces the general form of sparse optimization and espe-cially elaborates the lasso type models in detail.For minimizing regularized loss function in the form of separable sum,the proximal dual fixed point algorith-m(PDFP)is utilized to iteratively solve the optimization problem.Furthermore,a Stochastic-PDFP algorithm is proposed for the application to online machine learning.Various numerical experiments are performed to demonstrate the effi-ciency and accuracy of the PDFP algorithms on solving lasso type problems.The stochastic form of PDFP algorithm is also proved feasible. |