Applying stochastic proximal(sub)gradient algorithm to handle the stochastic composite optimization problem has become popular,where it is always assumed that stochastic(sub)gradient information on one component of the objective function is available and the proximal mapping of the other component,which is called a regularization term,is possible to be computed.However,few works consider the case when the regularization term is not proximally tractable anymore.In this paper,we propose a stochastic proximal stochastic subgradient algorithm for the stochastic convex composite optimization assuming that only the stochastic information on both components of the objective function is available.First of all,we prove the almost sure convergence of the proposed algorithm under some mild conditions.With the help of almost sure convergence result,we also establish the asymptotic efficiency of the algorithm.Then,we provide the non-asymptotic convergence results of the stochastic proximal stochastic subgradient algorithm for the convex and strongly convex cases,respectively.Finally,the convergence,convergence rates and asymptotic efficiency of the proposed algorithm are demonstrated by some numerical experiments. |