We often transform the non-convex optimization problem into a convex optimiza-tion problem when solving sparse solutions for large-scale problem. Then, this means that we just need to solve a large-scale convex programming problem instead. In recen-t years, gradient algorithms were found to have good calculating result in large-scale problems. On this basis, it has developed a variety of algorithms in solving large s-cale convex programming problems such as projection subgradient method, fixed point algorithm, augmented Lagrange algorithm, Bregman distance algorithm and so on.In this paper, we base on these methods, focus on details and improvement of the Augmented Lagrange Method(ALM) and Alternating Direction Method(ADM), and prove its convergence. In addition, we improve the limitation of Predictor-Corrector Proximal Multiplier Method(PCPM) which was proposed by Chen and Teboulle, and extend it into general convex separable programming problems.Finally, the feasibility and convergence of the improved PCPM algorithm are ver-ified by several numerical experiments. |