| Convex Programming and nonconvex programming provide strong tools for a wide class of problems arising from management science,statistics,economics,biology and some other research fields.With the era of big data coming,more and more information are involved in the practical problems resulting in much larger problems.Therefore,it is very necessary to design some fast and efficient numerical algorithms for solving these problems,which is of great practical significance.In this thesis,we mainly consider the following two optimization problems,one is the nonconvex nonsmooth optimization problem of block structure,and the other is the large-scale convex optimization problem.This paper aims to propose some efficient algorithms for solving these problems.Then we apply the algorithms to some important practical problems respectively,such as,Poisson linear inverse problem、signal and image recover problems、regularized logistic regression problem and so on.Compared with the existing algorithms,the new numerical methods can solve these problems more effectively.The proximal alternating minimization algorithm is simple and fast for solving nonconvex nonsmooth problems with block structure.And two popular algorithms are the proximal alternating linearized minimization algorithm(PALM for short)and alternating structure-adapted proximal gradient descent algorithm(ASAP).These two methods have their own advantages in solving different types of nonconvex nonsmooth optimization problems.By introducing different inertial strategies,we propose the accelerated versions of the two algorithms.Then we give the global convergence results of the new algorithms,and apply them to some practical problems,which show the efficiency and practicability of the new algorithms intuitively.We then consider a more general model of solving the above problem over abstract constraint sets.This kind of problem has a wide range of practical applications,and the proximal alternating minimization method is no longer applicable because of the existence of abstract constraints.By introducing generalized Legendre function and extended descent lemma,we propose alternating structure-adapted proximal-like gradient descent algorithm(ASAPL).This algorithm not only extends ASAP to solving the case with abstract constraints,but also circumvents the restrictive assumption of global Lipschitz smoothness of objective functions.We establish a global O(1/K)sublinear convergence rate measured by Bregman distance for algorithm ASAPL.Furthermore,we prove that each bounded sequence generated ASAPL globally converges to a critical point of the problem under the assumption that the underlying functions satisfy the Kurdyka-(?)ojasiewicz property.Then,we also propose the algorithm IASAPL,which is the inertial version of ASAPL,and give its theoretical analysis.In addition,we study a broad class of convex minimization problems with abstract constraint,whose objective function can be expressed as the sum of an average sum of n smooth convex functions and a nonsmooth convex function.When n is extremely large,it takes a lot of computation to obtain the gradient information of the average sum function.And proximal stochastic gradient method is the most powerful tool to handle with this case.Based on this method,we propose proximal-like stochastic gradient method(PLSG)using Legendre function to capture the geometry of the abstract constraint.We then present the adaptive and accelerated form of PLSG named A2PLSG and provide its theoretical analysis.Moreover,numerical experiment is given to illustrate the effectiveness of the new algorithms. |