In this paper we study a class of nonconvex and nonsmooth optimization problems whose objective function is the sum of separable smooth function and a simple coupling term.We propose an inertial accelerated alternating proximal gradient algorithm.The method compares the values of the objective function between a usual proximal gradient step and a linear extrapolation step,and decides whether to accept the inertial point in next iteration.The proposed algorithm is a descent method.Under some assumptions we prove that every limit point of the sequence generated by our algorithm is a critical point of the objective function.Furthermore if the function satisfies Kurdyka-(?)ojasiewicz(K(?))property,the generated sequence is globally convergent to a critical point.When the Lipschitz constant of the smooth function in the objective function is unknown or very large,we apply Barzilai-Borwein(BB)rule and backtracking strategies to solve the subproblems.Also we propose an adaptive algorithm for selecting inertial parameters to improve the numerical performance.Finally,the algorithm is applied to non-convex quadratic programming problem and sparse logistic regression,the numerical results verify the effectiveness of the algorithm. |