Nonconvex minimization problems play an important role in optimization, and they come from molecular biology, economy and finance, data mining and knowledge discovery, information science and engineering, engineering design and control and other fields.In this paper, we consider this class of nonconvex minimization problem of min-imizing the sum of a proper closed function and a continuously differentiable function with a Lipschitz continuous gradient. Based on some suitable merit function, we dis-cuss the convergence properties of the Douglas-Rachford splitting method and the forward-backward splitting method with self-adaptive proximal parameters for solving the above-mentioned nonconvex minimization problem under appropriate assumptions. Finally, we apply the Douglas-Rachford splitting method to solving nonconvex feasibility problems. |