Font Size: a A A

Accelerated Alternating Neighbor Gradient Algorithm For A Class Of Nonconvex Nonsmooth Optimization Problems

Posted on:2022-10-30Degree:MasterType:Thesis
Country:ChinaCandidate:X YangFull Text:PDF
GTID:2510306722981779Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
In this paper we study a class of nonconvex and nonsmooth optimization problems whose objective function is the sum of separable smooth function and a simple coupling term.We propose an inertial accelerated alternating proximal gradient algorithm.The method compares the values of the objective function between a usual proximal gradient step and a linear extrapolation step,and decides whether to accept the inertial point in next iteration.The proposed algorithm is a descent method.Under some assumptions we prove that every limit point of the sequence generated by our algorithm is a critical point of the objective function.Furthermore if the function satisfies Kurdyka-(?)ojasiewicz(K(?))property,the generated sequence is globally convergent to a critical point.When the Lipschitz constant of the smooth function in the objective function is unknown or very large,we apply Barzilai-Borwein(BB)rule and backtracking strategies to solve the subproblems.Also we propose an adaptive algorithm for selecting inertial parameters to improve the numerical performance.Finally,the algorithm is applied to non-convex quadratic programming problem and sparse logistic regression,the numerical results verify the effectiveness of the algorithm.
Keywords/Search Tags:Nonconvex-nonsmooth optimization, Alternating minimization, Accelerated proximal gradient method, K(?) property, Convergence analysis
PDF Full Text Request
Related items