Font Size: a A A

Research On Two-time-scale Alternating Gradient Descent Ascent Algorithm For Nonconvex-Concave Minimax Problems

Posted on:2024-01-07Degree:MasterType:Thesis
Country:ChinaCandidate:J WangFull Text:PDF
GTID:2530306914474924Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
In recent years,the nonconvex minimax optimization has become a hot research topic in the interdisciplinary fields of optimization,machine learning and artificial intelligence due to its wide application in machine learning and signal processing,which has attracted extensive attention of scholars and achieved rich research results.With the increasing complexity of the problem model and the increasing size of data in the development of machine learning and artificial intelligence,the single-loop algorithm based on the gradient descent ascent algorithm has become the most popular algorithm for solving nonconvex minimax optimization problem.In this paper,we focus on the single-loop algorithm for solving nonconvex-(strongly)concave minimax optimization problem with the aim of improving the complexity of the algorithm and verifying the good performance of the new algorithm through numerical experiments.The research in this paper consists of the following two parts.In the first part,for the nonconvex-(strongly)concave minimax optimization problem,the existing single-loop algorithms–two-time-scale gradient descent ascent algorithm and two-time-scale stochastic gradient descent ascent algorithm are improved,and the new two-time-scale alternating gradient descent ascent algorithm and two-time-scale stochastic alternating gradient descent ascent algorithm are proposed.The new two-time-scale alternating gradient descent ascent algorithm and the two-time-scale stochastic alternating gradient descent ascent algorithm use alternating gradient update instead of simultaneous gradient update to update the iteration points x and y sequentially.The complexity of the algorithm is analyzed theoretically under nonconvex-strongly concave and nonconvex-concave conditions,and the results of numerical experiments on different data sets also show that the new algorithm is more feasible and efficient than the existing algorithm.In the second part,for the nonconvex-(strongly)concave minimax optimization problem,based on the proposed two-time-scale alternating gradient descent ascent algorithm,the two-time-scale accelerated alternating gradient descent ascent algorithm is further proposed by introducing the momentum acceleration technique.Numerical experimental results under nonconvex-strongly concave and nonconvex-concave conditions show that the new algorithm outperforms the two-time-scale alternating gradient descent ascent algorithm in adversarial training on different data sets.
Keywords/Search Tags:nonconvex minimax optimization problem, single-loop algorithm, (stochastic)gradient descent ascent algorithm, alternating gradient update, momentum acceleration technique
PDF Full Text Request
Related items