Font Size: a A A

Research Of Accelerated Gradient Algorithms

Posted on:2022-07-15Degree:MasterType:Thesis
Country:ChinaCandidate:C K LiFull Text:PDF
GTID:2480306338469504Subject:Mathematics
Abstract/Summary:PDF Full Text Request
As artificial intelligence plays an increasingly important role in the scientific and technological revolution,deep learning as an important learning model has been widely concerned.The basic core idea of deep learning can be promoted to solve optimization problems,so more and more people begin to pay attention to optimization algorithm.Among these algorithms,the accelerated gradient algorithm refers to the algorithm which has faster convergence rate than the traditional gradient descent method on the premise of using only first-order gradient information.Firstly,we mainly introduce the Nesterov accelerated gradient algorithm in this paper.It has the optimal convergence rate on the premise of using only the first-order gradient information.Because the optimization algorithm can be linked with ordinary differential equation,people begin to understand the acceleration principle of Nesterov acceleration gradient algorithm by ordinary differential equation method.People began to discretize ordinary differential equations to obtain optimization algorithm.Secondly,the symplectic discrete scheme is improved.We discretize the high-resolution ordinary differential equation corresponding to the Nesterov accelerated gradient algorithm,and obtain a new accelerated gradient algorithm.Experiments show that the convergence rate of the new accelerated gradient algorithm is faster than that of the symplectic scheme algorithm under the same computational complexity.Finally,three discrete schemes are used for the inertial dynamic system:symplectic scheme,explicit Euler and implicit Euler.Three different optimization algorithms are obtained.In this paper,we construct a suitable Lyapunov function,and prove that when the objective function is ?-strong convex function,the exponential convergence rate of inertial dynamic system can be transformed into linear convergence rate in discrete scheme.The optimization algorithm obtained by symplectic scheme and implicit Euler are proved to be accelerated gradient algorithms.
Keywords/Search Tags:optimization, dynamic inertial system, discretization, Lyapunov function, accelerated gradient algorithm
PDF Full Text Request
Related items