Font Size: a A A

First-order Stochastic Algorithm For Solving Nonlinear Optimization Problems And Its Application

Posted on:2021-08-14Degree:MasterType:Thesis
Country:ChinaCandidate:Y L ZhouFull Text:PDF
GTID:2480306479459354Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
With the advent of the era of big data,more and more fields are faced with the optimization processing of massive data,such as biometric statistics,pattern recognition,financial analysis and wisdom medical.Because the traditional nonlinear optimization method needs to adopt all data accurately,the calculation cost in the practical application is huge,the development of the optimization method which adopts part data information randomly is inevitable and important.In view of its simple format and easy implementation,first-order stochastic gradient algorithm has been widely used in big data mining and attracted the attention of many researchers in the field of mathematical programming.The selection of the iterative step size of the first-order stochastic gradient algorithm is very important for the convergence and the speed of convergence.Commonly used step sizes include steepest descent step size,Barzilai-Borwein(BB)step size,etc.In this paper,based on Dai and Yuan et al.'s two improved two-point step size gradient formula and combined with the technique of stochastic variance reduced gradient,the stochastic gradient algorithms with two-point interpolation information step size are proposed,and the linear convergence of the algorithm under strong convexity is proved.The test results on the public machine learning large data set show that the stochastic gradient algorithms with improved step size can effectively improve the convergence rate.In this paper,the influence of the stochastic degree of step size on the convergence efficiency of stochastic gradient algorithm is further studied.Using the improved Barzilai-Borwein step size with different ‘stochastic frequencies' in each iteration,two further improved stochastic BB step size gradient algorithms are obtained,and the linear convergence of the new algorithm in expectation is proved.The results of multiple numerical experiments on large data sets show that the convergence efficiency of the two improved stochastic BB step size gradient algorithms is significantly improved.This paper also discusses the numerical performance of the stochastic gradient algorithm with completely randomized improved Barzilai-Borwein step size,that is,the stochastic step size is updated once after each inner loop iteration.The experimental results show that the convergence effect of the first-order stochastic gradient algorithm is not directly proportional to the stochastic frequency of step size,and the completely dynamic stochastic step size not only increases the calculation amount of the algorithm,but also reduces the convergence rate.
Keywords/Search Tags:First-order optimization algorithm, Barzilai-Borwein step size, Stochastic gradient method, Stochastic variance reduced gradient
PDF Full Text Request
Related items