Font Size: a A A

On The Barzilai-Borwein Gradient Method And Its Applications In Other Optimization Methods

Posted on:2019-03-26Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y T ZheFull Text:PDF
GTID:1310330566964491Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Gradient method is one of the most foundational methods for solving unconstrained optimization problems,its algorithm is very simple and low in memory.However,the choice of the step length greatly affect the gradient-type algorithm.Due to its very satisfactory performance and better convergence property compared to the steepest descent method,Barzilai-Borwein(BB)gradient method had been developed into a competitive method for large scale problems and got broad attention of numerous experts.This thesis mainly studies the new BarzilaiBorwein-type gradient method and its applications in other optimization algorithms.The research contents contain the following aspects:Firstly,we aim to find some new BB-type gradient algorithms.Based on Ford and Moghrabi's multi-step quasi-Newton equation,by correcting the original BB stepsize to satisfy this particular quasi-Newton property,we introduce a new family of BB-type stepsizes.Furthermore,we establish the global convergence of the gradient algorithm utilizing this new stepsize for the strict convex quadratic minimization problem.When applied to some practical and random problems,the numerical results show that the modified BB method can outperform some existing gradient algorithms.Secondly,we mainly consider the applications of BB-type stepsize in other optimization algorithms.Firstly,combining with the nonmonotone line search,we generalize the modified BB method proposed in part one to solve the general non-quadratic unconstrained optimization problem.Secondly,due to the close relationship between the improved quasi-Newton equation and the Dai-Liao conjugate gradient method,also the BB-type step length is widely used in these two methods,we consider using the BB-type parameter value,which was used in the Dai-Liao method with some certain optimal characteristics,in the scaled BFGS method and the modified BFGS method.Finally,we embed the BB-type step length into the adaptive cubic regularization method.By taking a positive definite scale matrix as the exact Hessian or its quasi-Newton approximation,the minimizer of the resulted subproblems can be easily determined,and at the same time,the convergence of the algorithm is maintained.The modified algorithm is also suitable for large scale problems.Numerical results show that these corrections are reasonable and effective.During this period,we also unified most of the existing improvements to the quasi-Newton equation by means of the interpolation,doing this gives a natural interpretation to the developments of these quasi-Newton equations.Thirdly,we study the convergence of the BB-type method for solving symmetric indefinite linear system.We conclude that,in the two-dimensional case,the gradient method of adopting the second form of the BB stepsize is R-superlinear convergent.This result is consistent with the result of Dai in the symmetric positive definite case with respect to the BB method.Finally,we improve the conjugate gradient methods.By modifying the conjugate parameters,we propose several improved conjugate gradient methods,including two Dai-Liao-type conjugate gradient methods.Under some certain conditions,combined with the strong Wolfe line search,we analyze the strong convergence of the algorithms for the uniformly convex functions and global convergence for general objective functions.Numerical experiments show that our algorithms are more effective when compared with some existing conjugate gradient methods.
Keywords/Search Tags:Barzilai-Borwein gradient method, quasi-Newton equation, BFGS method, adaptive regularization algorithm with cubics, conjugate gradient method
PDF Full Text Request
Related items