Font Size: a A A

A Type Of New BFGS Algorithm

Posted on:2018-11-24Degree:MasterType:Thesis
Country:ChinaCandidate:Y X WangFull Text:PDF
GTID:2310330536979432Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Optimization method is as a method of looking for a given objective function under the condition of the optimal solution. At present, it is widely applied in various fields, such as scientific research, industrial design, military defense, government policy and public management. With the development of computer computing ability and data handling capacity, optimization method has become a vital decision-making tool in modern society.The method for solving unconstrained optimization problem is the foundation in the optimization method. The BFGS method has advantages of calculation efficiency and good numerical stability, which is popular with learners. However, the modified BFGS algorithm has not interrupted. The aim of this thesis is to have a preliminary exploration in this studying method.This thesis puts forward a type of new BFGS algorithm. Under the inspiration of Zhang Jianzhong and Biggs, First of all, it in the right of yk the quasi-newton equation Bk+1Sk ?yk , using yk* =yk +?kSk replaces yk . Secondly, improving the matrixcorrection format of BFGS algorithm, that is, adding dynamic parameter before yk*(yk*)T/Skyk*.The proof method of BFGS convergence method by Byrd and Nocedal. Under the same hypothesis, this thesis giving the method of global convergence and local superlinear rate of convergence rate. When the objective function is uniformly convex and twice continuously differentiable, the modified BFGS algorithm in the thesis has global convergence, and has R- linear convergence rate. We introduce two parameters that is 0??k??max, 0<?min??k??max .This shows that this algorithm provides a theoretical framework for a more general algorithm. Further, if the Hessian matrix of the objective function satisfy the local Lipschitz continuity of x* in G(x) and ?k =1,?k= K?Sk?b .Then the modified BFGS algorithm in the thesis satisfy superlinear, where K and b ?1 is constant.The algorithm of this thesis, inexact line reach are only using simple method and the backward Armijo principle criterion.Testing library unconstrained optimization problem in the calculation results show that this algorithm has good and numerical stability. Comparing with the BFGS algorithm, this algorithm has certain advantages when the problem is serious. It also shows that ?k =1 is the prerequisite of ensuring the step length tend to 1.
Keywords/Search Tags:unconstrained optimization, BFGS algorithm, inexact line search, global convergence, local superlinear convergence
PDF Full Text Request
Related items
 1 An Improved Algorithms Of BFGS And Its Convergence Analysis 2 A New Class Of Quasi-newton Algorithm And Its Convergence 3 A New Class Of Quasi-newton Algorithm And Its Convergence 4 Nonlinear Optimization Problem For A Class Of Quasi-newton Algorithm 5 The Nonmonotone Line Search Technique And Its Applications 6 Based On The New Quasi-newton Equation Improved Bfgs Method 7 Nonmonotone BFGS Method For Nonconvex Unconstrained Optimization 8 A Class Of Non-monotonic Amendments To The Quasi-newton Algorithm And Its Convergence Analysis 9 The Perturbed BFGS Method For The Unconstrained Optimization With The Armijo Line Search 10 Research On New Methods For Solving Unconstrained Optimization Problem