 Font Size: a A A

# A New Perturbed BFGS Method For Unconstrained Optimization

Posted on:2020-07-19Degree:MasterType:Thesis
Country:ChinaCandidate:F ChenFull Text:PDF
GTID:2370330602960514Subject:Mathematics
Abstract/Summary: PDF Full Text Request
The classical BFGS method is the most efficient quasi-Newton method for solving optimization.The Newton method has quadratic convergence rate and high precision,but it requires computing the Hessian matrix of the objective function at each iteration.Quasi-Newton methods are some approximation of the Newton method,whose iterative matrix is an approximation of the Hessian matrix and satisfies the famous quasi-Newton equation.Quasi-Newton methods can reduce account of calculation without computing the Hessian matrix and possess superlinear convergence locally.When the inner between the difference of adjacent two iterative vectors and the difference of the corresponding gradients is greater than zero,the BFGS iterative matrix is symmetric and positive definite,which guarantees descent property of the quasi-Newton direction.Thus the stepsize can be computed by the Armijo line search and Wolfe line search.The paperproved that the BFGS method converges globally for convex minimization when the Wolfe line search is used.The results of[4,5]show that the BFGS method may diverge for nonconvex problems even if the Wolfe search is used In order to ensure global convergence for nonconvex optimization,it is always required to modify the BFGS foumula or line searches.Different from these globalization techniques,the paperpresented a perturbed BFGS method and established its global convergence for nonconvex problems with the Wolfe search.Based on the modified BFGS inand the perturbed technique in,we introduce a new perturbed BFGS method for nonconvex optimization and study its convergence properties.The paper is organized as follows.In Chapter 1,we introduce the preliminary knowledge,the background of the problem and the main work of the paper.In Chapter 2,compared with the classical BFGS update formula,the paperproposed a modified BFGS formula which can approximate the Hessian matrix better.But the modified method only converges globally for convex problems.Based on the modified BFGS inand the perturbed technique in,we present a new perturbed BFGS method for noncvex problems.Under suitable conditions,we show its global convergence for nonconvex problems with the Wolfe search.In Chapter 3,under some assumptions,we investigate local convergence properties of the proposed method and show its superlinear convergence.In Chapter 4,we do some numerical experiments to show efficiency of the proposed method.
Keywords/Search Tags:Nonconvex, perturbation, BFGS method, global convergence, superlinear convergence PDF Full Text Request
Related items
 1 Nonmonotone BFGS Method For Nonconvex Unconstrained Optimization 2 Based On The New Quasi-newton Equation Improved Bfgs Method 3 The Perturbed BFGS Method For The Unconstrained Optimization With The Armijo Line Search 4 For Solving Nonlinear Constrained Optimization Problems Modified Bfgs Method 5 An Improved Algorithms Of BFGS And Its Convergence Analysis 6 Research On A Modified BFGS Method And SQP Method 7 A New Class Of Quasi-newton Algorithm And Its Convergence 8 A New Class Of Quasi-newton Algorithm And Its Convergence 9 A Type Of New BFGS Algorithm 10 Modified Bfgs Formula In The Sqp Algorithm