Optimization method is as a method of looking for a given objective function under the condition of the optimal solution. At present, it is widely applied in various fields, such as scientific research, industrial design, military defense, government policy and public management. With the development of computer computing ability and data handling capacity, optimization method has become a vital decision-making tool in modern society.The method for solving unconstrained optimization problem is the foundation in the optimization method. The BFGS method has advantages of calculation efficiency and good numerical stability, which is popular with learners. However, the modified BFGS algorithm has not interrupted. The aim of this thesis is to have a preliminary exploration in this studying method.This thesis puts forward a type of new BFGS algorithm. Under the inspiration of Zhang Jianzhong and Biggs, First of all, it in the right of yk the quasi-newton equation Bk+1Sk ?yk , using yk* =yk +?kSk replaces yk . Secondly, improving the matrixcorrection format of BFGS algorithm, that is, adding dynamic parameter before yk*(yk*)T/Skyk*.The proof method of BFGS convergence method by Byrd and Nocedal. Under the same hypothesis, this thesis giving the method of global convergence and local superlinear rate of convergence rate. When the objective function is uniformly convex and twice continuously differentiable, the modified BFGS algorithm in the thesis has global convergence, and has R- linear convergence rate. We introduce two parameters that is 0??k??max, 0<?min??k??max .This shows that this algorithm provides a theoretical framework for a more general algorithm. Further, if the Hessian matrix of the objective function satisfy the local Lipschitz continuity of x* in G(x) and ?k =1,?k= K?Sk?b .Then the modified BFGS algorithm in the thesis satisfy superlinear, where K and b ?1 is constant.The algorithm of this thesis, inexact line reach are only using simple method and the backward Armijo principle criterion.Testing library unconstrained optimization problem in the calculation results show that this algorithm has good and numerical stability. Comparing with the BFGS algorithm, this algorithm has certain advantages when the problem is serious. It also shows that ?k =1 is the prerequisite of ensuring the step length tend to 1. |