Quasi-Newton is extensively used in optimization problem ,which is based onthe quasi-Newton equation.Traditional quasi-Newton equation only uses the gradientinformation of the objection function.Obviously,it is a waste of information. In this paper ,we present a class of new generalized quasi-Newton algorithmsfor unconstrained optimization which are the combination of qseudo-quasi-newtonalgorithm and the quasi-newton algorithm. The new algorithms are very extensive,including the generalized quasi-newton algorithms in Jiao's paper and also inZhangs'even the family of Broyden. They are conjugated in research direction andquit in less n step as the objection function is quadratic.After linear transformation,thenew methods are not changed.The global convergence and the superlinearconvergence of the new algorithms with wolfe line search are also proved under theweak condition. Numerical experiments indicate that the new algorithms are morefeasible and effective .It is very interesting that we also validate that BFGS method isone of the best methods up to now.
|