Based on the fixed step-length and non-monotone techniques,we presented two superme-mory gradient methods which are used to solve unconstrained optimization problem, denoted by Algorithm A and Algorithm B.In Algorithm A,we use the fixed step-length technique and back-tracking line search, tech-nique, thus reducing the number of iterations an function evaluations;In Algorithm B, we improve the traditional non-monotone technique in order to make it more suitable for solving large-scale optimization problems.Under certain conditions, both of them are globally convergent.Numerical results show that they are effective and feasible. |