In this thesis,we presented a super memory gradient method for solving unconstrained optimization problems and a derivative-free memory gradient method for solving monotone nonlinear equations.They are denoted by Algorithm? and Algorithm? respectively.In Algorithm?,we modified the non-monotone line search technique,the search direction at each iteration will always provide a sufficient descent step,so the Algorithm? is more stable;In Algorithm?,we propose an improved line search technique,which reduces the selection of initial points,so Algorithm? more for solving large-scale equations.Under appropriate conditions,the two algorithm has global convergence and R-linearly converges to x*.The numerical results show that they are was effective. |