| In this thesis,we presented a super memory gradient method for solving unconstrained optimization problems and a derivative-free memory gradient method for solving monotone nonlinear equations.They are denoted by AlgorithmⅠ and AlgorithmⅡ respectively.In AlgorithmⅠ,we modified the non-monotone line search technique,the search direction at each iteration will always provide a sufficient descent step,so the AlgorithmⅠ is more stable;In AlgorithmⅡ,we propose an improved line search technique,which reduces the selection of initial points,so AlgorithmⅡ more for solving large-scale equations.Under appropriate conditions,the two algorithm has global convergence and R-linearly converges to x*.The numerical results show that they are was effective. |