Optimization method is an important part of operations research. Gradient-related memory method is one of important issues in unconstrained optimization fields. In this thesis, we presented three sorts of gradient-related memory methods in unconstrained optimization which have global convergence under mild conditions. This paper is composed of three chapters.Chapter 1 is the introduction, which introduces conjugate gradient method, gradient-related memory method and the main results obtained in this thesis.In Chapter 2, we presented a kind of nonlinear conjugate gradient methods for unconstrained optimization problems. We showed the convergence properties of this method. Furthermore, numerical examples are given to show the effectiveness of these methods.In Chapter 3, we presented two sorts of gradient-related memory methods for unconstrained optimization problems. We obtained the global convergence theorem under mild conditions.
|