Nonlinear conjugate gradient method is a class of important methods for solvingunconstrained optimization problems, especially for large scale problems. It is widelyused because of their simplicity, small storage and fast convergence. Many practicalproblems, such as oil exploration, atmospheric modeling, aerospace and others, whichare all large scale optimization problems, is often solved by conjugate gradient method.Trust region method is another class of important methods for solvingunconstrained optimization problems. Trust region method is presented in the1970s,and widely studied by many scholars in nonlinear optimization area and these methodsare always one of attractive issues in the field of nonlinear optimization. Trust regionmethods have two advantages. One is that they have strong stability, the other is thatthey have the global convergence. This thesis further investigate conjugate gradientmethod and trust region method for solving unconstrained optimization problemswithout convexity assumptions for the objective function. The two main results of thisthesis are as follows:Firstly, we present two classes of nonlinear conjugate gradient methods for solvingunconstrained optimization problems and prove their global convergence with thestrong Wolfe line search. Numerical results shows that the two classes of methods areefficient.Secondly, we give a new trust region method for solving unconstrainedoptimization problems, based on the quadratic model and a fixed-step modified BFGSformula, and prove its global convergence properties. |