Font Size: a A A

Nonlinear Conjugate Gradient Methods For Optimization Problems

Posted on:2007-07-21Degree:DoctorType:Dissertation
Country:ChinaCandidate:L ZhangFull Text:PDF
GTID:1100360185465946Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In this paper, we propose some new nonlinear conjugate gradient methods for solving unconstrained and box constrained optimization problems. They are modifications for the existing well-known conjugate gradient methods. We establish the global convergence theory for the proposed methods and report extensive numerical results.We first propose some modified nonlinear conjugate gradient methods in Chapters 2-4. We call these modified methods MFR method, MPRP method, MHS method, respectively. An important property of these modified methods is that at each iteration, the methods can generate a sufficient descent direction d_κ satisfying d_k~Tgκ= —‖gκ‖~2· This property is independent of line search used. Moreover, if exact line search is used, the MFR method, MPRP method and MHS method reduce to the standard FR method, PRP method and HS method respectively. Consequently, when applied to minimize a strictly convex quadratic function, the proposed methods terminate at the solution of the problem finitely.Under mild conditions, we also prove that the MFR method with standard Armijo or Wolfe line search converges globally for nonconvex functions. Moreover, we propose a modified Armijo type line search and establish a global convergence theory for the MPRP method with this search.In order to improve the performance of conjugate gradient methods, we also propose a strategy about the choice of initial steplength. Our numerical results show that this strategy do have some advantage. The initial steplength is essentially accepted for most problems.To ensure global convergence of the MHS method, we also introduce two modified MHS methods, which are called MMHS method and CMHS method respectively. These two methods still retain the property g_κ~T d_κ =—‖g_κ‖~2· Under appropriate conditions, we prove that both MMHS method and CMHS method with Armijo or Wolfe line search are globally convergent for nonconvex minimizations. We also test these two methods for many unconstrained problems from CUTE library. The extensive numerical results show that MPRP, MMHS and CMHS method perform very well. They are as good as CG_DESCENT method.In Chapter 5, we introduce a cautious control rule in DY method and propose a hybrid method that is a combination of the steepest descent method and DY method. We show that the hybrid method is also a descent method. Under mild conditions, we prove that the hybrid method with standard Armijo line search is globally convergent for nonconvex minimizations.
Keywords/Search Tags:Nonlinear conjugate gradient methods, Line search, Sufficient descent direction, Global convergence
PDF Full Text Request
Related items