Font Size: a A A

With Sufficient Descent Nonlinear Conjugate Gradient Method

Posted on:2010-05-20Degree:MasterType:Thesis
Country:ChinaCandidate:L J ShenFull Text:PDF
GTID:2190360278976269Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Optimization is a subject which is widely applied to discuss the best choice in decision making and to seek the best solution. Although the practice of optimization can be traced back to the old extreme-value problem, it became an independent subject in 1947 after Dantzig had put forward algorithm of linear programming based on simplex method. Recently, with the development of systematic science and the widely application of computer, the optimization has become an active subject applied in engineering, national defence, economy, management and some branches of mathematics directly or indirectly.Nonlinear conjugate gradient method, one of the optimization methods, is simple, easily realized and especially suitable for solving the large-scale unconstrained optimization problems. In this paper, some new nonlinear conjugate gradient methods are proposed. They are modifications for the well-known existing conjugate gradient methods. We establish the global convergence theory for the proposed methods and report extensive numerical results.In chapter 1, the development of optimization and some optimality conditions which are frequently used to determine the optimum solution are introduced. Several kinds of derivative descent methods for unconstrained optimization are also reviewed.In chapter 2, the background and existed result of the research and the current research situation of this method are introduced.In chapter 3 and 4, two modified nonlinear conjugate gradient methods are proposed. They are modified FR method and modified HS method, respectively. An important property of these modified methods is that the methods can generate a sufficient descent direction at each iteration, that is d ksatisfies g_k~Td_k= ||g_k||~2.This property is independent of line search. Moreover, when exact line search is used, the modified FR method and modified HS method reduce to the standard FR method, HS method respectively. Consequently, when they are applied to minimize a strictly convex quadratic function, the proposed methods terminate at the solution of the problem finitely. Under some mild conditions, we prove that the modified conjugate gradient methods with standard Armijo line search converges globally for nonconvex functions. Moreover, under nonmonotone line search, we prove that modified HS method converges globally in chapter 4. Finally, we carry out an numerical experimentation. And the numerical result shows that our algorithm is of good convergence and efficiency, and is especially suitable for solving the large-scale unconstrained optimization problems.In chapter 5, the global convergence of modified FR and modified HS with fixed step-length are proved.
Keywords/Search Tags:Nonlinear Conjugate Gradient Method, Line Search, Sufficient Descent Direction, Global Convergence
PDF Full Text Request
Related items