Font Size: a A A

Conjugate Gradient Methods For Nonlinear Symmetric Equations

Posted on:2011-05-31Degree:MasterType:Thesis
Country:ChinaCandidate:J WuFull Text:PDF
GTID:2120360308468553Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Since originated by France famous mathematician, Cauchy, in 1847, the steep-est descent method has become a basic iterative method for solving the uncon-strained optimization problems. As the steepest descent method uses the negative gradient as the search direction, it is also called the gradient method. The idea of conjugate gradient method was originated by Hesteness and Stiefel in 1952 for solving liner equations. It is then extended solve the unconstrained optimization problems. Compared with the Newton method and the Quasi-Newton methods, the conjugate gradient methods enjoy some nice properties such as the lower stor-age, quick convergence speed and quadratic termination. They have now become one of the most welcome iterative methods for solving large-scale optimization problems. On the other hand, however, so far, the research in the steepest descent method and the conjugate methods for solving the systems of nonlinear equations is very limited. One of the major reason is that the direction generated by the related methods is generally not descent for the norm function of the equation. As a result;the idea those methods could not extend to solve the systems of nonlinear equations directly.Gu-Li-Qi-Zhou (2003) proposed a quasi-Newton method for solving symmet-ric nonlinear equations. The method is a derivative-free method. However, it is a descent method. The generated sequence of the norm function values is de-screasing. Under appropriate conditions, the method is proved to be globally and superlinearly convergent. Motivated by that method, in this paper, we propose two iterative methods for solve symmetric nonlinear equations. We call them the approximate steepest descent method and approximate modified PRP method re-spectively. These two methods possess some nice properties:1. They can generated descent directions for the norm function of the equation without computation of the derivatives; 2. The generated sequence of the norm function values is decreas-ing; 3. Under mild conditions, the methods are globally convergent. Due to the lower storage requirement, the proposed methods can be applied for solving large-scale symmetric nonlinear equations. We also do some numerical experiments to test the proposed methods. The results show that the proposed methods are quite efficient for solving large-scale symmetric nonlinear equations.
Keywords/Search Tags:Symmetric nonlinear equations, Derivative-free methods, Global convergence
PDF Full Text Request
Related items