Font Size: a A A

Some Modified Nonlinear Conjugate Gradient Methods

Posted on:2015-04-02Degree:MasterType:Thesis
Country:ChinaCandidate:Y M QinFull Text:PDF
GTID:2180330431978875Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Nonlinear conjugate gradient methods are a class important unconstrained optimiza-tion methods. These methods are often used to solve large-scale optimization problems because they are of simplicity, low storage and are easy to code.In this paper, we present some modified nonlinear conjugate gradient methods and prove their sufficient descent property and global convergence under suitable assumptions. Numerical results show that the algorithms given in the paper are efficient.This paper is organized as follows:In Chapter1, the research status of some well-known nonlinear conjugate gradient methods are reviewed.In Chapter2, a modified conjugate gradient method is presented on the basis of an improved PRP method given by Zheng Li. The modified method possesses the sufficient descent property independent of line search used. The global convergence of the modified method is proved under the Wolfe line search conditions. Numerical results show that the algorithm given in this chapter are efficient.In Chapter3, noting by the articles of Cheng wanyou etc, a class of modified uncon-strained optimization method is obtained by modifying the search direction of conjugate gradient method. The modified method has the sufficient descent property without any line search and conjugate gradient parameter. For the modified method of conjugate gradient parameters are taken the form of YT and YT+methods reducing to the corre-sponding the modified YT method and modified YT+method. The global convergences are proved with the modified YT method under the Armijo line search for uniformly con-vex function and the modified YT+method under strong Wolfe line search for general nonlinear functions. Numerical results show that the modified algorithms are efficient.In Chapter4, two new modified conjugate gradient methods are presented noting by the articles of Yu Gaohang etc and Chen Jihong etc. It is proved that the modifide methods are of sufficient descent property and globally convergent under the strong Wolfe line search. Numerical results show that the modified algorithms are efficient.
Keywords/Search Tags:Unconstrain Optimization, Conjugate Gradient Method, Line Search, Suffi-cient Descent Property, Global Convergence
PDF Full Text Request
Related items