Font Size: a A A

Conjugate Gradient Algorithms Based On The Conjugate Gradient Parameter Of Wei Et Al.

Posted on:2016-08-12Degree:MasterType:Thesis
Country:ChinaCandidate:W Y MaFull Text:PDF
GTID:2180330461461762Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
In this thesis, we present some modified conjugate gradient algorithms based on the conjugate gradient parameter of Wei et al., then establish the convergence results of the algorithms, and test the effectiveness of the algorithms by a lot of numerical experiments.Chapter 1 is the introduction of this thesis, which introduces nonlinear conjugate gradient methods, the context of this thesis, the main results obtained in this thesis and some important lemmas and assumptions.In Chapter 2, we propose four modified nonlinear conjugate gradient methods based on the conjugate gradient parameter of Wei et al., called NVLS method, NVPRP* method, NVHS* method and NVLS* method, respectively. The sufficient descent property and global convergence of the NVLS method is proved under the strong Wolfe line search conditions with 0< σ< 1, the sufficient descent property and global convergence of the NVPRP* method is proved under the strong Wolfe line search conditions with 0< σ<1/4, the sufficient descent property and global convergence of the NVHS* method is proved under the strong Wolfe line search conditions with 0< σ<1/3 and the sufficient descent property and global convergence of the NVLS* method is proved under the strong Wolfe line search conditions with 0< σ<1/2. Numerical results show that the NVPRP* method, the NVHS* method and the NVLS* method are more efficient than the NVPRP method, the NVHS method and the NVLS method respectively.In Chapter 3, we propose a class of conjugate gradient methods with double parameters, which can be regarded as some kind of convex combination of NVPRP* method, NVHS* method and NVLS* method proposed by author in Chapter 2. The sufficient descent property and global convergence of the THCG* method is proved under the strong Wolfe line search conditions with σ∈(0,(μ1+μ2)/(2+μ2)). Numerical results show tha the THCG* method although slightly less than the NVHS* method, but more efficient than the PRP, NVPRP*, NVLS*, THCG method.In Chaptere A Wo modify the NVPRP* NYHS* and NVLS* method respectively and propose the MDPRP*, MDHS* and MDLS* method. When μ≥0, the sufficient descent property and global convergence of the MDPRP* method is proved under the strong Wolfe line search conditions with 0<σ<1/4, the sufficient descent property and global convergence of the MDHS* method is proved under the strong Wolfe line search conditions with 0< σ< 1/3 and the sufficient descent property and global convergence of the MDLS* method is proved under the strong Wolfe line search conditions with 0< we show the global convergence of the MDPRP*, MDHS* and MDLS* method under the Wolfe line search, respectively. Numerical results show that the MDPRP*, MDHS* and MDLS* method are more efficient than the NVPRP*, NVHS* and NVLS* method respectively.In Chapter 5, we propose a modified Dai-Liao conjugate gradient method, we call this modified method MDL* method. The sufficient descent property and global convergence of the MDL* method is proved under the strong Wolfe line search conditions with 0 σ<1/3. Numerical results show that the MDL* method is more efficient than the PRP, DL, MDL, NVHS* and DY method.
Keywords/Search Tags:inexact line search, conjugate gradient method, su?cient descent property, global convergence
PDF Full Text Request
Related items