Font Size: a A A

The Conjugate Gradient Method Under Wolfe Line Search

Posted on:2017-03-25Degree:MasterType:Thesis
Country:ChinaCandidate:Z GuanFull Text:PDF
GTID:2310330485956809Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method is an important part in optimization theory, is very effective to solve unconstrained optimization problems of mathematical tools, because of its don't need to solve the second order partial derivative?Convergence speed is fast?the required storage space is small?simple algorithm and easy to be programmed etc.,Especially suitable for solving the large-scale optimization problems.Achieve better convergence and numerical excellent of the conjugate gradient method is a hotspot of current research.This article is mainly to studied the conjugate gradient method under the Wolfe line search conditions,The main content and results obtained of thesis are as follows:1. through the study of classical conjugate gradient method, a new algorithm formula was deduced gradually, and in the Wolfe line search conditions basis an improved line search condition has been deduced,under the condition of the new search the descent property and global convergence of the algorithm has been proved.2. Mainly to studied of the MHS algorithm,the NHS algorithm and hybrid conjugate gradient method, a modified of HS algorithm and a new hybrid conjugate gradient method has been provided, and the new algorithm has global convergence and fully the descent property, the numerical experiments show that the algorithm is feasible and effective.
Keywords/Search Tags:Conjugate gradient method, Wolfe Line search, Global convergence, Unconstrained optimization, Sufficient descent property
PDF Full Text Request
Related items