Font Size: a A A

Study On Nonlinear Conjugate Gradient Algorithms And Their Convergence

Posted on:2015-11-03Degree:MasterType:Thesis
Country:ChinaCandidate:S M WangFull Text:PDF
GTID:2180330422472106Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
Optimization is a subject, which is young and widely applied. It has rich branchesand new methods, which emerge constantly. With the development of electronic scienceand technology, optimization theory and method is more and more widely used in theproduction, transportation, economic and other aspects, and conjugate gradientalgorithm of unconstrained optimization is the main method to solve the large-scaleproblems. Due to the simple, easy programming and less memory needingcharacteristics of the conjugate gradient algorithm, it has become an effective andcommonly used method for many practical problems.At first, this paper briefly discusses the optimality conditions of unconstrainedoptimization after introducing the concept and classification of optimization problem,summarizes the numerical algorithms of optimization problem from step length andsearch direction, and introduces the referring conception of conjugate gradient. Andthen, based on the research status of conjugate gradient, the paper analyzes the hybridconjugate gradient which is hotter studied recently, improves parameters k, andconstructs a new search directiond k, and proposes the modified hybrid conjugategradient methods and a modified HS conjugate gradient method. This paper not onlyproves the global convergence of the algorithms, but also verifies the numerical resultsunder some mild conditions.This topic expands the conjugate gradient method based on the former studies,which obtained in dissertation may be summarized as follows:1. Based on the hybrid conjugate gradient, the author modifies two hybridconjugate gradient methods, which search directiond kis assumed descent, such that thenew hybrid conjugate gradient methods can automatically generate sufficient descentdirection independent of the line search used for each iteration. Under appropriateconditions and the Wolfe line search, it was proved that the new methods are globallyconvergent. Preliminary numerical results show that the methods are efficient.2. Based on the modified BFGS method, we present a three-term type modified HSconjugate gradient method, the new method also has the sufficient descent property. Themethod is globally convergent with different Armijo-type line search for generalnonlinear function, and the numerical results are verified under some mild conditions.
Keywords/Search Tags:unconstrained optimization, nonlinear conjugate gradient method, sufficientdescent property, line search, global convergence
PDF Full Text Request
Related items