Font Size: a A A

Three-term Conjugate Gradient Method Based On Least Square

Posted on:2018-02-16Degree:MasterType:Thesis
Country:ChinaCandidate:C R CuiFull Text:PDF
GTID:2310330518964627Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
As an important part of operational research and cybernetics,optimization has been widely used in practical applications,such as economic management,engineering design,optimal control,oil exploration,etc..Unconstrained optimization plays a basic and important role in the field of optimization.These methods that we usually used to solve unconstrained optimization problems include steepest descent method,Newton method,quasi-Newton method,conjugate gradient method and trust region method,etc..Due to the low storage,the conjugate gradient method is one of the most effective methods to solve large-scale unconstrained optimization problems.In the study of conjugate gradient methods,the three-term conjugate gradient method is one of the hotspots.In this thesis,we study the three conjugate gradient method for solving unconstrained optimization problem and nonlinear equations.For solving large-scale unconstrained optimization problems,we proposed a three-term conjugate gradient method based on the least squares technique.In this thesis,we first study some three-term conjugate gradient methods which are effective to solve unconstrained optimization problems,then approximate them by the least squares to propose a new formula of three-term conjugate gradient method.The algorithm has the following advantages:(1)Without considering the line search,the algorithm has the descent property,that is to say,the descent property does not depend on the choice of line search technique;(2)Under certain conditions,the algorithm has global convergence;(3)the numerical experiment is used to illustrate the algorithm for large-scale unconstrained optimization.The numerical results show that the algorithm is efficient.For the problems of nonlinear equations,we propose an improved Polak-Ribiere-Polyak(PRP)conjugate gradient algorithm.In our thesis we prove the global convergence of the algorithm.Since the proposed algorithm has the advantages of low storage,it can be used to solve the large-scale nonlinear equations.The numerical results show that the algorithm still has well numerical results even the dimension of the objective function is high.
Keywords/Search Tags:unconstrained optimization, nonlinear equations, least squares, three-term conjugate gradient method, global convergence
PDF Full Text Request
Related items