Font Size: a A A

The Conjugate Gradient Methods For Large-scale Unconstrained Optimization

Posted on:2017-03-30Degree:MasterType:Thesis
Country:ChinaCandidate:Y T ChenFull Text:PDF
GTID:2180330509955870Subject:Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method is an important numerical method in solving general unconstrained optimization problems, which is between steepest descent method and Newton method. This method only uses the objective function values and gradient val-ues, it can overcome the slow convergence disadvantage of the steepest descent method, but also has advantages in avoiding the computation and storage of Hessian matrix need-ed by Newton method. Based on the above advantages, the conjugate gradient method is considered indispensable in solving large-scale optimization problems. Therefore, it is a hot research direction all the time. In this thesis, we mainly research on the mod-ified LS conjugate gradient method and three-term conjugate gradient methods based on subspace technique.In chapter 2, based on the classical LS conjugate gradient method, we present a modified LS conjugate gradient method. Global convergence results of the proposed method is established under Wolfe line search for the strongly convex functions. More-over, a mixed strategy is incorporated in the proposed method, the corresponding global convergence is obtained for the non-convex problems.In chapter 3, combining with the subspace technique, we present a three-term con-jugate gradient method using subspace technique for large-scale unconstrained opti-mization, in which the search directions are determined by minimizing the quadratic approximation of the objective function in a subspace spanned by negative gradient at the current iteration point, the difference between iteration points and the difference be-tween gradients. Moreover, the search directions satisfy sufficient descent property that is independent of the line search. Under appropriate assumptions, global convergence result of the method is established.In chapter 4, we minimize the quadratic approximation of the objective function in a subspace spanned by negative gradient direction of current iteration point and the last two search directions, then present another three-term conjugate gradient method. Furthermore, the search directions can both satisfy the descent condition and the Dai- Liao conjugacy condition. Global convergence result of the method can also established under suitable assumptions.In this paper, we perform numerical experiments on all proposed methods, and use performance profile plots to compare the numerical results, which show the effective-ness and applicable scope of the present methods.
Keywords/Search Tags:Unconstrained optimization, Conjugate gradient methods, Sub- space, Global convergence
PDF Full Text Request
Related items