Font Size: a A A

Research On Adaptive Conjugate Gradient Methods

Posted on:2016-08-28Degree:DoctorType:Dissertation
Country:ChinaCandidate:X L DongFull Text:PDF
GTID:1220330488457667Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Unconstrained optimization is a subject that is widely and increasingly used in economics, engineering, management, military and space technology and other areas. So, it is greatly significant to construct computational methods, explore the theoretical properties and study the computational performance of these numerical methods for optimization.Modest storage requirement and simple computational scheme have lead to conjugate gradi-ent method that are among the most efficient methods for solving large-scale optimization. During the past two decades, the conjugate gradient method become an active area of re-search due to its two advantages. One is the sufficient descent condition and the other is the conjugacy condition. We first summarize the existing nonlinear conjugate gradient meth-ods, and then from a practical point of view, we restrict our attention to the self-adjusting algorithm based on the two above conditions. The main contributions are listed as follows:1. We introduce two kinds of self-adjusting conjugate gradient methods, which can generate sufficient descent directions at each iteration. Different from the existent methods, a dy-namical adjustment of conjugacy condition in our proposed method is developed, which can be regarded as the inheritance and development of properties of standard Hestenes-Stiefel conjugacy condition and Dai-Liao conjugacy condition. Under mild condition, we establish the global convergence of the proposed methods even if the objective function is nonconvex.2. By taking modifications to the six leading conjugate gradient methods, we propose some modified nonlinear conjugate gradient methods. The corresponding search directions can satisfy the sufficient descent condition independent of any line search. Also, we propose a general form of conjugate gradient method, which always generates a sufficient descent direction independent of the line search employed. We establish the global convergence of our methods without the Yuan’s assumption that the steplength is bounded away from zero.3. We construct a two-dimensional function which is not necessarily uniformly convex. This example is presented to show the possibility that the sufficient condition for the glob-al convergence of the TTCG method does not necessarily hold. Specifically, whether the TTCG method is globally convergent for minimizing our example or not, the assumption condition sKTyk> τ is not satisfied, where r> 0 is a constant. The main reason is that sKTyk is an infinitesimal of higher order than ||sk||2.On the other hand, a general form of three-term conjugate gradient method is presented, in which the search directions simultaneously satisfy the self-adjusting conjugacy condition and sufficient descent property.4. We focus on finding an appropriate choice for the parameter ω in the sense that the con-dition number in the iteration matrix of the search direction of the TTDES method is at minimum. However, its numerical results need some revisions due to this improper choice. Specifically, since the iteration matrix is neither symmetric nor normal, a cautious and rea-sonable strategy is changed based on a singular value analysis rather than an eigenvalue analysis.5. We study new Hestenes-Stiefel type and Polak-Ribiere-Polyak type three-term conju-gate gradient methods, in which the sufficient descent directions are generated by the affine combination of different search directions in such a way that the corresponding directions are closest to the directions of the quasi-Newton method or satisfy conjugacy condition as the iterations evolve. The global convergence can be established under the Wolfe line search.6.Preliminary numerical results are reported on a set of large-scale optimization problems, which show our proposed methods are promising.A self-adjusting algorithmic mechanism benefits the conjugate gradient method more than theoretically and computationally, and with the passage of time, it will show more meaning-ful and profound significance.
Keywords/Search Tags:conjugate gradient method, self–adjusting conjugacy condition, sufficient descent condition, global convergence, condition number
PDF Full Text Request
Related items