Font Size: a A A

Research On Subspace Minimization Conjugate Gradient Methods Based On Conic Model

Posted on:2020-07-31Degree:MasterType:Thesis
Country:ChinaCandidate:Y F LiFull Text:PDF
GTID:2370330602950572Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Conjugate gradient method is one of the main methods for solving unconstrained optimization problems.Due to its simple iterative format,small amount of memory and fast convergence speed,conjugate gradient method can effectively solve large-scale optimization problems and has attracted focused attention of researchers.With the emergence of more and more large-scale problems,subspace technique becomes particularly important and is widely applied to the field of optimization.It avoids solving large-scale problems at each iteration,and it can reduce the amount of calculation and the storage space,where the key factor is to select the subspace to which the iterative direction belongs.Recently,many scholars have studied subspace minimization conjugate gradient methods,and the iteration direction is generally obtained by minimizing the quadratic model of the objective function in a specific subspace.However,for some objective functions with strong non-quadratic states,the quadratic model might produce a poor performance.Compared with the quadratic model,the conic model possesses more degree of freedom to interpolate more information,and it can take advantage of the function and gradient information in previous iterations.In this paper,based on conic model,two new subspace minimization conjugate gradient methods are proposed according to different subspaces.The specific work is as follows:Firstly,a new conjugate gradient method for unconstrained optimization problems is proposed by minimizing a model in a two-dimensional subspace.The appropriate model is dynamically selected in each iteration.That is,when the iteration point is near the minimizer,the objective function is close to quadratic,a quadratic model can often approximate well.When the iteration point is far away from the minimizer or the non-quadratic behavior of the objective function is strong,the conic model is considered to approximate the original function.Firstly,the criterion of model selection is given,and the search direction is generated by minimizing the selected model in the two-dimensional subspace,further verifying that the search direction has sufficient descent property.With the improved non-monotone line search,the global convergence and R-linear convergence of the proposed method are established.Numerical results using two different test function collections show that the proposed algorithm is quite efficient and comparable to the classical CGOPT method and CG DESCENT method.Secondly,the two-dimensional subspace is extended to the three-dimensional subspace,and the current search direction is constructed by the current gradient and latest two search directions.In each iteration,the quadratic or conic model is dynamically selected,and three choices of subspaces are given under the selected model,further giving the selection criteria of search direction under different subspaces.We propose the three-dimensional subspace minimization conjugate gradient method based on the conic model.Under certain conditions,two important properties of search direction are proved.Based on the improved nonmonotone line search,the convergence of the proposed method is established.Numerical experiments with two different test function sets indicate the proposed method is efficient,especially for large-scale unconstrained optimization.
Keywords/Search Tags:conjugate gradient method, conic model, subspace minimization, global convergence, R-linear convergence
PDF Full Text Request
Related items