Font Size: a A A

Research On Subspace Minimization Conjugate Gradient Methods Based On Modified Secant Equation

Posted on:2021-07-20Degree:MasterType:Thesis
Country:ChinaCandidate:X L DiaoFull Text:PDF
GTID:2480306047985279Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Optimization theory and algorithm is widely used in many fields such as engineering design,financial analysis,production and life.The conjugate gradient method is the one of the most effective methods to solve the unconstrained optimization problem because of its simple iteration form,small storage memory and good numerical efficiency.The subspace technique does not need to solve the large-scale subproblem in each iteration,and it only needs to select the low-dimensional subspace and solve the subproblem in the selected subspace.With the in-depth study of nonlinear conjugate gradient method and subspace technique,scholars tried to apply the subspace technique to conjugate gradient method,namely the subspace minimization conjugate gradient method,so as to improve the efficiency of the pure conjugate gradient method.The subspace minimization conjugate gradient method is a method that minimizing an approximate quadratic model of the objective function on a two-dimensional subspace composed between the gradient direction at the current iteration point and the search direction at the previous point.Among them,the Hessian matrix in the quadratic model is approximated by the Hessian matrix satisfying the standard secant equation.Notice that the standard secant equation contains only the gradient information of iteration points and ignores the information about the function value.In order to use a more precise approximate quadratic model,based on modified secant equation which includes information about both gradient and function value,two new subspace minimization conjugate gradient algorithms on different subspaces are proposed in this paper.The specific work is as follows:For the two-dimensional subspace,according to the advantage that the modified secant has more information about iteration points than the standard one,we put forward the situation that the approximation of Hessian matrix satisfies the modified secant equation in addition to the standard one.And the criteria for choosing the standard or modified secant equation are also analyzed and given.Based on this,combining the idea of BB method,the search direction is derived by minimizing the above quadratic model in the two-dimensional subspace,and analyzing the sufficient descent property of search direction.The global convergence of the new algorithm is also proved under the improved non-monotone line search condition.The numerical experiments in two test function sets show that the method is more efficient than famous CG_DESCENT and SMCG_BB algorithms.Considering the three-dimensional subspace,similarly,based on the advantage that the modified secant has more information about iteration points,we approximate Hessian matrix by satisfying the modified secant equation in the quadratic model,then the search direction is derived by minimizing the above quadratic model in the three-dimensional subspace.The numerical experiment shows that the improved algorithm not only is more efficient but also contains more function information than traditional three-dimensional subspace minimization conjugate gradient method,and the Hessian matrix is estimated more accurately.
Keywords/Search Tags:subspace minimization, conjugate gradient method, modified secant equation, Wolfe line search, global convergence
PDF Full Text Request
Related items