Font Size: a A A

Research On Two Types Of Three-term Conjugate Gradient Methods

Posted on:2021-10-07Degree:MasterType:Thesis
Country:ChinaCandidate:J LiFull Text:PDF
GTID:2480306047485274Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
In the optimization field,conjugate gradient method is widely used to solve unconstrained optimization problems due to its simple iteration format and small storage capacity.With the expansion of problem scale,subspace technology has become an important method to solve large-scale optimization problems due to its feature of dimensionality reduction.In recent years,some scholars combine subspace technology with conjugate gradient method and put forward the subspace conjugate gradient method,which is very effective for large-scale unconstrained optimization problems.Because three-term conjugate gradient method contains more information from iteration points,it can effectively improve the numerical performance of the algorithm,which is entering the field of vision of more and more scholars.In addition,because the objective function has a strong quadratic state when the iteration point is near the minimum point,quadratic model is used to approximate the objective function in most optimization algorithms.However,when the iteration point is not near the minimum point,and the non-quadratic property of the objective function is relatively strong,it is not reasonable to use the quadratic model to approximate the objective function.In this case,we think it is a good choice to consider the tensor model with higher accuracy.In this paper,we propose two three-term conjugate gradient methods as follows:First,based on the theory of conjugate gradient algorithm,and considering the pros and cons of SMCG_NLS algorithm in terms of choice of subspace,the famous BBCG algorithm is extended to three dimensional subspace?k+1=span?gk+1,s k,yk?.The approximate objective function is minimized in this subspace,and different ways of selecting search direction are obtained under different dimensions.Then,an adaptive three-term Barzilai-Borwein conjugate gradient method is proposed based on non-monotone line search.Then we give two important theoretical properties of the search direction and prove the global convergence of the algorithm.Finally,the numerical test of the new algorithm is compared with SMCG_NLS and SMCG_BB under the given test function set,which shows that the numerical performance of the new algorithm has certain advantages over the other two algorithms.Second,when the iteration point is far away from the minimum point,the tensor model is more accurate than the quadratic model in approximating the objective function.Therefore,the appropriate approximation model is selected at each iteration point according to the model criterion.Combining with the idea of three conjugate gradient method of subspace minimization,we obtain the selection of corresponding search direction by minimizing the model function in three dimensional subspace.Finally,a subspace minimization three-term conjugate gradient method based on tensor model is proposed with ZH line search condition.Because of the complexity of the search direction form under the tensor model,the theoretical properties of the search direction in the new algorithm are proved by combining the theoretical knowledge of matrix norm.Then the global convergence of the algorithm for general functions is also given.Finally,by comparing with CG_TENSOR,SMCG_Conic and SMCG_NLS,it is concluded that the proposed algorithm is very effective.
Keywords/Search Tags:three-term conjugate gradient method, subspace minimization, tensor model, global convergence
PDF Full Text Request
Related items