Font Size: a A A

Study On Gradient Method Based On Cubic Regularization Techniques

Posted on:2020-03-12Degree:MasterType:Thesis
Country:ChinaCandidate:W L ChuFull Text:PDF
GTID:2370330602450569Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
With the rapid development of social economy,science and technology,a large number of large-scale optimization problems have emerged in reality.Large-scale optimization methods have been widely applied in many fields such as national defense construction,engineering design,and agricultural production.Gradient method is one of the most important methods to solve optimization problems,it is a good choice for solving large-scale unconstrained optimization problems because of its simplicity of calculation and ease of storage,and can adaptable the needs of large data and cloud computing technology.Among them,Barzilai-Borwein(BB)algorithm stimulates scholars' passion for gradient method research with its simplicity and high numerical performance.As we all know,the establishment and selection of gradient models have a crucial influence on the numerical performance of gradient algorithms.More and more efficient gradient models are established.Therefore,constructing more accurate gradient models is not only to theoretical significance,but also very important to practical application.In recent years,the cubic regularization method for unconstrained optimization problems has been proposed as a new efficient algorithm,which has become a new choice expect the trust region algorithm and the line search model.The basic idea of the cubic regularization algorithm is to solve the approximate minimum value of the objective function by finding the global minimum value for the three over-estimation models of the objective function.When the Hessian matrix of the objective function is L-continuous,the cubic regularization model has better worst-case iteration complexity than the steepest descent model.Adaptive cubic regularization algorithm and linear search are important methods for solving unconstrained optimization problems.How to combine the two methods efficiently and make full use of their advantages is a new challenge.In this paper,two kinds of efficient gradient algorithms are proposed based on the efficiency of the gradient algorithm and combined with the cubic regularization model.The main work is as follows:For the gradient method,since the cubic regular model can be used to represent more characteristics of function information,we construct the cubic regularization model of the current iteration point,calculate the approximate optimal step size,and establish a reasonable model selection mechanism to appropriately select the quadratic model or the cubic regularization model.An approximate optimal gradient method for solving the minimization problem is proposed,and the global convergence of the algorithm is established.By constructing the BB type parameter,a scalar matrix with BB type parameters is given to approximate the Hessian matrix of the objective function,Construct the cubic regularization model at the current iteration point,solve the trial step by minimizing the cubic regularization model,and a simple model of the cubic regularization BB algorithm is proposed.Considering the nonmonotonicity of the BB algorithm,a simple model nonmonotone cubic regularization BB algorithm is proposed with some nonmonotone line search techniques.The essence of the proposed method is the gradient model.In the case of establishment,the global convergence of the algorithm is established.For the given function test set,the numerical performance of the proposed algorithm is better than BB and GM-AOS(cone),and compared with the famous limit memory conjugate gradient algorithm CG DESCENT(6.0)also has certain competitiveness.
Keywords/Search Tags:Large-scale unconstrained optimization, Gradient method, Adaptive cubic regularization method, Nonmonotonic line search, Barzilai-Borwein method, Global convergence
PDF Full Text Request
Related items