Font Size: a A A

Research On Three Dimensional Subspace Gradient Algorithm Based On Quadratic Model

Posted on:2022-05-23Degree:MasterType:Thesis
Country:ChinaCandidate:Y P WuFull Text:PDF
GTID:2480306533996009Subject:Mathematics
Abstract/Summary:PDF Full Text Request
With the development of science and technology,the scale of problems in our production and life is expanding.An efficient and stable algorithm for large-scale optimization problems is one of the hot spots of current research.At present,the algorithms for unconstrained optimization problems mainly include:conjugate gradient method,steepest descent method,Newton and quasi-Newton methods.The conjugate gradient method is one of the most effective methods to solve large-scale nonlinear optimization problems,which is widely used in solving large-scale problems.For large-scale optimization problems,how to simplify the calculation and storage is the key.Subspace technology takes the way of minimizing the function approximation model in the selected particular subspace,forcing the iteration point of the algorithm to carry out the next iteration in the lower-dimensional subspace,which can effectively reduce the computational cost of the algorithm,and is very suit-able for solving large-scale problems.According to the above characteristics,this paper studies the three-dimensional subspace gradient algorithm based on quadratic model.The research contents include:(1)the construction of subspace;(2)the approxi-mate model of objective function in a given subspace;(3)the optimal selec-tion of embedded parameters in the model.The main results of this paper are as follows:the construction methods of three-dimensional subspace based on two parameters and three parameters are studied.The corresponding three-dimensional subspace is constructed using the gradient of the current iteration point,the gradient of the previous iteration point and the latest search direction.On this basis,the quadratic approximation of the function is constructed to minimize the quadratic ap-proximation model of the objective function in the given subspace.Com-bined with the improved secant equation and the non-monotonic general-ized line search,the three-term conjugate gradient method with subspace techniques(TCGS)and dynamically adjusted subspace conjugate gradient method(DSCG)are obtained.Under general assumptions,it is proved that the TCGS algorithm with Wolfe line search is globally convergent,and for uniformly convex functions,the TCGS algorithm with the strong Wolfe line search is globally convergent.Under more mild assumptions,the global con-vergence of the DSCG algorithm with non-monotone generalized line search is obtained.In particular,for uniformly convex functions,the DSCG algo-rithm produces a point sequence of{f(xk)}R-linear convergence to f(x~*).Both algorithms show high efficiency and robustness in numerical experi-ments for solving general unconstrained problems.We also apply the DSCG algorithm to image restoration,which has an excellent performance in im-proving colo saturation and clarity.
Keywords/Search Tags:Subspace method, Conjugate gradient method, Quadratic approximation, Image restoration
PDF Full Text Request
Related items