Research On Subspace Minimization Conjugate Gradient Methods | | Posted on:2024-05-25 | Degree:Doctor | Type:Dissertation | | Country:China | Candidate:W M Sun | Full Text:PDF | | GTID:1520307340461544 | Subject:Applied Mathematics | | Abstract/Summary: | PDF Full Text Request | | Large-scale optimization problems are widely used in various fields such as engineering design,production management,economy and finance.Because of its advantages of simple iteration form,less memory consumption and good numerical effect,conjugate gradient method becomes the preferred choice for solving this kind of problem.With the increasing size and complexity of problems,the subspace technique is favored by many scholars because it does not require solving large-scale subproblems in each iteration.Therefore,how to improve the efficiency of conjugate gradient method by using subspace technique is of great theoretical value and practical significance for solving large-scale optimization problems.This paper focuses on accelerated subspace minimization conjugate gradient algorithms.Firstly,based on the existing theories and algorithms,we design effective acceleration strategies and propose several types of accelerated subspace minimization conjugate gradient algorithms that can effectively solve large-scale unconstrained optimization problems.Second,the convergence rate of the proposed algorithms is analyzed for convex and nonconvex cases.Finally,the accelerated subspace minimization conjugate gradient algorithms are applied to the solution of practical problems.The main work and innovative results are as follows:In the first part,based on the acceleration strategy,several classes of effective accelerated subspace minimization conjugate gradient(SMCG)algorithms are proposed,with the following three main aspects.1)An effective acceleration strategy is proposed,in which,combining the finite termination of linear conjugate gradient method,the acceleration parameter is derived by quadratic interpolation function to improve the stepsize,so that the modified stepsize is closer to the stepsize obtained by exact line search,and a detailed acceleration criterion is given to further improve the solution efficiency of the algorithm.The accelerated SMCG algorithm based on the BB idea and the accelerated SMCG algorithm based on the regularization model are designed and the global convergence of the two algorithms is established.The numerical results show that the two accelerated SMCG algorithms significantly outperform the algorithms before the acceleration.2)Four accelerated SMCG algorithms based on different regularization models are proposed in combination with acceleration strategies,in which,based on the property of the objective function at the iteration point,2-dimensional regularization and 3-dimensional regularization models of the objective function with different matrices on the two-dimensional subspace are constructed,the search direction is derived by minimizing this model and approximate quadratic model,and its important properties are discussed.Under the general assumptions,the global convergence of the proposed algorithms is established using the modified nonmonotone line search.Numerical results show that the proposed algorithm can efficiently solve the ill-conditioned problems in the CUTEr problem library.Moreover,for test problems of dimension not less than 50,the proposed algorithm is comparable to the limited-memory CG software CG DESCENT(6.8).3)An accelerated limited-memory SMCG algorithm is proposed based on the limited memory technique,in which an improved regularized Quasi-Newton method is presented by combining the regularization technique and the BFGS method to improve the orthogonality of the algorithm by solving the subproblem in a small dimensional subspace.A relatively simple acceleration criterion and an improved initial stepsize selection strategy are designed to improve the efficiency of the algorithm.In addition,the global convergence of the algorithm is established under weak conditions by using generalized nonmonotone line search.Numerical results show that the proposed algorithm is significantly better than ASMCG PR and outperforms the limited-memory CG software CG DESCENT(6.8)and CGOPT(2.0).The second part is a theoretical analysis,which investigates the convergence rate of the accelerated subspace minimization conjugate gradient algorithms proposed in the first part.At present,the convergence rate of the conjugate gradient method is established under the assumption that the objective function has some convexity,but this assumption condition is rather harsh for practical problems,and the relatively weak Krudyka-(?)ojasiewicz condition is more easily satisfied by the objective function instead.Based on this,firstly,the linear convergence rate of the accelerated SMCG algorithm on the sequence of function values is established under the assumption that the objective function has convexity.Secondly,under the general assumption,it is proved that any cluster of iterated point sequences generated by the accelerated SMCG algorithm is a stable point.Finally,under the assumption of Krudyka-(?)ojasiewicz property for nonconvex objective functions,the linear and sublinear convergence rates of the accelerated SMCG algorithm for function-valued sequences and iterated point sequences are established.The third part investigates the application of accelerated subspace minimization conjugate gradient algorithm to graph partitioning problems.For the graph partitioning problem with vertex weight constraints and fixed vertex constraints,a 0-1 quadratic integer programming model is constructed,and a recursive bipartition algorithm based on the accelerated SMCG algorithm is proposed.In order to reduce the difficulty of solving the model,the 0-1 quadratic integer programming is transformed into unconstrained optimization by using the idea of equilibrium term,elimination method and trigonometric function properties,and solved by using the accelerated SMCG algorithm.According to the optimal solution,the initial feasible partitioning result of the graph partitioning problem is generated by the hyperplane rounding algorithm,and two heuristic algorithms are further designed to improve the initial feasible partitioning result locally and finally obtain the approximate optimal solution of the graph partitioning problem by using recursive bipartition thought.Numerical experiments show that the proposed algorithm is effective and feasible for the graph partitioning problem under the knapsack constraint and some industrial examples. | | Keywords/Search Tags: | Conjugate gradient method, Subspace minimization, Regularization, Orthogonality, Limited memory, Krudyka-(?)ojasiewicz property, Global convergent, Convergence rate, Graph partitioning | PDF Full Text Request | Related items |
| |
|