Font Size: a A A

Research On The Iteration Method For Several Kinds Of Consistent And Inconsistent Constrained Matrix Equation

Posted on:2008-03-11Degree:DoctorType:Dissertation
Country:ChinaCandidate:Y H XiaoFull Text:PDF
GTID:1100360215479793Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
The purpose of the thesis is to study numerical methods for large-scale unconstrained optimization and bound constrained optimization. We establish the global convergence of these methods. We also test the proposed methods through a set of large-scale problems.In Chapter 2, based on the modified BFGS method by Wei, Li and Qi, we propose a limited memory BFGS method for large-scale unconstrained optimization. A good feature of the proposed method is that update formula use the information of the gradients and functions. We prove that the method is globally convergent if the objective function is uniformly convex. Numerical experiments indicate that our proposed method out performs the standard limited memory BFGS algorithm.In Chapter 3, based on the nonlinear conjugate gradient methods proposed by Dai-Liao and Li-Tang-Wei, respectively, we present two nonlinear conjugate gradient methods. An attractive property of the proposed methods is that the directions generated by the methods are always descent. This property is independent with the line search used. We show that both methods are globally convergent even for nonconvex problems. The reported numerical experiments indicate the performance of the proposed methods is as well as that of the standard PRP method.In Chapter 4-5, by the use of the active set identification technique proposed by Facchinei, Judice, and Soares, we propose two algorithms for large-scale bound constrained optimization. The method in Chapter 4 takes full advantage of the strict complementary assumption. By the use of backtracking technique, the method generates a sequence of feasible iterates. The method give in Chapter 5 used the gradient projection technique. It was shown that, both the proposed algorithms can add to or drop from the current estimated active set many constraints at each step. Under suitable conditions, we establish the global convergence theorem. We also test the proposed methods through a set of bound constrained optimization problems.In Chapter 6, the developed method is the modifications of the subspace limited memory quasi-Newton method by Ni and Yuan. An important property of the new approach is that more limited memory BFGS update are used. Extensive numerical results indicate the modifications are beneficial to the performance of the algorithm.In Chapter 7, based on the active set identification technique given by Facchinei, Judice, and Soares, we proposed an active set subspace Barzilai-Browein gradient algorithm for large-scale bound constrained optimization, and obtain its global convergence. We also present some numerical experiments to show the effectiveness of the proposed method.In Chapter 8, based on the active set identification technique given by Facchinei, Fischer, and Kanzow, we develop a projected Barzilai-Browein gradient algorithm for solving large-scale bound constrained degenerate optimization problems. We show that the proposed method with a nonmonotone line search is globally convergent. Numerical experiments indicate that the proposed method out performed better than the well-know SPG method did.
Keywords/Search Tags:Large-scale optimization, Unconstrained optimization, Conjugate gradient method, Limited memory BFGS method, Projection gradient, Barzilai-Browein gradient method, Global convergence
PDF Full Text Request
Related items