Font Size: a A A

A Class Of Algorithms For Large Scale Sparse Unconstrained Optimization

Posted on:2006-01-06Degree:MasterType:Thesis
Country:ChinaCandidate:J X LiFull Text:PDF
GTID:2120360152485466Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
This dissertation studies a class of several algorithms for solving large scale sparse unconstrained optimization. Its main results may be summarized as follows :1. In Chapter 4, a symmetric triangular partitioned secant method for solving large scale sparse unconstrained optimization problems is constructed, which is a combination of a secant method and a finite difference method, and depends on a consistent partition of the columns of the lower triangular part of the Hessian matrix; it reduces the number of gradient evaluations required by Powell and Toint's algorithm (the indirect lower triangular substitution algorithm) by one at every iterative step. A g-superlinear convergence result and an r-convergence rate estimate show that this method has good local convergence properties, and we sharpen error estimates and improve Kantorovich- type analyses for Broyden's algorithm and Schubert's algorithm.2. In Chapter 5, a new method with direct partitioned column updates of cholesky factorization for solving large scale sparse unconstrained optimization problems is proposed, which depends on a consistent partition of the columns and cholesky factorization of the Hessian matrix. This method employs an initial cholesky factorization and partitioned columns of the approximation Hessian and then corrects the corresponding columns of the diagonal factor and lower triangular factor directly and successively at each step. Iterations are generated using forward and backward substitution employing the update factorizations. A self-correcting property, a q-superlinear convergence result and an r-convergence rate estimate show that this method has good local convergence properties.3. Chapter 6 provides numerical experiments given by the algorithms in chapter 4 and chapter 5. The numerical results show that, for solving some large scale sparse unconstrained optimization problems, the two algorithms are obviously more effective than any other algorithms compared with in some aspects.
Keywords/Search Tags:algorithm, superlinear convergence, numerical example, unconstrained optimization, sparsity, finite difference, substitution, secant, Hessian, Cholesky, update, partitioned, self-correcting property
PDF Full Text Request
Related items