Font Size: a A A

Solving Large Scale Matrix Eigenvalue Problem With Optimization Methods

Posted on:2008-07-05Degree:MasterType:Thesis
Country:ChinaCandidate:X Y QianFull Text:PDF
GTID:2120360215997310Subject:Computational Mathematics
Abstract/Summary:PDF Full Text Request
Firstly the source and the development of computing eigenvalues is summarized. Then a new truncated Newton method is proposed to solving the extreme eigenvalue of large- scale sparse symmetric matrix. It is well known that the Hessian matrix of Rayleigh quotient function is nearly singular and bad conditioned as iteration approaches the optimal. To overcome this, a new strategy which requires that the correction vector is orthogonal to the current approximate eigenvector is proposed. Under the strategy, the Hessian matrix of the Rayleigh quotient is positive definite as the iteration sufficiently approaches the minimal extreme eigenvalue, thus the sufficient optimal condition holds. In order to accelerate the convergence rate of the minimal extreme eigenvalue, we augment the subspace by the computed modified Newton direction, thus a subspace accelerated truncated Newton method is given. The convergence of the method is proved; further numerical experiments are done to demonstrate the convergence analysis.Similar to the single eigenvalue case, block truncated Newton and subspace accelerated block truncated Newton method in several extreme eigenvalues case are proposed. Theoretical analysis and numerical experiments are also given.
Keywords/Search Tags:symmetric matrix, eigenvalue, eigen-vector, truncated Newton method, subspace accelerated method
PDF Full Text Request
Related items