Font Size: a A A

The Researches Of Iterative Methods And Preconditioning Techniques For Linear Systems And Saddle Point Problems

Posted on:2010-03-24Degree:DoctorType:Dissertation
Country:ChinaCandidate:L T ZhangFull Text:PDF
GTID:1100360275480056Subject:Applied Mathematics
Abstract/Summary:PDF Full Text Request
Solutions of large-scale sparse linear systems arise widely in large-scale scientific computing, and engineering disciplines. Moreover, solving large-scale sparse linear systems plays a key role in scientific and engineering computing and the corresponding time of computing always arrives at 80%, so more and more people pay attention to find effective numerical solutions of linear algebraic systems based on large-scale scientific and engineering computing. Solutions of large-scale sparse linear systems are always iterative methods, so convergence and convergence rate of iterative methods are deeply studied by many experts and scholars. The author mainly studies some iterative methods of special matrices related to iteration solutions of large-scale sparse linear systems, especially convergence of relaxed matrix multisplitting iterative methods, performance of two Krylov subspace methods, and preconditioning techniques of saddle point problems. This paper consists of four parts with six chapters.Research on relaxed matrix multisplitting iterative methods for H-matrices, and gives detailedly theoretical analysis and comparisons of convergence and divergence rates. On the one hand, we present relaxed matrix multisplitting TOR iterative method, study convergence of our method, compare the corresponding convergence and divergence rates, and carry out sequential and parallel experiments which show the validity of our method. On the other hand, we give relaxed matrix multisplitting USAOR iterative method, study convergence of our method, show the examples to put algorithms in practice, and carry out numerical experiments to compare with the corresponding existing methods.Further study new convergence of some relaxed matrix multisplitting iterative methods for H-matrices. We analyze non-stationary matrix multisplitting iterative methods for almost linear systems, matrix multisplitting iterative methods for the linear complementarity problems, relaxed matrix multisplitting SSOR iterative method, and relaxed matrix multisplitting TOR iterative method for linear systems, respectively. Moreover, this chapter studies convergence theory of the corresponding methods, finds new weaker convergence conditions, and carries out some comparisons of numerical experiments. Based on the parallelity of multisplitting, the study and theory of matrix multisplitting contributing to the structure of multisplitting preconditioners have many theoretical and applied values. Our methods have more optional parameters, so we can obtain more quick convergence rate and structure more effective preconditioners when choosing the approximately optimal relaxed parameters.Design Krylov subspace conjugate residual squared (CRS) algorithm for nonsym-metric linear systems and the improved conjugate residual squared (ICRS) algorithm for distributed parallel computing based on biconjugate residual method (BiCR) algorithm, and give theoretical analysis and comparisons of algorithms for two algorithms. In the end, sequential and parallel numerical experiments show that CRS and ICRS methods have more quick convergence rate than BiCR method and that ICRS method has better parallel performance than CRS method and so on.Research on generalized block preconditioning techniques for iteration solutions of saddle point problems which arise from the discretized time-harmonic Maxwell equations and interior point optimization method. On the one hand, new block triangular precondi-tioner for linear systems arising from interior point optimization method is obtained, and the corresponding theoretical analysis and numerical experiments are given to show the validity of our preconditioner. On the other hand, we deeply study block triangular preconditioners with multi-parameters based on block triangular saddle point linear systems and structure of special matrices arising from time-harmonic Maxwell equations, and give theoretical analysis and theoretical choice of optimal parameters. In particular, theoretical analysis shows that all eigenvalues of two generalized block triangular preconditioners are strongly clustered. Finally, numerical experiments show that the quality of the presented block triangular preconditioners is better than that of the corresponding block diagonal augmentation-free and Schur complement-free preconditioners.
Keywords/Search Tags:sparse nonsymmetric linear systems, H-matrix, M-matrix, eigenvalue, matrix multisplitting method, Krylov subspace method, conjugate residual squared method, global communication, preconditioning technique, saddle point problem
PDF Full Text Request
Related items