Общее
Yosef Saad. Iterative Methods for Sparse Linear Systems
Yosef Saad. Iterative Methods for Sparse Linear Systems
Lecture 1. Background in Linear Algebra. Basic
definitions. Types and structures of square matrices.
Lecture 2 Vector and matrix norms. Range and kernel. Existence of Solution. Orthonormal vectors.
Lecture 3. Gram-Schmidt process. Eigenvalues and their multiplicities. Basic matrix factorizations and canonical forms: QR, diagonal form.
Lecture 4. Basic matrix factorizations: Jordan form, Schur formSVD, LU, Cholessky.Lecture 5. Properties of normal, Hermittian matrices and positive definite matrices. Perturbation analysis and condition
number. Errors and costs.
Lectures 6-7. Discretization of partial differential equations (PDEs). Finite
differences. 1D Poisson’s equation. 2D Poisson’s equation. Overview of Finite
element method. Assembly process in FEM.
Lecture 8. Structures and graphs representations of sparse matrices. Storage schemes for sparse matrices. Algorithms for matrix by vector multiplication.
Lecture 9. Comparison of direct and iterative methods. Overview of direct solution methods. Direct sparse methods (Gaussian elimination with partial pivoting).
Lecture 10. Iterative methods: general idea and convergence criterion. Classic iterative methods: Jacobi, Gauss-Seidel, Successive Over Relaxation (SOR), Symmetric Successive Over Relaxation (SSOR). Properties of diagonally dominant matrices, location of matrix eigenvalues. Convergence criteria for iterative methods.
Lecture 11. Projection methods: general formulation of a projection method. One-dimensional projection methods: Steepest Descent method (SDM), Minimal Residual Iteration method (MRIM), Residual Norm Steepest Descent method (RNSD).
Lecture 12. Krylov subspace methods. Definition of Krylov suspace. General formulation of a Krylov subspace method. The process of Arnoldi orthogonalization to form a basis for Krylov subspace. Arnoldi relation and its properties.
Lecture 13. Methods based on Arnoldi process: Full Orthogonalization method (FOM). Derivation of FOM, restarted FOM.
Lecture 14. Methods based on Arnoldi process: Generalized Minimal Residual method (GMRES). Givens rotations in GRMRES. Calculation of residual in FOM and GMRES. Residual polynomials.Lecture 15. Lanczos orthogonalization for symmetric systems. Lanczos methods for symmetric systems: classic and direct. Derivation of Direct Lanczos method. Derivation of Conjugate Gradient method (CG) for systems with symmetric positive definite matrices. Generalization of CG for systems with Hermitian and nonsymmetric matrices: Conjugate Residual (CR), Generalized Conjugate Residual (GCR).
Lecture 16. Lanczos biorthogonalization for nonsymmetric systems. Classic Lanczos method for nonsymmetric systems. Derivation of Biconjugate Gradient method (BiCG). Overview: Efficient and optimal methods.
Lecture 17. Basic ideas of preconditioning technique. Examples of preconditioners: Jacobi, Gauss-Seidel, SOR and SSOR preconditioners, incomplete LU-factorization preconditioners. Extra class study: Preconditioned Krylov Subspace methods. Preconditioned Conjugate Gradient method (PCG), Split Preconditioned Conjugate Gradient method (Split PCG). Preconditioned Generalized Minimal Residual method, algorithms of GRMES with left and right preconditioning.