## Тематический план

• ### Общее

• • • • Yosef Saad. Iterative Methods for Sparse Linear Systems

• • • • ### Exam

• • • • • ### Individual project

• • • • • • ### Module 1. Lecture 1

Lecture 1. Background in Linear Algebra. Basic definitions. Types and structures of square matrices. Vector and matrix norms.

• • • • • • • • ### Lectures 2-4

Lecture 2. Range and kernel. Existence of Solution. Orthonormal vectors. Gram-Schmidt process. Thin and full QR-factorization.

Lecture 3.  Eigenvalues and their multiplicities. Canonical forms by similarity transformation: diagonal form,  Jordan form, Schur form. Other matrix factorizations: SVD, LU, Cholessky. Positive definite matrices.

Lecture 4. Properties of normal and Hermittian matrices. Powers of matrices. Perturbation analysis and condition number. Errors and costs.
• • • • • ### Lectures 5-7

Lectures 5-6. Discretization of partial differential equations (PDEs). Finite differences. 1D Poisson’s equation. 2D Poisson’s equation. Overview of Finite element method. Assembly process in FEM.

Lecture 7. Structures and graphs representations of sparse matrices. Storage schemes for sparse matrices. Algorithms for matrix by vector multiplication.

• • • • • ### Module 2. Lectures 8-10

Lecture 8. Comparison of direct and iterative methods. Overview of direct solution methods. Direct sparse methods (Gaussian elimination with partial pivoting).

Lecture 9. Iterative methods: general idea and convergence criterion. Classic iterative methods: Jacobi, Gauss-Seidel, Successive Over Relaxation (SOR), Symmetric Successive Over Relaxation (SSOR). Properties of diagonally dominant matrices, location of matrix eigenvalues. Convergence criteria for iterative methods.

Lecture 10. Projection methods: general formulation of a projection method. One-dimensional projection methods: Steepest Descent method (SDM), Minimal Residual Iteration method (MRIM), Residual Norm Steepest Descent method (RNSD).

• • • • • • • • • ### Module 3. Lectures 11-14

Lecture 11. Krylov subspace methods. Definition of Krylov suspace. General formulation of a Krylov subspace method. The process of Arnoldi orthogonalization to form a basis for Krylov subspace. Arnoldi relation and its properties. Methods based on Arnoldi process: Full Orthogonalization method (FOM). Derivation of FOM, restarted FOM.

Lecture 12. Methods based on Arnoldi process: Generalized Minimal Residual method (GMRES). Givens rotations in GRMRES. Calculation of residual in FOM and GMRES. Residual polynomials.

Lecture 13. Lanczos orthogonalization for symmetric systems. Lanczos methods for symmetric systems: classic and direct. Derivation of Direct Lanczos method.

Lecture 14. Derivation of Conjugate Gradient method (CG) for systems with symmetric positive definite matrices. Generalization of CG for systems with Hermitian and nonsymmetric matrices: Conjugate Residual (CR), Generalized Conjugate Residual (GCR). Lanczos biorthogonalization for nonsymmetric systems. Classic Lanczos method for nonsymmetric systems.

• • • • • • • ### Lectures 15-16

Lecture 15. Derivation of Biconjugate Gradient method (BiCG). Overview: Efficient and optimal methods. Basic ideas of preconditioning technique. Examples of preconditioners: Jacobi, Gauss-Seidel, SOR and SSOR preconditioners, incomplete LU-factorization preconditioners.

Lecture 16. Preconditioned Krylov Subspace methods. Preconditioned Conjugate Gradient method (PCG), Split Preconditioned Conjugate Gradient method (Split PCG). Preconditioned Generalized Minimal Residual method, algorithms of GRMES with left and right preconditioning.

• • • • • 