- Individual project
- Module 1. Lecture 1
Module 1. Lecture 1
Lecture 1. Background in Linear Algebra. Basic definitions. Types and structures of square matrices.
- Lectures 2-5
Lecture 2 Vector and matrix norms. Range and kernel. Existence of Solution. Orthonormal vectors. Gram-Schmidt process.
Lecture 3. Eigenvalues and their multiplicities. Basic matrix factorizations and canonical forms: QR, diagonal form, Jordan form, Schur form.Lecture 4. Basic matrix factorizations: SVD, LU, Cholessky. Properties of normal, Hermittian matrices and positive definite matrices.
Lecture 5. Perturbation analysis and condition number. Errors and costs.
- Lectures 6-8
Lectures 6-7. Discretization of partial differential equations (PDEs). Finite differences. 1D Poisson’s equation. 2D Poisson’s equation. Overview of Finite element method. Assembly process in FEM.
Lecture 8. Structures and graphs representations of sparse matrices. Storage schemes for sparse matrices. Algorithms for matrix by vector multiplication.
- Module 2. Lectures 9-12
Module 2. Lectures 9-12
Lecture 9. Comparison of direct and iterative methods. Overview of direct solution methods. Direct sparse methods (Gaussian elimination with partial pivoting).
Lecture 10. Iterative methods: general idea and convergence criterion. Classic iterative methods: Jacobi, Gauss-Seidel, Successive Over Relaxation (SOR), Symmetric Successive Over Relaxation (SSOR). Properties of diagonally dominant matrices, location of matrix eigenvalues. Convergence criteria for iterative methods.
Lecture 11-12. Projection methods: general formulation of a projection method. One-dimensional projection methods: Steepest Descent method (SDM), Minimal Residual Iteration method (MRIM), Residual Norm Steepest Descent method (RNSD).
- Module 3. Lectures 13-15
Module 3. Lectures 13-15
Lecture 13. Krylov subspace methods. Definition of Krylov suspace. General formulation of a Krylov subspace method. The process of Arnoldi orthogonalization to form a basis for Krylov subspace. Arnoldi relation and its properties.
Lecture 14. Methods based on Arnoldi process: Full Orthogonalization method (FOM). Derivation of FOM, restarted FOM.Lecture 15. Methods based on Arnoldi process: Generalized Minimal Residual method (GMRES). Givens rotations in GRMRES. Calculation of residual in FOM and GMRES. Residual polynomials.
- Lectures 16-18
Lecture 16. Lanczos orthogonalization for symmetric systems. Lanczos methods for symmetric systems: classic and direct. Derivation of Direct Lanczos method. Derivation of Conjugate Gradient method (CG) for systems with symmetric positive definite matrices. Generalization of CG for systems with Hermitian and nonsymmetric matrices: Conjugate Residual (CR), Generalized Conjugate Residual (GCR).
Lecture 17. Lanczos biorthogonalization for nonsymmetric systems. Classic Lanczos method for nonsymmetric systems. Derivation of Biconjugate Gradient method (BiCG). Overview: Efficient and optimal methods.
Lecture 18. Basic ideas of preconditioning technique. Examples of preconditioners: Jacobi, Gauss-Seidel, SOR and SSOR preconditioners, incomplete LU-factorization preconditioners. Extra class study: Preconditioned Krylov Subspace methods. Preconditioned Conjugate Gradient method (PCG), Split Preconditioned Conjugate Gradient method (Split PCG). Preconditioned Generalized Minimal Residual method, algorithms of GRMES with left and right preconditioning.