This course focuses on algorithms to improve the performance and resource utilization of sparse linear solvers that solve systems of the form Ax = b, that appear in many numerical applications. The matrix A can be either dense or sparse, that is, mostly made of zero entries. Sparse matrices appear for instance when discretizing Partial Differential Equations (PDEs) on 2D and 3D finite element or finite volume meshes. A classical exemple is the modelization of the heat equation, where the continuous function is discretized into points. Each point corresponds to a row in the matrix A and non-zero entries of A correspond to interaction between close points, remote interactions being neglected. Graphs and other combinatorial objects are used to model matrices and computations with them. This course will present certain solvers and introduce methods from combinatorial optimization and scheduling domains that enable efficiency in those solvers.
Evaluation:
Bi-monthly homework assignments (1/3 mark):
HW1 and HW2 by B. Uçar
HW3 by G. Pichon, due on January 9th
A report and a presentation on a research topic of choice (2/3 mark):
In group of one or two students
Research article
6 pages report (deadline 19/01) and 30 min (+ 15 min questions) presentation
Presentation 23/01 and 27/01 (on Friday for once) with other students attending
For both the report and the presentation, we are asking for:
A presentation of the background regarding the work
The general idea of the work
A detailed analysis of some parts of the work
Discussions regarding the state of the art
You will have to show why the work presented in the paper is of interest and also to give some feedback on it (what could have be done, potential extension, parts which are not clear enough, missing information...)
Presentation on Monday, 23th January, 16h45. Taken by Yasaman Asgari and Fatemeh Gashemi Mohsen. From theory to practice, optimize the volume of communications for Cholesky factorization. I/O-Optimal Algorithms for Symmetric Linear Algebra Kernels
Lecture notes
Lecture 1 Notes, 14 Nov. Homework 1 is out with two questions (see the lecture notes), and is due 24 nov, before class.
Lecture 2, 17 Nov: Linear algebra refresher, Gaussian Elimination yields U of the LU decomposition; keep track of linear elementary row transfotmations Ln × ⋯L2 × L1 × A = U; then A = L1−1 × L2−1 × ⋯ × Ln−1 × U. Each Li can be inverted in-place, and the multiplication of the inverses is just the inverses in their place. What happens in the sparse case?
Lecture 3 Notes, 21 nov. Homework 2 is out with two questions (see the lecture notes), and is due 1 dec, before class.
A Note on Perfect Elimination Digraphs, D. J. Kleitman, SIAM Journal on Computing, 3 (1974), pp. 280--282 (doi).
Toward Characterization of Perfect Elimination Digraphs, L. Haskins and D. J. Rose, SIAM Journal on Computing, 2 (1973), pp. 217--224. doi, 1973.
Algorithmic aspects of vertex elimination on directed graphs, D. J. Rose and R. E. Tarjan, SIAM Journal on Applied Mathematics, 34 (1978), pp. 176--197. (doi).