Randomized (numerical) linear algebra
Motivated largely by technological developments that generate extremely large scientific and Internet data sets, recent years have witnessed exciting developments in the theory and practice of matrix algorithms. Particularly remarkable is the use of randomization—typically assumed to be a property of the input data due to, e.g., noise in the data generation mechanisms—as an algorithmic or computational resource for the development of improved algorithms for fundamental matrix problems such as matrix multiplication, least-squares (LS) approximation, low-rank matrix approximation, etc. From an applied perspective, RandNLA is a vital new tool for machine learning, statistics, and data analysis.
In this seminar we will aquire knowledge of the core concepts of RandNLA and develop its fundamental algorithms.
Literature
- [DM] Petros Drineas & Michael W. Mahoney. Lectures on Randomized Numerical Linear Algebra. https://arxiv.org/abs/1712.08880v1
- [Mah] Michael W. Mahoney. Lecture Notes on Randomized Linear Algebra. https://arxiv.org/abs/1608.04481v1
- [Mar] Per-Gunnar Martinsson. Randomized methods for matrix computations. https://arxiv.org/abs/1607.01649
- [Ver] Roman Vershynin. Four lectures on probabilistic methods for data science. https://arxiv.org/abs/1612.06661
Additional Information
The seminar will consist of weekly student talks (~60 min) and following discussion. Each talk will be moderated by another student participant of the seminar. Every speaker should present to P. Koltai a detailed concept of their talk at least one week prior to the talk. Please make an appointment via peter.koltai@fu-berlin.de.
The final grade will be composed from the results of the own talk(s), and a short, 3-5 page summary of it. The summaries shall be submitted no later than February 2, 2020.
Topics will be assigned during the first class on Wednesday, Oct 16. These include:
- Concentration of sums of independent random variables (Vershynin, Lecture 1)
- Concentration of sums of independent random matrices (Vershynin, Lecture 2)
- Random Matrix Multiplication (Drineas, Chap 4)
- * A random projection algorithm for approximating matrix multiplication (Mahoney, Lecture 5)
- RandNLA Approaches for Regression Problems (Drineas, Chap 5)
- A RandNLA Algorithm for Low-rank Matrix Approximation (Drineas, Chap 6)
- * The CUR decomposition, The Nyström Method (Martinsson, Chap 9 & 11)
- * Fast Random Projections (Mahoney, Chap 8-10) , Matrix Completion (Mahoney, Chap 22-23; Vershynin, Lecture 3)
Topics marked with an * are optional, and may be replaced or dropped in due course.
Schedule
Date | Topic | Speaker | Moderator |
---|---|---|---|
30.10.19 | Linear algebra & discrete probability ([DM] 2-3) | Özmen | Li |
06.11.19 | Concentration of sums of scalars ([Ver] 1) | Merten | Özmen |
13.11.19 | Concentration of sums of matrices ([Ver] 2) | Sirin | Lieckfeld |
20.11.19 | Randomized matrix multiplication ([DM] 4) | Xie | Kassem |
27.11.19 | Covariance estimation and matrix completion ([Ver] 3] | Atlasov | Xie |
04.12.19 | RandNLA for regression problems ([DM] 5) | Kassem | Atlasov |
11.12.19 | No class | ||
18.12.19 | RandNLA for low-rank approximation ([DM] 6) | Miller | Sirin |
08.01.20 | CUR decomposition ([Mar] 9 & 11) | Li | Merten |
22.01.20 | Randomized graph sparsification | Cvetkovic | |
Random projection for matrix multiplication ([Mah] 5) | ??? | ??? |