Lecture: Wed 10am-12pm by Péter Koltai (peter.koltai@fu-berlin.de)
Exercise: Wed 4pm - 6pm by Mattes Mollenhauer (mattes.mollenhauer@fu-berlin.de)
The lectures will be held online in Webex Meetings. The link to the upcoming event will be communicated through the Announcements.
Lecture notes and additional material will be provided in Resources.
To facilitate a smooth workflow, please consider the following:
Tutorial sessions will be held online via Webex, just like the main lecture. There will be exercise sheets. It is expected that you prepare presentations of solutions to the exercises - you will not be required to submit written solutions in any form.
Possible ways to participate in the tutorial sessions are sharing your screen and presenting a particular exercise via
Before presenting in a tutorial session, please perform a sceen share in a test session with the Webex application on your platform.
We will discuss more details in the first tutorial session.
Time: Fri, July 24, 2020, 8am.
Location: Room L115, seminar center, address:
Silberlaube (Erdgeschoss)
Otto-von-Simson-Str. 26
14195 Berlin-Dahlem
Time: Wed, September 2, 2020, 8am.
Location: Room SR031, Arnimallee 6 (Pi-building)
In this lecture we will discuss the mathematical foundations of 'machine learning'. There are roughly two classes of methods: 'supervised learning' usually refers to methods that estimate a figure y = f(x) from given input data x1, ...., xn and associated output data y1, ..., yn, which links input and output data. With 'unsupervised learning' the output data is not available. Instead, you can try to find structure in the input data. This structure can be geometrical in nature (is the input data on a diversity?), or topological in nature (which input data is'similar'? Are there interesting subgroups? How are they connected?).
In the lecture we will work out the mathematical basics of different machine learning methods. Our focus will be to understand why (and in what sense) these methods work. We will deepen our understanding by means of many numerical examples. Focal points inlcude: kernel regression, support vector machines, learning of manifolds, spectral clustering, high-dimensional probability.
[BBL] O. Bousquet, S. Boucheron, and G. Lugosi. Introduction to statistical learning theory. In Advanced lectures on machine learning, pp. 169–207. Springer, 2004.
[DGL] L. Devroye, L. Györfi, and G. Lugosi. A probabilistic theory of pattern recognition. In Applications of Mathematics: Stochastic Modelling and Applied Probability, volume 31. Springer, 2013.
[Roj] R. Rojas. Neural networks: a systematic introduction. Springer, 2013.
[SC] I. Steinwart and A. Christmann. Support Vector Machines. Springer, 2008.
[SS] A. J. Smola and B. Schölkopf. Learning with kernels. MIT Press, 2002.
[Vap] V. N. Vapnik. Statistical learning theory, volume 1. Wiley New York, 1998.
The lectures are based on material by Ralf Banisch from a previous instance of this course.
Course No | Course Type | Hours |
---|---|---|
19234501 | Vorlesung | 2 |
19234502 | Übung | 2 |
Time Span | 15.04.2020 - 02.09.2020 |
---|---|
Instructors |
Péter Koltai
|
0089c_MA120 | 2014, MSc Informatik (Mono), 120 LPs |
0280b_MA120 | 2011, MSc Mathematik (Mono), 120 LPs |
0280c_MA120 | 2018, MSc Mathematik (Mono), 120 LP |
0496a_MA120 | 2016, MSc Computational Science (Mono), 120 LPs |
Day | Time | Location | Details |
---|---|---|---|
Wednesday | 10-12 | A3/SR 119 Seminarraum | 2020-04-15 - 2020-07-15 |
Day | Time | Location | Details |
---|---|---|---|
Tuesday | 14-16 | A3/019 Seminarraum | Übung 02 |
Wednesday | 16-18 | A6/SR 007/008 Seminarraum | Übung 01 |
Sunday | ? - ? | Pseudotutorium zur Kapazitätsplanung - potentielle Übungsteilnehmer melden sich bitte hier an! |