Organization

Lecture:  Wed 10am-12pm by Péter Koltai (peter.koltai@fu-berlin.de)
Exercise: Wed 4pm - 6pm by Mattes Mollenhauer (mattes.mollenhauer@fu-berlin.de)

Lecture

The lectures will be held online in Webex Meetings. The link to the upcoming event will be communicated through the Announcements.
Lecture notes and additional material will be provided in Resources.

To facilitate a smooth workflow, please consider the following:

  • Get comfortable with Webex in advance. See how to enter a meeting, and manage audio and video settings, as well as other features.
  • Get a good headset.
  • Before class: make sure you have everything prepared and set up optimal conditions given the circumstances (e.g., no unnecessary noise, good wifi reception).
  • During class: mute microphone unless you speak.

Exercise class

Tutorial sessions will be held online via Webex, just like the main lecture. There will be exercise sheets. It is expected that you prepare presentations of solutions to the exercises - you will not be required to submit written solutions in any form.

Possible ways to participate in the tutorial sessions are sharing your screen and presenting a particular exercise via

  • short and concise LaTeX notes;
  • a readable(!) scan/photo of a handwritten document;
  • deriving a solution live on a note taking app (Microsoft OneNote, Samsung Notes etc.) if you own a tablet (preferred);
  • additional ideas are always welcome.

Before presenting in a tutorial session, please perform a sceen share in a test session with the Webex application on your platform. 

We will discuss more details in the first tutorial session.

Examination

Time: Fri, July 24, 2020, 8am.
Location: Room L115, seminar center, address:
Silberlaube (Erdgeschoss)
Otto-von-Simson-Str. 26
14195 Berlin-Dahlem

  • No auxiliary material is allowed
  • Please bring a pen (blue or black color, no pencil)
  • Please bring your student ID and some official ID (such as ID card, passport, or driver's license)

Repeat examination

Time: Wed, September 2, 2020, 8am.
Location: Room SR031, Arnimallee 6 (Pi-building)

  • No auxiliary material is allowed
  • Please bring a pen (blue or black color, no pencil)
  • Please bring your student ID and some official ID (such as ID card, passport, or driver's license)

Content

In this lecture we will discuss the mathematical foundations of 'machine learning'. There are roughly two classes of methods: 'supervised learning' usually refers to methods that estimate a figure y = f(x) from given input data x1, ...., xn and associated output data y1, ..., yn, which links input and output data. With 'unsupervised learning' the output data is not available. Instead, you can try to find structure in the input data. This structure can be geometrical in nature (is the input data on a diversity?), or topological in nature (which input data is'similar'? Are there interesting subgroups? How are they connected?).

In the lecture we will work out the mathematical basics of different machine learning methods. Our focus will be to understand why (and in what sense) these methods work. We will deepen our understanding by means of many numerical examples. Focal points inlcude: kernel regression, support vector machines, learning of manifolds, spectral clustering, high-dimensional probability.

 

Prerequisites 

  • Calculus (Analysis I-II)
  • Linear algebra (Lineare Algebra I-II)
  • Basic probability theory (see Handout 1)
  • Basic programming (e.g., Matlab or Python)

 

Literature

  • [BBL] O. Bousquet, S. Boucheron, and G. Lugosi. Introduction to statistical learning theory. In Advanced lectures on machine learning, pp. 169–207. Springer, 2004.

  • [CZ] F. Cucker and D.-X. Zhou. Learning Theory: An Approximation Theory Viewpoint. Cambridge University Press, 2007.
  • [DGL] L. Devroye, L. Györfi, and G. Lugosi. A probabilistic theory of pattern recognition. In Applications of Mathematics: Stochastic Modelling and Applied Probability, volume 31. Springer, 2013.

  • [Roj] R. Rojas. Neural networks: a systematic introduction. Springer, 2013.

  • [SC] I. Steinwart and A. Christmann. Support Vector Machines. Springer, 2008.

  • [SS] A. J. Smola and B. Schölkopf. Learning with kernels. MIT Press, 2002.

  • [Vap] V. N. Vapnik. Statistical learning theory, volume 1. Wiley New York, 1998.

 

Additional Information

 

Acknowledgments

The lectures are based on material by Ralf Banisch from a previous instance of this course.