Mathematische Aspekte in machine learning S20
to Whiteboard Site

Description

Organization

Lecture:  Wed 10am-12pm by Péter Koltai (peter.koltai@fu-berlin.de)
Exercise: Wed 4pm - 6pm by Mattes Mollenhauer (mattes.mollenhauer@fu-berlin.de)

Lecture

The lectures will be held online in Webex Meetings. The link to the upcoming event will be communicated through the Announcements.
Lecture notes and additional material will be provided in Resources.

To facilitate a smooth workflow, please consider the following:

  • Get comfortable with Webex in advance. See how to enter a meeting, and manage audio and video settings, as well as other features.
  • Get a good headset.
  • Before class: make sure you have everything prepared and set up optimal conditions given the circumstances (e.g., no unnecessary noise, good wifi reception).
  • During class: mute microphone unless you speak.

Exercise class

Tutorial sessions will be held online via Webex, just like the main lecture. There will be exercise sheets. It is expected that you prepare presentations of solutions to the exercises - you will not be required to submit written solutions in any form.

Possible ways to participate in the tutorial sessions are sharing your screen and presenting a particular exercise via

  • short and concise LaTeX notes;
  • a readable(!) scan/photo of a handwritten document;
  • deriving a solution live on a note taking app (Microsoft OneNote, Samsung Notes etc.) if you own a tablet (preferred);
  • additional ideas are always welcome.

Before presenting in a tutorial session, please perform a sceen share in a test session with the Webex application on your platform. 

We will discuss more details in the first tutorial session.

Examination

Time: Fri, July 24, 2020, 8am.
Location: Room L115, seminar center, address:
Silberlaube (Erdgeschoss)
Otto-von-Simson-Str. 26
14195 Berlin-Dahlem

  • No auxiliary material is allowed
  • Please bring a pen (blue or black color, no pencil)
  • Please bring your student ID and some official ID (such as ID card, passport, or driver's license)

Repeat examination

Time: Wed, September 2, 2020, 8am.
Location: Room SR031, Arnimallee 6 (Pi-building)

  • No auxiliary material is allowed
  • Please bring a pen (blue or black color, no pencil)
  • Please bring your student ID and some official ID (such as ID card, passport, or driver's license)

Content

In this lecture we will discuss the mathematical foundations of 'machine learning'. There are roughly two classes of methods: 'supervised learning' usually refers to methods that estimate a figure y = f(x) from given input data x1, ...., xn and associated output data y1, ..., yn, which links input and output data. With 'unsupervised learning' the output data is not available. Instead, you can try to find structure in the input data. This structure can be geometrical in nature (is the input data on a diversity?), or topological in nature (which input data is'similar'? Are there interesting subgroups? How are they connected?).

In the lecture we will work out the mathematical basics of different machine learning methods. Our focus will be to understand why (and in what sense) these methods work. We will deepen our understanding by means of many numerical examples. Focal points inlcude: kernel regression, support vector machines, learning of manifolds, spectral clustering, high-dimensional probability.

 

Prerequisites 

  • Calculus (Analysis I-II)
  • Linear algebra (Lineare Algebra I-II)
  • Basic probability theory (see Handout 1)
  • Basic programming (e.g., Matlab or Python)

 

Literature

  • [BBL] O. Bousquet, S. Boucheron, and G. Lugosi. Introduction to statistical learning theory. In Advanced lectures on machine learning, pp. 169–207. Springer, 2004.

  • [CZ] F. Cucker and D.-X. Zhou. Learning Theory: An Approximation Theory Viewpoint. Cambridge University Press, 2007.
  • [DGL] L. Devroye, L. Györfi, and G. Lugosi. A probabilistic theory of pattern recognition. In Applications of Mathematics: Stochastic Modelling and Applied Probability, volume 31. Springer, 2013.

  • [Roj] R. Rojas. Neural networks: a systematic introduction. Springer, 2013.

  • [SC] I. Steinwart and A. Christmann. Support Vector Machines. Springer, 2008.

  • [SS] A. J. Smola and B. Schölkopf. Learning with kernels. MIT Press, 2002.

  • [Vap] V. N. Vapnik. Statistical learning theory, volume 1. Wiley New York, 1998.

 

Additional Information

 

Acknowledgments

The lectures are based on material by Ralf Banisch from a previous instance of this course.

 

Basic Course Info

Course No Course Type Hours
19234501 Vorlesung 2
19234502 Übung 2

Time Span 15.04.2020 - 02.09.2020
Instructors
Péter Koltai

Study Regulation

0089c_MA120 2014, MSc Informatik (Mono), 120 LPs
0280b_MA120 2011, MSc Mathematik (Mono), 120 LPs
0280c_MA120 2018, MSc Mathematik (Mono), 120 LP
0496a_MA120 2016, MSc Computational Science (Mono), 120 LPs

Mathematische Aspekte in machine learning S20
to Whiteboard Site

Main Events

Day Time Location Details
Wednesday 10-12 A3/SR 119 Seminarraum 2020-04-15 - 2020-07-15

Accompanying Events

Day Time Location Details
Tuesday 14-16 A3/019 Seminarraum Übung 02
Wednesday 16-18 A6/SR 007/008 Seminarraum Übung 01
Sunday ? - ? Pseudotutorium zur Kapazitätsplanung - potentielle Übungsteilnehmer melden sich bitte hier an!

Mathematische Aspekte in machine learning S20
to Whiteboard Site

Most Recent Announcement

:  

Currently there are no public announcements for this course.


Older announcements

Mathematische Aspekte in machine learning S20
to Whiteboard Site

Currently there are no resources for this course available.
Or at least none which you're allowed to see with your current set of permissions.
Maybe you have to log in first.