Machine Learning — Fundamentals & Advanced (5/10 ECTS)

Overview

This course provides a two-track path through modern ML.

  • Wednesdays 12-14 (Fundamentals, 5 ECTS): core concepts and classical methods—ideal if you need a concise, practice-oriented introduction.

  • Thursdays 14-16 (Advanced, +5 ECTS): deeper and newer topics that build directly on Wednesday’s lecture—recommended for data scientists and anyone taking 10 ECTS.

Format & Credits

  • 2 lectures/week: Wed = foundational (x.1), Thu = advanced (x.2).

  • 5 ECTS: attend & pass foundational topics (Wednesday lecture) + foundational assignments.

  • 10 ECTS: attend both lecture days + complete foundational & advanced assignments. 

What you’ll learn

By the end you can:

  • Frame problems for classification, regression, and unsupervised learning.

  • Train, tune, and validate models responsibly (avoid leakage; use nested CV).

  • Understand and implement trees, ensembles, linear & kernel methods.

  • Build and optimize neural networks (MLP, CNNs, RNNs, Transformers).

  • Apply representation learning (AE, VAE, contrastive, self-supervised).

  • Reason about Bayesian and generative modeling (GANs, diffusion).

Topics

Lecture # Lecture (ID & topic) Date
1.1 Introduction (Overview, KNN classifier) Oct 15
1.2 KNN for regression, DNNR Oct 16
2.1 Clustering (k-Means, DBSCAN) Oct 22
2.2 Hierarchical & Soft-Clustering (EM, GMM); Deep Embedded Clustering (DEC); Contrastive Clustering; Spectral Clustering Oct 23
3.1 Linear Models Oct 29
3.2 SVMs; Multinomial Logistic Regression (Softmax); Generalized Linear Models Oct 30
4.1 Principal Component Analysis (Dimensionality Reduction, Covariance Matrix, Gaussian Models) Nov 05
4.2 ICA; Nonlinear Dimensionality Reduction (t-SNE, UMAP) Nov 06
5.1 Model Validation Nov 12
5.2 Hyperoptimization, Ablation studies; metrics for validation; data leakage; Nested Cross-Validation Nov 13
6.1 Decision Trees, Bagging, Random Forest Nov 19
6.2 Ensembling: Extremely Randomized Trees; Rotation/Oblique Decision Trees; NODE (Neural Oblivious Decision Ensembles); DeepGBM (brief) Nov 20
7.1 Boosting (AdaBoost + Viola–Jones); Gradient-Boosted Trees (GBTs) Nov 26
7.2 XGBoost; CatBoost; LightGBM Nov 27
8.1 Multi-Layer Perceptron (classic + modern) Dec 03
8.2 Boltzmann Machines; Deep Boltzmann Machine (DBM); probabilistic modeling; Boltzmann generators; Hopfield networks Dec 04
9.1 Network Optimization (Gradient Descent + Backpropagation) Dec 10
9.2 Optimizers (Adam, RMSProp,…); non-gradient methods? (Evolutionary, Bayesian); Nesterov Accelerated Gradient (NAG); adaptive methods (AdaGrad, RMSProp, Adam) Dec 11
10.1 Convolutional Neural Networks (conv layer types; batch norm; dropout) Dec 17
10.2 Vision models (classification, detection, segmentation, pose); CNN architectures (VGG, ResNet, DenseNet, spatial transformer networks) Dec 18
11.1 Autoencoders & Variational Autoencoders; disentangled representation learning (β-VAE, factorVAE) Jan 07
11.2 Bayesian inference; Variational inference; Bayesian neural networks; MCMC Jan 08
12.1 Generative Adversarial Networks (GANs) Jan 14
12.2 Diffusion; Flow matching Jan 15
13.1 Recurrent Neural Networks (RNNs), LSTMs, GRU Jan 21
13.2 State Space Models (Mamba, Hyena) Jan 22
14.1 Attention & Transformers Jan 28
14.2 Large Language Models Jan 29
15.1 Contrastive Learning, SimCLR Feb 04
15.2 BYOL, I-JEPA, VICReg Feb 05
16.1 Recap + Q&A Feb 11
16.2 Recap + Q&A Feb 12

Assessment 

  • 5 ECTS: weekly assignments and exam, covering fundamental topics only

  • 10 ECTS: weekly assignments and exam, covering fundamental + advanced topics

Tutorials

Prerequisites & tools

  • Comfort with linear algebra, calculus, probability, and Python (NumPy/PyTorch/Scikit-learn).

  • We provide notebooks and data; coding is required for both tracks.

Discord Server for communication - https://discord.gg/TGFVFAH3c6