Requirements:

Basics in linear algebra:


- Have routine in doing matrix-vector operations and their properties


- Matrix decompositions and properties (Eigenvalue d., Singular value d.)


- see: Linear Algebra 1+2, Numerics 1

Basics in calculus:


- Multivariate calculus, integration and differentiation, partial

derivatives


- Basics of optimization: Properties of minimum, maximum, saddle point


- see: Calculus 1+2

Basics in statistics:


- Random variables, PDF, CDF, moments and their properties.


- Transformations between random variables, Jacobians.

- see: Stochastics 1 or Statistical Physics 1

Basics in functional transforms:


- Fourier transform / DFT / FFT.

Programming:


Python. Numpy. Scipy. Jupyter notebooks. Git. Github


- Check the worksheets of this course to see if you are ready:


https://github.com/cwehmeyer/scipro
 

Additional Information

 

This lecture/lab course is suitable for Master students of Mathematics, Computer Science or Computational Sciences

Students of the Computational Sciences program can combine this lecture/lab course with 19234502 + 19234501 (Mathematical aspects in machine learning) to complete “complex algorithms A/B”

Physics modules matching this course are: BSc Complex Algorithms B, MSc Aufbaumodul Numerik IV

Qualification objectives: The students have a basic understanding of algebraic and computational methods for deep neural networks, their application scope and can practically build and train them with state-of-the-art software tools. They are familiar with typical deep learning structures and understand the relationship to their shallow counterparts.

Content:

- Perceptron

- Multilayer neural network and universal represenation theorem

- Backpropagation

- Deep feedforward networks

- Convolutional Neural Networks

- Autoencoder versus principal component analysis

- Time-autoencoder versus time-lagged independent component analysis

- Generative networks: Variational Autoencoders and Adversarial Generative Networks

- Active learning