Heidelberg University

Modern Bayesian Inference for Machine Learning

Melih Kandemir, Bosch Center for AI

Abstract:

Given a sufficiently large set of labeled samples, deep learning sets a well-established pipeline for accurately learning complex functional dependencies. The next grand challenge is to perform equally well with much fewer and noisier observations, in other words, to render learning models more intelligent. This course aims to build a technical background for this challenge. It adopts probability theory, its Bayesian account to be specific, as a principled way of modeling uncertainty. The course provides an overview of the most recent approaches to Bayesian inference in the machine learning community, putting particular focus on their application to deep neural networks. The course syllabus consists of: i) the Bayesian framework basics, ii) Variational Inference, iii) Markov Chain Monte Carlo Inference with Hamiltonian and Langevin Dynamics, iv) Deep Generative Networks: Generative Adversarial Nets and Variational Auto-Encoders, v) Stein Variational Gradient Descent. Probability theory and linear algebra are prerequisite backgrounds for the course and basic machine learning knowledge is a plus.

You can find the slides here:

Part 1

Download


Part 2

Download


Part 3

Download


Part 4

Download