Probabilistic machine learning has gained a lot of practical relevance over the past 15 years as it is highly data-efficient, allows practitioners to easily incorporate domain expertise and, due to the recent advances in efficient approximate inference, is highly scalable. Moreover, it has close relations to causal inference which is one of the key methods for measuring cause-effect relationships of machine learning models and explainable artificial intelligence. This course will introduce all recent developments in probabilistic modeling and inference. It will cover both the theoretical as well as practical and computational aspects of probabilistic machine learning. In the course, we will implement all the inference techniques and apply them to real-world problems.
Probability | 01:18:09 |
---|
Julia | 01:31:23 |
---|
Inference & Decision Making | 01:17:35 |
---|
Tutorial 2 - Recap Theory Unit 1 & 2 | 01:08:55 |
---|
Graphical Models: Independence | 01:13:09 |
---|
Tutorial | 00:56:37 |
---|
Graphical Models: Inference | 01:21:29 |
---|
Tutorial 4 - Recap Theory Unit 3 & 4 | 01:15:19 |
---|
Bayesian Ranking | 01:32:20 |
---|
Practical Tutorial | 00:44:41 |
---|
Linear Basis Function Models | 01:21:19 |
---|
Tutorial 6 - Recap Theory Unit 5 | 01:21:47 |
---|
Tutorial 7 - Recap Theory Unit 6 | 01:26:37 |
---|
Bayesian Regression | 01:24:34 |
---|
Practical Tutorial | 00:40:42 |
---|
Tutorial 9 - Recap Theory Unit 8 | 01:16:26 |
---|
Non-Bayesian Classification | 01:28:49 |
---|
Tutorial 10 - Recap Theory Unit 9 | 01:05:29 |
---|
Gaussian Processes | 01:25:21 |
---|
Practical Tutorial | 00:46:02 |
---|
Information Theory | 00:59:51 | |
---|---|---|
Audio starts | 00:36:24 |
Real-World Applications | 01:08:00 |
---|
Exam Preparation | 01:00:43 |
---|