English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 30 lectures (4h 47m) | 1.19 GB
Fundamentals of Bayesian Machine Learning Parametric Models
Welcome to Bayesian Linear Regression!
I first started this course series on Bayesian Machine Learning many years ago, with a course on A/B Testing. I had always intended to expand the series (there’s a lot to cover!) but kept getting pulled in other directions.
Today, I am happy to announce that the Bayesian Machine Learning series is finally back on track!
In the first course, a lot of students asked, “but where is the ‘machine learning’?”, since they thought of machine learning from the typical supervised/unsupervised parametric model paradigm. The A/B Testing course was never meant to look at such models, but that is exactly what this course is for.
If you’ve studied machine learning before, then you know that linear regression is the first model everyone learns about. We will approach Bayesian Machine Learning the same way.
Bayesian Linear Regression has many nice properties (easy transition from non-Bayesian Linear Regression, closed-form solutions, etc.). It is best and most efficient “first step” into the world of Bayesian Machine Learning.
Also, let’s not forget that Linear Regression (including the Bayesian variety) is simply very practical in the real-world. Bayesian Machine Learning can get very mathematical, so it’s easy to lose sight of the big picture – the real-world applications. By exposing yourself to Bayesian ideas slowly, you won’t be overwhelmed by the math. You’ll always keep the application in mind.
It should be stated however: Bayesian Machine Learning really is very mathematical. If you’re looking for a scikit-learn-like experience, Bayesian Machine Learning is definitely too high-level for you. Most of the “work” involves algebraic manipulation. At the same time, if you can tough it out to the end, you will find the results really satisfying, and you will be awed by its elegance.
Sidenote: If you made it through my Linear Regression and A/B Testing courses, then you’ll do just fine.
Suggested Prerequisites:
- Python coding: if/else, loops, lists, dicts, sets
- Numpy and Pandas coding: matrix and vector operations, loading a CSV file
- Basic math: calculus, linear algebra, probability
- Linear regression
- Bayesian Machine Learning: A/B Testing in Python (know about conjugate priors)
What you’ll learn
- Understand Bayesian Linear Regression: Learn how Bayesian inference applies to linear regression using priors and posteriors.
- Derive and Implement the Model: Work through the math and code Bayesian Linear Regression from scratch in Python.
- Compare Bayesian vs. Frequentist Methods: Explore key differences and benefits of Bayesian over traditional linear regression.
- Apply Bayesian Regression to Data: Use probabilistic modeling to analyze real-world datasets and quantify uncertainty.
Table of Contents
Introduction
1 Introduction
2 Outline
3 Where to get the code
4 The Big Picture (Optional)
5 How to Succeed in this Course
6 What Are Dog Food Lectures
Review of Classical Linear Regression
7 Simple Linear Regression Review
8 Distribution of w Estimate
9 Linear Regression Review Dog Food
10 Relationship to Maximum Likelihood Estimation
11 MAP Estimation
12 MLE and MAP Dog Food
13 Suggestion Box
Bayesian Linear Regression With One Input
14 The Bayesian Approach
15 Review of Conjugate Priors
16 Training Posterior w
17 Making Predictions (pt 1)
18 Making Predictions (pt 2)
19 Making Predictions (pt 3)
20 Training Dog Food
21 Prediction Dog Food
Bayesian Linear Regression With Multiple Inputs
22 Multivariate Bayesian Linear Regression (Fitting)
23 Multivariate Bayesian Linear Regression (Predictions)
Bayesian Linear Regression in Code
24 Code Preparation
25 Code
Appendix & FAQ
26 How to Succeed in this Course (Long Version)
27 Machine Learning and AI Prerequisite Roadmap (pt 1)
28 Machine Learning and AI Prerequisite Roadmap (pt 2)
29 Where to Get the Code Troubleshooting
30 BONUS
Resolve the captcha to access the links!