Welcome to Bayesian Classification!

This course is the sequel to Bayesian Linear Regression, and it's a part of my series on Bayesian Machine Learning. While the previous course looked at regression (predicting a numerical output), this course looks at classification (predicting a categorical output).

This course takes the Bayes classifier (which, despite its name, is not Bayesian), and makes it Bayesian by placing priors on its parameters. In this course we will study the Bayesian Bayes classifier through the lens of Naive Bayes, so it would be a good idea to have a good handle on Naive Bayes before starting this course.

How does this course compare to Bayesian Linear Regression? Bayesian Linear Regression introduced a lot of the necessary math needed for Bayesian Machine Learning, and it built upon the A/B Testing course (mainly the concept of conjugate priors and how to compute the posterior distribution). In this course, we will go faster through the math we've already seen, so that we can focus on the new and interesting parts. Unlike the Bayesian Linear Regression course, the real learning opportunity in this course is in implementing each algorithm you learn about.

Why Bayesian Machine Learning? The main advantage of using Bayesian Machine Learning is that it doesn't require you to find a best guess for the optimal model parameters (a point estimate). Instead, Bayesian ML allows us to integrate over*all possible* values of the parameters (of which there are usually an infinite number).

Suggested Prerequisites:

This course is the sequel to Bayesian Linear Regression, and it's a part of my series on Bayesian Machine Learning. While the previous course looked at regression (predicting a numerical output), this course looks at classification (predicting a categorical output).

This course takes the Bayes classifier (which, despite its name, is not Bayesian), and makes it Bayesian by placing priors on its parameters. In this course we will study the Bayesian Bayes classifier through the lens of Naive Bayes, so it would be a good idea to have a good handle on Naive Bayes before starting this course.

How does this course compare to Bayesian Linear Regression? Bayesian Linear Regression introduced a lot of the necessary math needed for Bayesian Machine Learning, and it built upon the A/B Testing course (mainly the concept of conjugate priors and how to compute the posterior distribution). In this course, we will go faster through the math we've already seen, so that we can focus on the new and interesting parts. Unlike the Bayesian Linear Regression course, the real learning opportunity in this course is in implementing each algorithm you learn about.

Why Bayesian Machine Learning? The main advantage of using Bayesian Machine Learning is that it doesn't require you to find a best guess for the optimal model parameters (a point estimate). Instead, Bayesian ML allows us to integrate over

Suggested Prerequisites:

- Python coding: if/else, loops, lists, dicts, sets
- Numpy and Pandas coding: matrix and vector operations, loading a CSV file
- Basic math: calculus, linear algebra, probability
- Naive Bayes classifiers
- Bayesian Machine Learning: A/B Testing in Python (know about conjugate priors)
- Bayesian Linear Regression: know about the posterior predictive distribution