It is very useful for

In a real-world environment, you can imagine that a robot or an

Do you ever wonder how we get the data that we use in our supervised machine learning algorithms?

We always seem to have a nice CSV or a table, complete with Xs and corresponding Ys.

If you haven’t been involved in acquiring data yourself, you might not have thought about this, but someone has to make this data!

Those “Y”s have to come from somewhere, and a lot of the time that involves manual labor.

Sometimes, you don’t have access to this kind of information or it is infeasible or costly to acquire.

But you still want to have some idea of the structure of the data. If you're doing

This is where unsupervised machine learning comes into play.

In this course we are first going to talk about clustering. This is where instead of training on labels, we try to create our own labels! We’ll do this by grouping together data that looks alike.

There are 2 methods of clustering we’ll talk about:

Next, because in machine learning we like to talk about probability distributions, we’ll go into

One interesting fact is that under certain conditions, Gaussian mixture models and k-means clustering are exactly the same! We’ll prove how this is the case.

All the algorithms we’ll talk about in this course are staples in machine learning and data science, so if you want to know how to automatically find patterns in your data with data mining and pattern extraction, without needing someone to put in manual work to label that data, then this course is for you.

All the materials for this course are FREE. You can download and install Python, Numpy, and Scipy with simple commands on Windows, Linux, or Mac.

This course focuses on

Suggested Prerequisites:

- calculus
- linear algebra
- probability
- Python coding: if/else, loops, lists, dicts, sets
- Numpy coding: matrix and vector operations, loading a CSV file

Tips for success:

- Use the video speed changer! Personally, I like to watch at 2x.
- Take handwritten notes. This will drastically increase your ability to retain the information.
- Write down the equations. If you don't, I guarantee it will just look like gibberish.
- Ask lots of questions on the discussion board. The more the better!
- Don't get discouraged if you can't solve every exercise right away. Sometimes it'll take hours, days, or maybe weeks!
- Write code yourself, this is an applied course! Don't be a "couch potato".

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.

READ MORE

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch. ...

READ MORE

- Introduction (05:03) (FREE preview available)
- Course Outline (04:34)
- What is unsupervised learning used for? (05:31)
- Why Use Clustering? (09:20)
- Where to get the code (04:36)
- How to Succeed in this Course (03:04)

- An Easy Introduction to K-Means Clustering (07:07)
- Hard K-Means: Exercise Prompt 1 (09:13)
- Hard K-Means: Exercise 1 Solution (11:09)
- Hard K-Means: Exercise Prompt 2 (05:04)
- Hard K-Means: Exercise 2 Solution (07:08)
- Hard K-Means: Exercise Prompt 3 (06:55)
- Hard K-Means: Exercise 3 Solution (16:22)
- Hard K-Means Objective: Theory (13:01)
- Hard K-Means Objective: Code (05:13)
- Visual Walkthrough of the K-Means Clustering Algorithm (Legacy) (02:59)
- Soft K-Means (05:42)
- The K-Means Objective Function (01:40)
- Soft K-Means in Python Code (10:04)
- How to Pace Yourself (03:19)
- Visualizing Each Step of K-Means (02:19)
- Examples of where K-Means can fail (07:33)
- Disadvantages of K-Means Clustering (02:14)
- How to Evaluate a Clustering (Purity, Davies-Bouldin Index) (06:34)
- Using K-Means on Real Data: MNIST (05:01)
- One Way to Choose K (05:16)
- K-Means Application: Finding Clusters of Related Words (08:39)
- Clustering for NLP and Computer Vision: Real-World Applications (06:58)
- Suggestion Box (03:10)

- Visual Walkthrough of Agglomerative Hierarchical Clustering (02:36)
- Agglomerative Clustering Options (03:39)
- Using Hierarchical Clustering in Python and Interpreting the Dendrogram (04:39)
- Application: Evolution (14:01)
- Application: Donald Trump vs. Hillary Clinton Tweets (18:35)

- Gaussian Mixture Model (GMM) Algorithm (15:31)
- Write a Gaussian Mixture Model in Python Code (18:54)
- Practical Issues with GMM / Singular Covariance (09:07)
- Comparison between GMM and K-Means (03:55)
- Kernel Density Estimation (06:24)
- GMM vs Bayes Classifier (pt 1) (09:28)
- GMM vs Bayes Classifier (pt 2) (11:30)
- Expectation-Maximization (pt 1) (11:45)
- Expectation-Maximization (pt 2) (02:24)
- Expectation-Maximization (pt 3) (08:09)

- Description of the Gaussian Mixture Model and How to Train a GMM (03:05)
- Comparison between GMM and K-Means (01:45)
- Write a Gaussian Mixture Model in Python Code (10:00)
- Practical Issues with GMM / Singular Covariance (02:56)
- Kernel Density Estimation (02:11)
- Expectation-Maximization (02:02)
- Future Unsupervised Learning Algorithms You Will Learn (01:02)

- Pre-Installation Check (04:13)
- Anaconda Environment Setup (20:21)
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)

- How to Code Yourself (part 1) (15:55)
- How to Code Yourself (part 2) (09:24)
- Proof that using Jupyter Notebook is the same as not using it (12:29)
- Python 2 vs Python 3 (04:38)

- How to Succeed in this Course (Long Version) (10:25)
- Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:05)
- What order should I take your courses in? (part 1) (11:19)
- What order should I take your courses in? (part 2) (16:07)

- What is the Appendix? (02:48)
- Where to get discount coupons and FREE deep learning material (05:49)