Unsupervised Deep Learning in Python

Autoencoders and Restricted Boltzmann Machines for Deep Neural Networks in Theano / Tensorflow, plus t-SNE and PCA

Register for this Course

$19.00 $120.00 USD 84% OFF!

Login or signup to register for this course

Have a coupon? Click here.

Course Data

Lectures: 82
Length: 09h 59m
Skill Level: All Levels
Languages: English
Includes: Lifetime access, 30-day money back guarantee

Course Description

This course is the next logical step in my deep learning, data science, and machine learning series. I’ve done a lot of courses about deep learning, and I just released a course about unsupervised learning, where I talked about clustering and density estimation. So what do you get when you put these 2 together? Unsupervised deep learning!

In these course we’ll start with some very basic stuff - principal components analysis (PCA), and a popular nonlinear dimensionality reduction technique known as t-SNE (t-distributed stochastic neighbor embedding).

Next, we’ll look at a special type of unsupervised neural network called the autoencoder. After describing how an autoencoder works, I’ll show you how you can link a bunch of them together to form a deep stack of autoencoders, that leads to better performance of a supervised deep neural network. Autoencoders are like a non-linear form of PCA.

Last, we’ll look at Restricted Boltzmann Machines (RBMs). These are yet another popular unsupervised neural network, that you can use in the same way as autoencoders to pretrain your supervised deep neural network. I’ll show you an interesting way of training restricted Boltzmann machines, known as Gibbs sampling, a special case of Markov Chain Monte Carlo, and I’ll demonstrate how even though this method is only a rough approximation, it still ends up reducing other cost functions, such as the one used for autoencoders. This method is also known as Contrastive Divergence or CD-k. As in physical systems, we define a concept called free energy> and attempt to minimize this quantity.

Finally, we’ll bring all these concepts together and I’ll show you visually what happens when you use PCA and t-SNE on the features that the autoencoders and RBMs have learned, and we’ll see that even without labels the results suggest that a pattern has been found.

All the materials used in this course are FREE. Since this course is the 4th in the deep learning series, I will assume you already know calculus, linear algebra, and Python coding. You'll want to install Numpy, Theano and Tensorflow for this course. These are essential items in your data analytics toolbox.

If you are interested in deep learning and you want to learn about modern deep learning developments beyond just plain backpropagation, including using unsupervised neural networks to interpret what features can be automatically and hierarchically learned in a deep learning system, this course is for you.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.



HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • calculus
  • linear algebra
  • probability
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • linear regression, logistic regression
  • neural networks and backpropagation
  • Can write a feedforward neural network in Theano and TensorFlow


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.
  • Write code yourself, don't just sit there and look at my code.

Lectures

Introduction and Outline

  1. Introduction and Outline (01:55) (FREE preview available)
  2. Where does this course fit into your deep learning studies? (02:58)
  3. How to Succeed in this Course (03:13)
  4. Where to get the code and data (05:03)
  5. Tensorflow or Theano - Your Choice! (04:09)
  6. What are the practical applications of unsupervised deep learning? (05:35)

Principal Components Analysis

  1. What does PCA do? (04:32)
  2. How does PCA work? (11:21)
  3. Why does PCA work? (PCA derivation) (10:12)
  4. PCA only rotates (05:29)
  5. MNIST visualization, finding the optimal number of principal components (03:40)
  6. PCA implementation (03:29)
  7. PCA for NLP (03:37)
  8. PCA objective function (02:06)
  9. PCA Application: Naive Bayes (09:51)
  10. SVD (Singular Value Decomposition) (10:58)

t-SNE (t-distributed Stochastic Neighbor Embedding)

  1. t-SNE Theory (04:28)
  2. t-SNE Visualization (04:33)
  3. t-SNE on the Donut (05:51)
  4. t-SNE on XOR (04:37)
  5. t-SNE on MNIST (02:13)

Autoencoders

  1. Autoencoders (03:20)
  2. Denoising Autoencoders (01:56)
  3. Stacked Autoencoders (03:33)
  4. Writing the autoencoder class in code (Theano) (11:56)
  5. Testing our Autoencoder (Theano) (03:05)
  6. Writing the deep neural network class in code (Theano) (12:43)
  7. Autoencoder in Code (Tensorflow) (08:29)
  8. Testing greedy layer-wise autoencoder training vs. pure backpropagation (03:34)
  9. Cross Entropy vs. KL Divergence (04:40)
  10. Deep Autoencoder Visualization Description (01:33)
  11. Deep Autoencoder Visualization in Code (11:14)
  12. An Autoencoder in 1 Line of Code (04:50)

Restricted Boltzmann Machines

  1. Basic Outline for RBMs (04:52)
  2. Intro to RBMs (08:22)
  3. Motivation Behind RBMs (06:51)
  4. Intractability (03:11)
  5. Neural Network Equations (07:44)
  6. Training an RBM (part 1) (11:35)
  7. Training an RBM (part 2) (06:19)
  8. Training an RBM (part 3) - Free Energy (07:21)
  9. RBM Greedy Layer-Wise Pretraining (04:50)
  10. RBM in Code (Theano) and Greedy Layer-wise Pre-training on MNIST (14:24)
  11. RBM in Code (Tensorflow) (05:03)

The Vanishing Gradient Problem

  1. The Vanishing Gradient Problem Description (03:08)
  2. The Vanishing Gradient Problem Demo in Code (12:18)

Extras + Visualizing what features a neural network has learned

  1. Exercises on feature visualization and interpretation (02:08)

Applications to NLP (Natural Language Processing)

  1. Application of PCA and SVD to NLP (Natural Language Processing) (02:31)
  2. Latent Semantic Analysis in Code (10:08)
  3. Application of t-SNE + K-Means: Finding Clusters of Related Words (08:38)

Applications to Recommender Systems

  1. Recommender Systems Section Introduction (12:30)
  2. Why Autoencoders and RBMs work (05:58)
  3. Data Preparation and Logistics (05:34)
  4. Autoencoders (AutoRec) Discussion (10:14)
  5. Autoencoders (AutoRec) Code (11:45)
  6. Categorical RBM for Recommender System Ratings (11:32)
  7. Recommender RBM Code pt 1 (07:27)
  8. Recommender RBM Code pt 2 (04:17)
  9. Recommender RBM Code pt 3 (11:43)
  10. Speeding up the Recommender RBM Code (07:53)

Basics Review

  1. Theano Basics (07:48)
  2. Theano Neural Network in Code (09:18)
  3. Tensorflow Basics (07:27)
  4. Tensorflow Neural Network in Code (09:43)
  5. Keras Basics (06:49)
  6. Keras Neural Network in Code (06:38)
  7. Keras Functional API (04:26)

Optional - Legacy RBM Lectures

  1. Restricted Boltzmann Machine Theory (09:32)
  2. Deriving Conditional Probabilities from Joint Probability (06:18)
  3. Contrastive Divergence for RBM Training (02:45)
  4. How to derive the free energy formula (06:33)

Appendix

  1. What is the Appendix? (02:48)
  2. Windows-Focused Environment Setup 2018 (20:21)
  3. How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)
  4. Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:04)
  5. How to Code Yourself (part 1) (15:55)
  6. How to Code Yourself (part 2) (09:23)
  7. What order should I take your courses in? (part 1) (11:19)
  8. What order should I take your courses in? (part 2) (16:07)
  9. Python 2 vs Python 3 (04:38)
  10. How to Succeed in this Course (Long Version) (10:25)
  11. Where to get discount coupons and FREE deep learning material (02:21)