Deep Learning: GANs and Variational Autoencoders

Generative Adversarial Networks and Variational Autoencoders in Python, Theano, and Tensorflow

Register for this Course

$19.00 $180.00 USD 89% OFF!

Login or signup to register for this course

Have a coupon? Click here.

Course Data

Lectures: 48
Length: 06h 48m
Skill Level: All Levels
Languages: English
Includes: Lifetime access, 30-day money back guarantee

Course Description

Variational autoencoders and GANs have been 2 of the most interesting developments in deep learning and machine learning recently.

Yann LeCun, a deep learning pioneer, has said that the most important development in recent years has been adversarial training, referring to GANs.

GAN stands for generative adversarial network, where 2 neural networks compete with each other.

What is unsupervised learning?

Unsupervised learning means we’re not trying to map input data to targets, we’re just trying to learn the structure of that input data.

Once we’ve learned that structure, we can do some pretty cool things.

One example is generating poetry - we’ve done examples of this in the past.

But poetry is a very specific thing, how about writing in general?

If we can learn the structure of language, we can generate any kind of text. In fact, big companies are putting in lots of money to research how the news can be written by machines.

But what if we go back to poetry and take away the words?

Well then we get art, in general.

By learning the structure of art, we can create more art.

How about art as sound?

If we learn the structure of music, we can create new music.

Imagine the top 40 hits you hear on the radio are songs written by robots rather than humans.

The possibilities are endless!

You might be wondering, "how is this course different from the first unsupervised deep learning course?"

In this first course, we still tried to learn the structure of data, but the reasons were different.

We wanted to learn the structure of data in order to improve supervised training, which we demonstrated was possible.

In this new course, we want to learn the structure of data in order to produce more stuff that resembles the original data.

This by itself is really cool, but we'll also be incorporating ideas from Bayesian Machine Learning, Reinforcement Learning, and Game Theory. That makes it even cooler!

Thanks for reading and I’ll see you in class. =)



NOTES:

All the code for this course can be downloaded from my github:

https://github.com/lazyprogrammer/machine_learning_examples

In the directory: unsupervised_class3

Make sure you always "git pull" so you have the latest version!



HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

  • Calculus
  • Probability
  • Object-oriented programming
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations
  • Linear regression
  • Gradient descent
  • Know how to build a feedforward and convolutional neural network in Theano and TensorFlow


TIPS (for getting through the course):

  • Watch it at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Realize that most exercises will take you days or weeks to complete.
  • Write code yourself, don't just sit there and look at my code.


Lectures

Introduction and Outline

  1. Welcome (04:33) (FREE preview available)
  2. Where does this course fit into your deep learning studies? (05:00)
  3. Where to get the code and data (03:51)
  4. How to succeed in this course (05:19)

Generative Modeling Review

  1. What does it mean to Sample? (04:57)
  2. Sampling Demo: Bayes Classifier (03:57)
  3. Gaussian Mixture Model Review (10:31)
  4. Sampling Demo: Bayes Classifier with GMM (03:54)
  5. Why do we care about generating samples? (11:20)
  6. Neural Network and Autoencoder Review (07:26)
  7. Tensorflow Warmup (04:07)
  8. Theano Warmup (04:54)

Variational Autoencoders

  1. Variational Autoencoders Section Introduction (05:39)
  2. Variational Autoencoder Architecture (05:57)
  3. Parameterizing a Gaussian with a Neural Network (08:00)
  4. The Latent Space, Predictive Distributions and Samples (05:13)
  5. Cost Function (07:28)
  6. Tensorflow Implementation (pt 1) (07:18)
  7. Tensorflow Implementation (pt 2) (02:29)
  8. Tensorflow Implementation (pt 3) (09:55)
  9. The Reparameterization Trick (05:05)
  10. Theano Implementation (10:52)
  11. Visualizing the Latent Space (03:09)
  12. Bayesian Perspective (10:11)
  13. Variational Autoencoder Section Summary (04:02)

Generative Adversarial Networks (GANs)

  1. GAN - Basic Principles (05:13)
  2. GAN Cost Function (pt 1) (07:23)
  3. GAN Cost Function (pt 2) (04:56)
  4. DCGAN (07:38)
  5. Batch Normalization Review (08:01)
  6. Fractionally-Strided Convolution (08:35)
  7. Tensorflow Implementation Notes (13:23)
  8. Tensorflow Implementation (18:13)
  9. Theano Implementation Notes (07:26)
  10. Theano Implementation (19:47)
  11. GAN Summary (09:43)

Appendix

  1. What is the Appendix? (02:48)
  2. Windows-Focused Environment Setup 2018 (20:20)
  3. How to How to install Numpy, Theano, Tensorflow, etc... (17:22)
  4. Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:04)
  5. How to Succeed in this Course (Long Version) (10:24)
  6. How to Code by Yourself (part 1) (15:54)
  7. How to Code by Yourself (part 2) (09:23)
  8. What order should I take your courses in? (part 1) (11:18)
  9. What order should I take your courses in? (part 2) (16:07)
  10. Python 2 vs Python 3 (04:38)
  11. Is Theano Dead? (10:03)
  12. Where to get discount coupons and FREE deep learning material (02:20)

Extras

  • GAN Tutorial PDF
  • Variational Autoencoder Tutorial PDF
  • Pre-trained Style Transfer Network Ready for Use