Deep Learning: Recurrent Neural Networks in Python

GRU, LSTM, + more modern deep learning, machine learning, and data science for sequences

Register for this Course

$29.99 $199.99 USD 85% OFF!

Login or signup to register for this course

Have a coupon? Click here.

Course Data

Lectures: 113
Length: 17h 21m
Skill Level: All Levels
Languages: English
Includes: Lifetime access, certificate of completion (shareable on LinkedIn, Facebook, and Twitter), Q&A forum

Course Description

Like the course I just released on Hidden Markov Models, Recurrent Neural Networks are all about learning sequences - but whereas Markov Models are limited by the Markov assumption, Recurrent Neural Networks are not - and as a result, they are more expressive, and more powerful than anything we’ve seen on tasks that we haven’t made progress on in decades.

So what’s going to be in this course and how will it build on the previous neural network courses and Hidden Markov Models?

In the first section of the course we are going to add the concept of time to our neural networks.

I’ll introduce you to the Simple Recurrent Unit, also known as the Elman unit.

We are going to revisit the XOR problem, but we’re going to extend it so that it becomes the parity problem - you’ll see that regular feedforward neural networks will have trouble solving this problem but recurrent networks will work because the key is to treat the input as a sequence.

In the next section of the course, we are going to revisit one of the most popular applications of recurrent neural networks - language modeling.

You saw when we studied Markov Models that we could do things like generate poetry and it didn’t look too bad. We could even discriminate between 2 different poets just from the sequence of parts-of-speech tags they used.

In this course, we are going to extend our language model so that it no longer makes the Markov assumption.

Another popular application of neural networks for language is word vectors or word embeddings. The most common technique for this is called Word2Vec, but I’ll show you how recurrent neural networks can also be used for creating word vectors.

In the section after, we’ll look at the very popular LSTM, or long short-term memory unit, and the more modern and efficient GRU, or gated recurrent unit, which has been proven to yield comparable performance.

We’ll apply these to some more practical problems, such as learning a language model from Wikipedia data and visualizing the word embeddings we get as a result.

All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

See you in class!



Suggested Prerequisites:

  • calculus
  • linear algebra
  • probability (conditional and joint distributions)
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • neural networks and backpropagation
  • the XOR problem
  • Can write a feedforward neural network in Theano and TensorFlow


Tips for success:

  • Use the video speed changer! Personally, I like to watch at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Don't get discouraged if you can't solve every exercise right away. Sometimes it'll take hours, days, or maybe weeks!
  • Write code yourself, this is an applied course! Don't be a "couch potato".

Testimonials and Success Stories


I am one of your students. Yesterday, I presented my paper at ICCV 2019. You have a significant part in this, so I want to sincerely thank you for your in-depth guidance to the puzzle of deep learning. Please keep making awesome courses that teach us!

I just watched your short video on “Predicting Stock Prices with LSTMs: One Mistake Everyone Makes.” Giggled with delight.

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.

Thank you for doing this! I wish everyone who call’s themselves a Data Scientist would take the time to do this either as a refresher or learn the material. I have had to work with so many people in prior roles that wanted to jump right into machine learning on my teams and didn’t even understand the first thing about the basics you have in here!!

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.

Thank you, I think you have opened my eyes. I was using API to implement Deep learning algorithms and each time I felt I was messing out on some things. So thank you very much.

I have been intending to send you an email expressing my gratitude for the work that you have done to create all of these data science courses in Machine Learning and Artificial Intelligence. I have been looking long and hard for courses that have mathematical rigor relative to the application of the ML & AI algorithms as opposed to just exhibit some 'canned routine' and then viola here is your neural network or logistical regression. ...

READ MORE

I have now taken a few classes from some well-known AI profs at Stanford (Andrew Ng, Christopher Manning, …) with an overall average mark in the mid-90s. Just so you know, you are as good as any of them. But I hope that you already know that.

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.

Hi Sir I am a student from India. I've been wanting to write a note to thank you for the courses that you've made because they have changed my career. I wanted to work in the field of data science but I was not having proper guidance but then I stumbled upon your "Logistic Regression" course in March and since then, there's been no looking back. I learned ANNs, CNNs, RNNs, Tensorflow, NLP and whatnot by going through your lectures. The knowledge that I gained enabled me to get a job as a Business Technology Analyst at one of my dream firms even in the midst of this pandemic. For that, I shall always be grateful to you. Please keep making more courses with the level of detail that you do in low-level libraries like Theano.

I just wanted to reach out and thank you for your most excellent course that I am nearing finishing.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)

Hello, Lazy Programmer!

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch. ...

READ MORE

By the way, if you are interested to hear. I used the HMM classification, as it was in your course (95% of the script, I had little adjustments there), for the Customer-Care department in a big known fintech company. to predict who will call them, so they can call him before the rush hours, and improve the service. Instead of a poem, I Had a sequence of the last 24 hours' events that the customer had, like: "Loaded money", "Usage in the food service", "Entering the app", "Trying to change the password", etc... the label was called or didn't call. The outcome was great. They use it for their VIP customers. Our data science department and I got a lot of praise.

Lectures

Welcome

4 Lectures · 22min
  1. Introduction and Outline (03:18) (FREE preview available)
  2. Where to get the code and data - instant access (01:42)
  3. How to use Github & Extra Coding Tips (Optional) (11:12)
  4. How to Succeed in this Course (05:52)

Google Colab

3 Lectures · 33min
  1. Intro to Google Colab, how to use a GPU or TPU for free (12:32)
  2. Uploading your own data to Google Colab (11:41)
  3. Where can I learn about Numpy, Scipy, Matplotlib, Pandas, and Scikit-Learn? (08:54)

Machine Learning and Neurons

13 Lectures · 01hr 39min
  1. Review Section Introduction (02:37)
  2. What is Machine Learning? (14:26)
  3. Code Preparation (Classification Theory) (15:59)
  4. Classification Notebook (08:40)
  5. Exercise: Predicting Diabetes Onset (02:34)
  6. Code Preparation (Regression Theory) (07:18)
  7. Regression Notebook (10:34)
  8. Exercise: Real Estate Predictions (02:33)
  9. The Neuron (09:58)
  10. How does a model 'learn'? (10:53)
  11. Making Predictions (06:45)
  12. Saving and Loading a Model (04:27)
  13. Suggestion Box (03:10)

Feedforward Artificial Neural Networks

11 Lectures · 02hr 33min
  1. Artificial Neural Networks Section Introduction (06:00)
  2. Forward Propagation (09:40)
  3. The Geometrical Picture (09:43)
  4. Activation Functions (17:18)
  5. Multiclass Classification (08:41)
  6. How to Represent Images (12:36)
  7. Color Mixing Clarification (55:00)
  8. Code Preparation (ANN) (12:42)
  9. ANN for Image Classification (08:36)
  10. ANN for Regression (11:05)
  11. Exercise: E. Coli Protein Localization Sites (02:21)

Recurrent Neural Networks, Time Series, and Sequence Data

19 Lectures · 03hr 15min
  1. Sequence Data (18:27)
  2. Forecasting (10:35)
  3. Autoregressive Linear Model for Time Series Prediction (12:01)
  4. Proof that the Linear Model Works (04:12)
  5. Recurrent Neural Networks (21:34)
  6. RNN Code Preparation (05:50)
  7. RNN for Time Series Prediction (11:11)
  8. Paying Attention to Shapes (08:27)
  9. GRU and LSTM (pt 1) (17:35)
  10. GRU and LSTM (pt 2) (11:36)
  11. A More Challenging Sequence (09:19)
  12. Demo of the Long Distance Problem (19:26)
  13. RNN for Image Classification (Theory) (04:41)
  14. RNN for Image Classification (Code) (04:00)
  15. Stock Return Predictions using LSTMs (pt 1) (12:03)
  16. Stock Return Predictions using LSTMs (pt 2) (05:45)
  17. Stock Return Predictions using LSTMs (pt 3) (11:59)
  18. Other Ways to Forecast (05:14)
  19. Exercise: More Forecasting (01:52)

Natural Language Processing (NLP)

5 Lectures · 42min
  1. Embeddings (13:12)
  2. Code Preparation (NLP) (13:17)
  3. Text Preprocessing (05:30)
  4. Text Classification with LSTMs (08:19)
  5. Exercise: Sentiment Analysis (02:01)

In-Depth: Loss Functions

3 Lectures · 23min
  1. Mean Squared Error (09:11)
  2. Binary Cross Entropy (05:58)
  3. Categorical Cross Entropy (08:06)

In-Depth: Gradient Descent

6 Lectures · 54min
  1. Gradient Descent (07:52)
  2. Stochastic Gradient Descent (04:36)
  3. Momentum (06:11)
  4. Variable and Adaptive Learning Rates (11:46)
  5. Adam Optimization (pt 1) (13:15)
  6. Adam Optimization (pt 2) (11:14)

Introduction and Outline (Legacy)

4 Lectures · 11min
  1. Outline of this Course (02:56)
  2. Review of Important Deep Learning Concepts (03:31)
  3. Where to get the Code and Data (01:50)
  4. How to Succeed in this Course (03:13)

The Simple Recurrent Unit (Legacy)

9 Lectures · 01hr 05min
  1. Architecture of a Recurrent Unit (04:40)
  2. Prediction and Relationship to Markov Models (05:15)
  3. Unfolding a Recurrent Network (01:56)
  4. Backpropagation Through Time (BPTT) (04:18)
  5. The Parity Problem - XOR on Steroids (04:33)
  6. The Parity Problem in Code using a Feedforward ANN (15:06)
  7. Theano Scan Tutorial (12:41)
  8. The Parity Problem in Code using a Recurrent Neural Network (15:15)
  9. On Adding Complexity (01:17)

Recurrent Neural Networks for NLP (Legacy)

8 Lectures · 59min
  1. Word Embeddings and Recurrent Neural Networks (05:02)
  2. Word Analogies with Word Embeddings (02:26)
  3. Representing a sequence of words as a sequence of word embeddings (03:15)
  4. Generating Poetry (04:24)
  5. Generating Poetry in Code (part 1) (19:24)
  6. Generating Poetry in Code (part 2) (04:35)
  7. Classifying Poetry (03:40)
  8. Classifying Poetry in Code (16:43)

Advanced RNN Units (Legacy)

11 Lectures · 01hr 27min
  1. Rated RNN Unit (03:25)
  2. RRNN in Code - Revisiting Poetry Generation (08:50)
  3. Gated Recurrent Unit (GRU) (05:18)
  4. GRU in Code (06:29)
  5. Long Short-Term Memory (LSTM) (04:31)
  6. LSTM in Code (08:15)
  7. Learning from Wikipedia Data (06:58)
  8. Alternative to Wikipedia Data: Brown Corpus (06:04)
  9. Learning from Wikipedia Data in Code (part 1) (17:57)
  10. Learning from Wikipedia Data in Code (part 2) (08:38)
  11. Visualizing the Word Embeddings (11:07)

Batch Training (Legacy)

1 Lectures · 10min
  1. Batch Training for Simple RNN (10:26)

TensorFlow (Legacy)

1 Lectures · 07min
  1. Simple RNN in TensorFlow (07:39)

Extra Help

1 Lectures · 02min
  1. How to install wp2txt or WikiExtractor.py (02:22)

Setting Up Your Environment (Appendix/FAQ by Student Request)

2 Lectures · 37min
  1. Anaconda Environment Setup (20:21)
  2. How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)

Extra Help With Python Coding for Beginners (Appendix/FAQ by Student Request)

5 Lectures · 52min
  1. How to Code Yourself (part 1) (15:55)
  2. How to Code Yourself (part 2) (09:24)
  3. Proof that using Jupyter Notebook is the same as not using it (12:29)
  4. Python 2 vs Python 3 (04:38)
  5. Is Theano Dead? (10:04)

Effective Learning Strategies for Machine Learning (Appendix/FAQ by Student Request)

5 Lectures · 01hr 13min
  1. Beginner's Coding Tips (13:22)
  2. How to Succeed in this Course (Long Version) (10:25)
  3. Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:05)
  4. What order should I take your courses in? (part 1) (11:19)
  5. What order should I take your courses in? (part 2) (16:07)

Appendix / FAQ Finale

2 Lectures · 08min
  1. What is the Appendix? (02:48)
  2. Where to get discount coupons and FREE deep learning material (05:31)
This website is using cookies. That's Fine