Natural Language Processing with Deep Learning in Python

Complete guide on deriving and implementing word2vec, GLoVe, word embeddings, and sentiment analysis with recursive nets

Register for this Course

$29.99 $199.99 USD 85% OFF!

Login or signup to register for this course

Have a coupon? Click here.

Course Data

Lectures: 110
Length: 13h 49m
Skill Level: All Levels
Languages: English
Includes: Lifetime access, certificate of completion (shareable on LinkedIn, Facebook, and Twitter), Q&A forum

Course Description

In this course we are going to look at NLP (natural language processing) with deep learning.

Previously, you learned about some of the basics, like how many NLP problems are just regular machine learning and data science problems in disguise, and simple, practical methods like bag-of-words and term-document matrices.

These allowed us to do some pretty cool things, like detect spam emails, write poetry, spin articles, and group together similar words.

In this course I’m going to show you how to do even more awesome things. We’ll learn not just 1, but 4 new architectures in this course.

First up is word2vec.

In this course, I’m going to show you exactly how word2vec works, from theory to implementation, and you’ll see that it’s merely the application of skills you already know.

Word2vec is interesting because it magically maps words to a vector space where you can find analogies, like:

  • king - man = queen - woman
  • France - Paris = England - London
  • December - Novemeber = July - June


For those beginners who find algorithms tough and just want to use a library, we will demonstrate the use of the Gensim library to obtain pre-trained word vectors, compute similarities and analogies, and apply those word vectors to build text classifiers.

We are also going to look at the GloVe method, which also finds word vectors, but uses a technique called matrix factorization, which is a popular algorithm for recommender systems.

Amazingly, the word vectors produced by GLoVe are just as good as the ones produced by word2vec, and it’s way easier to train.

We will also look at some classical NLP problems, like parts-of-speech tagging and named entity recognition, and use recurrent neural networks to solve them. You’ll see that just about any problem can be solved using neural networks, but you’ll also learn the dangers of having too much complexity.

Lastly, you’ll learn about recursive neural networks, which finally help us solve the problem of negation in sentiment analysis. Recursive neural networks exploit the fact that sentences have a tree structure, and we can finally get away from naively using bag-of-words.

All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

See you in class!



Suggested Prerequisites:

  • calculus
  • linear algebra
  • probability (conditional and joint distributions)
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • neural networks and backpropagation
  • Can write a feedforward neural network in Theano and TensorFlow
  • Can write a recurrent neural network / LSTM / GRU in Theano and TensorFlow

Testimonials and Success Stories


I am one of your students. Yesterday, I presented my paper at ICCV 2019. You have a significant part in this, so I want to sincerely thank you for your in-depth guidance to the puzzle of deep learning. Please keep making awesome courses that teach us!

I just watched your short video on “Predicting Stock Prices with LSTMs: One Mistake Everyone Makes.” Giggled with delight.

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.

Thank you for doing this! I wish everyone who call’s themselves a Data Scientist would take the time to do this either as a refresher or learn the material. I have had to work with so many people in prior roles that wanted to jump right into machine learning on my teams and didn’t even understand the first thing about the basics you have in here!!

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.

Thank you, I think you have opened my eyes. I was using API to implement Deep learning algorithms and each time I felt I was messing out on some things. So thank you very much.

I have been intending to send you an email expressing my gratitude for the work that you have done to create all of these data science courses in Machine Learning and Artificial Intelligence. I have been looking long and hard for courses that have mathematical rigor relative to the application of the ML & AI algorithms as opposed to just exhibit some 'canned routine' and then viola here is your neural network or logistical regression. ...

READ MORE

I have now taken a few classes from some well-known AI profs at Stanford (Andrew Ng, Christopher Manning, …) with an overall average mark in the mid-90s. Just so you know, you are as good as any of them. But I hope that you already know that.

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.

Hi Sir I am a student from India. I've been wanting to write a note to thank you for the courses that you've made because they have changed my career. I wanted to work in the field of data science but I was not having proper guidance but then I stumbled upon your "Logistic Regression" course in March and since then, there's been no looking back. I learned ANNs, CNNs, RNNs, Tensorflow, NLP and whatnot by going through your lectures. The knowledge that I gained enabled me to get a job as a Business Technology Analyst at one of my dream firms even in the midst of this pandemic. For that, I shall always be grateful to you. Please keep making more courses with the level of detail that you do in low-level libraries like Theano.

I just wanted to reach out and thank you for your most excellent course that I am nearing finishing.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)

Hello, Lazy Programmer!

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch. ...

READ MORE

By the way, if you are interested to hear. I used the HMM classification, as it was in your course (95% of the script, I had little adjustments there), for the Customer-Care department in a big known fintech company. to predict who will call them, so they can call him before the rush hours, and improve the service. Instead of a poem, I Had a sequence of the last 24 hours' events that the customer had, like: "Loaded money", "Usage in the food service", "Entering the app", "Trying to change the password", etc... the label was called or didn't call. The outcome was great. They use it for their VIP customers. Our data science department and I got a lot of praise.

Lectures

Outline, Review, and Logistical Things

6 Lectures · 30min
  1. Introduction, Outline, and Review (05:35) (FREE preview available)
  2. Where to get the code / data for this course (09:17)
  3. How to Succeed in this Course (05:52)
  4. Tensorflow or Theano - Your Choice! (04:10)
  5. Preprocessed Wikipedia Data (03:04)
  6. How to Open Files for Windows Users (02:18)

Beginner's Corner: Working with Word Vectors

9 Lectures · 58min
  1. What are vectors? (07:56)
  2. What is a word analogy? (07:58)
  3. Trying to find and assess word vectors using TF-IDF and t-SNE (07:42)
  4. Pretrained word vectors from GloVe (11:05)
  5. Pretrained word vectors from word2vec (06:31)
  6. Text Classification with word vectors (04:24)
  7. Text Classification in Code (06:14)
  8. Using pretrained vectors later in the course (03:32)
  9. Suggestion Box (03:03)

Review of Language Modeling and Neural Networks

10 Lectures · 01hr 22min
  1. Review Section Intro (03:14)
  2. Bigrams and Language Models (14:47)
  3. Bigrams in Code (14:19)
  4. Neural Bigram Model (07:56)
  5. Neural Bigram Model in Code (06:48)
  6. Neural Network Bigram Model (09:13)
  7. Neural Network Bigram Model in Code (03:31)
  8. Improving Efficiency (14:35)
  9. Improving Efficiency in Code (04:52)
  10. Review Section Summary (03:26)

Word Embeddings and Word2Vec

15 Lectures · 01hr 34min
  1. Return of the Bigram (03:07)
  2. CBOW (07:39)
  3. Skip-Gram (04:00)
  4. Hierarchical Softmax (08:22)
  5. Negative Sampling (14:11)
  6. Negative Sampling - Important Details (05:09)
  7. Why do I have 2 word embedding matrices and what do I do with them? (02:16)
  8. Word2Vec implementation tricks (04:49)
  9. Word2Vec implementation outline (04:09)
  10. Word2Vec in Code with Numpy (10:47)
  11. Word2Vec Tensorflow Implementation Details (03:58)
  12. Word2Vec Tensorflow in Code (04:06)
  13. How to update only part of a Theano shared variable (05:29)
  14. Word2Vec in Code with Theano (09:57)
  15. Alternative to Wikipedia Data: Brown Corpus (06:03)

Word Embeddings using GLoVe

14 Lectures · 01hr 43min
  1. GloVe Section Introduction (02:19)
  2. Matrix Factorization for Recommender Systems - Basic Concepts (21:08)
  3. Matrix Factorization Training (08:11)
  4. Expanding the Matrix Factorization Model (09:23)
  5. Regularization for Matrix Factorization (06:18)
  6. GLoVe - Global Vectors for Word Representation (04:13)
  7. Recap of ways to train GloVe (02:32)
  8. GLoVe in Code - Numpy Gradient Descent (16:48)
  9. GLoVe in Code - Alternating Least Squares (04:43)
  10. GLoVe in Code - Theano Gradient Descent (03:51)
  11. GloVe in Tensorflow with Gradient Descent (07:04)
  12. Visualizing country analogies with t-SNE (04:24)
  13. Hyperparameter Challenge (02:19)
  14. Training GloVe with SVD (Singular Value Decomposition) (10:38)

Unifying Word2Vec and GloVe

2 Lectures · 19min
  1. Pointwise Mutual Information - Word2Vec as Matrix Factorization (12:07)
  2. PMI in Code (07:22)

Using Neural Networks to Solve NLP Problems

13 Lectures · 01hr 21min
  1. Parts-of-Speech (POS) Tagging (05:01)
  2. How can neural networks be used to solve POS tagging? (04:08)
  3. Parts-of-Speech Tagging Baseline (15:18)
  4. Parts-of-Speech Tagging Recurrent Neural Network in Theano (13:06)
  5. Parts-of-Speech Tagging Recurrent Neural Network in Tensorflow (12:18)
  6. How does an HMM solve POS tagging? (07:57)
  7. Parts-of-Speech Tagging Hidden Markov Model (HMM) (05:59)
  8. Named Entity Recognition (NER) (03:01)
  9. Comparing NER and POS tagging (02:02)
  10. Named Entity Recognition Baseline (05:55)
  11. Named Entity Recognition RNN in Theano (02:19)
  12. Named Entity Recognition RNN in Tensorflow (02:14)
  13. Hyperparameter Challenge II (02:14)

Recursive Neural Networks (Tree Neural Networks)

11 Lectures · 01hr 29min
  1. Recursive Neural Networks Section Introduction (07:15)
  2. Sentences as Trees (05:30)
  3. Data Description for Recursive Neural Networks (06:53)
  4. What are Recursive Neural Networks / Tree Neural Networks (TNNs)? (05:41)
  5. Building a TNN with Recursion (04:47)
  6. Trees to Sequences (06:40)
  7. Recursive Neural Network in Theano (18:35)
  8. Recursive Neural Tensor Networks (06:23)
  9. RNTN in Tensorflow (Tips) (12:20)
  10. RNTN in Tensorflow (Code) (11:19)
  11. Recursive Neural Network in TensorFlow with Recursion (04:13)

Legacy Word2vec Lectures

12 Lectures · 01hr 13min
  1. What is a word embedding? (10:01)
  2. Using pre-trained word embeddings (02:18)
  3. Word analogies using word embeddings (03:52)
  4. TF-IDF and t-SNE experiment (12:25)
  5. Word2Vec introduction (05:08)
  6. CBOW (02:19)
  7. Skip-Gram (03:31)
  8. Negative Sampling (07:37)
  9. Why do I have 2 word embedding matrices and what do I do with them? (01:36)
  10. Word2Vec in Code with Numpy (part 1) (19:49)
  11. Word2Vec in Code with Numpy (part 2) (01:54)
  12. Converting a sequence of word indexes to a sequence of word vectors (03:15)

Theano and Tensorflow Basics Review

4 Lectures · 34min
  1. Theano Basics (07:48)
  2. Theano Neural Network in Code (09:18)
  3. Tensorflow Basics (07:27)
  4. Tensorflow Neural Network in Code (09:43)

Extra Help

1 Lectures · 02min
  1. How to install wp2txt or WikiExtractor.py (02:22)

Setting Up Your Environment (Appendix/FAQ by Student Request)

2 Lectures · 37min
  1. Anaconda Environment Setup (20:21)
  2. How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)

Extra Help With Python Coding for Beginners (Appendix/FAQ by Student Request)

5 Lectures · 52min
  1. How to Code Yourself (part 1) (15:55)
  2. How to Code Yourself (part 2) (09:24)
  3. Proof that using Jupyter Notebook is the same as not using it (12:29)
  4. Python 2 vs Python 3 (04:38)
  5. Is Theano Dead? (10:04)

Effective Learning Strategies for Machine Learning (Appendix/FAQ by Student Request)

4 Lectures · 59min
  1. How to Succeed in this Course (Long Version) (10:25)
  2. Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:05)
  3. What order should I take your courses in? (part 1) (11:19)
  4. What order should I take your courses in? (part 2) (16:07)

Appendix / FAQ Finale

2 Lectures · 08min
  1. What is the Appendix? (02:48)
  2. Where to get discount coupons and FREE deep learning material (05:31)
This website is using cookies. That's Fine