In this course we are going to look at **advanced** NLP.

Previously, you learned about some of the basics, like how many NLP problems are just regular**machine learning** and **data science** problems in disguise, and simple, practical methods like **bag-of-words** and **term-document matrices**.

These allowed us to do some pretty cool things, like**detect spam** emails, **write poetry**, **spin articles**, and group together similar words.

In this course I’m going to show you how to do even more awesome things. We’ll learn not just 1, but**4** new architectures in this course.

First up is**word2vec**.

In this course, I’m going to show you exactly how word2vec works, from theory to implementation, and you’ll see that it’s merely the application of skills you already know.

Word2vec is interesting because it magically maps words to a vector space where you can find analogies, like:

We are also going to look at the**GLoVe** method, which also finds word vectors, but uses a technique called **matrix factorization**, which is a popular algorithm for **recommender systems**.

Amazingly, the word vectors produced by GLoVe are just as good as the ones produced by word2vec, and it’s way easier to train.

We will also look at some classical NLP problems, like**parts-of-speech tagging** and **named entity recognition**, and use **recurrent neural networks** to solve them. You’ll see that just about any problem can be solved using neural networks, but you’ll also learn the dangers of having too much complexity.

Lastly, you’ll learn about**recursive neural networks**, which finally help us solve the problem of negation in **sentiment analysis**. Recursive neural networks exploit the fact that sentences have a tree structure, and we can finally get away from naively using bag-of-words.

All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on**"how to build and understand"**, not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about **"seeing for yourself" via experimentation**. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

See you in class!

HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

TIPS (for getting through the course):

Previously, you learned about some of the basics, like how many NLP problems are just regular

These allowed us to do some pretty cool things, like

In this course I’m going to show you how to do even more awesome things. We’ll learn not just 1, but

First up is

In this course, I’m going to show you exactly how word2vec works, from theory to implementation, and you’ll see that it’s merely the application of skills you already know.

Word2vec is interesting because it magically maps words to a vector space where you can find analogies, like:

- king - man = queen - woman
- France - Paris = England - London
- December - Novemeber = July - June

We are also going to look at the

Amazingly, the word vectors produced by GLoVe are just as good as the ones produced by word2vec, and it’s way easier to train.

We will also look at some classical NLP problems, like

Lastly, you’ll learn about

All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey.

This course focuses on

See you in class!

HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE:

- calculus
- linear algebra
- probability (conditional and joint distributions)
- Python coding: if/else, loops, lists, dicts, sets
- Numpy coding: matrix and vector operations, loading a CSV file
- neural networks and backpropagation
- Can write a feedforward neural network in Theano and TensorFlow
- Can write a recurrent neural network / LSTM / GRU in Theano and TensorFlow

TIPS (for getting through the course):

- Watch it at 2x.
- Take handwritten notes. This will drastically increase your ability to retain the information.
- Write down the equations. If you don't, I guarantee it will just look like gibberish.
- Ask lots of questions on the discussion board. The more the better!
- Realize that most exercises will take you days or weeks to complete.
- Write code yourself, don't just sit there and look at my code.

- Introduction, Outline, and Review (06:25) (FREE preview available)
- Where to get the code / data for this course (02:33)
- How to Succeed in this Course (03:13)
- Tensorflow or Theano - Your Choice! (04:09)

- What are vectors? (07:57)
- What is a word analogy? (07:59)
- Trying to find and assess word vectors using TF-IDF and t-SNE (07:42)
- Pretrained word vectors from GloVe (11:05)
- Pretrained word vectors from word2vec (06:32)
- Text Classification with word vectors (04:25)
- Text Classification in Code (06:15)
- Using pretrained vectors later in the course (03:32)

- Review Section Intro (03:14)
- Bigrams and Language Models (14:47)
- Bigrams in Code (14:20)
- Neural Bigram Model (07:57)
- Neural Bigram Model in Code (06:49)
- Neural Network Bigram Model (09:14)
- Neural Network Bigram Model in Code (03:31)
- Improving Efficiency (14:36)
- Improving Efficiency in Code (04:52)
- Review Section Summary (03:26)

- Return of the Bigram (03:08)
- CBOW (07:40)
- Skip-Gram (04:00)
- Hierarchical Softmax (08:23)
- Negative Sampling (14:12)
- Negative Sampling - Important Details (05:09)
- Why do I have 2 word embedding matrices and what do I do with them? (02:16)
- Word2Vec implementation tricks (04:50)
- Word2Vec implementation outline (04:09)
- Word2Vec in Code with Numpy (10:47)
- Word2Vec Tensorflow Implementation Details (03:59)
- Word2Vec Tensorflow in Code (04:07)
- How to update only part of a Theano shared variable (05:29)
- Word2Vec in Code with Theano (09:57)
- Alternative to Wikipedia Data: Brown Corpus (06:04)

- GloVe Section Introduction (02:20)
- Matrix Factorization for Recommender Systems - Basic Concepts (21:08)
- Matrix Factorization Training (08:11)
- Expanding the Matrix Factorization Model (09:23)
- Regularization for Matrix Factorization (06:18)
- GLoVe - Global Vectors for Word Representation (04:13)
- Recap of ways to train GloVe (02:31)
- GLoVe in Code - Numpy Gradient Descent (16:48)
- GLoVe in Code - Alternating Least Squares (04:42)
- GLoVe in Code - Theano Gradient Descent (03:51)
- GloVe in Tensorflow with Gradient Descent (07:04)
- Visualizing country analogies with t-SNE (04:25)
- Hyperparameter Challenge (02:19)
- Training GloVe with SVD (Singular Value Decomposition) (10:38)

- Pointwise Mutual Information - Word2Vec as Matrix Factorization (12:07)
- PMI in Code (07:22)

- Parts-of-Speech (POS) Tagging (05:00)
- How can neural networks be used to solve POS tagging? (04:09)
- Parts-of-Speech Tagging Baseline (15:18)
- Parts-of-Speech Tagging Recurrent Neural Network in Theano (13:05)
- Parts-of-Speech Tagging Recurrent Neural Network in Tensorflow (12:17)
- How does an HMM solve POS tagging? (07:58)
- Parts-of-Speech Tagging Hidden Markov Model (HMM) (05:58)
- Named Entity Recognition (NER) (03:01)
- Comparing NER and POS tagging (02:02)
- Named Entity Recognition Baseline (05:55)
- Named Entity Recognition RNN in Theano (02:19)
- Named Entity Recognition RNN in Tensorflow (02:14)
- Hyperparameter Challenge II (02:13)

- Recursive Neural Networks Section Introduction (07:15)
- Sentences as Trees (05:29)
- Data Description for Recursive Neural Networks (06:52)
- What are Recursive Neural Networks / Tree Neural Networks (TNNs)? (05:41)
- Building a TNN with Recursion (04:47)
- Trees to Sequences (06:39)
- Recursive Neural Network in Theano (18:35)
- Recursive Neural Tensor Networks (06:23)
- RNTN in Tensorflow (Tips) (12:19)
- RNTN in Tensorflow (Code) (11:19)
- Recursive Neural Network in TensorFlow with Recursion (04:12)

- What is a word embedding? (10:00)
- Using pre-trained word embeddings (02:18)
- Word analogies using word embeddings (03:51)
- TF-IDF and t-SNE experiment (12:24)
- Word2Vec introduction (05:08)
- CBOW (02:20)
- Skip-Gram (03:31)
- Negative Sampling (07:36)
- Why do I have 2 word embedding matrices and what do I do with them? (01:37)
- Word2Vec in Code with Numpy (part 1) (19:50)
- Word2Vec in Code with Numpy (part 2) (01:53)
- Converting a sequence of word indexes to a sequence of word vectors (03:15)

- What is the Appendix? (02:48)
- Windows-Focused Environment Setup 2018 (20:21)
- How to install wp2txt or WikiExtractor.py (02:21)
- How to install Numpy, Scipy, Matplotlib, Pandas, Sci-Kit Learn, IPython, Theano, and TensorFlow (17:33)
- How to Code Yourself (part 1) (15:55)
- How to Code Yourself (part 2) (09:23)
- What order should I take your courses in? (part 1) (11:19)
- What order should I take your courses in? (part 2) (16:07)
- Proof that using Jupyter Notebook is the same as not using it (12:29)
- Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:04)
- Python 2 vs Python 3 (04:38)
- How to Succeed in this Course (Long Version) (10:25)
- Is Theano Dead? (10:03)
- Where to get discount coupons and FREE deep learning material (02:21)