Machine Learning: Natural Language Processing in Python (V2)

NLP: Use Markov Models, NLTK, Artificial Intelligence, Deep Learning, Machine Learning, and Data Science in Python

Register for this Course

$54.99 $219.99 USD 75% OFF!

Login or signup to register for this course

Have a coupon? Click here.

Course Data

Lectures: 170
Length: 24h 41m
Skill Level: All Levels
Languages: English
Includes: Lifetime access, certificate of completion (shareable on LinkedIn, Facebook, and Twitter), Q&A forum

Course Description

Hello friends!

Welcome to Machine Learning: Natural Language Processing in Python (Version 2).

This is a massive 4-in-1 course covering:

  • 1) Vector models and text preprocessing methods
  • 2) Probability models and Markov models
  • 3) Machine learning methods
  • 4) Deep learning and neural network methods


In part 1, which covers vector models and text preprocessing methods, you will learn about why vectors are so essential in data science and artificial intelligence. You will learn about various techniques for converting text into vectors, such as the CountVectorizer and TF-IDF, and you'll learn the basics of neural embedding methods like word2vec, and GloVe.

You'll then apply what you learned for various tasks, such as:

  • Text classification
  • Document retrieval / search engine
  • Text summarization


Along the way, you'll also learn important text preprocessing steps, such as tokenization, stemming, and lemmatization.

You'll be introduced briefly to classic NLP tasks such as parts-of-speech tagging.

In part 2, which covers probability models and Markov models, you'll learn about one of the most important models in all of data science and machine learning in the past 100 years. It has been applied in many areas in addition to NLP, such as finance, bioinformatics, and reinforcement learning.

In this course, you'll see how such probability models can be used in various ways, such as:

  • Building a text classifier
  • Article spinning
  • Text generation (generating poetry)


Importantly, these methods are an essential prerequisite for understanding how the latest Transformer (attention) models such as BERT and GPT-3 work. Specifically, we'll learn about 2 important tasks which correspond with the pre-training objectives for BERT and GPT.

In part 3, which covers machine learning methods, you'll learn about more of the classic NLP tasks, such as:

  • Spam detection
  • Sentiment analysis
  • Latent semantic analysis (also known as latent semantic indexing) (LSA / LSI)
  • Topic modeling
  • Text summarization (re-visit)


This section will be application-focused rather than theory-focused, meaning that instead of spending most of our effort learning about the details of various ML algorithms, you'll be focusing on how they can be applied to the above tasks.

Of course, you'll still need to learn something about those algorithms in order to understand what's going on. The following algorithms will be used:

  • Naive Bayes
  • Logistic Regression
  • Principal Components Analysis (PCA) / Singular Value Decomposition (SVD)
  • Latent Dirichlet Allocation (LDA)
  • Non-negative Matrix Factorization
  • TextRank (based on Google's PageRank)


These are not just "any" machine learning / artificial intelligence algorithms but rather, ones that have been staples in NLP and are thus an essential part of any NLP course.

In part 4, which covers deep learning methods, you'll learn about modern neural network architectures that can be applied to solve NLP tasks. Thanks to their great power and flexibility, neural networks can be used to solve any of the aforementioned tasks in the course.

You'll learn about:

  • Feedforward Artificial Neural Networks (ANNs)
  • Embeddings
  • Convolutional Neural Networks (CNNs)
  • Recurrent Neural Networks (RNNs)


The study of RNNs will involve modern architectures such as the LSTM and GRU which have been widely used by Google, Amazon, Apple, Facebook, etc. for difficult tasks such as language translation, speech recognition, and text-to-speech.

Obviously, as the latest Transformers (such as BERT and GPT-3) are examples of deep neural networks, this part of the course is an essential prerequisite for understanding Transformers.

VIP-only: In the VIP version of this course, you will get your first taste of the power of Transformers. In this section, we will use the Hugging Face library to apply pre-trained NLP Transformer models to tasks such as:

  • Sentiment analysis
  • Converting text into embedding vectors for document retrieval
  • Named entity recognition (NER)
  • Text generation and language modeling
  • Masked language modeling and article spinning
  • Text summarization
  • Neural language translation
  • Question answering
  • Zero-shot classification


You'll notice the first few tasks have been seen earlier in the course. This is intentional.

This section will "connect the dots" between what you learned previously, and the state-of-the-art today.

To end the section, we will go beyond just the familiar tasks to look at some very impressive feats of the modern NLP era, like zero-shot classification.

MORE BONUS CONTENT

This VIP section will contain even more content than what was included in the original VIP section (released elsewhere). In particular, you will get the following extra bonus notebooks:

  • Stock Movement Prediction Using News
  • LSA / LSI for Recommendations
  • LSA / LSI for Classification (Feature Engineering)
  • LSA / LSI for Text Summarization
  • LSA / LSI for Topic Modeling
  • Article spinner (masked language model) with LSTMs
  • Text generator (forward language model) with LSTMs
  • CNN for POS Tagging with custom loss for masking


The final notebooks, which show how to build an article spinner and seq2seq model with LSTMs, will help to "bridge the gap" between RNNs and Transformers. Specifically, masked language modeling is a training objective for some Transformers, while seq2seq introduces the "encoder-decoder" paradigm.

Thank you for reading and I hope to see you soon!

Testimonials and Success Stories


I am one of your students. Yesterday, I presented my paper at ICCV 2019. You have a significant part in this, so I want to sincerely thank you for your in-depth guidance to the puzzle of deep learning. Please keep making awesome courses that teach us!

I just watched your short video on “Predicting Stock Prices with LSTMs: One Mistake Everyone Makes.” Giggled with delight.

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.

Thank you for doing this! I wish everyone who call’s themselves a Data Scientist would take the time to do this either as a refresher or learn the material. I have had to work with so many people in prior roles that wanted to jump right into machine learning on my teams and didn’t even understand the first thing about the basics you have in here!!

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.

Thank you, I think you have opened my eyes. I was using API to implement Deep learning algorithms and each time I felt I was messing out on some things. So thank you very much.

I have been intending to send you an email expressing my gratitude for the work that you have done to create all of these data science courses in Machine Learning and Artificial Intelligence. I have been looking long and hard for courses that have mathematical rigor relative to the application of the ML & AI algorithms as opposed to just exhibit some 'canned routine' and then viola here is your neural network or logistical regression. ...

READ MORE

I have now taken a few classes from some well-known AI profs at Stanford (Andrew Ng, Christopher Manning, …) with an overall average mark in the mid-90s. Just so you know, you are as good as any of them. But I hope that you already know that.

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.

Hi Sir I am a student from India. I've been wanting to write a note to thank you for the courses that you've made because they have changed my career. I wanted to work in the field of data science but I was not having proper guidance but then I stumbled upon your "Logistic Regression" course in March and since then, there's been no looking back. I learned ANNs, CNNs, RNNs, Tensorflow, NLP and whatnot by going through your lectures. The knowledge that I gained enabled me to get a job as a Business Technology Analyst at one of my dream firms even in the midst of this pandemic. For that, I shall always be grateful to you. Please keep making more courses with the level of detail that you do in low-level libraries like Theano.

I just wanted to reach out and thank you for your most excellent course that I am nearing finishing.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)

Hello, Lazy Programmer!

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch. ...

READ MORE

By the way, if you are interested to hear. I used the HMM classification, as it was in your course (95% of the script, I had little adjustments there), for the Customer-Care department in a big known fintech company. to predict who will call them, so they can call him before the rush hours, and improve the service. Instead of a poem, I Had a sequence of the last 24 hours' events that the customer had, like: "Loaded money", "Usage in the food service", "Entering the app", "Trying to change the password", etc... the label was called or didn't call. The outcome was great. They use it for their VIP customers. Our data science department and I got a lot of praise.

Lectures

Introduction

2 Lectures · 15min
  1. Introduction and Outline (10:40) (FREE preview available)
  2. Are You Beginner, Intermediate, or Advanced? All are OK! (05:06)

Getting Set Up

4 Lectures · 18min
  1. Where to get the code and data - instant access (01:42)
  2. How to use Github & Extra Coding Tips (Optional) (11:12)
  3. How to Succeed in this Course (03:04)
  4. Temporary 403 Errors (02:58)

Vector Models and Text Preprocessing

22 Lectures · 03hr 32min
  1. Vector Models & Text Preprocessing Intro (03:40)
  2. Basic Definitions for NLP (05:01)
  3. What is a Vector? (10:41)
  4. Bag of Words (02:32)
  5. Count Vectorizer (Theory) (13:45)
  6. Tokenization (14:44)
  7. Stopwords (04:51)
  8. Stemming and Lemmatization (12:03)
  9. Stemming and Lemmatization Demo (13:26)
  10. Count Vectorizer (Code) (15:43)
  11. Vector Similarity (11:35)
  12. TF-IDF (Theory) (14:16)
  13. (Interactive) Recommender Exercise Prompt (02:36)
  14. TF-IDF (Code) (20:25)
  15. Word-to-Index Mapping (10:54)
  16. How to Build TF-IDF From Scratch (15:08)
  17. Neural Word Embeddings (10:15)
  18. Neural Word Embeddings Demo (11:25)
  19. Vector Models & Text Preprocessing Summary (03:50)
  20. Text Summarization Preview (01:21)
  21. How To Do NLP In Other Languages (10:41)
  22. Suggestion Box (03:10)

Probabilistic Models (Introduction)

1 Lectures · 04min
  1. Probabilistic Models (Introduction) (04:46)

Markov Models (Intermediate)

13 Lectures · 01hr 47min
  1. Markov Models Section Introduction (02:42)
  2. The Markov Property (07:34)
  3. The Markov Model (12:30)
  4. Probability Smoothing and Log-Probabilities (07:50)
  5. Building a Text Classifier (Theory) (06:33)
  6. Building a Text Classifier (Exercise Prompt) (07:29)
  7. Building a Text Classifier (Code pt 1) (10:32)
  8. Building a Text Classifier (Code pt 2) (12:06)
  9. Language Model (Theory) (10:15)
  10. Language Model (Exercise Prompt) (06:52)
  11. Language Model (Code pt 1) (10:45)
  12. Language Model (Code pt 2) (09:25)
  13. Markov Models Section Summary (03:00)

Article Spinner (Intermediate)

6 Lectures · 51min
  1. Article Spinning - Problem Description (07:55)
  2. Article Spinning - N-Gram Approach (04:24)
  3. Article Spinner Exercise Prompt (05:45)
  4. Article Spinner in Python (pt 1) (17:31)
  5. Article Spinner in Python (pt 2) (10:00)
  6. Case Study: Article Spinning Gone Wrong (05:42)

Cipher Decryption (Advanced)

13 Lectures · 01hr 31min
  1. Section Introduction (04:50)
  2. Ciphers (03:59)
  3. Language Models (Review) (16:06)
  4. Genetic Algorithms (21:23)
  5. Code Preparation (04:46)
  6. Code pt 1 (03:06)
  7. Code pt 2 (07:20)
  8. Code pt 3 (04:52)
  9. Code pt 4 (04:03)
  10. Code pt 5 (07:11)
  11. Code pt 6 (05:25)
  12. Cipher Decryption - Additional Discussion (02:56)
  13. Section Conclusion (06:00)

Machine Learning Models (Introduction)

1 Lectures · 05min
  1. Machine Learning Models (Introduction) (05:50)

Spam Detection

6 Lectures · 01hr 00min
  1. Spam Detection - Problem Description (06:32)
  2. Naive Bayes Intuition (11:37)
  3. Spam Detection - Exercise Prompt (02:07)
  4. Aside: Class Imbalance, ROC, AUC, and F1 Score (pt 1) (12:25)
  5. Aside: Class Imbalance, ROC, AUC, and F1 Score (pt 2) (11:02)
  6. Spam Detection in Python (16:23)

Sentiment Analysis

7 Lectures · 01hr 03min
  1. Sentiment Analysis - Problem Description (07:27)
  2. Logistic Regression Intuition (pt 1) (17:36)
  3. Multiclass Logistic Regression (pt 2) (06:52)
  4. Logistic Regression Training and Interpretation (pt 3) (08:15)
  5. Sentiment Analysis - Exercise Prompt (04:00)
  6. Sentiment Analysis in Python (pt 1) (10:38)
  7. Sentiment Analysis in Python (pt 2) (08:28)

Text Summarization

10 Lectures · 01hr 09min
  1. Text Summarization Section Introduction (05:34)
  2. Text Summarization Using Vectors (05:30)
  3. Text Summarization Exercise Prompt (01:50)
  4. Text Summarization in Python (12:40)
  5. TextRank Intuition (08:03)
  6. TextRank - How It Really Works (Advanced) (10:50)
  7. TextRank Exercise Prompt (Advanced) (01:23)
  8. TextRank in Python (Advanced) (14:33)
  9. Text Summarization in Python - The Easy Way (Beginner) (06:06)
  10. Text Summarization Section Summary (03:22)

Topic Modeling

9 Lectures · 01hr 03min
  1. Topic Modeling Section Introduction (03:06)
  2. Latent Dirichlet Allocation (LDA) - Essentials (10:54)
  3. LDA - Code Preparation (03:41)
  4. LDA - Maybe Useful Picture (Optional) (01:52)
  5. Latent Dirichlet Allocation (LDA) - Intuition (Advanced) (14:54)
  6. Topic Modeling with Latent Dirichlet Allocation (LDA) in Python (11:38)
  7. Non-Negative Matrix Factorization (NMF) Intuition (10:21)
  8. Topic Modeling with Non-Negative Matrix Factorization (NMF) in Python (05:33)
  9. Topic Modeling Section Summary (01:37)

Latent Semantic Analysis (Latent Semantic Indexing)

5 Lectures · 39min
  1. LSA / LSI Section Introduction (04:06)
  2. SVD (Singular Value Decomposition) Intuition (12:11)
  3. LSA / LSI: Applying SVD to NLP (07:46)
  4. Latent Semantic Analysis / Latent Semantic Indexing in Python (09:15)
  5. LSA / LSI Exercises (06:00)

Deep Learning (Introduction)

1 Lectures · 04min
  1. Deep Learning Introduction (Intermediate-Advanced) (04:57)

The Neuron

7 Lectures · 58min
  1. The Neuron - Section Introduction (02:20)
  2. Fitting a Line (14:23)
  3. Classification Code Preparation (07:20)
  4. Text Classification in Tensorflow (12:09)
  5. The Neuron (09:58)
  6. How does a model learn? (10:53)
  7. The Neuron - Section Summary (01:51)

Feedforward Artificial Neural Networks

15 Lectures · 02hr 02min
  1. ANN - Section Introduction (06:59)
  2. Forward Propagation (09:40)
  3. The Geometrical Picture (09:43)
  4. Activation Functions (17:18)
  5. Multiclass Classification (08:41)
  6. ANN Code Preparation (04:35)
  7. Text Classification ANN in Tensorflow (05:43)
  8. Text Preprocessing Code Preparation (11:33)
  9. Text Preprocessing in Tensorflow (05:30)
  10. Embeddings (10:13)
  11. CBOW (Advanced) (04:07)
  12. CBOW Exercise Prompt (00:57)
  13. CBOW in Tensorflow (Advanced) (19:24)
  14. ANN - Section Summary (01:32)
  15. Aside: How to Choose Hyperparameters (Optional) (06:25)

Convolutional Neural Networks

9 Lectures · 01hr 25min
  1. CNN - Section Introduction (04:34)
  2. What is Convolution? (16:38)
  3. What is Convolution? (Pattern Matching) (05:56)
  4. What is Convolution? (Weight Sharing) (06:41)
  5. Convolution on Color Images (15:58)
  6. CNN Architecture (20:58)
  7. CNNs for Text (08:07)
  8. Convolutional Neural Network for NLP in Tensorflow (05:31)
  9. CNN - Section Summary (01:27)

Recurrent Neural Networks

12 Lectures · 01hr 47min
  1. RNN - Section Introduction (04:46)
  2. Simple RNN / Elman Unit (pt 1) (09:20)
  3. Simple RNN / Elman Unit (pt 2) (09:42)
  4. RNN Code Preparation (09:45)
  5. RNNs: Paying Attention to Shapes (08:26)
  6. GRU and LSTM (pt 1) (17:35)
  7. GRU and LSTM (pt 2) (11:36)
  8. RNN for Text Classification in Tensorflow (05:56)
  9. Parts-of-Speech (POS) Tagging in Tensorflow (19:50)
  10. Named Entity Recognition (NER) in Tensorflow (05:13)
  11. Exercise: Return to CNNs (Advanced) (03:19)
  12. RNN - Section Summary (01:58)

Transformers with Hugging Face

13 Lectures · 02hr 15min
  1. Transformers Section Introduction (10:14)
  2. From RNNs to Attention and Transformers - Intuition (17:01)
  3. Sentiment Analysis (10:32)
  4. Sentiment Analysis in Python (17:00)
  5. Text Generation (10:47)
  6. Text Generation in Python (11:47)
  7. Masked Language Modeling (Article Spinner) (11:37)
  8. Masked Language Modeling (Article Spinner) in Python (08:26)
  9. Question Answering (07:20)
  10. Question Answering in Python (06:14)
  11. Zero-Shot Classification (05:30)
  12. Zero-Shot Classification in Python (13:47)
  13. Transformers Section Summary (04:53)

Course Conclusion

2 Lectures · 13min
  1. What to Learn Next (06:27)
  2. Where is BERT, ChatGPT, GPT-4, ...? (07:01)

Setting Up Your Environment (Appendix/FAQ by Student Request)

3 Lectures · 42min
  1. Pre-Installation Check (04:13)
  2. Anaconda Environment Setup (20:21)
  3. How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)

Extra Help With Python Coding for Beginners (Appendix/FAQ by Student Request)

3 Lectures · 37min
  1. How to Code Yourself (part 1) (15:55)
  2. How to Code Yourself (part 2) (09:24)
  3. Proof that using Jupyter Notebook is the same as not using it (12:29)

Effective Learning Strategies for Machine Learning (Appendix/FAQ by Student Request)

4 Lectures · 59min
  1. How to Succeed in this Course (Long Version) (10:25)
  2. Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:05)
  3. What order should I take your courses in? (part 1) (11:19)
  4. What order should I take your courses in? (part 2) (16:07)

Appendix / FAQ Finale

2 Lectures · 08min
  1. What is the Appendix? (02:48)
  2. Where to get discount coupons and FREE deep learning material (05:49)

Extras

  • GloVe Word Embeddings Demo
  • Stock Movement Prediction Using News
  • LSA / LSI for Recommendations
  • LSA / LSI for Classification (Feature Engineering)
  • LSA / LSI for Topic Modeling
  • LSA / LSI for Text Summarization (Method 1)
  • LSA / LSI for Text Summarization (Method 2)
  • LSTM for Text Generation Notebook
  • Language Model Training Efficiency
  • Masked language model with LSTM Notebook
  • CNN POS Tagging Custom Loss
This website is using cookies. That's Fine