Data Science: Supervised Machine Learning in Python

Complete Guide to Implementing Classic Machine Learning Algorithms in Python and with Scikit-Learn

Register for this Course

$29.99 $199.99 USD 85% OFF!

Login or signup to register for this course

Have a coupon? Click here.

Course Data

Lectures: 53
Length: 6h 22m
Skill Level: All Levels
Languages: English
Includes: Lifetime access, certificate of completion (shareable on LinkedIn, Facebook, and Twitter), Q&A forum

Course Description

In recent years, we've seen a resurgence in AI, or artificial intelligence, and machine learning.

Machine learning has led to some amazing results, like being able to analyze medical images and predict diseases on-par with human experts.

Google's AlphaGo program was able to beat a world champion in the strategy game go using deep reinforcement learning.

Machine learning is even being used to program self driving cars, which is going to change the automotive industry forever. Imagine a world with drastically reduced car accidents, simply by removing the element of human error.

Google famously announced that they are now "machine learning first", meaning that machine learning is going to get a lot more attention now, and this is what's going to drive innovation in the coming years. It's embedded into all sorts of different products.

Machine learning is used in many industries, like finance, online advertising, medicine, and robotics.

It is a widely applicable tool that will benefit you no matter what industry you're in, and it will also open up a ton of career opportunities once you get good.

Machine learning also raises some philosophical questions. Are we building a machine that can think? What does it mean to be conscious? Will computers one day take over the world?

In this course, we are first going to discuss the K-Nearest Neighbor algorithm. It’s extremely simple and intuitive, and it’s a great first classification algorithm to learn. After we discuss the concepts and implement it in code, we’ll look at some ways in which KNN can fail.

It’s important to know both the advantages and disadvantages of each algorithm we look at.

Next we’ll look at the Naive Bayes Classifier and the General Bayes Classifier. This is a very interesting algorithm to look at because it is grounded in probability.

We’ll see how we can transform the Bayes Classifier into a linear and quadratic classifier to speed up our calculations.

Next we’ll look at the famous Decision Tree algorithm. This is the most complex of the algorithms we’ll study, and most courses you’ll look at won’t implement them. We will, since I believe implementation is good practice.

The last algorithm we’ll look at is the Perceptron algorithm. Perceptrons are the ancestor of neural networks and deep learning, so they are important to study in the context of machine learning.

One we’ve studied these algorithms, we’ll move to more practical machine learning topics. Hyperparameters, cross-validation, feature extraction, feature selection, and multiclass classification.

We’ll do a comparison with deep learning so you understand the pros and cons of each approach.

We’ll discuss the Sci-Kit Learn library, because even though implementing your own algorithms is fun and educational, you should use optimized and well-tested code in your actual work.

We’ll cap things off with a very practical, real-world example by writing a web service that runs a machine learning model and makes predictions. This is something that real companies do and make money from.

All the materials for this course are FREE. You can download and install Python, Numpy, and Scipy with simple commands on Windows, Linux, or Mac.

This course focuses on "how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about "seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.



Suggested Prerequisites:

  • calculus
  • linear algebra
  • geometry
  • probability
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations, loading a CSV file
  • linear regression would be useful


Tips for success:

  • Use the video speed changer! Personally, I like to watch at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Don't get discouraged if you can't solve every exercise right away. Sometimes it'll take hours, days, or maybe weeks!
  • Write code yourself, this is an applied course! Don't be a "couch potato".

Testimonials and Success Stories


I am one of your students. Yesterday, I presented my paper at ICCV 2019. You have a significant part in this, so I want to sincerely thank you for your in-depth guidance to the puzzle of deep learning. Please keep making awesome courses that teach us!

I just watched your short video on “Predicting Stock Prices with LSTMs: One Mistake Everyone Makes.” Giggled with delight.

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.

Thank you for doing this! I wish everyone who call’s themselves a Data Scientist would take the time to do this either as a refresher or learn the material. I have had to work with so many people in prior roles that wanted to jump right into machine learning on my teams and didn’t even understand the first thing about the basics you have in here!!

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.

Thank you, I think you have opened my eyes. I was using API to implement Deep learning algorithms and each time I felt I was messing out on some things. So thank you very much.

I have been intending to send you an email expressing my gratitude for the work that you have done to create all of these data science courses in Machine Learning and Artificial Intelligence. I have been looking long and hard for courses that have mathematical rigor relative to the application of the ML & AI algorithms as opposed to just exhibit some 'canned routine' and then viola here is your neural network or logistical regression. ...

READ MORE

I have now taken a few classes from some well-known AI profs at Stanford (Andrew Ng, Christopher Manning, …) with an overall average mark in the mid-90s. Just so you know, you are as good as any of them. But I hope that you already know that.

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.

Hi Sir I am a student from India. I've been wanting to write a note to thank you for the courses that you've made because they have changed my career. I wanted to work in the field of data science but I was not having proper guidance but then I stumbled upon your "Logistic Regression" course in March and since then, there's been no looking back. I learned ANNs, CNNs, RNNs, Tensorflow, NLP and whatnot by going through your lectures. The knowledge that I gained enabled me to get a job as a Business Technology Analyst at one of my dream firms even in the midst of this pandemic. For that, I shall always be grateful to you. Please keep making more courses with the level of detail that you do in low-level libraries like Theano.

I just wanted to reach out and thank you for your most excellent course that I am nearing finishing.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)

Hello, Lazy Programmer!

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch. ...

READ MORE

By the way, if you are interested to hear. I used the HMM classification, as it was in your course (95% of the script, I had little adjustments there), for the Customer-Care department in a big known fintech company. to predict who will call them, so they can call him before the rush hours, and improve the service. Instead of a poem, I Had a sequence of the last 24 hours' events that the customer had, like: "Loaded money", "Usage in the food service", "Entering the app", "Trying to change the password", etc... the label was called or didn't call. The outcome was great. They use it for their VIP customers. Our data science department and I got a lot of praise.

Lectures

Introduction and Review

4 Lectures · 29min
  1. Introduction and Outline (09:34) (FREE preview available)
  2. How to Succeed in this Course (05:52)
  3. Where to get the Code and Data (10:45)
  4. Review of Important Concepts (03:28)

K-Nearest Neighbor

9 Lectures · 38min
  1. K-Nearest Neighbor Intuition (04:11)
  2. K-Nearest Neighbor Concepts (05:02)
  3. KNN in Code with MNIST (07:41)
  4. When KNN Can Fail (03:50)
  5. KNN for the XOR Problem (02:05)
  6. KNN for the Donut Problem (02:37)
  7. Effect of K (05:48)
  8. KNN Exercise (04:05)
  9. Suggestion Box (03:03)

Naive Bayes and Bayes Classifiers

9 Lectures · 01hr 02min
  1. Bayes Classifier Intuition (Continuous) (18:16)
  2. Bayes Classifier Intuition (Discrete) (10:58)
  3. Naive Bayes (09:01)
  4. Naive Bayes Handwritten Example (03:28)
  5. Naive Bayes in Code with MNIST (05:57)
  6. Non-Naive Bayes (04:05)
  7. Bayes Classifier in Code with MNIST (02:03)
  8. Linear Discriminant Analysis (LDA) and Quadratic Discriminant Analysis (QDA) (06:07)
  9. Generative vs Discriminative Models (02:47)

Decision Trees

6 Lectures · 38min
  1. Decision Tree Intuition (04:48)
  2. Decision Tree Basics (04:58)
  3. Information Entropy (03:59)
  4. Maximizing Information Gain (07:58)
  5. Choosing the Best Split (04:02)
  6. Decision Tree in Code (13:10)

Perceptrons

4 Lectures · 19min
  1. Perceptron Concepts (07:07)
  2. Perceptron in Code (05:26)
  3. Perceptron for MNIST and XOR (03:16)
  4. Perceptron Loss Function (04:01)

Practical Machine Learning

6 Lectures · 31min
  1. Hyperparameters and Cross-Validation (04:16)
  2. Feature Extraction and Feature Selection (03:54)
  3. Comparison to Deep Learning (04:40)
  4. Multiclass Classification (03:20)
  5. Sci-Kit Learn (09:03)
  6. Regression with Sci-Kit Learn is Easy (05:50)

Building a Machine Learning Web Service

2 Lectures · 10min
  1. Building a Machine Learning Web Service Concepts (04:12)
  2. Building a Machine Learning Web Service Code (06:12)

Conclusion

1 Lectures · 02min
  1. What’s Next? Support Vector Machines and Ensemble Methods (e.g. Random Forest) (02:51)

Setting Up Your Environment (Appendix/FAQ by Student Request)

2 Lectures · 37min
  1. Anaconda Environment Setup (20:21)
  2. How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)

Extra Help With Python Coding for Beginners (Appendix/FAQ by Student Request)

4 Lectures · 42min
  1. How to Code Yourself (part 1) (15:55)
  2. How to Code Yourself (part 2) (09:24)
  3. Proof that using Jupyter Notebook is the same as not using it (12:29)
  4. Python 2 vs Python 3 (04:38)

Effective Learning Strategies for Machine Learning (Appendix/FAQ by Student Request)

4 Lectures · 59min
  1. How to Succeed in this Course (Long Version) (10:25)
  2. Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:05)
  3. What order should I take your courses in? (part 1) (11:19)
  4. What order should I take your courses in? (part 2) (16:07)

Appendix / FAQ Finale

2 Lectures · 08min
  1. What is the Appendix? (02:48)
  2. Where to get discount coupons and FREE deep learning material (05:31)

Extras

  • Creating Linear Models out of Gaussian, Bernoulli, and Multinomial Naive Bayes
This website is using cookies. That's Fine