Deep Learning: Advanced Computer Vision (GANs, SSD, +More!)

VGG, ResNet, Inception, SSD, RetinaNet, Neural Style Transfer, GANs +More Tensorflow, Keras, and Python

Register for this Course

$29.99 $199.99 USD 85% OFF!

Login or signup to register for this course

Have a coupon? Click here.

Course Data

Lectures: 147
Length: 19h 08m
Skill Level: All Levels
Languages: English
Includes: Lifetime access, certificate of completion (shareable on LinkedIn, Facebook, and Twitter), Q&A forum

Course Description

This is one of the most exciting courses I’ve done and it really shows how fast and how far deep learning has come over the years.

When I first started my deep learning series, I didn’t ever consider that I’d make two courses on convolutional neural networks.

I think what you’ll find is that, this course is so entirely different from the previous one, you will be impressed at just how much material we have to cover.

Let me give you a quick rundown of what this course is all about:

We’re going to bridge the gap between the basic CNN architecture you already know and love, to modern, novel architectures such as VGG, ResNet, and Inception (named after the movie which by the way, is also great!)

We’re going to apply these to images of blood cells, and create a system that is a better medical expert than either you or I. This brings up a fascinating idea: that the doctors of the future are not humans, but robots.

In this course, you’ll see how we can turn a CNN into an object detection system, that not only classifies images but can locate each object in an image and predict its label.

You can imagine that such a task is a basic prerequisite for self-driving vehicles. (It must be able to detect cars, pedestrians, bicycles, traffic lights, etc. in real-time)

We’ll be looking at a state-of-the-art algorithm called SSD which is both faster and more accurate than its predecessors.

Another very popular computer vision task that makes use of CNNs is called neural style transfer.

This is where you take one image called the content image, and another image called the style image, and you combine these to make an entirely new image, that is as if you hired a painter to paint the content of the first image with the style of the other. Unlike a human painter, this can be done in a matter of seconds.

I will also introduce you to the now-famous GAN architecture (Generative Adversarial Networks), where you will learn some of the technology behind how neural networks are used to generate state-of-the-art, photo-realistic images.

Currently, we also implement object localization, which is an essential first step toward implementing a full object detection system.

Finally, I teach you about the controversial technology behind facial recognition - how to identify a person based on a photo of their face.

I hope you’re excited to learn about these advanced applications of CNNs, I’ll see you in class!



AWESOME FACTS:

  • One of the major themes of this course is that we’re moving away from the CNN itself, to systems involving CNNs.
  • Instead of focusing on the detailed inner workings of CNNs (which we've already done), we'll focus on high-level building blocks. The result? Almost zero math. (If that's what you're looking for, earlier courses in the series are math-heavy, which was required to understand the inner workings of these building blocks.)
  • Another result? No complicated low-level code such as that written in Tensorflow, Theano, or PyTorch (although some optional exercises may contain them for the very advanced students). Most of the course will be in Keras which means a lot of the tedious, repetitive stuff is written for you.




Suggested Prerequisites:

  • Know how to build, train, and use a CNN using some library (preferably in Python)
  • Understand basic theoretical concepts behind convolution and neural networks
  • Decent Python coding skills, preferably in data science and the Numpy Stack




Tips for success:

  • Use the video speed changer! Personally, I like to watch at 2x.
  • Take handwritten notes. This will drastically increase your ability to retain the information.
  • Write down the equations. If you don't, I guarantee it will just look like gibberish.
  • Ask lots of questions on the discussion board. The more the better!
  • Don't get discouraged if you can't solve every exercise right away. Sometimes it'll take hours, days, or maybe weeks!
  • Write code yourself, this is an applied course! Don't be a "couch potato".

Testimonials and Success Stories


I am one of your students. Yesterday, I presented my paper at ICCV 2019. You have a significant part in this, so I want to sincerely thank you for your in-depth guidance to the puzzle of deep learning. Please keep making awesome courses that teach us!

I just watched your short video on “Predicting Stock Prices with LSTMs: One Mistake Everyone Makes.” Giggled with delight.

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.

Thank you for doing this! I wish everyone who call’s themselves a Data Scientist would take the time to do this either as a refresher or learn the material. I have had to work with so many people in prior roles that wanted to jump right into machine learning on my teams and didn’t even understand the first thing about the basics you have in here!!

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.

Thank you, I think you have opened my eyes. I was using API to implement Deep learning algorithms and each time I felt I was messing out on some things. So thank you very much.

I have been intending to send you an email expressing my gratitude for the work that you have done to create all of these data science courses in Machine Learning and Artificial Intelligence. I have been looking long and hard for courses that have mathematical rigor relative to the application of the ML & AI algorithms as opposed to just exhibit some 'canned routine' and then viola here is your neural network or logistical regression. ...

READ MORE

I have now taken a few classes from some well-known AI profs at Stanford (Andrew Ng, Christopher Manning, …) with an overall average mark in the mid-90s. Just so you know, you are as good as any of them. But I hope that you already know that.

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.

Hi Sir I am a student from India. I've been wanting to write a note to thank you for the courses that you've made because they have changed my career. I wanted to work in the field of data science but I was not having proper guidance but then I stumbled upon your "Logistic Regression" course in March and since then, there's been no looking back. I learned ANNs, CNNs, RNNs, Tensorflow, NLP and whatnot by going through your lectures. The knowledge that I gained enabled me to get a job as a Business Technology Analyst at one of my dream firms even in the midst of this pandemic. For that, I shall always be grateful to you. Please keep making more courses with the level of detail that you do in low-level libraries like Theano.

I just wanted to reach out and thank you for your most excellent course that I am nearing finishing.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)

Hello, Lazy Programmer!

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch. ...

READ MORE

By the way, if you are interested to hear. I used the HMM classification, as it was in your course (95% of the script, I had little adjustments there), for the Customer-Care department in a big known fintech company. to predict who will call them, so they can call him before the rush hours, and improve the service. Instead of a poem, I Had a sequence of the last 24 hours' events that the customer had, like: "Loaded money", "Usage in the food service", "Entering the app", "Trying to change the password", etc... the label was called or didn't call. The outcome was great. They use it for their VIP customers. Our data science department and I got a lot of praise.

Lectures

Welcome

5 Lectures · 28min
  1. Introduction (02:36) (FREE preview available)
  2. Outline and Perspective (06:50)
  3. Where to get the code and data - instant access (01:42)
  4. How to use Github & Extra Coding Tips (Optional) (11:12)
  5. How to Succeed in this Course (05:52)

Google Colab

3 Lectures · 33min
  1. Intro to Google Colab, how to use a GPU or TPU for free (12:32)
  2. Uploading your own data to Google Colab (11:41)
  3. Where can I learn about Numpy, Scipy, Matplotlib, Pandas, and Scikit-Learn? (08:54)

Machine Learning and Neurons

14 Lectures · 01hr 44min
  1. Review Section Introduction (02:37)
  2. What is Machine Learning? (14:26)
  3. Code Preparation (Classification Theory) (15:59)
  4. Beginner's Code Preamble (04:39)
  5. Classification Notebook (08:40)
  6. Code Preparation (Regression Theory) (07:18)
  7. Exercise: Predicting Diabetes Onset (02:34)
  8. Regression Notebook (10:34)
  9. Exercise: Real Estate Predictions (02:33)
  10. The Neuron (09:58)
  11. How does a model 'learn'? (10:53)
  12. Making Predictions (06:45)
  13. Saving and Loading a Model (04:27)
  14. Suggestion Box (03:10)

Feedforward Artificial Neural Networks

11 Lectures · 02hr 33min
  1. Artificial Neural Networks Section Introduction (06:00)
  2. Forward Propagation (09:40)
  3. The Geometrical Picture (09:43)
  4. Activation Functions (17:18)
  5. Multiclass Classification (08:41)
  6. How to Represent Images (12:36)
  7. Color Mixing Clarification (55:00)
  8. Code Preparation (ANN) (12:42)
  9. ANN for Image Classification (08:36)
  10. ANN for Regression (11:05)
  11. Exercise: E. Coli Protein Localization Sites (02:21)

Convolutional Neural Networks

12 Lectures · 01hr 58min
  1. What is Convolution? (part 1) (16:38)
  2. What is Convolution? (part 2) (05:56)
  3. What is Convolution? (part 3) (06:41)
  4. Convolution on Color Images (15:58)
  5. CNN Architecture (20:58)
  6. CNN Code Preparation (15:13)
  7. CNN for Fashion MNIST (06:46)
  8. CNN for CIFAR-10 (04:28)
  9. Data Augmentation (08:51)
  10. Batch Normalization (05:14)
  11. Improving CIFAR-10 Results (10:22)
  12. Exercise: Facial Expression Recognition (01:35)

VGG and Transfer Learning

9 Lectures · 41min
  1. VGG Section Intro (03:05)
  2. What's so special about VGG? (07:01)
  3. Transfer Learning (08:23)
  4. Relationship to Greedy Layer-Wise Pretraining (02:20)
  5. Getting the data (02:18)
  6. Code pt 1 (09:24)
  7. Code pt 2 (03:42)
  8. Code pt 3 (03:27)
  9. VGG Section Summary (01:48)

ResNet (and Inception)

17 Lectures · 01hr 21min
  1. ResNet Section Intro (02:50)
  2. ResNet Architecture (12:45)
  3. Transfer Learning with ResNet in Code (08:32)
  4. Blood Cell Images Dataset (03:03)
  5. How to Build ResNet in Code (11:17)
  6. Low-Level ResNet Implementation Exercise Intro (01:13)
  7. Building ResNet - Strategy (02:25)
  8. Building ResNet - Conv Block Details (03:34)
  9. Building ResNet - Conv Block Code (06:09)
  10. Building ResNet - Identity Block Details (01:24)
  11. Building ResNet - First Few Layers (02:28)
  12. Building ResNet - First Few Layers (Code) (04:16)
  13. Building ResNet - Putting it all together (04:20)
  14. 1x1 Convolutions (04:04)
  15. Optional: Inception (06:48)
  16. Different sized images using the same network (04:13)
  17. ResNet Section Summary (02:28)

Object Detection (SSD / RetinaNet)

18 Lectures · 02hr 08min
  1. SSD Section Intro (05:05)
  2. Object Localization (06:37)
  3. What is Object Detection? (02:54)
  4. How would you find an object in an image? (08:41)
  5. The Problem of Scale (03:48)
  6. The Problem of Shape (03:53)
  7. SSD Tensorflow Object Detection API (pt 1) (12:05)
  8. SSD Tensorflow Object Detection API (pt 2) (12:16)
  9. SSD for Video Object Detection (12:00)
  10. 2020 Update - More Fun and Excitement (Legacy) (05:45)
  11. Using Pretrained RetinaNet (Legacy) (11:14)
  12. RetinaNet with Custom Dataset (pt 1) (Legacy) (04:26)
  13. RetinaNet with Custom Dataset (pt 2) (Legacy) (09:20)
  14. RetinaNet with Custom Dataset (pt 3) (Legacy) (07:05)
  15. SSD in Tensorflow (Legacy) (09:58)
  16. Modifying SSD to work on Video (Legacy) (05:05)
  17. Optional: Intersection over Union & Non-max Suppression (05:07)
  18. SSD Section Summary (02:53)

Neural Style Transfer

7 Lectures · 43min
  1. Style Transfer Section Intro (02:53)
  2. Style Transfer Theory (11:24)
  3. Optimizing the Loss (08:03)
  4. Code pt 1 (07:47)
  5. Code pt 2 (07:14)
  6. Code pt 3 (03:50)
  7. Style Transfer Section Summary (02:22)

Class Activation Maps

2 Lectures · 17min
  1. Class Activation Maps (Theory) (07:10)
  2. Class Activation Maps (Code) (09:55)

Facial Recognition

10 Lectures · 51min
  1. Facial Recognition Section Introduction (03:39)
  2. Siamese Networks (10:18)
  3. Code Outline (05:02)
  4. Loading in the data (04:41)
  5. Splitting the data into train and test (04:25)
  6. Converting the data into pairs (05:03)
  7. Generating Generators (04:21)
  8. Creating the model and loss (03:13)
  9. Accuracy and imbalanced classes (07:08)
  10. Facial Recognition Section Summary (03:29)

GANs (Generative Adversarial Networks)

2 Lectures · 28min
  1. GAN Theory (15:52)
  2. GAN Code (12:11)

Object Localization Project

15 Lectures · 01hr 50min
  1. Localization Introduction and Outline (13:38)
  2. Localization Code Outline (pt 1) (10:40)
  3. Localization Code (pt 1) (09:11)
  4. Localization Code Outline (pt 2) (04:53)
  5. Localization Code (pt 2) (11:04)
  6. Localization Code Outline (pt 3) (03:19)
  7. Localization Code (pt 3) (04:17)
  8. Localization Code Outline (pt 4) (03:20)
  9. Localization Code (pt 4) (02:07)
  10. Localization Code Outline (pt 5) (07:43)
  11. Localization Code (pt 5) (08:40)
  12. Localization Code Outline (pt 6) (07:07)
  13. Localization Code (pt 6) (07:38)
  14. Localization Code Outline (pt 7) (04:59)
  15. Localization Code (pt 7) (12:08)

Basics Review

6 Lectures · 36min
  1. Keras Discussion (06:49)
  2. Keras Neural Network in Code (06:38)
  3. Keras Functional API (04:27)
  4. How to easily convert Keras into Tensorflow 2.0 code (01:49)
  5. TensorFlow Basics: Variables, Functions, Expressions, Optimization (07:27)
  6. Building a neural network in TensorFlow (09:43)

Review (Legacy)

4 Lectures · 22min
  1. Review of CNNs (10:35)
  2. Where to get the code and data (02:27)
  3. Fashion MNIST (02:21)
  4. Review of CNNs in Code (07:00)

Setting Up Your Environment (Appendix/FAQ by Student Request)

2 Lectures · 37min
  1. Anaconda Environment Setup (20:21)
  2. How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)

Extra Help With Python Coding for Beginners (Appendix/FAQ by Student Request)

4 Lectures · 42min
  1. How to Code Yourself (part 1) (15:55)
  2. How to Code Yourself (part 2) (09:24)
  3. Proof that using Jupyter Notebook is the same as not using it (12:29)
  4. Python 2 vs Python 3 (04:38)

Effective Learning Strategies for Machine Learning (Appendix/FAQ by Student Request)

4 Lectures · 59min
  1. How to Succeed in this Course (Long Version) (10:25)
  2. Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:05)
  3. What order should I take your courses in? (part 1) (11:19)
  4. What order should I take your courses in? (part 2) (16:07)

Appendix / FAQ Finale

2 Lectures · 08min
  1. What is the Appendix? (02:48)
  2. Where to get discount coupons and FREE deep learning material (05:31)

Extras

  • Data Links
  • Super-Resolution and Fast Style Transfer
This website is using cookies. That's Fine