This page is designed to answer the most common question we receive, "what order should I take your courses in?" Feel free to skip any courses in which you already understand the subject matter. Do not skip courses that contain prerequisites to later courses you want to take.

Many people do not understand that if you know nothing about machine learning and the first thing you try to tackle is word2vec, you WILL NOT succeed. Take your time to learn and strengthen the fundamentals.

See the course descriptions for a more in-depth review of what is contained within each course. This chart mostly explains the dependencies (i.e. why one course will teach you things that are needed in the next)

Deep learning-specific courses are in green, non-deep learning machine learning courses are in blue. All contain techniques that tie into deep learning.

It's very important to note that learning about machine learning is a very *nonlinear* process. In other words, it's not a matter of learning one subject, then learning the next, and the next, and so on. Sometimes, you might need 2 separate, unrelated courses to provide background for 1. Sometimes 1 course might provide background for 2 different courses. Thus, this linear chart only provides a very rough guideline. For a "graphical" representation, please scroll down to the bottom of this page. You'll be able to drag around each course and see arrows pointing from prerequisite to sequel. Click here to jump down.

Legend:

Above course leads to bottom course.

Above course can be taken simultaneously with bottom course.

(however, again note that the linear chart is a *very rough* guideline, the graphical representation is more accurate!)

- Take this first if you're unfamiliar with the Numpy stack
- Covers common operations in Numpy, Scipy, Matplotlib, Pandas

- Use this *massive* course as your intro to learn a wide variety of deep learning applications
- ANNs (artificial neural networks), CNNs (convolutional neural networks), and RNNs (recurrent neural networks)
- Image classification, sequence modeling, prediction, and forecasting
- Stock prediction
- NLP (natural language processing)
- GANs (generative adversarial networks)
- Transfer Learning
- Recommender Systems
- Deep Reinforcement Learning (applied to create a trading bot)
- DeepDream
- Object Localization
- After you take this, go and do my other courses to go more in-depth on each topic

- Use this *massive* course as your intro to learn a wide variety of deep learning applications
- ANNs (artificial neural networks), CNNs (convolutional neural networks), and RNNs (recurrent neural networks)
- Image classification, sequence modeling, prediction, and forecasting
- Stock prediction
- NLP (natural language processing)
- GANs (generative adversarial networks)
- Transfer Learning
- Recommender Systems
- Deep Reinforcement Learning (applied to create a trading bot)
- Quantifying prediction uncertainty
- Face Recognition with Siamese Networks
- After you take this, go and do my other courses to go more in-depth on each topic

- Exploratory data analysis for stock prices and stock returns
- Hypothesis testing, QQ-plots, Gaussian Mixture Models
- Time series analysis and forecasting
- EWMA and ARIMA
- Exposing frauds that teach "predicting stock prices with LSTMs"
- Portfolio optimization, Markowitz portfolio theory
- Efficient frontier, Sharpe ratio, tangency portfolio
- Convex optimization, CVXOPT, quadratic programming, linear programming
- CAPM (capital asset pricing model)
- Algorithmic trading
- Trend-following strategy
- Reinforcement learning (Q-learning) for trading
- Statistical factor models
- Hidden Markov Models for regime detection

- ETS and Exponential Smoothing
- Holt’s Linear Trend Model and Holt-Winters Model
- ARIMA, SARIMA, SARIMAX, and Auto ARIMA
- ACF and PACF
- Exposing frauds that teach "predicting stock prices with LSTMs"
- Vector Autoregression and Moving Average Models (VAR, VMA, VARMA)
- Machine Learning Models (including Logistic Regression, Support Vector Machines, and Random Forests)
- Deep Learning Models (Artificial Neural Networks, Convolutional Neural Networks, and Recurrent Neural Networks)
- GRUs and LSTMs for Time Series Forecasting
- VIP: AWS Forecast (Amazon’s state-of-the-art low-code forecasting API)
- VIP: GARCH (financial volatility modeling)
- VIP: FB Prophet (Facebook’s time series library)
- VIP: LinkedIn's Greykite time series library
- VIP: Granger Causality

- Possibly the simplest machine learning model
- Fitting a line
- Regression (predicting a number)
- The form "wx + b" will appear in every subsequent deep learning course
- Has a simple calculus-based "closed-form" solution

- Linear models with alternative loss functions where Linear Programming is used
- L1 Loss (absolute error), Maximum absolute deviation (MAD), Exponential Loss
- Applicable to operations research, quantitative finance, engineering

- Binary classification (predicting a category)
- Add to the form "wx + b" to do classification
- Learn how this models a "neuron", and thus forms the building block for neural networks
- No closed-form solution, use gradient descent

- Not much math, focus on algorithms and geometrical visualization
- Practical ML concepts (generalization, cross-validation)
- Bigger picture: How you would fit ML into a web-service
- Naive Bayes, Decision Trees, K-Nearest Neighbor, Perceptron (ancestor of deep learning)

- Confidence intervals, frequentist A/B testing
- Improvements on vanilla A/B test: greedy-epsilon and UCB1
- The Bayesian paradigm
- Adaptive learning using Bayesian techniques

- Go from binary classification to multi-class classification (softmax)
- Combine neurons to form a neural network
- Derive the backpropagation algorithm

- Improve backpropagation using momentum and adaptive learning rates (RMSProp, Adam)
- Modern regularization techniques like dropout, batch normalization
- Speedup with GPU on AWS
- Do backpropagation automatically with Theano and TensorFlow
- Learn Theano and TensorFlow from the ground up
- Other modern libraries like Keras, PyTorch, MXNet, CNTK

- Derive and demonstrate bias-variance tradeoff
- Bootstrap and bagging, combining models by resampling
- Random forest
- Derive theory behind AdaBoost and implement it
- Compare random forest and AdaBoost to deep learning techniques

- Add another element (convolution) to neural network
- Deep learning for images
- Learn special pre-processing techniques for images
- Learn the principles behind how famous deep neural networks are built, such as the LeNet, VGG, AlexNet, and Inception

- Go from supervised to unsupervised paradigm
- K-Means Clustering, Hierarchical Clustering, Gaussian-Mixture Models (GMM)

- Practice the unsupervised paradigm with deep learning
- Explore the vanishing gradient problem and techniques to solve it
- Deep learning for visualizing data
- PCA, t-SNE, Autoencoders, Restricted Boltzmann Machines (RBMs)

- Apply machine learning (both supervised and unsupervised) to language

- Extend unsupervised paradigm from vectors to sequences
- Add GMM to HMM to model continuous data
- Apply Theano in a non-deep learning setting, and learn basic tools needed to code recurrent neural networks

- Apply Markov models to the Markov Decision Process (MDP) - the framework for RL problems
- Continue learning about the explore-exploit dilemma, originally seen in Bayesian Machine Learning: A/B testing
- See how deep learning can be applied to reinforcement learning

- Deep learning for sequences
- Recurrent architectures (LSTM, GRU)
- Some language modeling (sentences are sequences of words)

- Learning word vectors / word embeddings (word2vec, GLoVe)
- State-of-the-art sentiment analysis with Recursive Neural Networks and Recursive Neural Tensor Networks (RNTNs) - these are extensions of RNNs

- Learn about MDPs, Monte Carlo, and Temporal Difference learning more in-depth
- Apply deep neural networks to reinforcement learning
- Play with the OpenAI Gym (CartPole, MountainCar, Atari)

- Learn state-of-the-art techniques for generating realistic, high-quality images using convolutional neural networks
- Apply game theory and Bayesian machine learning to deep learning
- Understand the connection between GANs and reinforcement learning

- Learn state-of-the-art architectures such as VGG, ResNet, and Inception
- Train state-of-the-art models fast with transfer learning
- Object detection principles with SSD
- Neural style transfer
- Super resolution (a.k.a. image enhance)

- Learn state-of-the-art NLP and RNN techniques such as seq2seq, attention, and memory networks
- Apply CNNs to NLP
- Apply RNNs to image classification
- Speech recognition

- Learn the ranking algorithms that drive Reddit, Hacker News, Google PageRank, Buzzfeed and other news sites
- Learn powerful rating prediction algorithms based on matrix factorization (used by Amazon, Netflix, and more)
- Apply deep learning (supervised and unsupervised) to rating predictions

- A robust (and very elegant) plug-and-play type of machine learning classifier
- Technically only relies on knowledge of Logistic Regression, but goes very deep theoretically, and you'll appreciate it more if you understand neural networks too

- Continue your reinforcement learning journey with modern algorithms developed on top of the original DQN and policy gradient, including DDPG and A2C.
- Learn a completely new way to train RL agents called Evolution Strategies.
- All new environments such as Atari (Breakout, Pong, Space Invaders, etc.), MuJoCo (physics simulator), and Flappy Bird.

- No dependencies, can be taken at any time
- Learn SQL (a universal language), which can then be applied to mobile and desktop (SQLite), web services (MySQL, PostgreSQL), and big data technologies (Hadoop Hive, Spark)
- Mostly useful for scalable data preparation (after which I assume you would feed in the data into one of the algorithms below)

- No dependencies, can be taken at any time
- My very first course!
- Part 1 is focused on MATLAB programming language basics: syntax, data types, programming logic, arithmetic, and plotting
- Part 2 is focused on the application of MATLAB to signal processing with sound and images

Check out this draggable graph of courses! This best illustrates the fact that the learning path is nonlinear. Sometimes, the same course will provide background for more than one other course. Sometimes it is the reverse - you'll need 2 or more different topics to prepare you for one. Not surprisingly, this still does not tell the whole story.

Sometimes, not all of a course is relevant to the next. So sometimes a prerequisite covers some useful element of the sequel but the rest of the content is unrelated. Sometimes there is a very strong connection, where the entire prerequisite is critical to success in the sequel. These nuances are described in a lecture that can be found in the appendix of any course: "What order should I take your courses in?"

Note that you can zoom in/out and drag the nodes around.

Just in case you missed it above, you CAN drag the items around and zoom into the graph. Use your mouse!