The
Hidden Markov Model or
HMM is all about learning sequences.
A lot of the data that would be very useful for us to model is in sequences.
Stock prices are sequences of prices. Language is a sequence of words.
Credit scoring involves sequences of borrowing and repaying money, and we can use those sequences to predict whether or not you’re going to default. In short, sequences are everywhere, and being able to analyze them is an important skill in your
data science toolbox.
The easiest way to appreciate the kind of information you get from a sequence is to consider what you are reading right now. If I had written the previous sentence backwards, it wouldn’t make much sense to you, even though it contained all the same words. So order is important.
While the current fad in
deep learning is to use
recurrent neural networks to model sequences, I want to first introduce you guys to a machine learning algorithm that has been around for several decades now - the Hidden Markov Model.
This course follows directly from my first course in
Unsupervised Machine Learning for Cluster Analysis, where you learned how to measure the probability distribution of a random variable. In this course, you’ll learn to measure the probability distribution of a sequence of random variables.
You guys know how much I love deep learning, so there is a little twist in this course. We’ve already covered
gradient descent and you know how central it is for solving deep learning problems. I claimed that gradient descent could be used to optimize any objective function. In this course I will show you how you can use gradient descent to solve for the optimal parameters of an HMM, as an alternative to the popular
expectation-maximization algorithm.
We’re going to do it in
Theano and
Tensorflow, which are popular libraries for deep learning. This is also going to teach you how to work with sequences in Theano and Tensorflow, which will be very useful when we cover
recurrent neural networks and
LSTMs.
This course is also going to go through the many practical applications of Markov models and hidden Markov models. We’re going to look at a model of sickness and health, and calculate how to predict how long you’ll stay sick, if you get sick. We’re going to talk about how Markov models can be used to analyze how people interact with your website, and fix problem areas like high
bounce rate, which could be affecting your
SEO. We’ll build language models that can be used to identify a writer and even generate text - imagine a machine doing your writing for you. HMMs have been very successful in
natural language processing or
NLP.
We’ll look at what is possibly the most recent and prolific application of Markov models -
Google’s PageRank algorithm. And finally we’ll discuss even more practical applications of Markov models, including generating images,
smartphone autosuggestions, and using HMMs to answer one of the most fundamental questions in
biology - how is
DNA, the code of life, translated into physical or behavioral attributes of an organism?
All of the materials of this course can be downloaded and installed for FREE. We will do most of our work in Numpy and Matplotlib, along with a little bit of Theano. I am always available to answer your questions and help you along your data science journey.
This course focuses on
"how to build and understand", not just "how to use". Anyone can learn to use an API in 15 minutes after reading some documentation. It's not about "remembering facts", it's about
"seeing for yourself" via experimentation. It will teach you how to visualize what's happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.
See you in class!
Suggested Prerequisites:
- calculus
- linear algebra
- probability
- Python coding: if/else, loops, lists, dicts, sets
- Numpy coding: matrix and vector operations, loading a CSV file
- Be comfortable with the multivariate Gaussian distribution
- Cluster Analysis and Unsupervised Machine Learning in Python will provide you with sufficient background
Tips for success:
- Use the video speed changer! Personally, I like to watch at 2x.
- Take handwritten notes. This will drastically increase your ability to retain the information.
- Write down the equations. If you don't, I guarantee it will just look like gibberish.
- Ask lots of questions on the discussion board. The more the better!
- Don't get discouraged if you can't solve every exercise right away. Sometimes it'll take hours, days, or maybe weeks!
- Write code yourself, this is an applied course! Don't be a "couch potato".