Common scenario: You try to get into **machine learning** and **data science**, but there's SO MUCH MATH.

Either you never studied this math, or you studied it so long ago you've forgotten it all.

What do you do?

Well my friends, that is why I created this course.

**Linear Algebra** is one of the most important math prerequisites for machine learning. It's required to understand probability and statistics, which form the foundation of data science.

The "data" in data science is represented using**matrices** and **vectors**, which are the central objects of study in this course.

If you want to do machine learning beyond just copying library code from blogs and tutorials, you must know linear algebra.

In a normal STEM college program, linear algebra is split into multiple semester-long courses.

Luckily, I've refined these teachings into just the essentials, so that you can learn everything you need to know on the scale of hours instead of semesters.

This course will cover systems of linear equations, matrix operations (dot product, inverse, transpose, determinant, trace), low-rank approximations, positive-definiteness and negative-definiteness, and eigenvalues and eigenvectors. It will even include machine learning-focused material you wouldn't normally see in a regular college course, such as how these concepts apply to GPT-4, and fine-tuning modern neural networks like diffusion models (for generative AI art) and LLMs (Large Language Models) using**LoRA**. We will even demonstrate many of the concepts in this course using the **Python** programming language (don't worry, you don't need to know Python for this course). In other words, instead of the dry old college version of linear algebra, this course takes just the most practical and impactful topics, and provides you with skills directly applicable to machine learning and data science, so you can start applying them today.

Are you ready?

Let's go!

Suggested prerequisites:

Either you never studied this math, or you studied it so long ago you've forgotten it all.

What do you do?

Well my friends, that is why I created this course.

The "data" in data science is represented using

If you want to do machine learning beyond just copying library code from blogs and tutorials, you must know linear algebra.

In a normal STEM college program, linear algebra is split into multiple semester-long courses.

Luckily, I've refined these teachings into just the essentials, so that you can learn everything you need to know on the scale of hours instead of semesters.

This course will cover systems of linear equations, matrix operations (dot product, inverse, transpose, determinant, trace), low-rank approximations, positive-definiteness and negative-definiteness, and eigenvalues and eigenvectors. It will even include machine learning-focused material you wouldn't normally see in a regular college course, such as how these concepts apply to GPT-4, and fine-tuning modern neural networks like diffusion models (for generative AI art) and LLMs (Large Language Models) using

Are you ready?

Let's go!

Suggested prerequisites:

- Firm understanding of high school math (functions, algebra, trigonometry)

You probably already know this, but some of us really and truly appreciate you. BTW, I spent a reasonable amount of time making a learning roadmap based on your courses and have started the journey.

Looking forward to your new stuff.

I am signing up so that I have the easy refresh when needed and the see what you consider important, as well as to support your great work, thank you.

READ MORE

I wish you a happy and safe holiday season. I am glad you chose to share your knowledge with the rest of us.

And, I couldn't agree more with some of your "rants", and found myself nodding vigorously!

You are an excellent teacher, and a rare breed.

And, your courses are frankly, more digestible and teach a student far more than some of the top-tier courses from ivy leagues I have taken in the past.

(I plan to go through many more courses, one by one!)

I know you must be deluged with complaints in spite of the best content around That's just human nature.

Also, satisfied people rarely take the time to write, so I thought I will write in for a change. :)

In the process of completing my Master’s at Hunan University, China, I am writing this feedback to you in order to express my deep gratitude for all the knowledge and skills I have obtained studying your courses and following your recommendations.

The first course of yours I took was on Convolutional Neural Networks (“Deep Learning p.5”, as far as I remember). Answering one of my questions on the Q&A board, you suggested I should start from the beginning – the Linear and Logistic Regression courses. Despite that I assumed I had already known many basic things at that time, I overcame my “pride” and decided to start my journey in Deep Learning from scratch. ...

READ MORE

- Introduction and Outline (09:30) (FREE preview available)
- How to Succeed in this Course (08:45)
- Where to get the code (01:42)
- How to Take this Course (02:05)

- Lines and Planes (10:14)
- 2 Equations and 2 Unknowns (12:58)
- 3 Equations and 3 Unknowns (17:23)
- Gaussian Elimination (22:48)
- No Solutions (05:10)
- Infinitely Many Solutions (08:22)
- Review Summary (03:59)
- Suggestion Box (03:10)

- What is a Vector? (20:05)
- Adding and Subtracting Vectors (12:12)
- Dot Product (15:56)
- Dot Product (pt 2) (09:06)
- Dot Product Exercises in Python (17:49)
- Bonus Application: Neural Embeddings, Cosine Similarity (Optional) (22:34)
- Exercise: Normalizing a Vector (08:02)
- Exercise: The Vector Normal to a Plane (05:09)
- What is a Matrix? (27:59)
- Matrix Addition and Scalar Multiplication (03:52)
- Matrix Multiplication (18:02)
- Properties of Matrix Multiplication (08:19)
- Matrix-Vector Product (12:53)
- Application: Neural Networks (07:28)
- Element-Wise Product (03:23)
- Outer Product (09:50)
- Bonus Application: Replicating GPT-4 (07:11)
- Matrix Exercises in Python (24:08)
- Linear Systems Revisited (06:19)
- Vectors and Matrices Summary (10:41)

- Identity Matrix (06:01)
- Diagonal Matrices (08:48)
- Matrix Inverse (24:20)
- Exercise: Inverse of the Inverse (07:59)
- Singular Matrices (08:14)
- Matrix Transpose (18:38)
- Properties of the Matrix Transpose (25:11)
- Symmetric Matrices (07:53)
- Transpose in Higher Dimensions (13:51)
- Orthogonal and Orthonormal Matrices and Vectors (14:27)
- Exercise: Orthogonal Matrices (03:21)
- Exercise: Inverse of a Product (02:25)
- Exercise: Transpose of Inverse of Symmetric Matrix (04:02)
- Exercise: Why Are Orthogonal Matrices Length- and Angle-Preserving? (09:26)
- Determinants (pt 1) (18:50)
- Determinants (pt 2) (23:09)
- Determinant Formula (Optional) (12:05)
- Determinant Identities (Optional) (06:01)
- Exercise: Determinant of a Unitary Matrix (02:23)
- Matrix Trace (Optional) (07:36)
- Positive Definite and Negative Definite Matrices (23:37)
- Exercise: Inverse of a Positive Definite Matrix (03:50)
- Exercise: Complete the Square (21:12)
- Matrix Operations Exercises in Python (13:56)
- Matrix Operations and Special Matrices Summary (11:35)

- Linear Independence and Dependence (34:31)
- Geometric Interpretation of Linear Combinations (07:46)
- The Rank of a Matrix (20:17)
- Matrix Decompositions (SVD, QR, LU, Cholesky) (24:50)
- Rank After Multplication (22:37)
- Low-Rank Approximations and Frobenius Norm (13:25)
- Applications: Recommender Systems and Topic Modeling (Optional) (17:09)
- Applications of SVD: Data Visualization and Feature Selection (Optional) (11:57)
- Bonus Application: LoRA for Diffusion Models and LLMs (09:14)
- Exercise: Generating a Positive Semi-Definite Matrix (05:15)
- Matrix Decompositions in Python (20:09)
- Matrix Rank and Decompositions Summary (05:05)

- How to Find Eigenvalues and Eigenvectors (pt 1) (24:04)
- How to Find Eigenvalues and Eigenvectors (pt 2) (03:04)
- Exercise: Rotation Matrix (21:38)
- Exercise: Why Do A^TA and AA^T Have the Same Eigenvalues? (02:53)
- Exercise: Eigenvalues of the Inverse (02:48)
- Conjugate Transpose and Hermitian Matrices (11:31)
- Hermitian Matrices Have Real Eigenvalues (06:39)
- Why Do Hermitian Matrices Have Orthogonal Eigenvectors? (06:29)
- Test for Positive Definiteness Using Eigenvalues (08:18)
- Determinant From Eigenvalues (03:02)
- Invertibility From Eigenvalues (Positive Definite Matrices Are Invertible) (04:54)
- Diagonalization (24:11)
- Constructing the SVD ('Proof' of SVD) (26:00)
- Matrix Powers (08:30)
- Application: The Vanishing Gradient Problem (08:16)
- Functions of Matrices (Optional) (13:45)
- Eigenvalues in Python (25:16)
- Quiz: Square Root of a Matrix (05:50)
- Eigenvalues and Eigenvectors (11:16)

- Anaconda Environment Setup (20:21)
- How to install Numpy, Scipy, Matplotlib, Pandas, IPython, Theano, and TensorFlow (17:33)

- Can YouTube Teach Me Calculus? (Optional) (15:08)
- Is this for Beginners or Experts? Academic or Practical? Fast or slow-paced? (22:05)
- What order should I take your courses in? (part 1) (11:19)
- What order should I take your courses in? (part 2) (16:07)

- What is the Appendix? (02:48)
- Where to get discount coupons and FREE deep learning material (05:31)

- PDF Notes