FastML

Machine learning made easy

Math for machine learning

Sometimes people ask what math they need for machine learning. The answer depends on what you want to do, but in short our opinion is that it is good to have some familiarity with linear algebra and multivariate differentiation.

Linear algebra is a cornerstone because everything in machine learning is a vector or a matrix. Dot products, distance, matrix factorization, eigenvalues etc. come up all the time.

Differentiation matters because of gradient descent. Again, gradient descent is almost everywhere*. It found its way even into the tree domain in the form of gradient boosting - a gradient descent in function space.

We file probability under statistics and that’s why we don’t mention it here.

Now for some recommendations. Trey Causey once tweeted about Gilbert Strang’s linear algebra course. We like it very much.


Executing a determinants formula in lecture 19

If you like books, there are a few free books online, for example Linear algebra by Jim Hefferon. Also, people put up PDFs of pretty much any book you can imagine, you can google them if you lost your copy.

For shorter intro, check out an upcoming book about deep learning by Yoshua Bengio, Ian Goodfellow and Aaron Courville. It has three separate chapters on linear algebra, probability and numerical computation. The drafts are online.

MOOCs

There are a few MOOCs about linear algebra specifically. They differ in style.

A shorter MOOC at Coursera, with linear algebra and some calculus for financial applications. The lecturer looks convincing:

And here’s a calculus course with some linear algebra:

More math MOOCs

*Almost in a mathematical sense, that is everywhere except a finite number of places.

Comments