First Steps in Linear Algebra for Machine Learning
- Systems of linear equations and linear classifier
- In the first week we provide an introduction to multi-dimensional geometry and matrix algebra. After that, we study methods for finding linear system solutions based on Gaussian eliminations and LU-decompositions. We illustrate the methods with Python code examples of matrix calculations.
- Full rank decomposition and systems of linear equations
- The second week is devoted to getting to know some fundamental notions of linear algebra, namely: vector spaces, linear independence, and basis. Next, we will discuss what a rank of a matrix is, and how it could help us decompose a matrix. In addition, we will talk about the properties of a set of solutions for a system of linear equations. At the end of this week we will apply this theory to a scanned document processing.
- Euclidean spaces
- In the third week, we firstly introduce coordinates in an abstract vector space. This allows us to apply the usual matrix arithmetic to abstract vectors. Next, we discuss the concept of Euclidean space which allows us to measure distances and angles in vector spaces. Then we use these measures in the least squares method to find approximate solutions of linear systems and in the linear regression model based on it. Finally, we describe the core of the most common linear classifier called Support Vector Machine.
- Final Project
- In this week we will apply the acquired knowledge about linear regression and SVM models in this final project.