Do you remember back then during school days when you did some vector or matrix operations? Or did some probability and statistics? Or performed some differentiation, integration and then getting confused about what the math question you just finished solving was 😆? Haha! You know right?

Ermm! hope you were pretty attentive during those Math classes 👀 because when we do Machine Learning, **we are essentially solving a Math** problem, and one of the Math topics with enormous applications in Machine learning is **Linear Algebra**!!!

Linear Algebra plays a huge role in machine learning! Back then, we modelled real-world problems into systems of linear equations and solved these equations via substitution, via elimination or via the graphing method. Take a look at a linear system with 2 unknowns below:

A linear system is when variables e.g x and y above are raised to a power or an exponent of 1 i.e they are first-degree variables. In *1.md* above, there are just 2 unknowns and it can easily be solved via either substitution, elimination or graphing methods. So, Yeah! we can solve that with a pen and paper but machines can't solve that in such form! Just imagine each equation as a row or observation in a dataset where the right side of the equation above is the target or response or dependent variable, and the left side of the equation represents the features or predictors or independent or input variables. Then, think of a system of 1000 equations with 1000 unknowns 🤔 - it will be time and energy-consuming solving these equations using previous methods mentioned. This challenge gives room for **Matrices**, which is a key data structure in linear algebra and helps handle more data instead of just 2 observations like in *1.md* above. To represent the above equation in a matrix form, see below:

There are numerous examples of Linear Algebra in Machine Learning. To do Machine learning (ML), data needs to be in such a way that ML models or ML algorithms can ingest it - Most ML models take numeric inputs - Datasets usually contain a number of observations (rows) characterized by features (columns). The Dataset is a **matrix** with each column representing a **vector** - a collection of vectors is a matrix. Matrix Operations, such as addition, subtraction, multiplication, transpose etc are applied in Machine Learning!

Also, in this light, for an image classification problem, the inputs to the neural network are **tensors**. For a classic Artificial Neural Network (ANN), given a grayscale image which is a 3D tensor where the 1st dimension is the index of the image, and the other 2 dimensions give the dimension of the arrays that contain the image pixels. To input this image into an ANN, this 3D tensor is flattened to get a 2D tensor, where the 1st dimension tells what row an image corresponds to and the 2nd dimension is a single vector that contains the pixels of the image. Images are represented as tensors in order for computers to process them and guess what? **A tensor is just a generalization of vectors and matrices to higher dimensions potentially**!

Furthermore, depending on the prediction task, when we do some data preprocessing using the popular one-hot encoding technique to convert categorical variables, we get a vector - a binary vector, which is a better data format for ML algorithms to be trained on in order to give a better prediction.

So, it has lots and lots of applications. See Some examples below:

- Want to implement
**Principal Component Analysis (PCA)**, which is a popular dimensionality reduction technique to avert the curse of dimensionality? Linear Algebra is used here! - What about
**Word Embeddings**which is in the field of Natural Language Processing (NLP) that seems like the hottest field in Machine Learning right now? Linear Algebra is applied here too! 😎 - Did you just say
**Optimizing Deep Learning Models**? still Linear Algebra! - Even in
**Computer Vision**,**Encoding Data**as I briefly explained above and lots more that I have not even mentioned, Linear Algebra shows itself! Yeah! We are so stuck with it! 😂

Now! Will you keep running away from Math or will you rather just embrace it whole-heartedly? 😄

So, I just shed some light on how Linear Algebra is used in Machine Learning. Stay tuned on this series where we start dissecting one application at a time, going in a little deeper, starting with **The Maths behind Linear Regression** in order to understand the ins and outs of it. Have an amazing week ahead!

## Discussion (4)

Averting the curse of dimensionality! I couldn't have said it better.

Another thing to note is that any graph can be represented by a matrix. Which ones

shouldbe represented that way in a computing problem is another discussion, but all can. And graphs are a big thing in computer science, so there you have another major use case for matrices.Yea! thanks, @n8chz

A very nice article. Waiting eagerly for the other blogs of the series.

Haha, sure! Glad you like it! @amananandrai