DEV Community

Salvietta150x40
Salvietta150x40

Posted on • Originally published at ma-no.org

How to Multiply Matrices in Javascript

How to Multiply Matrices in Javascript

It may seem strange to want to know how to multiply matrices in JavaScript. But we will see some examples where it is useful to know how to perform this operation and how useful it is when handling coordinates as matrices, which we will see in a following article.

But let's go step by step and the first thing to do is to create a matrix in JavaScript. We have to know that an array is an array where each of the positions is an array of elements, for example numbers. In this way we can initialize an array in JavaScript as follows:

m1 = [[1,2,3],[4,5,6]];  
m2 = [[7,8],[9,10],[11,12]];

Enter fullscreen mode Exit fullscreen mode

In order to understand what the columns and rows of the matrix would be, we will take into account that the elements of the main array are the rows and each of the elements of the internal array will be the columns. In this way the two arrays that we have instantiated in these lines of code would correspond to the matrices:

Matrix 2x3  
|1   2    3|    
|4   5   6|    

Matrix 3x2  
|7    8|  
|9   10|  
|11  12|

Enter fullscreen mode Exit fullscreen mode

If we want to know how many rows and columns the matrix has, we can calculate it in the following way:

fil_m1 = m1.length;  
col_m1 = m1[0].length;    
fil_m2 = m2.length;  
col_m2 = m2[0].length;

Enter fullscreen mode Exit fullscreen mode

We can see that the rows are obtained through the .length property of the array and the columns by asking the first element of the matrix, again with the .length property of the array.

This is very important because in order to be able to multiply matrices, the columns of the first matrix must be the same as the columns of the second matrix. We will make this check in the following way:

if (col_m1 != fil_m2)    
throw "Matrices cannot be multiplied";

Enter fullscreen mode Exit fullscreen mode

The next step is to create the matrix with the result. The matrix will have a size equal to as many rows as matrix 1 as columns of matrix 2. So we create the matrix for the result of the multiplication as follows:

let multiplication = new Array(fil_m1);  
for (x=0; x<multiplication.length;x++)      
multiplication[x] = new Array(col_m2).fill(0);

Enter fullscreen mode Exit fullscreen mode

We see that we first create an array and then for each element of the array we create a new element. We rely on the .fill() method that allows us to fill the array with a number. In this case we are going to initialise it to 0.

The next step is to perform the multiplication. To do this we go through the result matrix and in each x,y position we assign the result of multiplying each element of the row of the first matrix
with each of the columns of the second matrix.

The scheme would be as follows

matrices multiplcation

And the code that implements it would be:

for (x=0; x < multiplication.length; x++) {      
for (y=0; y < multiplication[x].length; y++) {   
    for (z=0; z<col_m1; z++) {              
               multiplication [x][y] = multiplication [x][y] + m1[x][z]*m2[z][y]; 
               }      
    }  
}

Enter fullscreen mode Exit fullscreen mode

In this way we will have managed to perform the multiplication and the result will be stored in the multiplication matrix. This way we will have managed to multiply matrices in JavaScript.

Top comments (0)