Another day another question at the bottom.
As always, if there's a question that means I stumbled upon an solution that's different, this time however, it isn't the solution, but the way to get there itself. But since I've read till the end of the section this might be a solution I'll use until proven otherwise.
I'll break down the steps with so much scrutiny that even my mental state will join the the fun... The mental break down type of fun :D. So, let's begin.
Transformation matrix
Transformation matrix is much like the last topic, but if the previous day was regarding vectors this will be with matrices.
So Why do you need to make this if it's just from vectors to matrices?
So I'll explain the method that works in the book's example and I'll explain what's really going on.
Explanation from the book
Consider a vector space with the corresponding ordered bases.
We're also considering a linear mapping
Which will result in the following unique representation
As I've said, this is the unique representationwith respect to C. Then we call the mxn-matrix whose element are given my
is the transformation matrix with respect to the ordered bases B:V and C:W.
Example
Consider a homomorphism with an ordered bases B of V and C of W.
With
The transformation matrix with respect to B and C satisfies the following
and is given as :
You get all that?
Honestly, I sure don't. So let's try going in even more depth than the book intended.
So in order to transform the matrix, we need to map the result that we want to the transformation matrix. Which is why we use C instead of B.
If we lie them in a transposed vector type of way, it'll look something like this
then just remove the variable and you have the same formula as before.
But Terra, I get how it's made now but how do you use it?
So if the matrix we're trying to achieve is W. Given there's three variables, we can write it like this. Remember, A is the transformation and V is the vector beforehand.
Then we expand the formula
And that's it, just multiply the vectors before with the transformation you want and you're all set!
A question.
I haven't found the answer and probably never will considering it took me way longer than I imagined. So I'll just leave it here :D
The equation for finding transformation matrix.
The solution I thought and that haven't been disproven (a.k.a not scientifically proven)
Just... transpose it.
Here's the formula
Split it and remove the constants
Notice that with the result before, each row corresponds to a column?
So change the row into columns and vice versa!
Seriously, what's math?? What am I doing wrong? I know the guy from Cambridge who wrote this is much smarter than I am so he must have thought someone will think this is just a transposed version. So why :(
Acknowledgement
I can't overstate this: I'm truly grateful for this book being open-sourced for everyone. Many people will be able to learn and understand machine learning on a fundamental level. Whether changing careers, demystifying AI, or just learning in general, this book offers immense value even for fledgling composer such as myself. So, Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, thank you for this book.
Source:
Axler, Sheldon. 2015. Linear Algebra Done Right. Springer
Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning. Cambridge: Cambridge University Press.
https://mml-book.com
Top comments (0)