Honestly, this is because I forgot that the book shouldn't need a bachelors in mathematics to read :(
A more mathematical proof
So, remember the long breakdown I did before? So, turns out the author has a more fancy way (and probably the correct way) of finding out why there's an inverse in our formula before hand.
We are going to look at
T−1AΦ=AΦS∈Rm×n
With:
1. S∈Rn×n being the transformation of idv with respect to B~ onto coordinates with respect to B
and
2. T∈Rm×m being the transformation of idv with respect to C~ onto coordinates with respect to C
What does this mean?
We'll get to that, before that though, remember this?
With the beggining of the quation bj being the vectors of the new basis B (with a weird eyebrow) of V.
And,
ck~=t1kc1,…,smkcm=Σl=1mtlkcl,k=1,…,m
With the beggining of the quation ck being the vectors of the new basis C (with a weird eyebrow) of W.
Finally
S=((sij))∈Rn×n and T=((tkl))∈Rm×m
being the transformation matrix with respect to B (eyebrow) onto coordinates with respect to B and being the transformation matrix with respect to C (eyebrow) onto coordinates with respect to C respectively
Now, let's find out about the first equation
ΦBj~=Cj~
Note:
The first half of the equation above is what's said in the book, but if it's a transformation of B after a basis change, then it's C
But Terra, does that mean the method in the last blog is wrong?
A simple answer is no, it's still stays true since it's working :D
also because the example stated that the matrix is a homomorphism, making the V and W the same.
What does that mean?
It means it's when bj and ck is the same, making T and S the same. The use of P is just arbitrary since we can just change it to T and S and it's still the same result.
What's clear is that if the matrix isn't a homomorphism, then the formula to use is this one, where it differentiates each of the basis change to ensure if the shape of the matrix is different the formula can still be used.
Acknowledgement
I can't overstate this: I'm truly grateful for this book being open-sourced for everyone. Many people will be able to learn and understand machine learning on a fundamental level. Whether changing careers, demystifying AI, or just learning in general, this book offers immense value even for fledgling composer such as myself. So, Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, thank you for this book.
Source:
Axler, Sheldon. 2015. Linear Algebra Done Right. Springer
Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning. Cambridge: Cambridge University Press. https://mml-book.com
Top comments (0)
Subscribe
For further actions, you may consider blocking this person and/or reporting abuse
Top comments (0)