DEV Community

Terra
Terra

Posted on • Edited on • Originally published at pourterra.com

Mathematics for Machine Learning - Day 9

Always has been vector meme

Get it? because they're in space, but if they're on earth the vector (in) space disappears and only 0 remains :D

Vector Space

With your knowledge regarding groups we can start to discuss regarding vector spaces! and if you read till the end, the mathematical notations if vector space will be a breeze!

What's a vector space?

With the four conditions from group hold true, we can start to add operations outside of the group. Such as:

The multiplication of a vector xG by a scalar λR \text{The multiplication of a vector } x \in \mathscr{G} \text{ by a scalar } \lambda \in \reals

A real valued vector space:

ν=(ν,+,) is a set νwith two operations \nu = (\nu, +, \bullet) \text{ is a set } \nu \text{with two operations}

Those operations are:

(+:ν×νν)(:R×νν) (+ : \nu \times \nu \to \nu) \\ (\bullet : \reals \times \nu \to \nu)

Side note: The brackets doesn't mean anything, the + sign doesn't work at the beginning of KaTeX

What does that?

Good question, much like my mental health, let's have a breakdown!

Abelian Group

ν=(ν,+) is an Abelian group  \nu = (\nu, +) \text{ is an Abelian group }

An Abelian group is the same as a normal group but with an additional condition:

x,yG:xy=yx \forall x,y \in \mathscr{G} : x \bigotimes y = y \bigotimes x

Then

G=(G,) G = (\mathscr{G}, \bigotimes )

In the book, there's a lot of examples to ensure we have a good understanding on what Abelian and groups in general are, but I won't go too deep in it here. I feel so long as we understood the conditions for a group and the additional one for Abelian, we're good to go.

For example:

(N0,+) (\natnums_0 , +)

is not a group, because though all natural number additions are still part of the natural number and with the addition of 0 (Which is why there's a subscript 0) means there's a neutral number.

The "group" lacks inverse numbers, in addition and subtraction means it lacks negative numbers.

Distributivity

Yeah, yeah, this is the third time I've mentioned distributivity so I'll just show the formula and continue on.

A. λR,x,yν:λ(x+y)=λx+λy \text{A. } \forall \lambda \in \reals, x,y \in \nu : \lambda (x+y) = \\ \lambda x + \lambda y
B. λ,ψR,xν:(λ+ψ)x=λx+ψy \text{B. } \forall \lambda , \psi \in \reals, x \in \nu : (\lambda + \psi) x = \\ \lambda x + \psi y

Associativity

Ditto (I really hope you're not skipping chapters, there's a lot of same conditions that would be better to start from vectors than here straight away)

λ,ψR,xν:λ(xψ)=(λψ)x \forall \lambda ,\psi \in \reals, x \in \nu : \lambda(x \psi) = (\lambda \psi) x

Neutral elements

... Hi.

xν:Ix=x \forall x \in \nu : I x = x

A few remarks

The elements xν are called vectorsRn,Rn×1,R1×n are only different ways vectors can be written \text{The elements } x \in \nu \text{ are called vectors} \\ \reals^n , \reals^{n \times 1}, \reals^{1 \times n} \text{ are only different ways vectors can be written} \\

The only difference for the vectors is that Rn is for vertical matrices so it's the same as Rnx1 but R1xn is for horizontal matrices.

Rn,Rn×1=(x1x2x5)R1×n=(x1,x2,,x5) \reals^n , \reals^{n \times 1} = \begin{pmatrix} x_1 \\ x_2 \\ \vdots \\ x_5 \end{pmatrix} \\ \reals^{1 \times n} = \begin{pmatrix} x_1, x_2, \dots, x_5 \end{pmatrix}

Vector Subspaces.

The author reminded in us the importance of vector subspaces as it will be revisited later on (i.e. Chapter 10 Dimensionality Reduction)

Let:

ν=(ν,+,) \nu = (\nu, +, \bullet)

be a vector space, and

υν:ν \upsilon \subseteq \nu : \nu \not = \empty

Then:

ν=(ν,+,) is a vector subspace of ν or linear subspace \nu = (\nu, +, \bullet) \text{ is a vector subspace of } \nu \text { or linear subspace}

Subsets

To determine:

υν:ν \upsilon \subseteq \nu : \nu \not = \empty

The subset needs to inherit many properties from the vector space

So what?

(1.) υ(2.) λR,xυ:λ×υ (Outer operations)(3.) x,yυ:x+yυ (Inner operations) \text{(1.) } \upsilon \not = \empty \\ \text{(2.) } \forall \lambda \in \reals, \forall x \in \upsilon : \lambda \times \in \upsilon \text{ (Outer operations)} \\ \text{(3.) } \forall x,y \in \upsilon : x + y \in \upsilon \text{ (Inner operations)}

Point two and three are the Closure condition from V

Example

Vector subspace example

  1. For every vector space, the trivial subspace are the vector space itself and {0}
  2. Only example D is a subspace of R (with the usual inner/outer operations). In A and C, the closure property is violated and B doesn't contain 0.
  3. The solution set of a hogenous system of linear equations Ax = 0 with n unknowns x = [x1, x2, ..., xn] transposed is a subspace of Rn
  4. The solution of an inhomogenous system of linear equations Ax = b, b not equaling 0 is not a subspace of Rn.
  5. The intersection of arbitrary many subspaces is a subspace itself.

If you're confused, so am I, let me try to break it down.

  1. All subspace have at least two subspaces. The vector space itself and 0.
  2. A violates the closure condition since we can use additions to find elements outside of the box
  3. B violates the neutral element condition since it doesn't intersect with (0,0)
  4. C violates the closure condition since using conditions we can find elements outside the weird shape as well.
  5. D fulfills all conditions, no violation for associativity, distributivity, neutral element nor inverse element.
  6. Ax = 0 is a homogenous system. This is a subspace

Why? because it fulfills the neutral element again, the value returns to zero and if it's all zero (Just like example D) it doesn't violate any conditions

  1. Ax = b with b not 0 is a in-homogenous system. This isn't a subspace.
  2. Let's say we use example A and C. Assuming both of them are from the same vector space. There are areas where example A overlap with example C and in the areas where those subspaces overlap/intersect (Intersect is for points) will also be a subspace.

Ending Note

Honestly, yesterday's short summary was worth it. It was so much faster understanding this concept after fully understanding groups and sets. So we covered two topics today, vector spaces and their subspaces, horray! :D


Acknowledgement

I can't overstate this: I'm truly grateful for this book being open-sourced for everyone. Many people will be able to learn and understand machine learning on a fundamental level. Whether changing careers, demystifying AI, or just learning in general, this book offers immense value even for fledgling composer such as myself. So, Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, thank you for this book.

Source:
Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning. Cambridge: Cambridge University Press.
https://mml-book.com

Top comments (0)