<aside> 💡 Recommended watching: https://www.3blue1brown.com/topics/linear-algebra

</aside>

Vectors, matrices, and linear transformations

Elementary vector operations

A geometric view of vector addition and subtraction:

Untitled

When subtracting vectors, most of the time we do not care where is the origin. Given $u,v \in \R^d$, we can just think of the new vector $v-u$ as having $u$ and $v$ as endpoints and pointing in the direction of $v$:

Untitled

The dot product and Euclidean norm

<aside> 💡 Dot product. For $x, y\in \R^d$, define;

$$ \langle x, y \rangle := x \cdot y := x^\top y

\sum_{i=1}^d x_i y_i, $$

</aside>

<aside> 💡 Euclidean norm (also known as $l_2$ norm) For $x\in \R^d$, define

$$ \| x \| := \| x \|2 := \sqrt{x^\top x} = \sqrt{\sum{i=1}^d x_i^2}. $$

</aside>

Our convention is that vectors are column vectors, thus $u^\top\in \R^{1 \times d}$ could be understood as a matrix, and the above dot product interpreted as a linear transformation:

$$ \begin{align*} u^\top(\cdot) &= \langle u,\cdot\rangle \colon \R^d \to \R \end{align*} $$

This linear transformation induced by $u$, is the signed length of the “scaled projection” onto the 1-dimensional subspace spanned by $u$. When $u$ is unit length, we have an actual projection:

Untitled

When the angle between vectors is acute, the inner product is positive, when it is obtuse, it is negative.(More precisely, the dot product gives the cosine of the angle between vectors, $a^\top b = \|a\|\|b\| \cos \theta$.)

Untitled

Matrix-matrix product

Let

$$ \begin{align*} \R^{m\times d} \ni A &= \begin{pmatrix} - & a_1^\top & - \\ & \vdots & \\ - & a_m^\top & - \end{pmatrix} = \begin{pmatrix} | & & | \\ A_1& \cdots & A_d \\ | & & | \end{pmatrix} \\ \R^{d\times n} \ni B &= \begin{pmatrix} - & b_1^\top & - \\ & \vdots & \\ - & b_d^\top & - \end{pmatrix} = \begin{pmatrix} | & & | \\ B_1& \cdots & B_n \\ | & & | \end{pmatrix} \end{align*} $$

Then the matrix $AB \in \R^{m\times n}$ is given by: