eigenvectors / description.md
sharmaanupam's picture
Added Batman logo, transformation types and updated description (#2)
c1ab7e8

A newer version of the Streamlit SDK is available: 1.40.1

Upgrade

You might observe that sometimes the shapes collapse to a line, or sometimes the eigenvectors won't show due to them being complex. Let us look at why this happens in some detail.

A point $(x, y)$ transforms under $A$ to $(x',y')$ as follows

[xβ€²yβ€²]=[a00a01a10a11]⏟A[xy] \begin{bmatrix} x'\\ y' \end{bmatrix} = \underbrace{\begin{bmatrix} a_{00} & a_{01} \\ a_{10} & a_{11} \end{bmatrix}}_{A} \begin{bmatrix} x\\ y \end{bmatrix}

It is helpful to see where the unit basis vectors map to under the transformation.

[10][a00a10]β€…β€Šβ€…β€Šβ€…β€Šβ€…β€Šβ€…β€Šβ€…β€Šβ€…β€Šβ€…β€Š[01][a01a11] \begin{bmatrix} 1\\ 0 \end{bmatrix} \stackrel{A}{\longrightarrow} \begin{bmatrix} a_{00}\\ a_{10} \end{bmatrix} \;\;\;\;\;\;\;\; \begin{bmatrix} 0\\ 1 \end{bmatrix} \stackrel{A}{\longrightarrow} \begin{bmatrix} a_{01}\\ a_{11} \end{bmatrix}

Eigenvectors are precisely those vectors that satisfy the following property Av=Ξ»v Av = \lambda v They determine the directions that remain invariant under the transformation. Their corresponding eigenvalues $\lambda$ determine how much the space is stretched or squished in that direction. An eigenvalue of zero would imply that the transformed space collapses in that direction. Since the determinant is the product of the eigenvalues, it quantifies how much the transform amplifies the area measure.

To obtain the eigenvalues, we solve the characteristic equation $\left| A - \lambda I \right| = 0$ which in our case expands to

Ξ»2βˆ’(a00+a11)Ξ»+(a00a11βˆ’a01a10)=0 \lambda^2 -(a_{00}+a_{11})\lambda + (a_{00}a_{11} - a_{01}a_{10})=0

When the matrix $A$ is singular, then the transformed space collapses to a line. We can verify that $\lambda = 0$ is a solution only when $\left|A\right| = a_{00}a_{11} - a_{01}a_{10} = 0$. The following conditions are also equivalent to $A$ being singular:

  • A row is a multiple of the other. Trivially true when any row is identically zero. Same is true for the columns.
  • Three (or more) elements are zero.

Solving for $\lambda$, we get $$ \frac{1}{2} \left(a_{00}+a_{11}\pm \sqrt{a_{00}^2-2 a_{11} a_{00}+a_{11}^2+4 a_{01} a_{10}}\right) $$

The quantity under the square root sign is called the Discriminant, denoted by $D$. When $D < 0$, the eigenvalues and consequently the eigenvectors are complex. On the other hand, when $A$ is symmetric $a_{01} = a_{10}$, then the discriminant is always positive and the eigendecomposition is real.

Some common transformations include

Name Matrix Explanation
Stretch $\begin{bmatrix} s_{x} & 0 \ 0 & s_{y} \end{bmatrix}$ Streches by $s_x$ in $x$-direction and by $s_y$ in the $y$-direction. When $s_x = s_y = s$, this is equivalent to scaling by $s$
Shear $\begin{bmatrix} 1 & s_{x} \ s_{y} & 1 \end{bmatrix}$ Shears simultaneously by $s_x$ in $x$-direction and by $s_y$ in the $y$-direction. When $s_x = -s_y$, this is equivalent to rotate and scale.
Rotate $\begin{bmatrix} \cos\theta & -\sin\theta \ \sin\theta & \cos\theta \end{bmatrix}$ Rotation by $\theta$ in the anti-clockwise direction. Since all vectors rotate under this transformation, the eigenvalues and eigenvectors are complex.