# Eigenvalues and Eigenvectors (Find and Use Them)

## Eigenvalues and Eigenvectors

Let $\mathbb{F}$ be either the real numbers or the complex numbers. A nonzero vector $v$ in $\mathbb{F}^n$ is called an eigenvector of an $n\times n$ matrix $A$ if $A v$ is a scalar multiple of $v$, that is $A v= \lambda v$ for some scalar $\lambda.$ Note that this scalar $\lambda$ may be zero. The scalar $\lambda$ is called the eigenvalue associated with the eigenvector $v.$ Even though, $A 0=\lambda 0$ we do not call $0$ an eigenvector. Of course a matrix need not have any eigenvalues or eigenvectors, but notice if $v$ is an eigenvector of matrix $A$, then $v$ is an eigenvector of matrices $A^2$, $A^3$, … as well, with $A^t v=\lambda^t v,$ for all positive integers $t.$ If $\mathbb{F}=\mathbb{C}$, then counting multiplicities, every $n\times n$ matrix has exactly $n$ eigenvalues.

If $v$ is an eigenvector of the $n\times n$ matrix $A$ with associated eigenvalue $\lambda$, what can you say about $\ker(A-\lambda I_n)$? Is the matrix $A-\lambda I_n$ invertible? We know $A v=\lambda v$ so $(A-\lambda I_n) v=A v-\lambda I_n v=\lambda v-\lambda v=0.$ Thus a nonzero vector $v$ is in the kernel of $(A-\lambda I_n).$ Therefore, $\ker(A-\lambda I_n)\neq {0}$ and so $A-\lambda I_n$ is not invertible.

Lemma. Let $A$ be an $n\times n$ matrix $A$ and $\lambda$ a scalar. Then $\lambda$ is an eigenvalue of $A$ if and only if $\det(A-\lambda I_n)=0.$

Proof. The proof follows from the chain of equivalent statements:

• $\lambda$ is an eigenvalue of $A$,
• there exists a nonzero vector $v$ such that $(A -\lambda I_n ) v=0$,
• $\ker(A -\lambda I_n )\neq {0}$,
• matrix $A-\lambda I_n$ fails to be invertible, and
• $\det(A -\lambda I_n )=0.$

Example. Find all eigenvectors and eigenvalues of the identity matrix $I_n.$ Since $I_n v = \lambda v= 1 v$ for all $v\in \mathbb{R}^n$, all nonzero vectors in $\mathbb{R}^n$ are eigenvectors of $I_n$, with eigenvalues $\lambda=1.$

Lemma. The eigenvalues of a triangular matrix are its diagonal entries.

Proof. Let $A$ be a triangular matrix. Then $A-\lambda I_n$ is also a triangular matrix, and so $\det(A-\lambda I_n)$ is the product of its diagonal entries. Let $a_{ii}$ be any diagonal entry of $A.$ Then $a_{ii}-\lambda$ is the corresponding diagonal entry of $A-\lambda I_n.$ Thus $\lambda$ is an eigenvalue of $A$ if and only if $a_{ii}-\lambda=0$.

Example. Find a basis of the linear space $V$ of all $2\times 2$ matrices for which $e_1$ is an eigenvector. For an arbitrary $2\times 2$ matrix we want$$\begin{bmatrix} a & b \\ c & d\end{bmatrix} \begin{bmatrix} 1\\ 0\end{bmatrix} = \begin{bmatrix}a\\ c\end{bmatrix} = \begin{bmatrix} \lambda \\ 0\end{bmatrix} = \lambda \begin{bmatrix} 1\\ 0\end{bmatrix}$$ for any $\lambda.$ Hence $a, b, d$ are free and $c=0$; thus a desired basis of $V$ is$$\left( \begin{bmatrix} 1 & 0 \\ 0 & 0\end{bmatrix}, \begin{bmatrix} 0 & 1 \\ 0 & 0\end{bmatrix}, \begin{bmatrix} 0 & 0 \\ 0 & 1\end{bmatrix} \right).$$

Example. Find a basis of the linear space $V$ of all $4\times 4$ matrices for which $e_2$ is an eigenvector. We want to find all $4 \times 4$ matrices $A$ such that $A e_2=\lambda e_2.$ Thus the second column of an arbitrary $4 \times 4$ matrix $A$ must be of the form $\begin{bmatrix} 0\\ \lambda\\ 0\\ 0\end{bmatrix}$, so $$A=\begin{bmatrix} a & 0 & c & d \\ e & \lambda & f & g \\ h & 0 & i & j \\ k & 0 & l & m\end{bmatrix}.$$ Let $E_{ij}$ denote the $4\times 4$ matrix with all entries zero except for a 1 in the $i$-th row and $j$-th column. Then a basis for $V$ is $$\left( E_{11}, E_{21}, E_{31}, E_{41}, E_{22}, E_{13}, E_{23}, E_{33}, E_{34}, E_{41}, E_{42}, E_{43}, E_{44} \right)$$ and so the dimension of $V$ is 13.

Example. Find the eigenvalues and find a basis for each eigenspace given $A=\begin{bmatrix}1 & 0 & 0 \\ -5 & 0 & 2 \\ 0 & 0 & 1 \end{bmatrix}.$ Find an eigenbasis for $A.$ The eigenvalues are $\lambda_1=0$ and $\lambda_2=\lambda_3=1.$ A basis for $E_{0}$ is $\left(\begin{bmatrix} 0\\ 1\\ 0\end{bmatrix}\right).$ A basis for $E_{1}$ is $\left(\begin{bmatrix}1\\ -5\\ 0\end{bmatrix},\begin{bmatrix}0\\ 2\\ 1\end{bmatrix}\right).$ An eigenbasis for $A$ is $\left(\begin{bmatrix}0 \\ 1\\ 0\end{bmatrix},\begin{bmatrix}1\\ -5\\ 0 \end{bmatrix}, \begin{bmatrix}0\\ 2\\ 1\end{bmatrix}\right).$

Example. Find a basis of the linear space $V$ of all $2\times 2$ matrices $A$ for which $\begin{bmatrix} 1\\ -3\end{bmatrix}$ is an eigenvector. For an arbitrary $2\times 2$ matrix we want $$\begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} 1\\ -3\end{bmatrix} = \lambda\begin{bmatrix} 1\\ -3\end{bmatrix} = \begin{bmatrix} \lambda \\ -3\lambda\end{bmatrix}.$$ Thus $a-3b=\lambda$, $c-3d=-3\lambda$ and so $c=-3a+9b+3d.$ Thus $A$ must be of the form $$\begin{bmatrix} a & b \\ -3a+9b+3d & d\end{bmatrix}=a\begin{bmatrix} 1 & 0 \\-3 & 0 \end{bmatrix}+b \begin{bmatrix} 0 & 1 \\ 9 & 0 \end{bmatrix}+ d\begin{bmatrix} 0 & 0 \\ 3 & 1\end{bmatrix}.$$ Thus a basis of $V$ is$$\left( \begin{bmatrix} 1 & 0 \\ -3 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 1 \\ 9 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 \\ 3 & 1\end{bmatrix} \right)$$ and so the dimension of $V$ is $3.$

Example. Find a basis of the linear space $V$ of all $3\times 3$ matrices $A$ for which both $\begin{bmatrix} 1\\ 0 \\ 0 \end{bmatrix}$ and $\begin{bmatrix}0 \\ 0 \\ 1\end{bmatrix}$ are eigenvectors. Since $A \begin{bmatrix} 1\\ 0 \\ 0 \end{bmatrix}$ is simply the first column of $A$, the first column must be a multiple of $e_1.$ Similarly, the third column must be a multiple of $e_3.$ There are no other restrictions on the form of $A$, meaning it can be any matrix of the form $$\begin{bmatrix} a & b & 0 \\ 0 & c & 0 \\ 0 & d & e \end{bmatrix}$$ Thus a basis of $V$ is$$\left( \begin{bmatrix} 1 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 1 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 1 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 & 0 \\ 0 & 0 & 0 \\ 0 & 0 & 1 \end{bmatrix} \right)$$ and so the dimension of $V$ is 5.

Theorem. If $A$ is an $n\times n$ matrix, then $\det(A-\lambda I_n)$ is a polynomial of degree $n$, of the form $$\label{chpo} f_A(\lambda)=(-\lambda)^n+\mathop{trace} (A) (-\lambda)^{n-1}+\cdots +\det(A).$$

## Characteristic Equation

The equation $\det(A-\lambda I_n)=0$ is called the characteristic equation of $A.$ The polynomial in \eqref{chpo} is called the characteristic polynomial and is denoted by $f_A(\lambda).$ We say that an eigenvalue $\lambda_0$ of a square matrix $A$ has algebraic multiplicity $k$ if $\lambda_0$ is a root of multiplicity $k$ of the characteristic polynomial $f_A(\lambda)$ meaning that we can write $$f_A(\lambda)=(\lambda_0-\lambda)^k g(\lambda)$$for some polynomial $g(\lambda)$ with $g(\lambda_0)\neq 0.$

## Examples on Eigenvalues and Eigenvectors

Example. Find the characteristic equation for a $2\times 2$ matrix $A.$ The characteristic equation of $A=\begin{bmatrix} a& b \\ c & d\end{bmatrix}$ is \begin{align*}f_A(\lambda) & =\det \begin{bmatrix} a-\lambda& b \\ c & d-\lambda \end{bmatrix} \\ & =(a-\lambda)(d-\lambda)-bc \\ & =\lambda^2-(a+d)\lambda +(ad-bc)=0. \end{align*}

Example. Use the characteristic polynomial $f_A(\lambda)$ to determine the eigenvalues and their multiplicities of $$A=\begin{bmatrix} -1 & -1 & -1 \\ -1 & -1 & -1 \\ -1 & -1 & -1 \end{bmatrix}.$$ The characteristic equation is $f_A(\lambda)=-\lambda^2(\lambda+3).$ So $\lambda_1=0$ with algebraic multiplicity of 2 and $\lambda_2=-3$ with algebraic multiplicity of 1 are the eigenvalues of $A.$

Example. Consider the matrix $A=\begin{bmatrix} a & b \\ b & c \end{bmatrix}$, where $a, b, c$ are nonzero constants. For which values of $a, b, c$ does $A$ have two distinct eigenvalues? The characteristic equation is $f_A(\lambda)=\lambda^2-(a+c)\lambda+(a c-b^2).$ The discriminant of this quadratic equation is \begin{align*} (a+c)^2-4(ac-b^2) & =a^2+2ac+c^2-4ac+4b^2 \\ & =(a-c)^2+4b^2. \end{align*} The discriminant is always positive since $b\neq 0.$ Thus, the matrix $A$ there will always have two distinct real eigenvalues.

Example. In terms of eigenvalues of $A$, which $2\times 2$ matrices $A$ does there exist an invertible matrix $S$ such that $AS=SD$, where $D=\begin{bmatrix} 2 & 0 \\ 0 & 3 \end{bmatrix}$? If we let $S=\begin{bmatrix} v_1 & v_2\end{bmatrix}$, then $AS=\begin{bmatrix} A v_1 & A v_2\end{bmatrix}$ and $SD=\begin{bmatrix} 2 v_1 & 3 v_2\end{bmatrix}.$ So that $v_1$ must be an eigenvector of $A$ with eigenvalue 2, and $v_2$ must be an eigenvector of $A$ with eigenvalue 3. Thus, the matrix $S$ will exist and will have first column has an eigenvector of $A$ with eigenvalue 2, and have second column is an eigenvector of $A$ with eigenvalue of 3. Therefore, $A$ can be any matrix satisfying these requirements.

Exercise. Let $A$ be a matrix with eigenvalues $\lambda_1, \ldots, \lambda_k.$

• Show the eigenvalues of $A^T$ are $\lambda_1, \ldots, \lambda_k.$
• Show the eigenvalues of $\alpha A$ are $\alpha\lambda_1, \ldots, \alpha \lambda_k.$
• Show $A^{-1}$ exists if and only if $\lambda_1 \cdots \lambda_k\neq 0.$
• Also, show that if $A^{-1}$ exists then its eigenvalues are $1/\lambda_1,\ldots,1/\lambda_k.$

Example. Let $A$ be a matrix with eigenvalues $\lambda_1, \ldots, \lambda_k$ and let $m$ be a positive integer. Show that the eigenvalues of $A^m$ are $\lambda^m_1, \ldots, \lambda^m_k.$

Exercise. By using the matrix $$\begin{bmatrix} 0 & 1 & 0 & \cdots & 0\\ 0 & 0 & 1 & \cdots & 0\\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 0 & 0 & 0 & \cdots & 1 \\ \frac{-a_n}{a_0} & \frac{-a_{n-1}}{a_0}& \frac{-a_{n-2}}{a_0} & \cdots & \frac{-a_1}{a_0} \end{bmatrix}$$ Show that any given polynomial $a_o \lambda^n + a_1\lambda^{n-1} + \cdots +a_{n-1}\lambda +a_n$ where $a_0\neq 0$, of degree $n$ may be regarded as the characteristic polynomial of a matrix of order $n.$ This matrix is called the companion matrix of the given polynomial.

Example. Let $A$ and $B$ be $n\times n$ matrices. Show that $AB$ and $BA$ have the same eigenvalues.

Example. Let $A$ and $B$ be real $n\times n$ matrices with distinct eigenvalues. Prove that $AB=BA$ if and only if $A$ and $B$ have the same eigenvectors.

Example. Prove that the characteristic polynomial of the block-triangular matrix $A=\begin{bmatrix} B & C\\ 0 & D\end{bmatrix}$ is the product of the characteristic polynomials of $B$ and $D.$

Example. Suppose that $A$ is an invertible $n\times n$ matrix. Prove that $$f_{A^{-1}}(x)=(-x)^n\det(A^{-1})f_A\left(\frac{1}{x}\right).$$

Example. Let $A$ be an $n\times n$ matrix. Prove that $A$ and $A^T$ have the same characteristic polynomial and hence the same eigenvalues.

If $\lambda$ is an eigenvalue of an $n\times n$ matrix $A$, then the kernel of the matrix $A-\lambda I_n$ is called the eigenspace associated with $\lambda$ and is denoted by $E_\lambda.$ The dimension of the eigenspace is called the geometric multiplicity of eigenvalue $\lambda.$ In other words, the geometric multiplicity is the nullity of the matrix $A-\lambda I_n.$

Theorem. Let $A$ be an $n\times n$ matrix. If $\lambda_1, \ldots, \lambda_k$ are distinct eigenvalues of $A$, and $v_1, \ldots, v_k$ are any nonzero eigenvectors associated with these eigenvalues respectively, then $v_1, \ldots, v_k$ are linearly independent.

Proof. Suppose there exists constants $c_1,\ldots, c_k$ such that $$\label{eigenveclin} c_1 v_1+\cdots +c_k v_k=0$$ Using the fact that $A v_i=\lambda_i v_i$ we multiply \eqref{eigenveclin} by $A$ to obtain $$\label{eigenveclin2} c_1 \lambda_1 v_1+\cdots c_k \lambda_k v_k=0.$$ Repeating this again we obtain $$\label{eigenveclin3} c_1 \lambda^2_1 v_1+\cdots +c_k \lambda^2_k v_k=0.$$ Repeating, we are lead to the system in the vector unknowns $v_1, \ldots, v_k$ $$\begin{bmatrix} c_1 v_1 & \cdots & c_k v_k \end{bmatrix}_{n\times k} \begin{bmatrix} 1 & \lambda_1 & \lambda_1^2 & \cdots \lambda_1^{k-1} \\ 1 & \lambda_2 & \lambda_2^2 & \cdots \lambda_2^{k-1} \\ 1 & \lambda_3 & \lambda_3^2 & \cdots \lambda_3^{k-1} \\ \vdots & \vdots & \vdots & \ddots \vdots \\ 1 & \lambda_k & \lambda_k^2 & \cdots \lambda_k^{k-1} \\ \end{bmatrix}_{k\times k} =0_{n\times k}.$$ Since the eigenvalues are distinct, the coefficient matrix is an invertible Vandermonde matrix. Multiplying on the right by its inverse shows that $$\begin{bmatrix} c_1 v_1 & \cdots & c_k v_k \end{bmatrix} =0_{n\times k}.$$ It follows that every $c_i$ must be zero. Hence $v_1, \ldots, v_k$ are linearly independent.

A basis of $\mathbb{F}^n$ consisting of eigenvectors of $A$ is called an eigenbasis for $A.$ In particular, if an $n\times n$ matrix $A$ has $n$ distinct eigenvalues, then there exists an eigenbasis for $A$, namely, construct an eigenbasis by finding an eigenvector for each eigenvalue.

Example. Find the characteristic equation, the eigenvalues, and a basis for the eigenspace. $$A=\begin{bmatrix} 3 & 2 & 4 \\ 2 & 0 & 2\\ 4 & 2 & 3 \end{bmatrix}$$ The eigenvalues are $\lambda_1=8$ (with algebraic multiplicity 1) and $\lambda_2=-1$ (with algebraic multiplicity 2) since $$\det(A-\lambda I) =\begin{vmatrix} 3-\lambda & 2 & 4 \\ 2 & -\lambda & 2\\ 4 & 2 & 3\lambda \end{vmatrix} =-\lambda^3+6\lambda^2+15\lambda+8=0.$$ For $\lambda_1=8$ we obtain $$\begin{bmatrix} -5 & 2 & 4 \\ 2 & -8 & 2\ 4 & 2 & -5 \end{bmatrix} \begin{bmatrix}x_1\ x_2\\ x_3\end{bmatrix}=\begin{bmatrix}0\\ 0\\ 0\end{bmatrix}$$ and we obtain the eigenvector $v_1=\begin{bmatrix}2\\ 1\\ 2\end{bmatrix}$ with $E_8=\mathop{span}(v_1).$ Therefore the geometric multiplicity of $\lambda_1=8$ is 1. For $\lambda_1=-1$ we obtain $$\begin{bmatrix} 4 & 2 & 4 \\ 2 & 1 & 2\\ 4 & 2 & 4 \end{bmatrix} \begin{bmatrix}x_1\\ x_2 \\ x_3 \end{bmatrix} =\begin{bmatrix}0\\ 0\\ 0\end{bmatrix}$$ and we obtain the eigenvectors $v_2=\begin{bmatrix}1\\ -2 \\ 0\end{bmatrix}$ and $v_3=\begin{bmatrix}0 \\ -2 \\ 1 \end{bmatrix}$ with $E_{-1} =\mathop{span}(v_2, v_3).$ Therefore the geometric multiplicity of $\lambda_2=-1$ is 2.

Example. Show that for each of the following matrices, $\lambda=3$ is an eigenvalue of algebraic multiplicity 4. In each case, compute the geometric multiplicity of $\lambda.$

$$\begin{bmatrix} 3 & 0 & 0 & 0 \\ 0 & 3 & 0 & 0 \\ 0 & 0 & 3 & 0\\ 0 & 0 & 0 & 3 \end{bmatrix} \qquad \begin{bmatrix} 3 & 1 & 0 & 0 \\ 0 & 3 & 0 & 0 \\ 0 & 0 & 3 & 0\\ 0 & 0 & 0 & 3 \end{bmatrix} \qquad \begin{bmatrix} 3 & 1 & 0 & 0 \\ 0 & 3 & 1 & 0 \\ 0 & 0 & 3 & 0\\ 0 & 0 & 0 & 3 \end{bmatrix} \qquad \begin{bmatrix} 3 & 1 & 0 & 0 \\ 0 & 3 & 1 & 0 \\ 0 & 0 & 3 & 1\\ 0 & 0 & 0 & 3 \end{bmatrix}$$

Theorem. Similar matrices $A$ and $B$ have the same determinant, trace, characteristic polynomial, rank, nullity, and the same eigenvalues with the same algebraic multiplicities.

Proof. The case for the determinant and trace were previously proven. Since $A$ and $B$ are similar, there exists an invertible matrix $P$ such that $B=P^{-1}AP.$ We find \begin{align*} \det(B-\lambda I) &=\det(P^{-1}AP-\lambda I) \\ & =\det(P^{-1}AP-P^{-1}\lambda I P) =\\& \det(P^{-1}(A-\lambda I) P) \\ & =\det(P^{-1})\det(A-\lambda I) \det(P) \\ & =\det(P^{-1})\det(P) \det(A-\lambda I) \\ & =\det(P^{-1}P)\det(A-\lambda I) \\& =\det(A-\lambda I) \end{align*} Thus $A$ and $B$ have the same characteristic equation. Therefore also the same eigenvalues with the same algebraic multiplicities.

If $T$ is a linear transformation from $V$ to $V$ then a scalar $\lambda$ is called an eigenvalue of $T$ if there exists a nonzero element $v$ in $V$ such that $T(v)=\lambda v.$ Assuming $V$ is finite-dimensional then a basis $\mathcal{D}$ of $V$ consisting of eigenvectors of $T$ is called an eigenbasis for $T.$

Theorem. Let $T$ be a linear transformation on a finite-dimensional vector space $V$, and let $\lambda$ be an eigenvalue of $T.$ The geometric multiplicity of $\lambda$ is less than or equal to the algebraic multiplicity of $\lambda.$

Proof. Let $k$ represent the geometric multiplicity of $\lambda$ and assume $\dim V=n.$ First notice, by definition, the eigenspace $E_{\lambda}$ must contain at least one nonzero vector, and thus $k=\dim E_{\lambda} \geq 1.$ Choose a basis $v_1, \ldots,v_k$ for $E_{\lambda}$ and extend it to a basis $\mathcal{B}=(v_1, \ldots, v_k, v_{k+1},\ldots, v_n)$ of $V.$ For $1\leq i \leq k$, notice $$[T(v_i)]_{\mathcal{B}} =[\lambda v_i]_{\mathcal{B}} = \lambda[v_i]_{\mathcal{B}} =\lambda e_i.$$ Thus the matrix representation for $T$ with respect to $\mathcal{B}$ has the form $$B= \begin{bmatrix}\lambda I_k & C\\ 0 & D \end{bmatrix}$$where $C$ is a $k\times (n-k)$ submatrix, $O$ is an $(n-k)\times k$ zero submatrix, and $D$ is an $(n-k)\times (n-k)$ submatrix. We determine the characteristic polynomial of $T$ \begin{align*}f_T(x) & =f_B(x) \\ & =|x I_n-B| \\ & =\left|xI_n \begin{bmatrix} \lambda I_k & C\\ 0 & D \end{bmatrix}\right|\begin{vmatrix} (x-\lambda)I_k & C\\ 0 & xI_{n-k}-D \end{vmatrix} \\ & =(x-\lambda)^k f_D(x). \end{align*} It follows that $f_T(x)=(x-\lambda)^{k+m} g(x)$ where $g(\lambda)\neq 0$ and $m$ is the number of factors of $x-\lambda$ in $f_D(x).$ Hence $k\leq k+m$ leading to the desired conclusion.

Example. Let $T(M)=M-M^T$ be a linear transformation from $\mathbb{R}^{2\times2}$ to $\mathbb{R}^{2\times2}.$ For each eigenvalue find a basis for the eigenspace and state the geometric multiplicity. Since $A=A^T$ for every symmetric matrix, we notice $$T(M)=M-M^T=M-M=0$$ whenever $M$ is a symmetric matrix. Thus the nonzero symmetric matrices are eigenmatrices with eigenvalue $0.$ Also notice the nonzero skew-symmetric matrices have eigenvalue $2$ since $$L(M)=M-M^T=M+M=2M.$$ For eigenvlaue $\lambda=0$ we have eigenspace $E_0$ with basis $$\left( \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix},\begin{bmatrix}0 & 1 \\ 1 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix} \right).$$This follow from the condition $A=A^T$ in $\mathbb{R}^{2\times2}.$ Therefore the geometric multiplicity of $\lambda=0$ is 3. For eigenvalue $\lambda=2$ we have eigenspace $E_2$ with basis $$\left( \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix} \right).$$ which follows from the condition $A=-A^T$ in $\mathbb{R}^{2\times2}.$ Therefore the geometric multiplicity of $\lambda=2$ is 1. We have an eigenbasis $$\left( \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 1 \\ 1 & 0 \end{bmatrix}, \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}, \begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix} \right).$$ for $T.$