Invariant Subspaces and Generalized Eigenvectors

Invariant Subspaces and Generalized Eigenvectors


Master of Science in Mathematics
Lecture Notes. Accessed on: 2019-10-21 18:15:02

Invariant Subspaces of a Linear Transformation

We let $V$ and $W$ denote real or complex vector spaces. Suppose $T \in \mathcal{L}(V)$ and $U$ a subspace of $V$, then we say $U$ is an invariant subspace of $T$ if $u\in U$ implies $Tu\in U.$

Lemma. Suppose $T \in \mathcal{L}(V)$ and $U$ a subspace of $V.$ Then all of the following hold

$U$ is invariant under $T$ if and only if $T|_U$ is an operator on $U$

$\mathop{ker} T$ is invariant under $T$, and

$\mathop{im} T$ is invariant under $T.$

Proof. The proof of each part follows.

  • By definition, invariant subspace $U$ is invariant under $T$ if and only if $u\in U \implies Tu\in U$, which, by definition of operator, is the same as $T|_U$ being an operator.
  • If $u \in \mathop{ker} T $ then $Tu=0$, and hence $Tu\in \mathop{ker} T.$
  • If $u\in \mathop{im} T $, then by the definition of range $Tu\in\mathop{im} T.$

Theorem. Suppose $T\in \mathcal{L}(V).$ Let $\lambda_1,\ldots,\lambda_m$ denote distinct eigenvalues of $T.$ The the following are equivalent:

(1) $T$ has a diagonal matrix with respect to some basis of $V$;

(2) $V$ has a basis consisting of eigenvectors of $T$;

(3) there exist one-dimensional $T$-invariant subspaces $U_1, \ldots, U_n$ of $V$ such that $V=U_1 \oplus \cdots \oplus U_n$;

(4) $V=\text{null} (T-\lambda_1 I) \oplus \cdots \oplus \text{null}(T-\lambda_m I)$;

(5) $\mathop{dim} V = \mathop{dim} \text{null} (T-\lambda_1 I) + \cdots + \mathop{dim} \text{null} (T-\lambda_m I).$

Proof. 1. $\Longleftrightarrow$ 2.: Exercise.

2. $\Longleftrightarrow$ 3.: Suppose 2 holds; thus suppose $V$ has a basis consisting of eigenvectors of $T.$ For each $j$, let $U_j=\mathop{span}(v_j).$ Obviously each $U_j$ is a one-dimensional subspace of $V$ that is invariant under $T$ (because each $v_j$ is an eigenvector of $T$). Because $(v_1,\ldots,v_n)$ is a basis of $V$, each vector in $V$ can be written uniquely as a linear combination of $(v_1,\ldots,v_n).$ In other words, each vector in $V$ can be written uniquely as a sum $u_1+\cdots +u_n$, where each $u_j\in U_j.$ Thus $V=U_1 \oplus \cdots \oplus U_n.$ Hence 2 implies 3. Conversely, suppose now that 3 holds; thus there are one-dimensional subspaces $U_1,\ldots,U_n$ of $V$, each invariant under $T$, such that $V=U_1 \oplus \cdots \oplus U_n.$ For each $j$, let $v_j$ be a nonzero vector in $U_j.$ Then each $v_j$ is an eigenvector of $T.$ Because each vector in $V$ can be written uniquely as a sum $u_1+\cdots + u_n$, where each $u_j\in U_j$ ( so each $u_j$ is a scalar multiple of $v_j$), we see that $(v_1,\ldots,v_n)$ is a basis of $V.$ Thus 3 implies 2.

2 $\Longrightarrow$ 4: Suppose 2 holds; thus thus suppose $V$ has a basis consisting of eigenvectors of $T.$ Thus every vector in $V$ is a linear combination of eigenvectors of $T.$ Hence $V=\text{null} (T-\lambda_1)I + \cdots + \text{null} (T-\lambda_m)I.$ To show that the sum above is direct, suppose that $0=u_1+\cdots + u_m$, where each $u_j\in \text{null} (T-\lambda_j I).$ Because nonzero eigenvectors correspond to distinct eigenvalues are linearly independent, this implies that each $u_j$ equals 0. This implies that the sum is a direct sum, completing the proof that 2 implies 4. 4 $\Longrightarrow$ 5: Exercise.

5 $\Longrightarrow$ 2: Suppose (v) holds; thus $\mathop{dim} V=\mathop{dim} \text{null} (T-\lambda_1I)+\cdots + \mathop{dim} \text{null} (T-\lambda_m I).$ Choose a basis of each $\text{null}(T-\lambda_j I)$; put all these bases together to form a list $(v_1,\ldots,v_n)$ of eigenvectors of $T$, where $n=\mathop{dim} V.$ To show that this list is linearly independent, suppose $a_1 v_1+\cdots + a_n v_n=0$, where $a_1,\ldots,a_n$ are scalars. For each $j=1, \ldots, m$, let $u_j$ denote the sum of all the terms $a_k v_k$ such that $v_k\in \text{null}(T-\lambda_j I).$ Thus each $u_j$ is an eigenvector of $T$ with eigenvalue $\lambda_j$, and $u_1+\cdots + u_m=0.$ Because nonzero eigenvectors corresponding to distinct eigenvalues are linearly independent, this implies that each $u_j$ equals to 0. Because each $u_j$ is a sum of terms $a_k v_k$ where the $v_k$’s where chosen to be a basis of $\text{null} (T-\lambda_j I)$, this implies that all the $a_k$’s equal to 0. Thus $(v_1,\ldots,v_n)$ is linearly independent and hence a basis of $V.$ Thus 5 implies 2.

A vector $v$ is called a generalized eigenvector of $T$ corresponding to $\lambda$, where $\lambda$ is an eigenvalue of $T$, if $(T-\lambda I)^j v=0$ for some positive integer $j.$

Lemma. If $T\in \mathcal{L}(V).$ Then, if $m$ is a nonnegative integer such that $\mathop{ker} T^m=\mathop{ker} T^{m+1}$, then

(1) ${0}=\mathop{ker} T^0 \subseteq \mathop{ker} T^1 \subseteq \cdots \subseteq \mathop{ker} T^m = \mathop{ker} T^{m+1} = \mathop{ker} T^{m+2} = \cdots$

(2) $\mathop{ker} T^{\mathop{dim} V}=\mathop{ker} T^{\mathop{dim} V+1} =\mathop{ker} T^{\mathop{dim} V+2}\cdots$

(3) $V=\mathop{im} T^0 \supseteq \mathop{im} T^1 \supseteq \cdots \supseteq \mathop{im} T^k \supseteq \mathop{im} T^{k+1} \supseteq \cdots $

(4) $\mathop{im} T^{\mathop{dim} V}= \mathop{im} T^{\mathop{dim} V+1} = \mathop{im} T^{\mathop{dim} V+2}\cdots$

An operator is called nilpotent if some power of it equal 0.

Lemma. Suppose $N\in \mathcal{L}(V)$ is nilpotent, then $N^{\mathop{dim} V}=0.$

Use generalized eigenvectors for finding invariant subspaces

Theorem. Let $T\in \mathcal{L}(V)$ and $\lambda\in\mathbb{F}.$ Then for every basis of $V$ with respect to which $T$ has an upper-triangular matrix, $\lambda$ appears on the diagonal of the matrix of $T$ precisely $\mathop{dim} (T-\lambda I)^{\mathop{dim} V}$ times.

The multiplicity of an eigenvalue $\lambda$ of $T$ is defined to be the dimension of the subspace of generalized eigenvectors corresponding to $\lambda$, that is the multiplicity of $\lambda$ is equal to $\mathop{dim} \mathop{ker}(T-\lambda I)^{\mathop{dim} V}.$

Theorem. If $V$ is a complex vector space and $T\in \mathcal{L}(V)$, then the sum of the multiplicities of all the eigenvalues of $T$ equals $\mathop{dim} V.$

Let $d_j$ denote the multiplicity of $\lambda_j$ as an eigenvalue of $T$, the polynomial $$ (z-\lambda_1)^{d_1} \cdots (z-\lambda_m)^{d_m} $$ is called the characteristic polynomial of $T.$ Read about the characteristic polynomial here.

Theorem. Suppose that $V$ is a complex vector space and $T\in \mathcal{L}(V).$ Let $q$ denote the characteristic polynomial of $T.$ Then $q(T)=0.$

Theorem. If $T\in \mathcal{L}(V)$ and $p\in \mathcal{P}(\mathbb{F})$, then $\mathop{ker} p(T)$ is invariant under $T.$

Invariant Subspaces Questions

Theorem. Suppose $V$ is a complex vector space and $T\in \mathcal{L}(V).$ Let $\lambda_1,\ldots,\lambda_m$ be the distinct eigenvalues of $T$, and let $U_1,\ldots,U_m$ be the corresponding subspaces of generalized eigenvectors. Then

(1) $V=U_1\oplus \cdots \oplus U_m$;

(2) each $U_J$ is invariant under $T$;

(3) each $\left.(T-\lambda_j I) \right|_{U_j}$ is nilpotent.

Theorem. Suppose $V$ is a complex vector space and $T\in \mathcal{L}(V).$ Then there is a basis of $V$ consisting of generalized eigenvectors of $T.$

Theorem. Suppose $N$ is a nilpotent operator on $V.$ Then there is a basis of $V$ with respect to which the matrix of $N$ has the form $$ \begin{bmatrix} 0 & & * \ & \ddots & \ 0 & & 0\end{bmatrix}; $$ here all entries on and below the diagonal are 0’s.

Theorem. Suppose $V$ is a complex vector space and $T\in \mathcal{L}(V).$ Let $\lambda_1,\ldots,\lambda_m$ be the distinct eigenvalues of $T.$ Then there is a basis of $V$ with respect to which $T$ has block diagonal matrix of the form $$ \begin{bmatrix} A_1 & & 0\ & \ddots & \ 0 & & A_m\end{bmatrix}, $$ where each $A_j$ is an upper-triangular matrix of the form $$ \begin{bmatrix} \lambda_1 & & * \ & \ddots & \ 0 & & \lambda_j\end{bmatrix}. $$

Theorem. Suppose $N\in\mathcal{L}(V)$ is nilpotent. Then $I+N$ has a square root.

On real vector spaces there exist invertible operators that have no square roots. For example, the operator of multplication by $-1$ on $\mathbb{R}$ has no square root because no real number has its square equal to $-1.$

Theorem. Suppose $V$ is a complex vector space. If $T\in \mathcal{L}(V)$ is invertible, then $T$ has a square root.

The minimal polynomial of $T$ is the monic polynomial $p\in \mathcal{P}(\mathbb{F})$ of smallest degree such that $p(T)=0.$

Theorem. Let $T\in \mathcal{L}(V)$ and let $q\in \mathcal{P}(\mathbb{F}).$ Then $q(T)=0$ if and only if the minimal polynomial of $T$ divided $q.$

Theorem. Let $T\in \mathcal{L}(V).$ Then the roots of the minimal polynomial of $T$ are precisely the eigenvalues of $T.$

Every $T\in \mathcal{L}(V)$ where $V$ is a complex vector space, there is a basis of $V$ with respect to which $T$ has a nice upper-triangular matrix. We can do even better. There is a basis of $V$ with respect to which the matrix of $T$ contains zeros everywhere except possibly on the diagonal and the line directly above the diagonal.

Suppose $N\in \mathcal{L}(V)$ is nilpotent. For each nonzero vector $v\in V$, let $m(v)$ denote the largest nonnegative integer such that $N^{m(v)}\neq 0.$

Theorem. If $N\in \mathcal{L}(V)$ is nilpotent, then there exist vectors $v_1,\ldots,v_k \in V$ such that

(1) $\left(v_1,N v_1,\ldots,N^{m(v_1)},\ldots,v_k, N v_k, \ldots, N^{m(v_k)}v_k\right)$ is a basis of $V$;

(2) $\left(N^{m(v_1)}v_1,\ldots,N^{m(v_k)}v_k\right)$ is a basis of $\mathop{ker} N.$

Examples

Example. Suppose $T\in \mathcal{L}(V).$ Prove that if $U_1,\ldots,U_m$ are subspaces of $V$ invariant under $T$, then $U_1 + \cdots +U_m$ is invariant under $T.$ Suppose $v\in U_1+\cdots + U_m.$ Then there exists $u_1,\ldots,u_m$ such that $v=u_1+\cdots +u_m$ with $u_j\in U_j.$ Then $Tv=T u_1+\cdots + T u_m.$ Since each $U_j$ is invariant under $T$, $T u_j \in U_j$, so $T v \in U_1+ \cdots + U_m.$

Example. Suppose $T\in \mathcal{L}(V).$ Prove that the intersection of any collection of subspaces of $V$ invariant under $T$ is invariant under $T.$ Suppose we have subspaces ${U_j}$ with each $U_j$ invariant under $T.$ Let $v\in \cap_j U_j.$ Then $Tv\in U_j$ for each $j$, and so $\cap_j U_j$ is an invariant subspace under $T.$

Example. Prove or give a counterexample: if $U$ is a subspace of $V$ that is invariant under every operator on $V$, then $U={0}$ or $U=V.$ We will prove the contrapositive: if $U$ is a subspace of $V$ and $U\neq {0}$ and $U\neq V$, then there exists an operator $T$ on $V$ such that $U$ is not invariant under $T.$ Let $(u_1,\ldots,u_m)$ be a basis for $U$, which we extend to a basis $(u_1,\ldots,u_m, v_1,\ldots,v_n)$ of $V.$ The assumption $U\neq {0}$ and $U\neq V$ means that $m\geq 1$ and $n\geq 1.$ Define a linear map $T$ by $Tu_1=v_1$ and for $j>1$, $T u_j=0.$ Since $v_1\not \in U$, the subspace $U$ is not invariant under the operator $T.$

Example. Suppose that $S,T\in \mathcal{L}(V)$ are such that $S T= T S.$ Prove that $\text{null }(T-\lambda I)$ is invariant under $S$ for every $\lambda \in F.$ Suppose $v\in \mathop{ker}(T-\lambda I).$ Then $Tv = \lambda v$ and using $TS=ST$, $Sv$ satisfies $ T(S v)=S(T v)=S \lambda v)=\lambda(S v). $ Thus $S v\in \mathop{ker}(T-\lambda I)$ and so $\mathop{ker} (T-\lambda I)$ is an invariant subspace under $S.$

Example. Define $T\in \mathcal{L}(F^2)$ by $T(w,z)=(z,w).$ Find all eigenvalues and eigenvectors of $T.$ Suppose $(w,z)\neq (0,0)$ and $T(w,z)=(z,w)=\lambda(w,z).$ Then $z=\lambda w$ and $w=\lambda z.$ Of course this leads to $w=\lambda z=\lambda^2w$, $z=\lambda w=\lambda^2 z.$ Since $w\neq 0$ or $z\neq 0$, we see that $\lambda^2=1$ so that $\lambda =\pm 1.$ A basis of eigenvectors is $(w_1,z_1)=(1,1)$, $(w_2,z_2)=(-1,1)$ and they have eigenvalues 1 and $-1$ respectively.

Example. Define $T\in \mathcal{L}(F^3)$ by $T(z_1,z_2,z_3)=(2z_2,0,5z_3).$ Find all eigenvalues and eigenvectors of $T.$ Suppose $(z_1,z_2,z_3)\neq (0,0,0)$ and $$T(z_1,z_2,z_3)=(2z_2,0,5z_3)=\lambda (z_1,z_2,z_3).$$ If $\lambda=0$ then $z_2=z_3=0$, and one checks that $v_1=(0,0,0)$ is an eigenvector with eigenvalue 0. If $\lambda\neq 0$ then $z_2=0$, $2z_2=\lambda z_1=0$, $5z_3=\lambda z_3$, so $z_1=0$ and $\lambda =5.$ The eigenvector for $\lambda=5$ is $v_2=(0,0,1).$ These are the only eigenvalues and each eigenspace is one dimensional.

Example. Suppose $n$ is a positive integer and $T\in \mathcal{L}(\mathbb{F}^n)$ is defined by $$ T(x_1,\ldots,x_n)=(x_1+ \cdots + x_n,\ldots,x_1+\cdots +x_n). $$ Find all eigenvalues and eigenvectors of $T.$ First, any vector of the form $v_1=(\alpha,\ldots,\alpha)$, for $\alpha\in \mathbb{F}$, is an eigenvector with eigenvalue $n.$ If $v_2$ is any vector $v_2=(x_1,\ldots,x_n), $ such that $x_1+\cdots + x_n=0$ then $v_2$ is an eigenvector with eigenvalue 0. Here are the independent eigenvectors: $v_1=(1,1,\ldots,1)$, and $v_n=(1,0,\ldots,0)-E_n$, for $n\geq 2$ where $E_n$ denoted the $n$-th standard basis vector.

Example. Suppose $T\in \mathcal{L}(V)$ is invertible and $0\neq \lambda \in F.$ Prove that $\lambda$ is an eigenvalue of $T$ if and only if $\frac{1}{\lambda}$ is an eigenvalue of $T^{-1}.$
Suppose $v\neq 0$ and $T v =\lambda v.$ Then $v=T^{-1}T v=\lambda T^{-1}v$, or $T^{-1}v=\frac{1}{\lambda}v$, and the other direction is similar.

Example. Suppose $S,T\in \mathcal{L}(V).$ Prove that $S T$ and $T S$ have the same eigenvalues. Suppose $v\neq 0$ and $STv=\lambda v.$ Multiply by $T$ to get $T S(Tv)=\lambda T v.$ Thus if $t\neq 0$ then $\lambda$ is also an eigenvalue of $TS$, with nonzero=o eigenvector $Tv.$ On the other hand, if $T v=0$, then $\lambda =0$ is an eigenvalue of $ST.$ But if $T$ is not invertible, then $\mathop{im} TS \subset\mathop{im} T$is not equal to $V$, so $TS$ has a nontrivial null space, hence 0 is an eigenvalue of $TS.$

Example. Suppose $T\in \mathcal{L}(V)$ is such that every vector in $V$ is an eigenvector of $T.$ Prove that $T$ is a scalar multiple of the identity operator. Pick a basis $(v_1,\ldots,v_N)$ for $V.$ By assumption, $T v_n=\lambda_n v_n.$ Pick any two distance indices, $m,n.$ We also have $T(v_m+v_n)=\lambda(v_m+v_n)=\lambda(v_m+v_n)=\lambda_m v_m + \lambda_n v_n.$ Write this as $0=(\lambda-\lambda_m)v_m+(\lambda-\lambda_n)v_n.$ Since $v_m$ and $v_n$ are independent, $\lambda=\lambda_m=\lambda_n$, and all the $\lambda_n$ are equal.

Example. Suppose $S, T\in \mathcal{L}(V)$ and $S$ is invertible. Prove that if $p \in \mathcal{P}(\mathbb{F})$ is a polynomial, then $p(S T S^{-1})=S p(T) S^{-1}.$ First let’s show that for positive integers $n$, $(STS^{-1})^n=S T^n S^{-1}.$ We may do this by induction, with nothing to show if $n=1.$ Assume it’s true for $n=k$, and consider $$ (STS^{-1})^{k+1}=(STS^{-1})^k(STS^{-1})=ST^k S^{-1}STS^{-1}=S T^{k+1}S^{-1}. $$ Now suppose $P(z)=a_n z^N+\cdots + a_1 z+a_0.$ Then

\begin{align} p(STS^{-1})&=\sum_{n=0}^N a_n (STS^{-1})^n=\sum_{n=0}^N a_n ST^n S^{-1}\\ & =S\left( \sum_{n=0}^N a_n T^n \right) S^{-1}=Sp(T)S^{-1}. \end{align}

Example. Suppose $F=\mathbb{C}$, $T\in \mathcal{L}(V)$, $p\in \mathcal{P}(\mathbb{C})$, and $a\in \mathbb{C}.$ Prove that $a$ is an eigenvalue of $p(T)$ if and only if $a=p(\lambda)$ for some eigenvalue $\lambda $ of $T.$ Suppose first that $v\neq 0$ is an eigenvalue of $T$ with eigenvalue $\lambda$; that is $T v = \lambda v.$ Then for positive integers $n$, $T^n v=T^{n-1} \lambda v = \cdots \lambda^n v$, and so $p(T)v=p(\lambda) v.$ That is $\alpha=p(\lambda)$ is an eigenvalue of $p(T)$ if $\lambda$ is an eigenvalue of $T.$ Conversely, suppose now that $\alpha$ is a eigenvalue of $p(T)$, so there is a $v\neq 0$ with $p(T)v =\alpha v$, or $(p(T)-\alpha I)v=0.$ Since $\mathbb{F}=\mathbb{C}$, we may factor the polynomial $p(T)-\alpha I $ into linear factors $$ 0=(p(T)-\alpha I)v=\prod (T-\lambda_n I)v. $$ At least one of the factors is not invertible, so at least one of the $\lambda_n$, say $\lambda_1$, is an eigenvalue of $T.$ Let $w\neq 0$ be an eigenvector for $T$ with eigenvalue $\lambda_1.$ Then $$ 0=(T-\lambda_N I)\cdots (T-\lambda_1 I)w=(p(T)-\alpha I) w, $$ so $w$ is an eigenvalue for $p(T)$ with eigenvalue $\alpha.$ But by the first part of the argument, $p(T)w=p(\lambda_1)w=\alpha w$ and $\alpha=p(\lambda_1).$

Example. Show that the previous exercise does not hold with $F=\mathbb{R}.$ Take $T: \mathbb{R}^2 \rightarrow \mathbb{R}^2$ given by $T(x,y)=(-y,x).$ We’ve seen perviously that $T$ has no real eigenvalues. On the other hand, $T^2(x,y)=(-x,-y)=-1(x,y).$

Example. Suppose $V$ is a complex vector space and $T\in \mathcal{L}(V).$ Prove that $T$ has an invariant subspace of dimension $j$ for each $j=1,\ldots, \mathop{dim} V.$ Let $(v_1,\ldots,v_N)$ be a basis with respect to which $T$ has an upper triangular matrix. Then by a previous theorem, $T: \mathop{span}(v_1,\ldots,v_j) \rightarrow \mathop{span}(v_1,\ldots,v_j).$

Example. Give an example of an operator whose matrix with respect to some basis contains only 0’s on the diagonal, but the operator is invertible. Consider $T=\begin{bmatrix} 1 & 1 \ 1 & 1 \end{bmatrix}.$

Example. Give an example of an operator whose matrix with respect to some basis contains only nonzero numbers on the diagonal, but the operator is not invertible. Taking $T=\begin{bmatrix} 1 & 1 \ 1 & 1 \end{bmatrix}.$ If $v=[1,-1]$, then $Tv=0.$

Example. Give an example of an operator on $\mathbb{C}^4$ whose characteristic and minimal polynomials both equal $z (z-1)^2(z-3).$

Example. Give an example of an operator on $\mathbb{C}^4$ whose characteristic polynomial equals $z (z-1)^2(z-3)$ and whose minimal polynomial equals $z (z-1)(z-3).$

Example. Suppose $a_0, \ldots, a_{n-1}\in \mathbb{C}.$ Find the minimal and characteristic polynomials of the operator on $\mathbb{C}^n$ whose matrix is $$ \begin{bmatrix} 0 & & & & & -a_0 \\ 1 & 0 & & & & -a_1 \\ & 1 & \ddots & & & -a_2 \\ & & \ddots & & & \vdots \\ & 1 & & & 0 & -a_{n-2} \\ & & & & 1 & -a_{n-1} \end{bmatrix} $$with respect to the standard bases.

5/5 (1 Review)

Leave a Comment

Scroll to Top