inner product space

Inner Products and Orthonormal Bases


Master of Science in Mathematics
Lecture Notes. Accessed on: 2019-10-21 17:59:04

Recall that the norm of $x\in \mathbb{R}^n$ defined by $\left|\left |x\right|\right | = \sqrt{x_1^2+x_2^2}$ is not linear. To injective linearity into the discussion we introduce the dot product: for $x,y\in \mathbb{R}^n$ the dot product of $x$ and $y$ is defined as $x \cdot y=x_1 y_1+\cdots +x_n y_n.$ Obviously $x \cdot x=\left|\left |x\right|\right |^2$, and with the dot product being so useful, so we generalize the dot product into an inner product on a vector space $V.$

An inner product on $V$ is a function that takes each ordered pair $(u,v)$ of elements of $V$ to a number $\left \langle u,v \right \rangle \in\mathbb{F}$ and has the following properties:

  • (positivity) $\left \langle v,v \right \rangle \geq 0$ for all $v\in V$;
  • (definiteness) $\left \langle v,v \right \rangle =0$ if and only if $v=0$;
  • (additivity in the first slot) $\left \langle u+v,w \right \rangle = \left \langle u,w \right \rangle +\left \langle v,w \right \rangle $ for all $u,v,w\in V$;
  • (homogeneity in the first slot) $\left \langle av, w \right \rangle =a\left \langle v,w \right \rangle $ for all $a\in \mathbb{F}$ and all $v,w,\in V$;
  • (conjugate symmetry) $\left \langle v,w \right \rangle = \overline{\left \langle w,v \right \rangle }$ for all $v,w\in V.$

Recall that for $z\in \mathbb{C}^n$, we define the norm of $z$ by $$ \left|\left| z\right|\right| =\sqrt{|z_1|^2+\cdots + |z_n|^2} $$ where the absolute values are needed because we want $\left|\left| z\right|\right| $ to be a non-negative number. Then \begin{equation}\label{cp} \left|\left| z \right|\right| ^2=z_1 \overline{z_1}+\cdots + z_n \overline{z_n} \end{equation} because every $\lambda\in\mathbb{C}$ satisfies $|\lambda|^2 =\lambda \overline{\lambda}.$ Since $\left|\left| z^2\right|\right| $ is the inner-product of $z$ with itself, as in $\mathbb{R}^n$, the Equation \eqref{cp} suggests that the inner product of $w\in \mathbb{C}^n$ with $z$ should equal $$ w_1 \overline{z_1}+\cdots+w_n \overline{z_n}. $$ We should expect that the inner product of $w$ with $z$ equals the complex conjugate of the inner product of $z$ with $w$, thus motivating the definition of conjugate symmetry.

Definition. An inner-product space is a vector space $V$ along with an inner product on $V.$ The inner product defined on $\mathbb{F}^n$ by $$ \left \langle (w_1,\ldots,w_n),(z_1,\ldots,z_n) \right\rangle = w_1 \overline{z_1}+\cdots + w_n \overline{z_n} $$ is called the Euclidean inner product.

Continue to let $V$ denote a complex or real vector space. In this section we develop the basic theorems for norms. For $v\in V$, the norm of $v$ is defined by $||v||=\sqrt{\left \langle v, v \right \rangle }.$ Two vectors $u,v\in V$ are orthogonal if $\left \langle u, v \right\rangle = 0 .$

Theorem. (Pythagorean Theorem) If $u$ and $v$ are orthogonal vectors in $V$, then $$ \left|\left| u + v \right|\right|^2 = \left|\left| u \right|\right|^2 + \left|\left| y \right|\right|^2. $$

Proof. Suppose that $u,v$ are orthogonal vectors in $V.$ Then $$ \left|\left| u+v\right|\right|^2=\left \langle u+v,u+v \right \rangle=\left|\left| u \right|\right|^2+\left|\left| v \right|\right|^2 + \left \langle u,v\right \rangle+\left \langle v,u \right \rangle = \left|\left| u \right|\right|^2 + \left|\left| v \right|\right|^2, $$ as desired.

Theorem. (Orthogonal Decomposition) If $v$ is a nonzero vector in $V$, then $u$ can be written as a scalar multiple of $v$ plus a vector orthogonal to $v.$

Proof. Let $a\in \mathbb{F}.$ Then $$ u=a v+(u-av). $$ Thus we need to choose $a$ so that $v$ is orthogonal to $(u-a v).$ In other words, we want $$ 0=\left \langle u-av,v\right \rangle = \left \langle u,v\right \rangle-a \left \langle v,v\right \rangle = \left \langle u,v\right \rangle-a\left|\left| v \right|\right|^2. $$ The equation above shows that we should choose $a$ to be $\left \langle u,v\right \rangle/\left|\left| v \right|\right|^2$ (assume that $v\ne 0$ to avoid division by 0).
Making this choice of $a$, we can write $$ u=\frac{ \left \langle u,v\right \rangle}{\left|\left| v \right|\right|^2} v+\left(u-\frac{\left \langle u,v\right \rangle }{\left|\left| v \right|\right|^2}v\right). $$ Thus, if $v\neq 0$ then the equation above writes $u$ as a scalar multiple of $v$ plus a vector orthogonal to $v.$

Theorem. (Cauchy-Schwarz) If $u, v \in V$, then $$ | \left \langle u,v\right \rangle | \leq \left|\left| u \right|\right| \, \left|\left| v \right|\right| $$ where equality holds if and only if one of $u, v$ is a scalar multiple of the other.

Proof. Let $u,v\in V.$ If $v=0$, then both sides and the desired inequality holds. Thus we can assume that $v\neq 0.$ Consider the orthogonal decomposition $$ u=\frac{\left \langle u,v\right \rangle }{\left|\left| v \right|\right|^2}v+w $$ where $w$ is orthogonal to $v.$ By the Pythagorean theorem, \begin{align} \left|\left| u \right|\right|^2 & \qquad =\left|\left| \frac{\left \langle u,v\right \rangle }{\left|\left| v \right|\right|^2} v \right|\right| ^2+\left|\left| w\right|\right|^2\\ & \qquad =\frac{|\left \langle u,v\right \rangle |^2}{\left|\left| v \right|\right|^2}+\left|\left| w \right|\right|^2\\ & \qquad \geq \frac{|\left \langle u,v\right \rangle |^2}{\left|\left| v \right|\right|^2} \end{align} Multiplying both sides by $\left|\left| v \right|\right|^2$ and then taking square roots gives the Cauchy-Schwarz inequality. Notice that there is equality if and only if $w=0$, that is, if and only if $u$ is a multiple of $v.$

Theorem. (Triangular Inequality) If $u, v \in V$, then $$ \left|\left|u+v\right|\right| \leq \left|\left| u \right|\right|+\left|\left| v \right|\right| $$ where equality holds if and only if one of $u, v$ is a nonnegative multiple of the other.

Proof. Let $u,v\in V.$ Then

\begin{align} \left|\left| u+v \right|\right| ^2 & =\left \langle u+v,u+v \right \rangle \\ &= \left \langle u,u\right \rangle + \left \langle v,v\right \rangle + \left \langle u,v\right \rangle + \left \langle v,u\right \rangle \\ &= \left \langle u,u\right \rangle + \left \langle v,v\right \rangle + \left \langle u,v\right \rangle +\overline{\left \langle u,v\right \rangle } \\ &\leq \left|\left|u,u\right|\right|^2+\left|\left| v,v\right|\right|^2+2 | \left \langle u,v\right \rangle | \label{ti1} \\ &\leq \left|\left| u,u\right|\right| ^2+\left|\left| v,v\right|\right| ^2+2 \left|\left| u \right|\right|\left|\left| v \right|\right| \label{ti2} \\ &= \left(\left|\left| u \right|\right|+\left|\left| v \right|\right|\right)^2 \end{align}

and so by taking square root of both sides yields the triangular inequality. This proof shows that the triangle inequality is an equality if and only if we have equality in \eqref{ti1} and \eqref{ti2}. Thus we have equality in the triangular inequality if and only if \begin{equation}\label{ti3} \left \langle u,v \right \rangle =\left|\left| u \right|\right|\left|\left| v \right|\right|. \end{equation} If one of $u,v$ is a nonnegative multiple of the other, then \eqref{ti3} holds. Conversely, suppose \eqref{ti3} holds. The the condition for equality in the Cauchy-Schwarz inequality implies that one of $u,v$ must be a scalar multiple of the other. Clearly, then \eqref{ti3} forces the scalar in question to be nonnegative, as desired.

Theorem. (Parallelogram Equality) If $u, v \in V$, then $$ \left|\left| u+v\right|\right| ^2+\left|\left| u-v\right|\right| ^2= 2 \left( \left|\left| u \right|\right|^2+\left|\left| v \right|\right|^2 \right). $$

Proof. If $u, v \in V$, then \begin{align} \left|\left| u+v\right|\right| ^2+\left|\left| u-v\right|\right| ^2 &= \left \langle u+v,u+v \right \rangle+\left \langle u-v,u-v \right \rangle \\ & = \left|\left| u \right|\right|^2+\left|\left| v \right|\right|^2+\left \langle u,v \right \rangle + \left \langle v,u \right \rangle + \left|\left| u \right|\right|^2+\left|\left| v \right|\right|^2-\left \langle u,v \right \rangle -\left \langle v,u \right \rangle \\ & =2 \left( \left|\left| u \right|\right|^2+\left|\left| v \right|\right|^2 \right ) \end{align} as desired.

A list of vectors is called orthonormal if the vectors in it are pairwise orthogonal and each vector has norm 1.

Theorem. If $(e_1,\ldots,e_m)$ is an orthonormal list of vectors in $V$, then $$\label{innorm} \left|\left| a_1 e_1+\cdots +a_m e_m\right|\right| ^2=|a_1|^2+\cdots + |a_n|^2 $$ for all $a_1,\ldots,a_m\in\mathbb{F}.$

Proof. Because each $e_j$ has norm 1, this follows easily from repeated by application of the Pythagorean theorem.

Theorem. Every orthonormal list of vectors is linearly independent.

Proof. Suppose $(e_1,\ldots,e_n)$ is an orthonormal list of vectors in $V$ and $a_1,\ldots,a_n\in \mathbb{F}$ are such that $a_1 e_1+\cdots + a_n e_n=0.$ Then $|a_1|^2+\cdots + |a_n|^2=0$, which means that all the $a_j$’s are 0, as desired.

An orthonormal basis of $V$ is an orthonormal list of vectors in $V$ that is also a basis of $V.$

The importance of orthonormal bases stems mainly from the following theorem.

Theorem. Suppose $(e_1,\ldots,e_n)$ is an orthonormal basis of $V.$ Then \begin{equation} \label{inim1} v=\left \langle v,e_1 \right \rangle e_1 + \cdots + \left \langle v,e_n \right \rangle e_n \end{equation} and \begin{equation} \label{inim2} \left|\left| v \right|\right|=|\left \langle v,e_1 \right \rangle |^2+\cdots + |\left \langle v,e_n \right \rangle |^2 \end{equation} for every $v\in V.$

Proof. Let $v\in V.$ Because $(e_1,\ldots,e_n)$ is a basis of $V$, there exist scalars $a_1,\ldots,a_n$ such that $v=a_1 e_1+\cdots + a_n e_n.$ Take the inner product of both sides of this equation with $e_j$, getting $\left \langle v,e_j \right \rangle =a_j.$ Thus \eqref{inim1} holds. Clearly \eqref{inim2} holds by \eqref{inim1}.

Theorem. (Gram-Schmidt) If $(v_1,\ldots,v_m)$ is a linearly independent list of vectors in $V$, then there exists an orthonormal list $(e_1,\ldots,e_m)$ of vectors in $V$ such that \begin{equation} \label{gmeq} \operatorname{span}(v_1,\ldots,v_j)=\operatorname{span}(e_1,\ldots,e_j) \end{equation} for $j=1,\ldots,m$

Proof. Suppose $(v_1,\ldots,v_m)$ is a linearly independent list of vectors in $V.$ To construct the $e$’s, start by setting $e_1=\frac{v_1}{\left|\left| v_1\right|\right| }.$ This satisfies \eqref{gmeq} for $j=1.$ We will choose $e_2,\ldots,e_m$ inductively, as follows. Suppose $j>1$ and an orthonormal list $(e_1,\ldots,e_{j-1})$ has been chosen so that \begin{equation} \label{gspan} \operatorname{span}(v_1,\ldots,v_{j-1})=\operatorname{span}(e_1,\ldots,e_{j-1}). \end{equation} Let \begin{equation}\label{gsproj} e_j=\frac{v_j- \left \langle v_j,e_1\right \rangle e_1-\cdots – \left \langle v_j,e_{j-1} \right \rangle e_{j-1} }{\left|\left| v_j – \left \rangle v_j,e_1 \right \rangle e_1-\cdots – \left \langle v_j,e_{j-1} \right \rangle e_{j-1} \right|\right| }. \end{equation} Note that $v_j \not\in \operatorname{span}(v_1,\ldots,v_{j-1})$ (because $(v_1,\ldots,v_m)$ is linearly independent) and thus $v_j\not \in \operatorname{span}(e_1,\ldots,e_{j-1}).$ Hence we are not dividing by 0 in the equation above, and so $e_j$ is well-defined. Dividing a vector by its norm produces a new vector with norm 1; thus $\left|\left| e_j\right|\right| =1.$

Let $1\leq k <j.$ Then

\begin{align} \left \langle e_j,e_k \right \rangle & = \frac{\left \langle v_j,e_k\right\rangle -\left \langle v_j,e_k \right\rangle \left \langle e_k,e_k \right\rangle}{\left|\left| v_j-\left \langle v_j,e_1 \right\rangle e_1-\cdots – \left \langle v_j,e_{j-1} \right\rangle e_{j-1} \right|\right| } \\ &=0. \end{align}

Thus $(e_1,\ldots,e_j)$ is an orthonormal list.

From \eqref{gsproj}, we see that $v_j\in \operatorname{span}(e_1,\ldots,e_j).$ Combining this information with \eqref{gspan} shows that $$ \operatorname{span}(v_1,\ldots,v_{j-1})\subset \operatorname{span}(e_1,\ldots,e_{j}). $$ Both lists above are linearly independent (the $v$’s by hypothesis, the $e$’s by orthonormality. Thus both subspaces above have dimension $j$, and hence must be equal, completing the proof.

Theorem. Every finite-dimensional inner-product space has an orthonormal basis.

Proof. Choose a basis of $V.$ Apply the Gram-Schmidt procedure to it, producing an orthonormal list. This list is linearly independent and it spans $V.$ Thus it is an orthonormal basis.

Theorem. Every orthonormal list of vectors in $V$ can be extended to an orthonormal basis of $V.$

Proof. Suppose $(e_1, \ldots,e_m)$ is an orthonormal list of vectors in $V.$ Then $(e_1, \ldots , e_m)$ is linearly independent and so can be extended to a basis $$ \mathcal{B}=(e_1, \ldots, e_m, v_1, \ldots, v_n) $$ of $V.$ Now apply the Gram-Schmidt procedure to $\mathcal{B}$ producing an orthonormal list $(e_1,\ldots,e_m,f_1,\ldots,f_n)$; here the Gram-Schmidt procedure leaves the first $m$ vectors unchanged because they are already orthonormal. Clearly $\mathcal{B}$ is an orthonormal basis of $V$ because it is linearly independent and its span equals $V.$ Hence we have our extension of $(e_1,\ldots,e_m)$ to an orthonormal basis of $V.$

Recall that if $V$ is a complex vector space, then for each operator on $V$ there is a basis with respect to which the matrix of the operator is upper-triangular. Now for inner-product spaces we would like to know the same question.

Theorem. Suppose $T\in\mathcal{L}(V).$ If $T$ has an upper-triangular matrix with respect to some basis of $V$, then $T$ has an upper-triangular matrix with respect to some orthonormal basis of $V.$

Proof. Suppose $T$ has upper-triangular matrix with respect to some basis $(v_1,\ldots,v_n)$ of $V.$ Thus $\operatorname{span}(v_1,\ldots,v_j)$ is invariant under $T$ for each $j=1,\ldots,n.$ Apply the Gram-Schmidt procedure to $(v_1,\ldots,v_n)$, producing an orthonormal basis $(e_1,\ldots,e_n)$ of $V.$ Because $$ \operatorname{span}(e_1,\ldots,e_j)=\operatorname{span}(v_1,\ldots,v_j) $$ for each $j$, we conclude that $\operatorname{span}(e_1,\ldots,e_j)$ is invariant under $T$ for each $j=1,\ldots,n.$ Thus, $T$ has an upper-triangular matrix with respect to the orthonormal basis $(e_1,\ldots,e_n).$

Theorem. Suppose $V$ is a complex vector space and $T\in \mathcal{L}(V).$ Then $T$ has an upper-triangular matrix with respect to some orthonormal basis of $V.$

If $U$ is a subset of an inner-product space $V$, then the orthogonal complement of $U$ is defined as $U^\bot={v\in V\, : \, \left \langle v, u \right \rangle =0 \text{ for all } u\in U}.$

Theorem. (Orthogonal Decomposition) If $U$ is a subspace of an inner-product space $V$, then $v=U\oplus U^\perp.$

Proof. Suppose that $U$ is a subspace of $V.$ First we will show that \begin{equation}\label{sumfirst} V=U + U^\perp. \end{equation} To do this, suppose $v\in V.$ Let $(e_1,\ldots,e_m)$ be an orthonormal basis of $U.$ Obviously, $$ v = \underbrace{ \left \langle v,e_1 \right \rangle e_1 + \cdots + \left \langle v,e_m \right \rangle e_m }_u + \underbrace{ v-\left \langle v,e_1 \right \rangle e_1-\cdots -\left \langle v,e_m \right \rangle e_m}_w. $$ Clearly, $u\in U.$ Because $(e_1,\ldots,e_m)$ is an orthonormal list, for each $j$ we have $$ \left \langle w,e_j \right \rangle = \left \langle v,e_j \right \rangle – \left \langle v,e_j \right \rangle = 0. $$ Thus $w$ is orthogonal to every vector in $\operatorname{span}(e_1,\ldots,e_m).$ In other words, $w\in U^\perp$, completing the proof of \eqref{sumfirst}.

If $v\in U\cap U^\perp$, then $v$ (which is in $U$) is orthogonal to every vector in $U$ (including $v$ itself), which implies that $\left \langle v,v \right \rangle = 0$, which implies that $v=0.$ Thus \begin{equation}\label{orthogonalss} U\cap U^\perp={0}. \end{equation} Now \eqref{sumfirst} and \eqref{orthogonalss} imply that $U\oplus U^\perp.$

Theorem. If $U$ is a subspace of an inner-product space $V$, then $$ U=(U^\perp)^\perp. $$

Proof. Suppose that $U$ is a subspace of $V.$ First we will show that \begin{equation} \label{subsetorth} U\subseteq (U^\perp)^\perp. \end{equation} To do this, suppose that $u\in U.$ Then $\left \langle u,v \right \rangle = 0$ for every $v\in U^\perp$ (by definition of $U^\perp$). Because $u$ is orthogonal to every vector in $U^\perp$, we have $u\in (U^\perp)^\perp$, completing the proof of \eqref{subsetorth}.

To prove the inclusion in the other direction, suppose $v\in (U^\perp)^\perp.$ We can write $v=u+w$, where $u\in U$ and $w\in U^\perp.$ We have $v-u=w\in U^\perp.$ Because $v\in (U^\perp)^\perp$ and $u\in(U^\perp)^\perp$ (from \eqref{subsetorth}), we have $v-u\in (U^\perp)^\perp.$ Thus $v-u\in U^\perp\cap (U^\perp)^\perp$, which implies that $v=u$, which implies that $v\in U.$ Thus $(U^\perp)^\perp \subseteq U$, which along with \eqref{subsetorth} completes the proof.

Let $V=U \oplus U ^\bot$ and for $v\in V$ let $v=u+w$ where $w\in U ^\bot.$ Then $u$ is called the orthogonal projection of $V$ onto $U$ and is denoted by $P_U v.$

Theorem. If $U$ is a subspace of an inner-product space $V$ and $v\in V.$ Then $\left|\left| v-P_U v\right|\right| \leq \left|\left| v-u\right|\right| $ for every $u\in U.$ Furthermore, if $u\in U$ and the inequality above is an equality; then $u=P_U v.$

Proof. Suppose $u\in U.$ Then

\begin{align} \left|\left| v-P_U v\right|\right| ^2 & \leq \left|\left| v-P_Uv\right|\right| + \left|\left| P_U v-u\right|\right| ^2 \label{mp1} \\ & = \left|\left| v-P_U v+P_Uv-u\right|\right| ^2 = \left|\left| v-u\right|\right| ^2 \label{mp2} \end{align}

where \eqref{mp2} comes from the Pythagorean theorem, which applies because $v-P_U v\in U^\perp$ and $P_U v-u\in U.$ Taking the square root gives the desired inequality. The inequality is an equality if and only if \eqref{mp1} is an equality, which happens if and only if $\left|\left| P_U v-u\right|\right| =0$, which happens if and only if $u=P_u v.$

Example. Show that if $c_1,\ldots,c_n$ are positive numbers, then $$ \left \langle (w_1,\ldots,w_n),(z_1,\ldots,z_n) \right \rangle = c_1 w_1 \overline{z_1}+\cdots + c_n w_n \overline{z_n} $$ defines an inner product on $\mathbb{F}^n.$

Example. Show that if $p,q\in \mathcal{P}_m(\mathbb{F})$, then $ \left \langle p ,q \right \rangle = \int_0^1 p(x)\overline{q(x)}dx $ is an inner product on the vector space $\mathcal{P}_m(\mathbb{F}).$

Example. Show that every inner product is a linear map in the first slot, as well as a linear map in the second slot.

Example. If $v\in V$ and $a\in \mathbb{F}$, then $\left|\left| a v\right|\right| = |a| \left|\left| v \right|\right|$, and if $v$ is nonzero then $u=\frac{1}{\left|\left| v \right|\right|} v$ is a unit vector. Since $\left|\left| a v\right|\right| ^2=\left \langle a v , a v \right \rangle = a \overline{a} \left \langle v,v \right \rangle =|a|^2 \left|\left| v \right|\right|^2$, taking square roots provides $\left|\left| a v\right|\right| =|a| \left|\left| v \right|\right|.$

Example. Prove that if $x, y$ are nonzero vectors in $\mathbb{R}^2$, then $$ \left \langle x,y \right \rangle =\left|\left |x\right|\right |\left|\left| y\right|\right| \cos \theta, $$ where $\theta$ is the angle between $x$ and $y.$ The law of cosines gives $$ \left|\left| x-y\right|\right| ^2=\left|\left |x\right|\right |^2+\left|\left| y\right|\right| ^2-2\left|\left |x\right|\right |\left|\left| y\right|\right| \cos \theta. $$ The left hand side of this equation is $$ \left|\left| x-y\right|\right| ^2=(x-y)\cdot (x-y)=\left|\left |x\right|\right |^2-2 (x \cdot y) +\left|\left| y\right|\right| ^2 $$ so $$ x\cdot y= \left|\left |x\right|\right |\left|\left| y\right|\right| \cos \theta. $$

Example. Suppose $u,v \in V.$ Prove that $\left \langle u,v \right \rangle = 0$ if and only if $||u||\leq ||u+a v||$ for all $a\in F.$ If $\left \langle u,v \right \rangle = 0$, then by the Pythagorean theorem $$ \left|\left| u+\alpha v\right|\right| ^2=\left|\left| u \right|\right|^2+\left|\left| \alpha v\right|\right| ^2\geq \left|\left| u \right|\right|. $$ Conversely, we will prove the contrapositive, that is we will prove: if $\left \langle u,v \right \rangle \neq0$ then there exists $a \in \mathbb{F}$ such that $\left|\left| u \right|\right|>\left|\left| u+a v\right|\right| .$ Suppose $\left \langle u,v \right \rangle \neq 0$ then $u$ and $v$ are both nonzero vectors. By the orthogonal decomposition, we can write \begin{equation} \label{ex3} u=\alpha v+w \end{equation} for some $\alpha \in \mathbb{F}$ and where $\left \langle w,v \right \rangle =0.$ Notice $\alpha \neq 0$ since $\left \langle u,v \right \rangle \neq 0.$ Since $v$ and $w$ are orthogonal $$ \left|\left| u \right|\right|^2=|\alpha|^2\left|\left| v \right|\right|^2+\left|\left| w\right|\right| ^2 $$ Let $a=-\alpha.$ Then by equation \eqref{ex3} $$ \left|\left| u+a v\right|\right| ^2=\left|\left| w\right|\right| ^2 $$ and so $$ \left|\left| u \right|\right|^2=|a|^2\left|\left| v \right|\right|^2 +\left|\left| u+a v\right|\right| ^2 > \left|\left| u+a v\right|\right| ^2 $$ which implies $$ \left|\left| u \right|\right|>\left|\left| u+a v\right|\right| $$ as desired.

Example. Prove that $$ \left (\sum_{k=1}^{n} a_k b_k \right )^2 \leq \left ( \sum_{k=1}^n k a_k^2 \right ) \left ( \sum_{k=1}^n \frac{b_k^2}{k}\right ) $$ for all real numbers $a_1,\ldots,a_n$ and $b_1,\ldots,b_n.$ This is a simple trick. $$ \left (\sum_{k=1}^{n} a_k b_k \right )^2 = \left ( \sum_{k=1}^n \sqrt{k} a_k \frac{b_k}{\sqrt{k}} \right )^2 \leq \left ( \sum_{k=1}^n k a_k^2 \right ) \left ( \sum_{k=1}^n \frac{b_k^2}{k}\right ) $$ where the last inequality is from the Cauchy-Schwarz inequality.

Example. Suppose $u, v\in V$ are such that $||u||=3$, $||u+v||=4$, and $||u-v||=6.$ What number must $||v||$ equal? Using the parallelogram equality $$ \left|\left| u+v\right|\right| ^2+\left|\left| u-v\right|\right| ^2=2 \left(\left|\left| u \right|\right|^2+\left|\left| v \right|\right|^2\right) $$ to get $$ 16+36=2(9+\left|\left| v \right|\right|^2), \qquad \left|\left| v \right|\right|=\sqrt{17}. $$

Example. Prove or disprove: there is an inner product on $\mathbb{R}^2$ such that the associated norm is given by $||(x_1,x_2)||=|x_1|+|x_2|$ for all $(x_1,x_2)\in \mathbb{R}^2.$ There is no such inner product. Take for instance, $$ u=(1/4,0), \qquad v=(0,3/4), \qquad u+v=(1/4,3/4). $$ Then we have equality in the triangular inequality $$ 1=\left|\left| u+v\right|\right| \leq \left|\left| u \right|\right|+\left|\left| v \right|\right|=1/4+3/4. $$ By the triangular inequality, we must have $u=a v$ or $v=a v$, with $a\geq 0.$ But clearly no such $a\in \mathbb{F}$ exists.

Example. Prove that if $V$ is a real inner-product space, then $$\left \langle u,v \right \rangle =\frac{||u+v||^2-||u-v||^2}{4}$$ for all $u,v\in V.$ Expressing the norms as inner products

\begin{align} \frac{\left|\left| u+v\right|\right| ^2- \left|\left| u-v\right|\right| ^2}{4} &=\frac{\left \langle u+v,u+v \right \rangle-\left \langle u-v,u-v \right \rangle }{4} \\ &=\frac{\left \langle u,u \right \rangle + \left \langle v,v \right \rangle + \left \langle u,v \right \rangle +\left \langle v,u \right \rangle – \left \langle u,u \right \rangle -\left \langle v,v \right \rangle+\left \langle u,v \right \rangle +\left \langle v,u \right \rangle }{4} \\ &=\frac{2\left \langle u,v \right \rangle + 2\left \langle v,u \right \rangle }{4} \\ &=\frac{2\left \langle u,v \right \rangle + 2\left \langle u,v \right \rangle }{4} \quad \text{(because $V$ is a real inner product space)} \\ &=\left \langle u,v \right \rangle \end{align}

as desired.

Example. Prove that if $V$ is a complex inner-product space, then $ \left \langle u,v \right \rangle $ is $$ \frac{||u+v||^2-||u-v||^2 + ||u+i v||^2 i – ||u-i v||^2 i}{4} $$ for all $u,v\in V.$

Example. A norm on a vector space $U$ is a function $||\text{ }|| : U\rightarrow [0,\infty)$ such that $||u||=0$ if and only if $u=0$, $||\alpha u ||= |\alpha | ||u||$ for all $\alpha \in F$ and all $u \in U$, and $||u+v|| \leq ||u||+||v||$ for all $u,v \in U.$ Prove that a norm satisfying the parallelogram equality comes from an inner product (in other words, show that if $|| \text{ }||$ is a norm on $U$ satisfying the parallelogram equality, then there is an inner product $\left \langle \quad , \quad \right \rangle $ on $U$ such that $||u||=\left \langle u, u \right \rangle ^{1/2}$ for all $u \in U$).

Example. Suppose $n$ is a positive integer. Prove that $$ \left ( \frac{1}{\sqrt{2\pi}},\frac{\sin x}{\sqrt{\pi}},\frac{\sin 2x}{\sqrt{\pi}},\ldots,\frac{\sin n x}{\sqrt{\pi}}, \frac{\cos x}{\sqrt{\pi}},\frac{\cos 2x}{\sqrt{\pi}},\ldots,\frac{\cos n x}{\sqrt{\pi}} \right ) $$ is an orthonormal list of vectors in $\mathcal{C}[-\pi,\pi]$, the vector space of continuous real-valued functions on $[-\pi,\pi]$ with inner product $$ \left \langle f,g \right \rangle = \int_{-\pi}^{\pi} f(x)g(x) \, d x . $$ Computation of these integrals is based on the product-to-sum formulas from trigonometry:

\begin{align} \sin(A)\sin(B)&=\frac{1}{2}\cos(A-B)-\frac{1}{2}\cos(A+B) \\ \cos(A)\cos(B)&=\frac{1}{2}\cos(A-B)+\frac{1}{2}\cos(A+B) \\ \sin(A)\cos(B)&=\frac{1}{2}\sin(A-B)+\frac{1}{2}\sin(A+B). \end{align}

Here is a sample computation, valid for $m,n=1,2,3,\ldots$ when $m\neq n.$

\begin{align} \left \langle \sin(mx),\cos(nx) \right \rangle &= \int_{-\pi}^{\pi}\sin(mx)\cos(nx) \, dx \\ &= \int_{-\pi}^{\pi}\left(\frac{1}{2}\sin((m-n)x)+\frac{1}{2}\sin((m+n)x)\right)\, dx \\ &= \left. \frac{-1}{2(m-n)}\cos\left((m-n)x\right)+\frac{-1}{2(m+n)}\cos\left((m+n)x\right) \right|^{\pi}_{-\pi}, \\ &= \frac{-1}{2(m-n)}[-1-(-1)]+\frac{-1}{2(m-n)}[-1-(-1)] \\ &=0. \end{align}

Example. On $\mathcal{P}_2(\mathbb{R})$, consider the inner product given by $$ \left \langle p,q \right \rangle = \int_o^1 p(x) q(x) \, d x. $$ Apply the Gram-Schmidt procedure to the basis $(1,x,x^2)$ to produce an orthonormal basis of $\mathcal{P}_2(\mathbb{R}).$ Computing $e_1$: A calculation gives $$ \left \langle 1,1 \right \rangle =\int_0^1 1 \, dx=1, $$ so $e_1=1.$ Computing $e_2$: A calculation gives $$ \left \langle x,1 \right \rangle = \int_0^1 x \, dx =\frac{1}{2}. $$ Let $$ f_2=x-1/2. $$ Then $$ \left|\left| f_2\right|\right| ^2=\left \langle x-1/2,x-1/2 \right \rangle =\int_0^1\left(x^2-x+\frac{1}{4}\right)\, dx =\frac{1}{12}, $$ so $$ e_2=\frac{f_2}{\left|\left| f_2\right|\right| }=2\sqrt{3}\left(x-\frac{1}{2}\right). $$ Computing $e_3$: A calculation gives $$
\left \langle x^2,1 \right \rangle = \int_0^1 x^2 \, dx =\frac{1}{3}, $$ and $$ \left \langle x^2, 2 \sqrt{3} \left(x-\frac{1}{2}\right) \right\rangle =2\sqrt{3}\int_0^1\left(x^3-\frac{x^2}{2}\right) \, dx
=\frac{1}{2\sqrt{3}}. $$ Let $$ f_3=x^2-\frac{1}{2\sqrt{3}}\left(x-\frac{1}{2}\right)-\frac{1}{3}=x^2-x+\frac{1}{6}. $$ Then $$ \left|\left| f_3\right|\right| ^2=\int_0^1\left(x^2-x+\frac{1}{6}\right)^2 \, dx $$ and $$ e_3=\frac{f_3}{\left|\left| f_3\right|\right| }. $$

Example. What happens if the Gram-Schmidt procedure is applied to a list of vectors that is not linearly independent? By examining the proof, notice that the numerator in the Gram-Schmidt formula is the difference between $v_j$ and the orthogonal projection $P_u$ of $v_j$ onto the subspace$$ U=\operatorname{span}(v_1,\ldots,v_{j-1})=\operatorname{span}(e_1,\ldots,e_{j-1}).
$$ If $v_j\in U$, then $v_j-P_U v_j=0$, so the numerator has norm 0 and division by the denominator is not defined. The algorithm can be adapted to handle this case by testing for 0 in the denominator. If 0 is found, throw $v_j$ out of the list and continue. The result will be an orthonormal basis for $\operatorname{span}(v_1,\ldots,v_N).$

Example. Suppose $V$ is a real inner-product space and $(v_1,\ldots,v_m)$ is a linearly independent list of vectors in $V.$ Prove that there exist exactly $2^m$ orthonormal lists $(e_1,\ldots,e_m)$ of vectors in $V$ such that span $(v_1,\ldots,v_j)=$ span $(e_1,\ldots,e_j)$ for all $j\in {1,\ldots,m}.$

Example. Suppose $(e_1,\ldots,e_M)$ is an orthonormal list of vectors in $V.$ Let $v \in V.$ Prove that $$ ||v||^2=\sum_{n=1}^M \left|\left \langle v,e_n \right \rangle\right|^2 $$ if and only if $v \in $ span $(e_1,\ldots,e_M).$ Extend $(e_1,\ldots,e_M)$ to an orthonormal basis for $V.$ Then $$ \sum_{n=1}^N\left \langle v,e_n \right \rangle e_n $$ and $$ \left|\left| v \right|\right|^2=\sum_{n=1}^N \left|\left \langle v,e_n \right \rangle \right|^2. $$ If $v\in \operatorname{span}(e_1,\ldots,e_M)$ then $\left \langle v,e_n \right \rangle = 0$ for $n>M$, and $$ \left|\left| v \right|\right|^2=\sum_{m=1}^M \left|\left \langle v,e_n \right \rangle \right|^2. $$ If $v \not\in\operatorname{span}(e_1,\ldots,e_M)$ then for some $n>M$ we have $\left \langle v,e_n \right \rangle \neq 0.$ This gives $$ \left|\left| v \right|\right|^2=\sum_{n=1}^N\left|\left \langle v,e_n \right \rangle \right|^2>\sum_{m=1}^M\left|v,e_n\right|^2. $$

Example. Find an orthonormal basis of $\mathcal{P}_2(\mathbb{R})$, such that the differentiation operator on $\mathcal{P}_2(\mathbb{R})$ has an upper-triangular matrix with respect to this basis.

Example. Suppose $U$ is a subspace of $V.$ Prove that $\mathop{dim} U^{\bot} = \mathop{dim} V – \mathop{dim} U.$

Example. Suppose $U$ is a subspace of $V.$ Prove that $U^{\bot} = {0}$ if and only if $U=V.$ If $U=V$ and $v\in U^\perp$ then $v\in U\cap U^\perp$, and $\left \langle v,v \right \rangle =0$, so $v=0.$ Therefore, $V=U\oplus U^\perp.$ If $U^\perp={0}$, then $U=V.$

Example. Prove that if $P\in \mathcal{L}(V)$ is such that $P^2=P$ and every vector in null $P$ is orthogonal to every vector in range $P$, then $P$ is an orthogonal projection.

Example. Prove that if $P\in \mathcal{L}(V)$ is such that $P^2=P$ and $|| P v ||\leq ||v||$ for every $v\in V$, then $P$ is an orthogonal projection.

Example. Suppose $T\in \mathcal{L}(V)$ and $U$ is a subspace of $V.$ Prove that $U$ is invariant under $T$ if and only if $P_U T= T P_U.$

Example. Suppose $T\in \mathcal{L}(V)$ and $U$ is a subspace of $V.$ Prove that $U$ and $U^\bot$ are both invariant under $T$ if and only if $P_U T= T P_U.$

Example. In $\mathbb{R}^4$, let $U=\text{span} \left ( (1,1,0,0),(1,1,1,2) \right ).$ Find $u\in U$ such that $||u-(1,2,3,4)||$ is as small as possible. In $\mathbb{R}^4$ let $U=\operatorname{span}((1,1,0,0),(1,1,1,2)).$ Find $u\in U$ such that $\left|\left| u-(1,2,3,4)\right|\right| $ is as small as possible. We want the orthogonal projection $P_U(1,2,3,4).$ Notice that $U=\operatorname{span}((1,1,0,0),(0,0,1,2)).$ An orthonormal basis for $U$ is $$ \left( \frac{1}{\sqrt{2}},\frac{1}{\sqrt{2}} ,0,0\right ),\left(0,0,\frac{1}{\sqrt{5}},\frac{2}{\sqrt{5}} \right) $$ Thus the desired vector is $$
P_U(1,2,3,4)=\left(\frac{3}{2},\frac{3}{2},0,0\right)+\left(0,0,\frac{11}{5},\frac{2}{5}\right). $$

Example. Find a polynomial $p\in \mathcal{P}_3(\mathbb{R})$ such that $p(0)=0$, $p'(0)=0$, and $$\int_0^1 |2+3x-p(x) |^2 \, dx$$ is as small as possible.

Example. Find a polynomial $p\in \mathcal{P}_5(\mathbb{R})$ that makes $$\int{-\pi}^{\pi} |\sin x – p(x) |^2 \, dx$$ is as small as possible.

Example. Find a polynomial $p\in \mathcal{P}_2(\mathbb{R})$ such that$$ \phi(p)=p \left( \frac{1}{2} \right) = \int{0}^{1} p(x) \, q(x) \, dx $$ for every $p\in \mathcal{P}_2(\mathbb{R}).$ Here is the direct approach. Every $q\in\mathcal{P}_2(\mathbb{R})$ can be expressed as $$ \alpha+\beta(x-1/2)+\gamma(x-1/2)^2. $$ The desired polynomial $q$ must satisfy $$ p(x)=1: \qquad \phi(1)=p(1/2)=1=\int_0^1\left( \alpha+\beta(x-1/2)+\gamma(x-1/2)^2 \right) \, dx = \alpha+\gamma \frac{1}{12}. $$ Moving to $p(x)=x-1/2$ we find \begin{align} p(x)&=x-1/2: \qquad \phi(x-1/2)=p(1/2)=0 \\ &=\int_0^1(x-1/2)[\alpha+\beta(x-1/2)+\gamma(x-1/2)^2] \, dx =\beta \frac{1}{12}, \end{align} so $\beta=0.$ Finally $$ p(x)=(x-1/2)^2: \qquad \phi((x-1/2)^2)=p(1/2)=0 $$ $$ =\int_0^1(x-1/2)^2[\alpha+\beta(x-1/2)+\gamma(x-1/2)^2] \, dx =\alpha \frac{1}{12}+\gamma\frac{1}{80}. $$ Solving gives $$ \alpha=\frac{27}{12}, \qquad \beta=0, \qquad \gamma=-15. $$ Thus $$ q(x)=\frac{27}{12}-15(x-1/2)^2. $$

Example. Find a polynomial $q\in \mathcal{P}_2(\mathbb{R})$ such that $$ \phi(p)=\int_{0}^{1} p(x) \, (\cos \pi x) \, dx= \int_{0}^{1} p(x) \, q(x) \, dx $$ for every $p\in \mathcal{P}_2(\mathbb{R}).$ Taking the same approach as in the previous example. We compute $$ p(x)=1: \qquad \phi(1)=\int_0^1\cos(\pi x) \, dx=0 $$ $$ =\int_0^1[\alpha+\beta(x-1/2)+\gamma(x-1/2)^2]\,dx=\alpha+\gamma\frac{1}{12}. $$ Moving to $p(x)=x-1/2$ we find $$ p(x)=x-1/2: \qquad \phi(x-1/2)=\int_0^1(x-1/2)\cos(\pi x)\, dx=\int_0^1 x \cos (\pi x) \, dx $$ $$ =-\frac{2}{\pi^2}=\int_0^1(x-1/2)[\alpha+\beta(x-1/2)+\gamma(x-1/2)^2]\, dx=\beta\frac{1}{12}, $$ so $\beta=\frac{-24}{\pi^2}.$ Finally, since $\cos(\pi x)$ is odd about $x=1/2$, $$ p(x)=(x-1/2)^2: \qquad \phi((x-1/2)^2)=\int_0^1(x-1/2)^2\cos(\pi x)\, dx =0 $$ $$ =\int_0^1(x-1/2)^2[\alpha+\beta(x-1/2)+\gamma(x-1/2)^2]=\alpha \frac{1}{12} +\gamma\frac{1}{80}. $$ Solving gives $$ \alpha=\gamma=0, \qquad \beta=\frac{-24}{\pi^2}. $$ Thus $$ q(x)=\frac{-24}{\pi^2}(x-1/2). $$

Example. Give an example of a real vector space $V$ and $T\in \mathcal{L}(V)$ such that trace$(T^2) < 0.$

Example. Suppose $V$ is a real vector space, $T\in \mathcal{L}(V)$, and $V$ has a basis consisting of eignvectors of $T.$ Prove that trace$(T^2)\geq 0.$

Example. Suppose $V$ is an inner-product space and $v, w\in V.$ Define $T\in \mathcal{L}(V)$ by $T u=\langle u,v \rangle w.$ Find a formula for trace $T.$

Example. Prove that if $P\in \mathcal{L}(V)$ satisfies $P^2=P$, then trace $P$ is a nonnegative integer.

Example. Prove that if $V$ is an inner-product space and $T\in \mathcal{L}(V)$, then trace $T^*=\overline{\text{trace} T}.$

Example. Suppose $V$ is an inner-product space and $T\in \mathcal{L}(V)$ is a positive operator with trace $T=0$, then $T=0.$

Example. Suppose $T\in \mathcal{L}(\mathbb{C}^3)$ is the operator whose matrix is $$ \begin{bmatrix} 51 & -12 & -21 \\ 60 & -40 & -28 \\ 57 & -68 & 1 \end{bmatrix} . $$ If $-48$ and $24$ are eigenvalues of $T$, find the third eigenvalue of $T.$

Example. Prove or give a counterexample: if $T\in \mathcal{L}(V)$ and $c\in F$, then trace$(c T ) = c$ trace $T.$

Example. Prove or give a counterexample,: if $S, T\in \mathcal{L}(V)$, then trace $(S T)=(\text{trace} S)(\text{trace} T).$

Example. Suppose $T\in \mathcal{L}(V).$ Prove that if $\text{trace} (ST)=0$ for all $S\in \mathcal{L}(V)$, then $T=0.$

Example. Suppose $V$ is an inner-product space and $T\in \mathcal{L}(V).$ Prove that if $(e_1,\ldots,e_n)$ is an orthonormal basis of $V$, then $ \text{trace}(T^* T)=|| T e_1||^2+\cdots + ||T e_n||^2. $ Conclude that the right side of the equation above is independent of which orthonormal basis $(e_1,\ldots,e_n)$ is chosen for $V.$

Example. Suppose $V$ is a complex inner-product space and $T\in \mathcal{L}(V).$ Let $\lambda_1,\ldots,\lambda_n$ be the eigenvalues of $T$, repeated according to multiplicity.

Example. Suppose $$ \begin{bmatrix} a_{1,1} & \cdots & a_{1,n} \\ \vdots & & \vdots \\ a_{n,1} & \cdots & a_{n,n} \\ \end{bmatrix}$$ is the matrix of $T$ with respect to some orthonormal basis of $V.$ Prove that $$ |\lambda_1|^2+\cdots + |\lambda_n|^2\leq \sum^n_{k=1} \sum_{j=1}^n |a_{j,k}|^2. $$

Example. Suppose $V$ is an inner-product space. Prove that $\langle S, T\rangle=\text{trace}(S T^*)$ defines an inner-product on $\mathcal{L}(V).$

Example. Suppose $V$ is an inner-product space and $T\in \mathcal{L}(V).$ Prove that if $||T^* v ||\leq ||T v||$ for every $v \in V$, then $T$ is normal.

Example. Prove or give a counterexample: if $T\in \mathcal{L}(V)$ and $c\in F$, then $\det(cT)=c^{\mathop{dim} V} \det T.$

Example. Prove or give a counterexample: if $T\in \mathcal{L}(V)$, then $\det(S+T)=\det S+ \det T.$

Example. Suppose $A$ is a block upper-triangular matrix $$A=\begin{bmatrix} A_1 & & * \\ & \ddots & \\ 0 & & A_m\end{bmatrix},$$ where each $A_j$ along the diagonal is a square matrix. Prove that $\det A =(\det A_1) \cdots (\det A_m).$

5/5 (1 Review)

Leave a Comment

Scroll to Top