
- Calculus, Volume 2: Multi-Variable Calculus and Linear Algebra with Applications to Differential Equations and Probability
- Tom M. Apostol
- Second Edition
- 1991
- 978-1-119-49676-2
1.15 Orthogonal components. Projections
$\quad$ Let $V$ be a Euclidean space and let $S$ be a finite-dimensional subspace. We wish to consider the following type of approximation problem: Given an element $x$ in $V,$ to determine an element in $S$ whose distance from $x$ is as small as possible. The distance between two elements $x$ and $y$ being defined as the norm $\|x - y\|.$
$\quad$ Definition. $\quad$Let $S$ be a subset of a Euclidean space $V.$ An element in $V$ is said to be orthogonal to $S$ if it is orthogonal to every element of $S.$ The set of all elements orthogonal to $S$ is denoted by $S^{\perp}$ and is called "S perpendicular".
$\quad$ Theorem 1.15. $\quad$ Orthogonal Decomposition Theorem. $\quad$ Let $V$ be a Euclidean space and let $S$ be a finite-dimensional subspace of $V.$ Then every element $x$ in $V$ can be represented uniquely as a sum of two elements, one in $S$ and one in $S^{\perp}.$ That is, we have: \begin{align*} (1.17) \qquad x = s + s^{\perp}, \qquad \text{where} \quad s \in s \qquad \text{and} \quad s^{\perp} \in S^{\perp} \end{align*} Moreover, the norm of $x$ is given by the Pythagorean formula: \begin{align*} (1.18) \qquad \|x\|^2 = \|s\|^2 + \|s^{\perp}\|^2 \end{align*}
$\quad$Proof.$\quad$ First, we show that an orthogonal decomposition of $x$ exists. Since $S$ is finite-dimensional, it has an orthonormal basis, say $\{e_1, \dots, e_n\}.$ Given $x,$ define the elements $s$ and $s^{\perp}$ as follows:
\begin{align*}
(1.19) \qquad s &= \sum_{i=1}^n (x, e_i)e_i,
\qquad s^{\perp} = x - s.
\end{align*}
Since $s$ is a linear combination of the basis elements of $S,$ $s$ is in the linear span of $S$ and hence $S.$ To show that $s^{\perp}$ is orthogonal to every element of $S,$ we take the inner product of $s^{\perp}$ with some basis element $e_j$ for $j \leq n:$
\begin{align*}
(s^{\perp}, e_j) &= (x - s, e_j)
\\
&= (x, e_j) - (s, e_j)
\end{align*}
Using $s$ as defined in $(1.19)$ and noting that $(e_j, e_i) = 0$ for $i \neq j,$ we find that the inner product $(s, e_j) = (x, e_j)$ and hence $(s^{\perp}, e_j) = 0$ for any $j \leq n,$ making $s^{\perp}$ orthogonal to all elements of $S.$
$\quad$ To show that $x$ is represented uniquely by $s$ and $s^{\perp},$ suppose $x$ can be represented in two ways:
\begin{align*}
x &= s + s^{\perp},
\qquad
\text{and}
\qquad
x = t + t^{\perp}
\end{align*}
where $s$ and $t$ are elements of $S$ and $s^{\perp}$ and $t^{\perp}$ are elements of $S^{\perp}.$ We wish to show that $s - t = 0$ and $s^{\perp} - t^{\perp} = 0.$ By definition, we know that $s + s^{\perp} = t + t^{\perp},$ which gives us $s - t = t^{\perp} - s^{\perp}.$ If we take the inner product of $s - t$ with itself, we find that:
\begin{align*}
(s - t, s - t) &= (s - t, s) - (s - t, t)
\\
&= (s, t^{\perp} - s^{\perp}) - (t, t^{\perp} - s^{\perp})
\\
&= (s, t^{\perp}) - (s, s^{\perp}) - (t, t^{\perp}) + (t, s^{\perp})
\end{align*}
By definition, $s^{\perp}$ and $t^{\perp}$ are orthogonal to all elements of $S,$ making the right-hand side zero. But $(s - t, s - t) = 0$ implies that $s - t = O,$ hence $s = t.$ By a similar argument, it also means $s^{\perp} = t^{\perp}.$ This shows that the decomposition is unique.
$\quad$ To show that the norm of $x$ is given by the Pythagorean formula $\|x\|^2 = \|s\|^2 + \|s^{\perp}\|^2,$ we first find $\|x\|^2 = (x, x).$
\begin{align*}
(x, x) &= (s + s^{\perp}, s + s^{\perp})
\\
&= (s, s) + 2(s, s^{\perp}) + (s^{\perp}, s^{\perp})
\\
&= \|s\|^2 + 2(s, s^{\perp}) + \|s^{\perp}\|^2
\end{align*}
But we know that $(s, s^{\perp}) = 0,$ giving us $\|x\|^2 = \|s\|^2 + \|s^{\perp}\|^2. \quad \blacksquare$
$\quad$Definition.$\quad$ Let $S$ be a finite-dimensional subspace of a Euclidean space $V,$ and let $\{e_1, \dots, e_n\}$ be an orthonormal basis for $S.$ If $x \in V,$ the element $s$ defined by the equation \begin{align*} s &= \sum_{i=1}^n (x, e_i)e_i \end{align*} is called the projection of $s$ on the subspace $S.$
$\quad$ In the next section, we will prove that the projection of $x$ on $S$ is the solution to the approximation problem presented at the beginning of this section.