
- Calculus, Volume 2: Multi-Variable Calculus and Linear Algebra with Applications to Differential Equations and Probability
- Tom M. Apostol
- Second Edition
- 1991
- 978-1-119-49676-2
1.12 Orthogonality in a Euclidean space
Definition.$\quad$ In a Euclidean space $V,$ two elements $x$ and $y$ are called orthogonal if their inner product is zero. A subset $S$ of $V$ is called an orthogonal set if $(x, y) = 0$ for every pair of distinct elements $x$ and $y$ in $S.$ An orthogonal set is called orthonormal if each of its elements has norm $1.$
Theorem 1.10$\quad$ In a Euclidean space $V,$ every orthogonal set of nonzero elements is independent. In particular, in a finite-dimensional Euclidean space with $\dim V = n,$ every orthogonal set consisting of $n$ nonzero elements is a basis for $V.$
Proof.$\quad$Let $S =\{x_1, \dots, x_n\}$ be an orthogonal set of $n$ nonzero elements in $V$ and let $a_1, \dots, a_n$ be scalars such that \begin{align*} \sum_{i = 1}^n a_ix_i &= O. \end{align*} Now, suppose $S$ is a dependent set. In other words, $a_1, \dots a_n$ are not all zero in the above sum. This means that we can represent any $x_i$ in S as a nontrivial combination of the remaining $n - 1$ elements. For simplicity, suppose $a_1x_1 \neq O,$ (we can rearrange the elements of $S$ as needed). Then, we rewrite $a_1x_1$ as \begin{align*} a_1x_1 &= -\sum_{i = 2}^n a_ix_i \end{align*} If we take the inner product of $a_1x_1$ with itself, we get the following sum: \begin{align*} (a_1x_1, a_1x_1) &= \left(a_1x_1, -\sum_{i = 2}^n a_ix_i\right) \\ &= \sum_{i = 2}^n (a_1x_1, -a_ix_i) \\ &= \sum_{i = 2}^n -a_1\bar{a_i}(x_1, x_i) \\ &= 0 \end{align*} Since $S$ is an orthogonal set. But this gives a contradiction since the inner product of any nonzero element with itself must be greater than zero. As such, $S$ must be independent. And from Theorem 1.7 we know that for a linear space $V$ with dimension $n,$ any subset of $n$ independent elements forms a basis for $V.$ Thus, $S$ forms a basis for $V.\quad\blacksquare$
Theorem 1.11.$\quad$ Let $V$ be a finite-dimensional Euclidean space with dimension $n,$ and assume that $S = \{e_1, \dots, e_n\}$ is an orthogonal basis for $V.$ If an element $x$ is expressed as a linear combination of the basis elements, say \begin{align*} (1.8) \qquad x &= \sum_{i = 1}^n c_ie_i, \end{align*} then its components relative to the ordered basis $(e_1, \dots, e_n)$ are given by the formulas \begin{align*} (1.9) \qquad c_j &= \frac{(x, e_j)}{(e_j, e_j)} \quad \text{for} \quad j = 1, 2, \dots, n. \end{align*} In particular, if $S$ is an orthonormal basis, each $c_j$ is given by \begin{align*} (1.10) \qquad c_j &= (x, e_j). \end{align*}
Proof.$\quad$By definition, each component of the ordered basis is orthogonal to the remaining elements of the set. Therefore, for each $j = 1, 2, \dots, n,$ the inner product $(x, e_j)$ is \begin{align*} (x, e_j) &= \overline{(e_j, x)} \\ &= \overline{\left(e_j, \sum_{i = 1}^n c_ie_i\right)} \\ &= \overline{(e_j, c_je_j)} \\ &= c_j(e_j, e_j) \end{align*} Dividing both sides by $(e_j, e_j),$ we obtain \begin{align*} c_j &= \frac{(x, e_j)}{(e_j,e_j)} \end{align*} And if $S$ is an orthonormal basis, then $\|e_j\| = (e_j, e_j)^{1/2} = 1$ which means that $\|e_j\|^2 =(e_j, e_j) = 1,$ giving us $c_j = (x, e_j). \quad \blacksquare$
If $\{e_1, \dots, e_n\}$ is an orthonormal basis, then we can write Equation $(1.8)$ as \begin{align*} (1.11) \qquad x &= \sum_{i = 1}^n (x, e_i)e_i \end{align*}
The next theorem shows that in a finite-dimensional Euclidean space with an orthonormal basis the inner product of two elements can be computed in terms of their components.
Theorem 15.12$\quad$ Let $V$ be a finite-dimensional Euclidean space of dimension $n,$ and assume that $\{e_1, \dots, e_n\}$ is an orthonormal basis for $V.$ Then, for every pair of elements $x$ and $y$ we have \begin{align*} (1.12) \qquad (x, y) &= \sum_{i = 1}^n(x, e_i)\overline{(y, e_i)} \qquad \text{(Parseval's formula)} \end{align*} In particular, when $x = y$ we have \begin{align*} (1.13) \qquad \|x\|^2 &= \sum_{i = 1}^n \left|(x, e_i)\right|^2. \end{align*}
Proof.$\quad$ Recall from Equation $(1.11)$ that if the components $\{e_1, \dots, e_n\}$ of $x$ form an orthonormal basis, we can define $x$ as \begin{align*} x &= \sum_{j=1}^n (x, e_j)e_j \end{align*} and similarly for $y.$ Then, the inner product $(x, y)$ becomes \begin{align*} (x, y) &= \left(\sum_{j=1}^n (x, e_j)e_j, \sum_{i=1}^n (y, e_i)e_i\right) \\ &= \sum_{i = 1}^n \left(\sum_{j=1}^n (x, e_j)e_j, (y, e_i)e_i\right) \\ &= \sum_{i = 1}^n \overline{\left((y, e_i)e_i, \sum_{j=1}^n (x, e_j)e_j\right)} \\ &= \sum_{i = 1}^n \sum_{j = 1}^n \overline{\left((y, e_i)e_i, (x, e_j)e_j\right)} \end{align*} But we know that for every $j \neq i,$ that $(e_i, e_j) = 0$ so the last sum simplifies to \begin{align*} (x, y) &= \sum_{i = 1}^n \overline{\left((y, e_i)e_i, (x, e_i)e_i\right)} \\ &= \sum_{i = 1}^n \left((x, e_i)e_i, (y, e_i)e_i\right) \\ &= \sum_{i = 1}^n (x, e_i)\overline{(y, e_i)}\left(e_i, e_i\right) \end{align*} And since the components $\{e_1, \dots, e_n\}$ form an orthonormal basis, $(e_i, e_i) = 1$ for all $i,$ giving us \begin{align*} (x, y) &= \sum_{i = 1}^n (x, e_i)\overline{(y, e_i)} \qquad (1.12) \end{align*} If we set $y = x,$ Equation $(1.12)$ becomes \begin{align*} \sum_{i = 1}^n (x, e_i)\overline{(y, e_i)} &= \sum_{i = 1}^n (x, e_i)\overline{(x, e_i)} \\ &= \sum_{i = 1}^n \left|(x, e_i)\right|^2 \qquad (1.13) \end{align*} This completes the proof.