- Calculus, Volume 2: Multi-Variable Calculus and Linear Algebra with Applications to Differential Equations and Probability
- Tom M. Apostol
- Second Edition
- 1991
- 978-1-119-49676-2
2.21 Miscellaneous review exercises on matrices
8. $\quad$ A square matrix $A$ is called an orthogonal matrix if $AA^t = I.$ Verify that the $2 \times 2$ matrix $\begin{bmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{bmatrix}$ is orthogonal for each real $\theta.$ If $A$ is any $n \times n$ orthogonal matrix, prove that its rows, considered as vectors in $V_n,$ form an orthonormal set.
$\quad$ Proof. $\quad$ We have \begin{align*} A &= \begin{bmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{bmatrix}, \qquad A^t = \begin{bmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix} \end{align*} \begin{align*} AA^t &= \begin{bmatrix} \cos \theta & -\sin \theta \\ \sin \theta & \cos \theta \end{bmatrix} \begin{bmatrix} \cos \theta & \sin \theta \\ -\sin \theta & \cos \theta \end{bmatrix} \\ &= \begin{bmatrix} \cos^2 \theta + \sin^2 \theta & \cos \theta \sin \theta - \sin \theta \cos \theta \\ \sin \theta \cos \theta - \cos \theta \sin \theta & \sin^2 \theta + \cos^2 \theta \end{bmatrix} \\ &= \begin{bmatrix} 1 & 0 \\ 0 & 1 \end{bmatrix} \end{align*} for all real $\theta.$ Now, if we let $A$ be an $n \times n$ matrix $A = (a_{ij})_{i,j = 1}^{n,n},$ we have \begin{align*} AA^t &= \begin{bmatrix} a_{11} & \cdots & a_{1n} \\ \vdots & & \vdots \\ a_{n1} & \cdots & a_{nn} \end{bmatrix} \begin{bmatrix} a_{11} & \cdots & a_{n1} \\ \vdots & & \vdots \\ a_{1n} & \cdots & a_{nn} \end{bmatrix} \end{align*} From this, we can see that the $ij^{th}$ element of $AA^t$ is given by \begin{align*} \sum_{k=1}^n a_{ik}a^t_{kj} \end{align*} where $a^t_{kj}$ is the element $a_{jk}$ of $A.$ This gives us \begin{align*} \sum_{k=1}^n a_{ik}a^t_{kj} &= \sum_{k=1}^n a_{ik}a_{jk} \end{align*} In other words, if we set $A_k$ as the $k^{th}$ row of $A,$ the $ij^{th}$ element of $AA^t$ is given by $A_i \cdot A_j.$ Moreover, if $A$ is an orthogonal matrix, then $AA^t = I$ and we get \begin{align*} A_i \cdot A_j &= \begin{cases} 1 \quad \text{if $i = j$} \\ 0 \quad \text{if $i \neq j$} \end{cases} \end{align*} But if we consider the rows of $A$ as vectors in the Euclidean space $V_n$ with the dot product as their inner product, we have $(A_i, A_j) = 0$ for distinct vectors in $A,$ and $\|A_k\| = 1$ for each $A_k$ in $A.$ Thus, by definition, the rows of $A$ form an orthonormal set. $\quad \blacksquare$