- Calculus, Volume 2: Multi-Variable Calculus and Linear Algebra with Applications to Differential Equations and Probability
- Tom M. Apostol
- Second Edition
- 1991
- 978-1-119-49676-2
2.16 Exercises
11. (a) $\quad$ Prove that a 2 x 2 matrix $A$ commutes with every 2 x 2 matrix if and only if $A$ commutes with each of the four matrices
$$\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}.$$
(b) $\quad$ Find all such matrices $A.$
Solution.
(a) $\quad$ Prove that a 2 x 2 matrix $A$ commutes with every 2 x 2 matrix if and only if $A$ commutes with each of the four matrices
$$\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}.$$
Proof.$\quad$ If $A$ commutes with every $2 \times 2$ matrix, then because the matrices $$\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix},$$ are $2 \times 2,$ it follows that $A$ commutes with each of them.
$\quad$ Now, if $A = \begin{bmatrix}a & b \\ c & d\end{bmatrix}$ commutes with the four matrices \begin{align*} \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}, \end{align*} we find that \begin{align*} \begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} &= \begin{bmatrix} a & 0 \\ c & 0 \end{bmatrix} = \begin{bmatrix} a & b \\ 0 & 0 \end{bmatrix} = \begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} \begin{bmatrix} a & b \\ c & d \end{bmatrix} \\ \\ \begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} &= \begin{bmatrix} 0 & a \\ 0 & c \end{bmatrix} = \begin{bmatrix} c & d \\ 0 & 0 \end{bmatrix} = \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \begin{bmatrix} a & b \\ c & d \end{bmatrix} \\ \\ \begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix} &= \begin{bmatrix} b & 0 \\ d & 0 \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ a & b \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix} \begin{bmatrix} a & b \\ c & d \end{bmatrix} \\ \\ \begin{bmatrix} a & b \\ c & d \end{bmatrix} \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix} &= \begin{bmatrix} 0 & b \\ 0 & d \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ c & d \end{bmatrix} = \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix} a & b \\ c & d \end{bmatrix} \end{align*} From this, we see that if $A = \begin{bmatrix}a & b \\ c & d\end{bmatrix}$ commutes with the four basis matrices, it must be of the form $A = \begin{bmatrix}a & b \\ c & d\end{bmatrix} = \begin{bmatrix}a & 0 \\ 0 & a \end{bmatrix},$ where $a$ is an arbitrary scalar. Then, by definition of scalar multiplication, if we let \begin{align*} P &= p\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad Q = q\begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \quad R = r\begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \quad S = s\begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix} \end{align*} where $p, q, r, s$ are arbitrary scalars, we have \begin{align*} AP &= \begin{bmatrix} ap & 0 \\ 0 & 0 \end{bmatrix} = p\begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix}\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} = p\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix} \begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix} = \begin{bmatrix}pa & 0 \\ 0 & 0\end{bmatrix} = PA \\ \\ AQ &= \begin{bmatrix} 0 & aq \\ 0 & 0 \end{bmatrix} = q\begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix}\begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} = q\begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} \begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix} = \begin{bmatrix}0 & qa \\ 0 & 0\end{bmatrix} = QA \\ \\ AR &= \begin{bmatrix} 0 & 0 \\ ar & 0 \end{bmatrix} = r\begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix}\begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix} = r\begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix} \begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix} = \begin{bmatrix}0 & 0 \\ ra & 0\end{bmatrix} = RA \\ \\ AS &= \begin{bmatrix} 0 & 0 \\ 0 & as \end{bmatrix} = s\begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix}\begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix} = s\begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix} \begin{bmatrix}a & 0 \\ 0 & a\end{bmatrix} = \begin{bmatrix}0 & 0 \\ 0 & sa\end{bmatrix} = SA \end{align*} In other words, $A$ commutes with $P, Q, R,$ and $S.$ Now, if we let $B = P + Q + R + S,$ the distributive laws of matrix multiplication give us: \begin{align*} AB &= A(P + Q + R + S) \\ &= AP + AQ + AR + AS \\ &= PA + QA + RA + SA \\ &= (P + Q + R + S)A \\ &= BA \end{align*} But because $B$ is a $2 \times 2$ matrix of arbitrary scalars, this implies that $A$ commutes with every $2 \times 2$ matrix. Thus, we have proven that $A$ commutes with every $2 \times 2$ matrix if and only if it commutes with each of the four matrices $$\begin{bmatrix} 1 & 0 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 1 & 0 \end{bmatrix}, \quad \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}. \quad \blacksquare$$
(b) $\quad$ Find all such matrices $A.$
$\quad$ As we saw in (a), for an arbitrary scalar $a,$ the set of $2 \times 2$ matrices $A$ that commute with all $2 \times 2$ matrices are of the form
\begin{align*}
A &= \begin{bmatrix}
a & 0
\\
0 & a
\end{bmatrix}
\quad \blacksquare
\end{align*}