- Calculus, Volume 2: Multi-Variable Calculus and Linear Algebra with Applications to Differential Equations and Probability
- Tom M. Apostol
- Second Edition
- 1991
- 978-1-119-49676-2
2.5 Algebraic operations on linear transformations
$\quad$ Functions whose values lie in a given linear space $W$ can be added to eachother and multiplied by the scalars in $W$ according to the following definition:
$\quad$ Definition. $\quad$ Let $S: V \to W$ and $T: V \to W$ be two functions with a common domain and with values in a linear space $W.$ If $c$ is any scalar in $W,$ we define the sum $S + T$ and the product $cT$ by the equations \begin{align*} (2.4) \qquad (S + T)(x) &= S(x) + T(x), \qquad (cT)(x) = cT(x) \end{align*} for all $x$ in $V.$
$\quad$ We are especially interested in the case where $V$ is also a linear space having the same scalars as $W.$ In this case, we denote by $\mathscr{L}(V, W)$ the set of all linear transformations of $V$ into $W.$
$\quad$ If $S$ and $T$ are two linear transformations in $\mathscr{L}(V, W),$ we can verify that $S + T$ and $cT$ are also linear transformations in $\mathscr{L}(V, W)$ using the above definitions for addition and scalar multiplication: \begin{align*} (S + T)(x) &= S(x) + T(x) \\ (cT)(x) &= cT(x) \end{align*} Since $W$ is a linear space satisfying the closure axioms, for every $x$ in $V$ and for an arbitrary scalar $c$ (in a set of scalars shared by $V$ and $W$) the transformations $S + T$ and $cT$ map $x$ onto the elements $S(x) + T(x)$ and $cT(x)$ of $W,$ respectively. In other words, the transformations $S + T$ and $cT$ are members of $\mathscr{L}(V, W).$
$\quad$ Moreover, with the addition and scalar multiplication operators as defined above, the set $\mathscr{L}(V, W)$ itself becomes a new linear space, with transformation $(-1)T$ being the negative of $T$ and the zero element being the zero transformation $O:$
$\quad$ Example. $\quad$ The zero transformation. $\quad$ The transformation $T: V \to V$ which maps each element of $V$ onto $O$ is called the zero transformation and is denoted by $O.$
$\quad$ This shows that $\mathscr{L}(V, W)$ satisfies the closure axioms as well as the existence of negatives and the zero element. It is a straightforward matter to verify that the remaining axioms are satisfied:
Axioms for addition
Axiom 3. $\quad$ Commutative Law. $\quad$ For all $x$ and $y$ in $V,$ we have $x + y = y + x.$
$\quad$ For any element $x$ in $V$ and for any transformations $S$ and $T$ in $\mathscr{L}(V, W),$ $S(x) + T(x)$ is an element of the linear space $W,$ which means $$ S(x) + T(x) = T(x) + S(x).$$ And by definition, $ (S + T)(x) = S(x) + T(x),$ thus
\begin{align*}
(S + T)(x) &= S(x) + T(x)
\\
&= T(x) + S(x)
\\
&= (T + S)(x)
\end{align*} But if this is the case for all $x$ in $V,$ then it follows that $S + T = T + S.$
Axiom 4. $\quad$ Associative law. $\quad$ For all $x, y$ and $z$ in $V,$ we have $(x + y) + z = x + (y + z).$
$\quad$ Let $x$ be any element of $V,$ and let $R, S,$ and $T$ be transformations in $\mathscr{L}(V, W).$ Since $R(x), S(x)$ and $T(x)$ are elements of the linear space $W,$ we have:
\begin{align*}
[(R + S) + T](x) &= (R + S)(x) + T(x)
\\
&= R(x) + S(x) + T(x)
\\
&= R(x) + (S + T)(x)
\\
&= [R + (S + T)](x)
\end{align*}
And since this is true for all $x$ in $V,$ we have $(R + S) + T = R + (S + T).$
Axioms for multiplication by numbers
Axiom 7. $\quad$ Associative law. $\quad$ For every $x$ in $V$ and all real numbers $a$ and $b,$ we have $$ a(bx) = (ab)x. $$
$\quad$ Let $T$ be a transformation in $\mathscr{L}(V, W)$ and let $a, b$ be real scalars shared by $V$ and $W.$ Then, for all $x$ in $V,$ we have:
\begin{align*}
a[bT(x)] &= aT(bx)
\\
&= T[(ab)x]
\\
&= (ab)T(x)
\end{align*}
and hence, $a(bT) = (ab)T.$
Axiom 8. $\quad$ Distributive Law for Addition in $V. \quad$ For all $x$ and $y$ in $V$ and all real $a,$ we have $$a(x + y) = ax + ay.$$ $\quad$ Let $S, T$ be transformations in $\mathscr{L}(V, W)$ and let $a$ be a real scalar shared by $V$ and $W,$ we then have
\begin{align*}
a[(S + T)(x)] &= a[S(x) + T(x)]
\\
&= aS(x) + aT(x)
\end{align*}
for all $x$ in $V.$ It follows that $a(S + T) = aS + aT.$
Axiom 9. $\quad$ Distributive Law for Addition of Numbers. $\quad$ For all $x$ in $V$ and all real $a, b,$ we have $$ (a + b)x = ax + bx. $$ $\quad$ Let $T$ be a transformation in $\mathscr{L}(V, W)$ and let $a$ and $b$ be real scalars shared by $V$ and $W.$ Then, we have:
\begin{align*}
(a + b)T(x) &= T[(a+b)x]
\\
&= T(ax + bx)
\\
&= T(ax) + T(bx)
\\
&= aT(x) + bT(x)
\end{align*}
for all $x$ in $V.$ Hence, $(a + b)T = aT + bT.$
Axiom 10. $\quad$ Existence of Identity. $\quad$ For every $x$ in $V,$ we have $1x = x.$
$\quad$ Let $T$ be a transformation in $\mathscr{L}(V, W)$ and let $x$ be any element of $V.$ We know that $1x = x$ by definition, which gives us:
\begin{align*}
1T(x) &= T(1x)
\\
&= T(x)
\end{align*}
Hence, $1T = T.$
$\quad$ Having verified that all ten axioms of a linear space are satisfied by $\mathscr{L}(V, W),$ we have proven the following:
$\quad$ Theorem 2.4. $ \quad$ The set $\mathscr{L}(V, W)$ of all linear transformations of $V$ into $W$ is a linear space with the operations of addition and scalar multiplication defined as in Equation (2.4).
A more interesting algebraic operation on linear transformations is composition or multiplication of transformations. This operation makes no use of the algebraic structure of a linear space and can be defined generally:
$\quad$Definition. $\quad$ Let $U, V, W$ be sets. Let $T: U \to V$ be a function with domain $U$ and values in $V,$ and let $S: V \to W$ be another function with domain $V$ and values in $W.$ Then, the composition ST is the function $ST: U \to W$ defined by the equation \begin{align*} (ST)(x) &= S[T(x)] \qquad \text{for every $x$ in $U.$} \end{align*} Thus, to map $x$ by the composition $ST,$ we first map $x$ by $T$ and then map $T(x)$ by $S.$
$\quad$ In previously covered topics on the composition of real-valued functions (eg. Volume 1, Section 3.7), we have seen that the operation in general is not commutative (that is, in general, $ST \neq TS$). However, as in the case of real-valued functions, the composition of transformations does satisfy an associative law:
$\quad$Theorem 2.5$\quad$ If $T: U \to V,$ $S: V \to W,$ and $R: W \to X$ are three functions, then we have \begin{align*} R(ST) &= (RS)T. \end{align*}
$\quad$Proof.$\quad$ Suppose $x$ is in $U.$ By definition, we have \begin{align*} [R(ST)](x) &= R[(ST)(x)] \\ &= R[S[T(x)]] \\ &= (RS)[T(x)] \\ &= [(RS)T](x) \end{align*} Thus, $R(ST) = (RS)T. \quad \blacksquare$
$\quad$Definition.$\quad$ Let $T: V \to V$ be a function which maps $V$ onto itself. We define integral powers of $T$ inductively as follows: \begin{align*} T^0 &= I, \qquad T^n = TT^{n-1}. \qquad \text{for $n \geq 1$} \end{align*} Here, $I$ is the identity transformation:
$\quad$Example. $\quad$ The identity transformation.$\quad$ The transformation $T: V \to V$ where $T(x) = x$ for each $x$ in $V,$ is called the identity transformation and is denoted by $I$ or $I_V.$
$\quad$ We can verify that the associative law implies the law of exponents $T^mT^n = T^{m+n}$ for all nonnegative integers $m$ and $n$ by applying the definition of the integral powers of $T$ $m + n - 1$ times to get: \begin{align*} T^mT^n &= T(T(...(I))) \\ &= T^{m+n} \end{align*} The next theorem shows that the composition of linear transformations is again linear.
$\quad$Theorem 2.6.$\quad$ If $U, V, W$ are linear spaces with the same scalars, and if $T:U \to V$ and $S:V \to W$ are linear transformations, then the composition $ST:U \to W$ is linear.
$\quad$ Proof.$\quad$ Recall from Section 2.1 that if $V$ and $W$ are linear spaces, then a function $T: V \to W$ is called a linear transformation from $V$ into $W$ if it has the following two properties:
$\quad$ (a) $\quad T(x + y) = T(x) + T(y) \quad$ for all $x$ and $y$ in $V,$
$\quad$ (b) $\quad T(cx) = cT(x) \quad$ for all $x$ in $V$ and all scalars $c.$
Now, let $x$ and $y$ be any elements of $U$ and let $a$ and $b$ be arbitrary scalars shared by $U, V,$ and $W.$ We have: \begin{align*} ST(ax + by) &= S[T(ax + by)] \\ &= S[T(ax) + T(by)] \\ &= S[aT(x) + bT(y)] \\ &= S[aT(x)] + S[bT(y)] \\ &= aST(x) + bST(y). \quad \blacksquare \end{align*}
$\quad$ Composition can be combined with the algebraic operations of addition and multiplication by scalars in $\mathscr{L}(V, W)$ to give us the following:
$\quad$Theorem 2.7.$\quad$ Let $U, V, W$ be linear spaces with the same scalars, assume $S$ and $T$ are in $\mathscr{L}(V, W),$ and let $c$ be any scalar.
$\quad$ (a) $\quad$ For any function $R$ with values in $V,$ we have
\begin{align*}
(S + T)R &= SR + TR \qquad \text{and} \qquad (cS)R = c(SR)
\end{align*}
$\quad$ (b) $\quad$ For any linear transformation $R: W \to U,$ we have
\begin{align*}
R(S + T) &= RS + RT \qquad \text{and} \qquad R(cS) = c(RS)
\end{align*}
$\quad$Proof.
$\quad$ (a) $\quad$ If $S$ and $T$ are transformations in $\mathscr{L}(V, W)$ and $R$ is a function whose values are in $V,$ then as defined in Equation (2.4) it follows that
\begin{align*}
(S + T)R = SR + TR \qquad \text{and} \qquad (cS)R = c(SR)
\end{align*}
$\quad$ (b) $\quad$ If $R: W \to U$ is a linear transformation, then by definition of a linear transformation, we have
\begin{align*}
R(S + T) &= RS + RT \qquad \text{and} \qquad R(cS) = c(RS) \quad \blacksquare
\end{align*}