Difference between revisions of "The Inverse of a Linear Transformation"

From Department of Mathematics at UTSA
Jump to navigation Jump to search
(Created page with "A '''linear transformation''' is an important concept in mathematics because many real world phenomena can be approximated by linear models. Unlike a linear function, a line...")
 
(Blanked the page)
Tag: Blanking
Line 1: Line 1:
A '''linear transformation''' is an important concept in mathematics because many real world phenomena can be approximated by linear models.
 
  
Unlike a linear function, a linear transformation works on vectors as well as numbers.
 
 
== Motivations and definitions ==
 
Say we have the vector <math>\begin{pmatrix} 1 \\ 0 \end{pmatrix}</math> in <math>\mathbb{R}^2</math>, and we rotate it through 90 degrees, to obtain the vector <math>\begin{pmatrix} 0 \\ 1 \end{pmatrix}</math>.
 
 
Another example instead of rotating a vector, we stretch it, so a vector <math>\mathbf{v}</math> becomes <math>2\mathbf{v}</math>, for example. <math>\begin{pmatrix} 2 \\ 3 \end{pmatrix}</math> becomes <math>\begin{pmatrix} 4 \\ 6 \end{pmatrix}</math>
 
 
Or, if we look at the ''projection'' of one vector onto the ''x'' axis - extracting its ''x'' component - , e.g. from
 
<math>\begin{pmatrix} 2 \\ 3 \end{pmatrix}</math> we get <math>\begin{pmatrix} 2 \\ 0 \end{pmatrix}</math>
 
 
These examples are all an example of a ''mapping'' between two vectors, and are all linear transformations. If the rule transforming the matrix is called <math>T</math>, we often write <math>T\mathbf{v}</math> for the mapping of the vector <math>\mathbf{v}</math> by the rule <math>T</math>. <math>T</math> is often called the transformation.
 
 
Note we do not always write brackets like when we write functions. However we ''should'' write brackets, especially when we want to express the mapping of the sum or the product or the combination of many vectors.
 
 
== Definitions ==
 
===Linear Operators===
 
 
Suppose one has a field K, and let x be an element of that field. Let O be a function taking values from K where O(x) is an element of a field J. Define O to be a linear form if and only if:
 
# O(x+y)=O(x)+O(y)
 
# O(&lambda;x)=&lambda;O(x)
 
 
===Linear Forms===
 
Suppose one has a vector space V, and let x be an element of that vector space. Let F be a function taking values from V where F(x) is an element of a field K. Define F to be a linear form if and only if:
 
# F(x+y)=F(x)+F(y)
 
# F(&lambda;x)=&lambda;F(x)
 
 
===Linear Transformation===
 
This time, instead of a field, let us consider functions from one vector space into another vector space. Let T be a function taking values from one vector space V where L(V) are elements of another vector space. Define L to be a linear transformation when it:
 
# ''preserves scalar multiplication'': T(&lambda;'''x''') = &lambda;T'''x'''
 
# ''preserves addition'': T('''x'''+'''y''') = T'''x''' + T'''y'''
 
 
Note that not all transformations are linear. Many simple transformations that are in the real world are also non-linear. Their study is more difficult, and will not be done here. For example, the transformation ''S'' (whose input and output are both vectors in '''R'''<sup>2</sup>) defined by
 
 
<math>S\mathbf{x} = S\begin{pmatrix} x \\
 
                                    y  \end{pmatrix}= \begin{pmatrix}  xy\\
 
                                                                        \cos(y)\end{pmatrix}</math>
 
 
We can learn about nonlinear transformations by studying easier, linear ones.
 
 
We often ''describe'' a transformation T in the following way
 
:<math>T : V \rightarrow W</math>
 
 
This means that T, whatever transformation it may be, maps vectors in the vector space V to a vector in the vector space W.
 
 
The actual transformation ''could'' be written, for instance, as
 
:<math>T\begin{pmatrix} x \\ y\end{pmatrix} = \begin{pmatrix} x + y \\ x - y \end{pmatrix}</math>
 
 
== Examples and proofs ==
 
Here are some examples of some linear transformations. At the same time, let's look at how we can prove that a transformation we may find is linear or not.
 
 
=== Projection ===
 
Let us take the projection of vectors in '''R'''<sup>2</sup> to vectors on the ''x''-axis. Let's call this transformation T.
 
 
We know that T maps vectors from '''R'''<sup>2</sup> to '''R'''<sup>2</sup>, so we can say
 
: <math>T: \mathbb{R}^2 \rightarrow \mathbb{R}^2</math>
 
 
and we can then write the transformation itself as
 
: <math>T\begin{pmatrix} x_0 \\ x_1 \end{pmatrix} = \begin{pmatrix} x_0 \\ 0 \end{pmatrix}</math>
 
 
Clearly this is linear. (''Can you see why, without looking below?'')
 
 
Let's go through a proof that the conditions in the definitions are established.
 
 
==== Scalar multiplication is preserved ====
 
We wish to show that for all vectors '''v''' and all scalars &lambda;, T(&lambda;'''v''')=&lambda;T('''v''').
 
 
Let
 
: <math>\mathbf{v}=\begin{pmatrix} v_0 \\ v_1 \end{pmatrix}</math>.
 
Then
 
:<math>\lambda\mathbf{v}=\begin{pmatrix} \lambda v_0 \\ \lambda v_1 \end{pmatrix}</math>
 
Now
 
:<math> T(\lambda\mathbf{v}) = T\begin{pmatrix} \lambda v_0 \\ \lambda v_1\end{pmatrix} = </math>
 
:<math> \begin{pmatrix} \lambda v_0 \\ 0 \end{pmatrix} </math>
 
If we work out &lambda;T('''v''') and find it is the same vector, we have proved our result.
 
:<math> \lambda T\mathbf{v}= \lambda \begin{pmatrix} v_0 \\ 0 \end{pmatrix}=</math>
 
:<math> \begin{pmatrix} \lambda v_0 \\ 0 \end{pmatrix} </math>
 
This is the same vector as above, so under the transformation T, ''scalar multiplication is preserved''.
 
 
==== Addition is preserved ====
 
We wish to show for all vectors '''x''' and '''y''', T('''x'''+'''y''')=T'''x'''+T'''y'''.
 
 
Let
 
: <math>\mathbf{x}=\begin{pmatrix} x_0 \\ x_1 \end{pmatrix}</math>.
 
and
 
: <math>\mathbf{y}=\begin{pmatrix} y_0 \\ y_1 \end{pmatrix}</math>.
 
Now
 
: <math>T(\mathbf{x}+\mathbf{y})=T\left(\begin{pmatrix} x_0 \\ x_1 \end{pmatrix}+\begin{pmatrix} y_0 \\ y_1 \end{pmatrix}\right)=</math>
 
: <math>T\begin{pmatrix} x_0 + y_0 \\ x_1 + y_1 \end{pmatrix} =</math>
 
: <math>\begin{pmatrix} x_0 + y_0 \\ 0 \end{pmatrix}</math>
 
Now if we can show T'''x'''+T'''y''' is this vector above, we have proved this result.
 
Proceed, then,
 
:<math>T\begin{pmatrix} x_0 \\ x_1 \end{pmatrix} + T\begin{pmatrix} y_0 \\ y_1 \end{pmatrix}=\begin{pmatrix} x_0 \\ 0 \end{pmatrix} + \begin{pmatrix} y_0 \\0 \end{pmatrix}=</math>
 
: <math>\begin{pmatrix} x_0 + y_0 \\ 0 \end{pmatrix}</math>
 
So we have that the transformation T ''preserves addition''.
 
 
==== Zero vector is preserved ====
 
Clearly we have
 
: <math>T\begin{pmatrix} 0 \\ 0 \end{pmatrix} = \begin{pmatrix} 0 \\ 0 \end{pmatrix} </math>
 
 
==== Conclusion ====
 
We have shown T preserves addition, scalar multiplication and the zero vector. So T must be linear.
 
 
== Disproof of linearity ==
 
When we want to ''disprove'' linearity - that is, to ''prove'' that a transformation is ''not'' linear, we need only find one counter-example.
 
 
If we can find just one case in which the transformation does not preserve addition, scalar multiplication, or the zero vector, we can conclude that the transformation is not linear.
 
 
For example, consider the transformation
 
: <math> T\begin{pmatrix} x \\ y\end{pmatrix} = \begin{pmatrix}  x^3 \\ y^2\end{pmatrix}</math>
 
 
We suspect it is not linear. To prove it is not linear, take the vector
 
: <math> \mathbf{v} = \begin{pmatrix} 2 \\ 2 \end{pmatrix} </math>
 
then
 
: <math> T(2\mathbf{v}) = \begin{pmatrix} 64 \\ 16 \end{pmatrix}</math>
 
but
 
: <math> 2T(\mathbf{v}) = \begin{pmatrix} 16 \\ 8 \end{pmatrix}</math>
 
 
so we can immediately say T is not linear because it doesn't preserve scalar multiplication.
 
 
=== Problem set ===
 
Given the above, determine whether the following transformations are in fact linear or not. Write down each transformation in the form T:V -> W, and identify V and W. (Answers follow to even-numbered questions):
 
# <math>T\begin{pmatrix} v_0 \\ v_1 \end{pmatrix} = \begin{pmatrix} v_0^2 + v_1 \\ v_1 \end{pmatrix}</math>
 
# <math>T\begin{pmatrix} v_0 \\ v_1 \end{pmatrix} = \begin{pmatrix} 1 \\ v_0 \end{pmatrix}</math>
 
# <math>T\begin{pmatrix} v_0 \\ v_1 \end{pmatrix} = \mathbf{0}</math>
 
# <math>T\begin{pmatrix} v_0 \\ v_1 \\ v_2 \end{pmatrix} = \begin{pmatrix} v_0 - v_2 \\ v_1 \end{pmatrix}</math>
 
 
==== Answers ====
 
: 2. No. A check whether the zero vector is preserved readily confirms this fact. T : '''R'''<sup>2</sup> -> '''R'''<sup>2</sup>
 
: 4. Yes. T : '''R'''<sup>3</sup> -> '''R'''<sup>2</sup>.
 
 
== Images and kernels ==
 
We have some fundamental concepts underlying linear transformations, such as the ''kernel'' and the ''image'' of a linear transformation, which are analogous to the ''zeros'' and ''range'' of a function.
 
 
=== Kernel ===
 
The ''kernel'' of a linear transformation T: V -> W is the set of all vectors in V which are mapped to the zero vector in W, ie.,
 
: <math> \mathrm{ker}\ T = \{v \in V\ |\ T\mathbf{v} = \mathbf{0}\}</math>
 
 
Coincidentally because of the matr to the matrix equation A'''x'''='''0'''.
 
 
The kernel of a transform T: V->W is always a subspace of V. The dimension of a transform or a matrix is called the ''nullity''..
 
 
=== Image ===
 
The ''image'' of a linear transformation T:V->W is the set of all vectors in W which were mapped from vectors in V. For example with the trivial mapping T:V->W such that T'''x'''='''0''', the image would be '''0'''. (''What would the kernel be?'').
 
 
More formally, we say that the image of a transformation T:V->W is the set
 
: <math> \mathrm{im}\ T = \{w \in W\ |\ w=T\mathbf{v}\ \mathrm{and}\ \mathbf{v}\in V\}</math>
 
 
== Isomorphism ==
 
 
A linear transformation T:V -> W is an isomorphic transformation if it is:
 
* one-to-one and onto.
 
* kernel(T) = {0} and the range(T) = W.
 
* an inverse of T exists.
 
* dim(V) = dim(W).
 

Revision as of 13:00, 29 September 2021