Linear Dependence of Vectors

From Department of Mathematics at UTSA
Revision as of 00:02, 18 November 2021 by Khanh (talk | contribs) (Created page with "==Linear Independence and Dependence== <blockquote style="background: white; border: 1px solid black; padding: 1em;"> :'''Definition:''' A set of vectors <math>V = \{ \mathbf...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Linear Independence and Dependence

Definition: A set of vectors and the scalars has a solution for the vector equation , namely when . If this is the only solution to the vector equation, then the set is said to be Linearly Independent. If there exists other solutions where not all , then is said to be Linearly Dependent.

We will now look at some examples of vector sets which are either linearly independent or linearly dependent.

Note: Another way to write that a set of vectors is linearly independent is by saying that , which says that any vector that is a linear combination of the set of vectors is a unique linear combination.

Example 1

Determine whether or not the vector set is linearly independent or linearly dependent.

We first must see if there exists only one set of scalars (the trivial solution) as a solution to the vector equation or if there exists more solutions.

From this vector equation we get the following system of linear equations:

When we reduce this system to RREF, we get that:

Therefore our only set of scalars are both equal to zero. Therefore, the vector set is linearly independent.

Example 2

Determine whether the vector set is a linearly independent or a linearly dependent set.

Once again, for to be a linearly independent set then the vector equation must containing only one set of scalars . From our vector equation be obtain the following system of linear equations:

When we solve this system of equations we obtain that:

Suppose that so that . We see that for any , the scalars and satisfy the vector equation and thus there are infinitely many scalars sets. Therefore is a linearly dependent set.

Example 3

Show that if is an invertible matrix, then the column vectors of denoted , , …, are linearly independent.

Suppose that is an invertible matrix and suppose that we have the linear system . We note that the only solution to this linear system is the trivial solution, namely . This system corresponds to the following vector equations though:

And since , we see that this vector equation only have the scalars , so the column vectors of are linearly independent.

Example 4

Consider the set of vectors from the vector space . Determine if this set of vectors is linearly independent or linearly dependent.

When we expand the vector equation as follows, notice:

We can clearly see that this equation holds only if . We can also verify this with the following matrix:

Since this is an upper triangular matrix, its determinant is the product of the entries down the main diagonal and so which implies is invertible, and so the homogenous system has only the trivial solution .

Example 5

Show that the set of vectors from is a linearly independent set for all nonzero .

We should note that this example is a more general version of example 4. First, let's expand the vector equation as follows:

Once again, this vector equation is only true if . The same matrix from example 4 can be used to provide a more extensive argument.

Linear Dependence Lemma

We will now look at a very important lemma known as the linear dependence lemma.

Lemma (Linear Dependence Lemma): Let be a set of linearly dependent vectors in the vector space and . Then there exists such that:
a) .
b) .

The linear dependence lemma tells us that given a linear dependent set of vectors where the first vector is nonzero, then there exists a vector in the set such that can be written as a linear combination of (that is ), and that the set of linear combinations of the set of vectors is the same set as the combination of the set of vectors .

  • Proof: Let be a set of vectors that are linearly dependent where , and let . Since this set of vectors is not linearly independent, then:
  • Where not all are zero (otherwise the set of vectors would be linearly independent). Now since , it follows that not all are equal to zero, and so there exists an such that where is the largest index such that , in other words, , , etc…, and so so:
  • Therefore can be written as a linear combination of the set of vectors so which proves A.


  • Now let , and so can be written as a linear combination of the vectors in this set, and so there exists such that:
  • Now we substitute that and so:
  • Therefore can be written as a linear combination of the vectors and so . Therefore .

Licensing

Content obtained and/or adapted from: