The Dimension of a Vector Space

From Department of Mathematics at UTSA
Revision as of 13:25, 8 October 2021 by Lila (talk | contribs)
Jump to navigation Jump to search

In the prior subsection we defined the basis of a vector space, and we saw that a space can have many different bases. For example, following the definition of a basis, we saw three different bases for . So we cannot talk about "the" basis for a vector space. True, some vector spaces have bases that strike us as more natural than others, for instance, 's basis or 's basis or 's basis . But, for example in the space , no particular basis leaps out at us as the most natural one. We cannot, in general, associate with a space any single basis that best describes that space.

We can, however, find something about the bases that is uniquely associated with the space. This subsection shows that any two bases for a space have the same number of elements. So, with each space we can associate a number, the number of vectors in any of its bases.

This brings us back to when we considered the two things that could be meant by the term "minimal spanning set". At that point we defined "minimal" as linearly independent, but we noted that another reasonable interpretation of the term is that a spanning set is "minimal" when it has the fewest number of elements of any set with the same span. At the end of this subsection, after we have shown that all bases have the same number of elements, then we will have shown that the two senses of "minimal" are equivalent.

Before we start, we first limit our attention to spaces where at least one basis has only finitely many members.

Definition 2.1: A vector space is finite-dimensional if it has a basis with only finitely many vectors.

(One reason for sticking to finite-dimensional spaces is so that the representation of a vector with respect to a basis is a finitely-tall vector, and so can be easily written.) From now on we study only finite-dimensional vector spaces. We shall take the term "vector space" to mean "finite-dimensional vector space". Other spaces are interesting and important, but they lie outside of our scope.

To prove the main theorem we shall use a technical result.

Lemma 2.2 (Exchange Lemma): Assume that is a basis for a vector space, and that for the vector the relationship has . Then exchanging for yields another basis for the space.

Proof: Call the outcome of the exchange .

We first show that is linearly independent. Any relationship among the members of , after substitution for ,

gives a linear relationship among the members of . The basis is linearly independent, so the coefficient of is zero. Because is assumed to be nonzero, . Using this in equation above gives that all of the other 's are also zero. Therefore is linearly independent.

We finish by showing that has the same span as . Half of this argument, that , is easy; any member of can be written , which is a linear combination of linear combinations of members of , and hence is in . For the half of the argument, recall that when with , then the equation can be rearranged to . Now, consider any member of , substitute for its expression as a linear combination of the members of , and recognize (as in the first half of this argument) that the result is a linear combination of linear combinations, of members of , and hence is in .

Theorem 2.3: In any finite-dimensional vector space, all of the bases have the same number of elements.

Proof: Fix a vector space with at least one finite basis. Choose, from among all of this space's bases, one of minimal size. We will show that any other basis also has the same number of members, . Because has minimal size, has no fewer than vectors. We will argue that it cannot have more than vectors.

The basis spans the space and is in the space, so is a nontrivial linear combination of elements of . By the Exchange Lemma, can be swapped for a vector from , resulting in a basis , where one element is and all of the other elements are 's.

The prior paragraph forms the basis step for an induction argument. The inductive step starts with a basis (for ) containing members of and members of . We know that has at least members so there is a . Represent it as a linear combination of elements of . The key point: in that representation, at least one of the nonzero scalars must be associated with a or else that representation would be a nontrivial linear relationship among elements of the linearly independent set . Exchange for to get a new basis with one more and one fewer than the previous basis .

Repeat the inductive step until no 's remain, so that contains . Now, cannot have more than these vectors because any that remains would be in the span of (since it is a basis) and hence would be a linear combination of the other 's, contradicting that is linearly independent.

Definition 2.4:

The dimension of a vector space is the number of vectors in any of its bases.

Template:TextBox

Template:TextBox

Template:TextBox

Again, although we sometimes say "finite-dimensional" as a reminder, in the rest of this book all vector spaces are assumed to be finite-dimensional. An instance of this is that in the next result the word "space" should be taken to mean "finite-dimensional vector space".

Template:TextBox

Template:TextBox

Template:TextBox

Template:TextBox

Template:TextBox

Template:TextBox

Template:TextBox

Template:TextBox

Template:TextBox

The main result of this subsection, that all of the bases in a finite-dimensional vector space have the same number of elements, is the single most important result in this book because, as Example 2.9 shows, it describes what vector spaces and subspaces there can be. We will see more in the next chapter.

Template:TextBox