Difference between revisions of "MAT2233"

From Department of Mathematics at UTSA
Jump to navigation Jump to search
(Removed the first version of the table)
 
(14 intermediate revisions by 4 users not shown)
Line 27: Line 27:
 
||
 
||
  
* [[Systems of Equations in Two Variables| Adding equations and multiplying equations by constants]] <!-- 1073-Mod 12.1 -->   
+
* Adding equations and multiplying equations by constants <!-- 1073-Mod 12.1 -->   
* [[Solving Equations]] <!-- 1073-Mod R -->   
+
* [[Solving Equations and Inequalities]] <!-- 1073-Mod R -->   
  
 
||
 
||
Line 50: Line 50:
 
||
 
||
 
          
 
          
[[Vectors, Matrices, and Gauss-Jordan Elimination]]  
+
[[Vectors and Matrices]]
 +
 
 +
[[Gauss-Jordan Elimination]]  
  
 
||
 
||
Line 59: Line 61:
 
||
 
||
  
* Vectors and vector spaces
+
* Vectors and vector addition
 
* Matrix notation
 
* Matrix notation
 
* The Gauss-Jordan method for solving a linear system of equation
 
* The Gauss-Jordan method for solving a linear system of equation
Line 72: Line 74:
  
  
|Week&nbsp;3  
+
|Week&nbsp;3
  
 
||
 
||
  
<div style="text-align: center;">1.8 and 1.9</div>
+
<div style="text-align: center;">2.1</div>
  
 
||
 
||
  
<div style="text-align: center;">2.1</div>
+
<div style="text-align: center;">2.3</div>
  
 
||
 
||
 
    
 
    
[[Introduction to Linear Transformations]]  
+
[[Matrix Algebra and Matrix Multiplication]]  
  
 
||
 
||
  
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->  
+
* [[Range of a Function]] <!-- 1073-Mod 1.2->
* [[Vectors, Matrices, and Gauss-Jordan Elimination]]  <!-- 1073-7.2-->  
+
* [[Vectors and Matrices]], [[Gauss-Jordan Elimination]]  <!-- 2233-1.3-->
 
* [[Transformations of Functions]]  <!-- 1073-Mod 6 -->   
 
* [[Transformations of Functions]]  <!-- 1073-Mod 6 -->   
  
 
||
 
||
  
* Linear Transformation
+
* Matrix Operations
* Requirements for a transformation to be linear
+
* Matrix products by columns
 +
* Matrix products using the dot product
 +
 
 +
|-
  
  
|-
 
  
|Week&nbsp;4    
+
|Week&nbsp;3    
  
 
||
 
||
  
<div style="text-align: center;">2.1</div>
+
<div style="text-align: center;">2.2 and 2.3</div>
  
 
||
 
||
  
<div style="text-align: center;">2.3</div>
+
<div style="text-align: center;">2.4</div>
  
 
||
 
||
 
    
 
    
[[Matrix Algebra and Matrix Multiplication]]  
+
[[The Inverse of a Linear Transformation]]  
  
 
||
 
||
  
* [[Range of a Function]] <!-- 1073-Mod 1.2->
+
* [[Matrix Algebra and Matrix Multiplication]] <!-- 2233-2.3-->  
* [[Vectors, Matrices, and Guass-Jordan Elimination]] <!-- 2233-1.3-->
+
* [[Inverse functions and the identity function|Inverse Functions]] <!-- 1073-7.2-->  
* [[Transformations of Functions]] <!-- 1073-Mod 6 -->   
+
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->   
  
 
||
 
||
  
* Matrix Operations
+
* The Identity matrix
* Matrix products by columns
+
* The Inverse of a Matrix
* Matrix products using the dot product
+
* Various characterizations for an invertible matrix
 +
 
  
 
|-
 
|-
 +
  
  
Line 134: Line 140:
 
||
 
||
  
<div style="text-align: center;">2.2 and 2.3</div>
+
<div style="text-align: center;">1.8 and 1.9</div>
  
 
||
 
||
  
<div style="text-align: center;">2.4</div>
+
<div style="text-align: center;">2.1</div>
  
 
||
 
||
 
    
 
    
[[The Inverse of a Linear Transformation]]  
+
[[Introduction to Linear Transformations]]  
  
 
||
 
||
  
* [[Matrix Algebra and Matrix Multiplication]] <!-- 2233-2.3-->  
+
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->  
* [[Inverse functions and the identity function|Inverse Functions]] <!-- 1073-7.2-->  
+
* [[Vectors and Matrices]], [[Gauss-Jordan Elimination]] <!-- 1073-7.2-->  
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->   
+
* [[Transformations of Functions]] <!-- 1073-Mod 6 -->   
  
 
||
 
||
  
* The Identity matrix
+
* Linear Transformations
* The Inverse of a Matrix
+
* Requirements for a transformation to be linear
* The Inverse of a Linear Transformation
 
* Various characterizations for an invertible matrix
 
 
 
  
 
|-
 
|-
Line 288: Line 291:
 
||
 
||
  
<div style="text-align: center;">6.1 and 6.4</div>
+
<div style="text-align: center;">6.1 and 6.2</div>
  
 
||
 
||
Line 332: Line 335:
 
||
 
||
 
          
 
          
[[Orthonormal Bases and the Gram-Schmidt Process]]  
+
<p> [[Orthonormal Bases and the Gram-Schmidt Process]] </p>
 +
<p> [[Orthogonal Transformations and Orthogonal Matrices]] </p>
  
 
||
 
||
Line 372: Line 376:
 
||
 
||
  
* The orthogonal complement of the image is equal to the left nullspace (or kernal of the transpose) for all matrices
+
* The orthogonal complement of the image is equal to the left nullspace (or kernel of the transpose) for all matrices
 
* The least-squares solution for a linear system
 
* The least-squares solution for a linear system
 
* Data fitting using the least-squares solution
 
* Data fitting using the least-squares solution
Line 393: Line 397:
 
||
 
||
 
          
 
          
[[Introduction to Determinants]]  
+
<p> [[Introduction to Determinants]] </p>
 +
<p> [[Cramer's Rule]] </p>
  
 
||
 
||
Line 399: Line 404:
 
* [[Orthonormal Bases and the Gram-Schmidt Process]]   
 
* [[Orthonormal Bases and the Gram-Schmidt Process]]   
 
* [[The Inverse of a Linear Transformation]]  
 
* [[The Inverse of a Linear Transformation]]  
* '''[[Signum Notation]]'''
+
* [[Sigma Notation]]
  
 
||
 
||

Latest revision as of 12:58, 29 January 2022

A comprehensive list of all undergraduate math courses at UTSA can be found here.

The Wikipedia summary of Linear Algebra and its history.

Topics List

Date Sections from Lay Sections from Bretscher Topics Prerequisite Skills Student Learning Outcomes
Week 1
1.1 and 1.2
1.1

Introduction to Linear Systems of Equations

  • Using elimination to find solutions of linear systems
  • The Geometrical interpretation of solutions to linear systems
Week 2
1.3, 1.4, and 1.5
1.2 and 1.3

Vectors and Matrices

Gauss-Jordan Elimination

  • Vectors and vector addition
  • Matrix notation
  • The Gauss-Jordan method for solving a linear system of equation
  • The rank of a matrix
  • Sums of Matrices
  • The product Ax (where A is a matrix and x is a vector)
  • The Dot product
  • Linear Combinations


Week 3
2.1
2.3

Matrix Algebra and Matrix Multiplication

  • Matrix Operations
  • Matrix products by columns
  • Matrix products using the dot product
Week 3
2.2 and 2.3
2.4

The Inverse of a Linear Transformation

  • The Identity matrix
  • The Inverse of a Matrix
  • Various characterizations for an invertible matrix


Week 4
1.8 and 1.9
2.1

Introduction to Linear Transformations

  • Linear Transformations
  • Requirements for a transformation to be linear
Week 5
1.7, 2.8, and 2.9
3.2

Subspaces of Rⁿ and Linear Independence

  • Definition of a subspace of Rⁿ
  • Defining linear independence for a set of vectors
  • Definition of a basis for a subspace


Week 6
4.1
4.1

Introduction to Vector Spaces

  • Definition of a vector space(or linear space)
  • Subspaces of vector spaces
  • Linear combinations and bases for vector spaces
  • Examples of vector spaces of functions


Week 6
4.2
3.1

The Column Space and Nullspace of a Linear Transformation

  • The image (or column space) of a linear transformation
  • The kernel (or nullspace) of a linear transformation
  • Properties of the kernel


Week 7
4.3 and 4.5
3.3 and 4.1

The Dimension of a Vector Space


  • The number of vectors in a basis of Rn
  • Dimension of a subspace in Rⁿ
  • The dimension of a vector space
  • The dimension of the nullspace (or kernel) and the column space (or image)
  • The Rank-nullity Theorem


Week 8
6.1 and 6.2
Appendix A and 5.1

Dot Products and Orthogonality

  • Orthogonal vectors
  • Length (or magnitude or norm) of a vector
  • Unit vectors
  • Orthonormal vectors
  • Orthogonal projections
  • Orthogonal complements
  • Cauchy-Schwarz inequality
  • The angle between vectors
Week 9
6.3 and 6.4
5.2 and 5.3

Orthonormal Bases and the Gram-Schmidt Process

Orthogonal Transformations and Orthogonal Matrices

  • Orthogonal transformations
  • Orthonormal Bases
  • Orthogonal matrices
  • The transpose of a matrix
  • The Gram-Schmidt Process
  • QR factorization
Week 10
6.5 and 6.6
5.4

The Least-squares Solution

  • The orthogonal complement of the image is equal to the left nullspace (or kernel of the transpose) for all matrices
  • The least-squares solution for a linear system
  • Data fitting using the least-squares solution


Week 11
3.1 and 3.2
6.1 and 6.2

Introduction to Determinants

Cramer's Rule

  • The determinant of 2 by 2 and 3 by 3 matrices
  • The determinant of a general n by n matrix
  • The determinant of a triangular matrix
  • Properties of the determinant
  • The determinant of the transpose
  • Invertibility and the determinant


Week 12
3.3
6.3

The Geometric Interpretation of the Determinant


  • Cramer's Rule
  • The adjoint and inverse of a matrix
  • The area of a parallelogram and the volume of a parallelepiped


Week 13
The beginning of 5.3 as well as the sections 5.1 and 5.2
7.1, 7.2 and the beginning of 7.3

Eigenvalues and Eigenvectors

  • The requirement for a matrix to be diagonalizable
  • Definition of an eigenvector
  • The characteristic equation used to find eigenvalues
  • Eigenvalues of a triangular matrix
  • Eigenspaces for specific eigenvalues


Week 14
5.3 and 5.4
3.4 and 7.3

Diagonalization of Matrices

  • Similar matrices
  • Diagonalization in terms of linearly independent eigenvectors
  • Algebraic and geometric multiplicity for a specific eigenvalue
  • The strategy for diagonalization