Difference between revisions of "MAT2233"

From Department of Mathematics at UTSA
Jump to navigation Jump to search
(→‎Topics List: Added sections for each topic and links at the top of the page)
 
(32 intermediate revisions by 5 users not shown)
Line 6: Line 6:
 
==Topics List==
 
==Topics List==
 
{| class="wikitable sortable"
 
{| class="wikitable sortable"
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes
+
! Date !! Sections from Lay !! Sections from Bretscher !! Topics !! Prerequisite Skills !! Student Learning Outcomes
  
 
|-   
 
|-   
 +
  
|Week 1/2
+
|Week 1
  
 
||
 
||
  
<div style="text-align: center;">1.1, 1.2, and 1.3</div>
+
<div style="text-align: center;">1.1 and 1.2</div>
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">1.1</div>
  
 
||
 
||
 
          
 
          
[[Systems of Linear Equations]]  
+
[[Introduction to Linear Systems of Equations]]  
 +
 
 +
||
 +
 
 +
* Adding equations and multiplying equations by constants <!-- 1073-Mod 12.1 --> 
 +
* [[Solving Equations and Inequalities]] <!-- 1073-Mod R --> 
 +
 
 +
||
 +
 
 +
* Using elimination to find solutions of linear systems
 +
* The Geometrical interpretation of solutions to linear systems
 +
 
 +
|-
 +
 
 +
 
 +
|Week&nbsp;2
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">1.3, 1.4, and 1.5</div>
  
 
||
 
||
  
* Adding and subtracting equations
+
<div style="text-align: center;">1.2 and 1.3</div>
* Solving an equation for a specifed variable
 
* Equation for a line
 
  
 
||
 
||
 +
       
 +
[[Vectors and Matrices]]
  
 +
[[Gauss-Jordan Elimination]]
  
* Vectors and Matrices
+
||
* Gauss-Jordan elimination
+
 
* Rank of a matrix
+
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 --> 
* Matrix addition
+
* [[Linear Equations|Equation for a line]] <!-- 1073-Mod R --> 
* The product Ax
+
 
* Inner product
+
||
 +
 
 +
* Vectors and vector addition
 +
* Matrix notation
 +
* The Gauss-Jordan method for solving a linear system of equation
 +
* The rank of a matrix
 +
* Sums of Matrices
 +
* The product Ax (where A is a matrix and x is a vector)
 +
* The Dot product
 
* Linear Combinations
 
* Linear Combinations
  
Line 41: Line 74:
  
  
|Week&nbsp;3/
+
|Week&nbsp;3
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">2.1</div>
  
 
||
 
||
  
<div style="text-align: center;">2.1, 2.2, 2.3, and 2.4</div>
+
<div style="text-align: center;">2.3</div>
  
 
||
 
||
 
    
 
    
 +
[[Matrix Algebra and Matrix Multiplication]]
  
[[Linear Transformations]]  
+
||
 +
 
 +
* [[Range of a Function]] <!-- 1073-Mod 1.2-> 
 +
* [[Vectors and Matrices]], [[Gauss-Jordan Elimination]]  <!-- 2233-1.3--> 
 +
* [[Transformations of Functions]] <!-- 1073-Mod 6 --> 
  
 
||
 
||
  
* Basics of functions
+
* Matrix Operations
* Inverse functions and the identity function
+
* Matrix products by columns
* Vectors and the Inner product
+
* Matrix products using the dot product
 +
 
 +
|-
 +
 
 +
 
 +
 
 +
|Week&nbsp;3 
  
 
||
 
||
  
* Linear transformations and their properties
+
<div style="text-align: center;">2.2 and 2.3</div>
* Geometry of Linear Transformations (rotations, scalings and projections)
+
 
* Matrix Products
+
||
* The Inverses of a linear transform
+
 
 +
<div style="text-align: center;">2.4</div>
 +
 
 +
||
 +
 
 +
[[The Inverse of a Linear Transformation]]
 +
 
 +
||
 +
 
 +
* [[Matrix Algebra and Matrix Multiplication]]  <!-- 2233-2.3-->
 +
* [[Inverse functions and the identity function|Inverse Functions]] <!-- 1073-7.2-->
 +
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 --> 
 +
 
 +
||
 +
 
 +
* The Identity matrix
 +
* The Inverse of a Matrix
 +
* Various characterizations for an invertible matrix
  
  
Line 69: Line 134:
  
  
|Week&nbsp;5/6
+
 
 +
 
 +
|Week&nbsp;
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">1.8 and 1.9</div>
  
 
||
 
||
  
<div style="text-align: center;">3.1, 3.2, and 3.3</div>
+
<div style="text-align: center;">2.1</div>
  
 
||
 
||
 
    
 
    
[[Bases and Linear Independence]]  
+
[[Introduction to Linear Transformations]]  
  
 
||
 
||
  
* Linear Combinations
+
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->
* Dimension in R<sup>n</sup>
+
* [[Vectors and Matrices]], [[Gauss-Jordan Elimination]]  <!-- 1073-7.2-->  
* Image and kernel of a function
+
* [[Transformations of Functions]]  <!-- 1073-Mod 6 --> 
  
 
||
 
||
  
* Image and Kernel of a linear transformation
+
* Linear Transformations
* Span of a vector set
+
* Requirements for a transformation to be linear
* Subspace of R<sup>n</sup>
 
* Linear independence and basis
 
* Dimension
 
* Rank-nullity Theorem
 
  
 
|-
 
|-
  
 +
|Week&nbsp;5 
  
|Week&nbsp;7/
+
||
 +
 
 +
<div style="text-align: center;">1.7, 2.8, and 2.9</div>
  
 
||
 
||
  
<div style="text-align: center;"> 3.</div>
+
<div style="text-align: center;">3.2</div>
 +
 
 +
||
 +
 
 +
[[Subspaces of Rⁿ and Linear Independence]]
  
 
||
 
||
  
 +
* [[Matrix Algebra and Matrix Multiplication]]  <!-- 2233-2.3-->
 +
* [[The Inverse of a Linear Transformation]]
 +
 +
||
 +
 +
* Definition of a subspace of Rⁿ
 +
* Defining linear independence for a set of vectors
 +
* Definition of a basis for a subspace
 +
 +
 +
|-
 +
 +
|Week&nbsp;6 
 +
 +
||
 +
 +
<div style="text-align: center;">4.1</div>
 +
 +
||
 +
 +
<div style="text-align: center;">4.1</div>
 +
 +
||
 
    
 
    
[[Similar Matrices and Coordinates]]  
+
[[Introduction to Vector Spaces]]
 +
 
 +
||
 +
 
 +
* [[Matrix Algebra and Matrix Multiplication]]  <!-- 2233-2.3-->
 +
* [[Subspaces of Rⁿ and Linear Independence]]  
 +
 
 +
||
 +
 
 +
* Definition of a vector space(or linear space)
 +
* Subspaces of vector spaces
 +
* Linear combinations and bases for vector spaces
 +
* Examples of vector spaces of functions
 +
 
 +
 
 +
|-
 +
 
 +
 
 +
 
 +
|Week&nbsp;6 
  
 
||
 
||
  
* Conics (ellipses in particular)
+
<div style="text-align: center;">4.2</div>
* Equivalence Relations
 
  
 
||
 
||
  
* Coordinates in a subspace of Rn
+
<div style="text-align: center;">3.1</div>
* Similar matrices
+
 
* Diagonal matrices
+
||
 +
 
 +
[[The Column Space and Nullspace of a Linear Transformation]]
 +
 
 +
||
 +
 
 +
* [[Introduction to Linear Transformations ]]
 +
* [[Range of a Function]] <!-- 1073- mod 1.2-->
 +
* [[The Inverse of a Linear Transformation]]
  
 
||
 
||
 +
 +
* The image (or column space) of a linear transformation
 +
* The kernel (or nullspace) of a linear transformation
 +
* Properties of the kernel
  
  
Line 125: Line 252:
  
  
|Week&nbsp;9/10
+
|Week&nbsp;
  
 
||
 
||
  
<div style="text-align: center;"> 5.1, 5.2, 5.3, and 5.4 </div>
+
<div style="text-align: center;">4.3 and 4.5</div>
  
 
||
 
||
  
 +
<div style="text-align: center;">3.3 and 4.1</div>
 +
 +
||
 
    
 
    
[[Orthogonality]]  
+
[[The Dimension of a Vector Space]]
 +
 
 +
||
 +
 
 +
* [[Introduction to Vector Spaces]]
 +
* [[Subspaces of Rⁿ and Linear Independence]]
 +
 
 +
 
 +
||
 +
 
 +
* The number of vectors in a basis of R<sup>n</sup>
 +
* Dimension of a subspace in Rⁿ
 +
* The dimension of a vector space
 +
* The dimension of the nullspace (or kernel) and the column space (or image)
 +
* The Rank-nullity Theorem
 +
 
 +
 
 +
|-
 +
 
 +
 
 +
|- 
 +
 +
 
 +
|Week&nbsp;8
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">6.1 and 6.2</div>
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">Appendix A and 5.1</div>
 +
 
 +
||
 +
       
 +
[[Dot Products and Orthogonality]]  
  
 
||
 
||
  
* Parallel and perpendicular lines
+
* [[The Dimension of a Vector Space]]
* Absolute value function
+
* [[Subspaces of Rⁿ and Linear Independence]]
* Basic trigonometric function
 
* Properties of inner products
 
  
 
||
 
||
  
* Perpendicular vectors
+
* Orthogonal vectors
* Magnitude of vectors
+
* Length (or magnitude or norm) of a vector
* Transpose of a Matrix
+
* Unit vectors
 
* Orthonormal vectors
 
* Orthonormal vectors
* Orthogonal Projection
+
* Orthogonal projections
 +
* Orthogonal complements
 +
* Cauchy-Schwarz inequality
 +
* The angle between vectors
 +
 
 +
|-
 +
 
 +
 
 +
|- 
 +
 +
 
 +
|Week&nbsp;9
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">6.3 and 6.4</div>
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">5.2 and 5.3</div>
 +
 
 +
||
 +
       
 +
<p> [[Orthonormal Bases and the Gram-Schmidt Process]] </p>
 +
<p> [[Orthogonal Transformations and Orthogonal Matrices]] </p>
 +
 
 +
||
 +
 
 +
* [[Subspaces of Rⁿ and Linear Independence]] 
 +
* [[Dot Products and Orthogonality]]
 +
 
 +
||
 +
 
 +
* Orthogonal transformations
 
* Orthonormal Bases
 
* Orthonormal Bases
* Gram-Schmidt process
+
* Orthogonal matrices
* The Least Squares solution
+
* The transpose of a matrix
 +
* The Gram-Schmidt Process
 +
* QR factorization
 +
 
 +
|- 
 +
 +
 
 +
|Week&nbsp;10
  
 
||
 
||
 +
 +
<div style="text-align: center;">6.5 and 6.6</div>
 +
 +
||
 +
 +
<div style="text-align: center;">5.4</div>
 +
 +
||
 +
       
 +
[[The Least-squares Solution]]
 +
 +
||
 +
 +
* [[Dot Products and Orthogonality]] 
 +
* [[The Column Space and Nullspace of a Linear Transformation]]
 +
 +
||
 +
 +
* The orthogonal complement of the image is equal to the left nullspace (or kernel of the transpose) for all matrices
 +
* The least-squares solution for a linear system
 +
* Data fitting using the least-squares solution
  
  
Line 161: Line 385:
  
  
|Week&nbsp;11/12
+
|Week&nbsp;11
  
 
||
 
||
  
<div style="text-align: center;">6.1, 6.2, and 6.3 </div>
+
<div style="text-align: center;">3.1 and 3.2</div>
  
 
||
 
||
  
    
+
<div style="text-align: center;">6.1 and 6.2</div>
[[Determinants]]  
+
 
 +
||
 +
       
 +
<p> [[Introduction to Determinants]] </p>
 +
<p> [[Cramer's Rule]] </p>
 +
 
 +
||
 +
 
 +
* [[Orthonormal Bases and the Gram-Schmidt Process]]    
 +
* [[The Inverse of a Linear Transformation]]
 +
* [[Sigma Notation]]
 +
 
 +
||
 +
 
 +
* The determinant of 2 by 2 and 3 by 3 matrices
 +
* The determinant of a general n by n matrix
 +
* The determinant of a triangular matrix
 +
* Properties of the determinant
 +
* The determinant of the transpose
 +
* Invertibility and the determinant
 +
 
 +
 
 +
 
 +
|-
 +
 
 +
 
 +
|Week&nbsp;12
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">3.3</div>
  
 
||
 
||
  
* Summation notation
+
<div style="text-align: center;">6.3</div>
* Sgn function
 
  
 
||
 
||
 +
       
 +
[[The Geometric Interpretation of the Determinant]]
  
* Properties of Determinants
+
||
* Row operations and determinants
+
* Invertibility based on determinant
+
* [[Orthonormal Bases and the Gram-Schmidt Process]] 
* Geometric Interpretation of the Determinant
+
* [[Introduction to Determinants]]
* Cramer's rule
+
* [[The Inverse of a Linear Transformation]]
  
 
||
 
||
 +
 +
 +
* Cramer's Rule
 +
* The adjoint and inverse of a matrix
 +
* The area of a parallelogram and the volume of a parallelepiped
  
  
Line 191: Line 451:
  
  
|Week&nbsp;13/14
+
|Week&nbsp;13
  
 
||
 
||
  
<div style="text-align: center;">7.1, 7.2, 7.3, and 8.1 </div>
+
<div style="text-align: center;">The beginning of 5.3 as well as the sections 5.1 and 5.2</div>
  
 
||
 
||
  
 
+
<div style="text-align: center;">7.1, 7.2 and the beginning of 7.3</div>
 +
 
 +
||
 +
       
 
[[Eigenvalues and Eigenvectors]]  
 
[[Eigenvalues and Eigenvectors]]  
  
 
||
 
||
  
* Finding real roots of a polynomial
+
* [[Introduction to Determinants]]   
* Finding the kernel of a function
+
* [[The Column Space and Nullspace of a Linear Transformation]]
 +
* [[The Inverse of a Linear Transformation]]
  
 
||
 
||
  
* Diagonalization
+
* The requirement for a matrix to be diagonalizable
* Finding eigenvalues
+
* Definition of an eigenvector
* Finding eigenvectors
+
* The characteristic equation used to find eigenvalues
* Geometric and algebraic multiplicity
+
* Eigenvalues of a triangular matrix
* Spectral Theorem
+
* Eigenspaces for specific eigenvalues
 +
 
 +
 
 +
|-
 +
 
 +
 
 +
|Week&nbsp;14
  
 
||
 
||
 +
 +
<div style="text-align: center;">5.3 and 5.4</div>
 +
 +
||
 +
 +
<div style="text-align: center;">3.4 and 7.3</div>
 +
 +
||
 +
       
 +
[[Diagonalization of Matrices]]
 +
 +
||
 +
 +
* [[Eigenvalues and Eigenvectors]] 
 +
* [[The Column Space and Nullspace of a Linear Transformation]]
 +
 +
||
 +
 +
* Similar matrices
 +
* Diagonalization in terms of linearly independent eigenvectors
 +
* Algebraic and geometric multiplicity for a specific eigenvalue
 +
* The strategy for diagonalization
 +
 +
 +
 +
|-

Latest revision as of 12:58, 29 January 2022

A comprehensive list of all undergraduate math courses at UTSA can be found here.

The Wikipedia summary of Linear Algebra and its history.

Topics List

Date Sections from Lay Sections from Bretscher Topics Prerequisite Skills Student Learning Outcomes
Week 1
1.1 and 1.2
1.1

Introduction to Linear Systems of Equations

  • Using elimination to find solutions of linear systems
  • The Geometrical interpretation of solutions to linear systems
Week 2
1.3, 1.4, and 1.5
1.2 and 1.3

Vectors and Matrices

Gauss-Jordan Elimination

  • Vectors and vector addition
  • Matrix notation
  • The Gauss-Jordan method for solving a linear system of equation
  • The rank of a matrix
  • Sums of Matrices
  • The product Ax (where A is a matrix and x is a vector)
  • The Dot product
  • Linear Combinations


Week 3
2.1
2.3

Matrix Algebra and Matrix Multiplication

  • Matrix Operations
  • Matrix products by columns
  • Matrix products using the dot product
Week 3
2.2 and 2.3
2.4

The Inverse of a Linear Transformation

  • The Identity matrix
  • The Inverse of a Matrix
  • Various characterizations for an invertible matrix


Week 4
1.8 and 1.9
2.1

Introduction to Linear Transformations

  • Linear Transformations
  • Requirements for a transformation to be linear
Week 5
1.7, 2.8, and 2.9
3.2

Subspaces of Rⁿ and Linear Independence

  • Definition of a subspace of Rⁿ
  • Defining linear independence for a set of vectors
  • Definition of a basis for a subspace


Week 6
4.1
4.1

Introduction to Vector Spaces

  • Definition of a vector space(or linear space)
  • Subspaces of vector spaces
  • Linear combinations and bases for vector spaces
  • Examples of vector spaces of functions


Week 6
4.2
3.1

The Column Space and Nullspace of a Linear Transformation

  • The image (or column space) of a linear transformation
  • The kernel (or nullspace) of a linear transformation
  • Properties of the kernel


Week 7
4.3 and 4.5
3.3 and 4.1

The Dimension of a Vector Space


  • The number of vectors in a basis of Rn
  • Dimension of a subspace in Rⁿ
  • The dimension of a vector space
  • The dimension of the nullspace (or kernel) and the column space (or image)
  • The Rank-nullity Theorem


Week 8
6.1 and 6.2
Appendix A and 5.1

Dot Products and Orthogonality

  • Orthogonal vectors
  • Length (or magnitude or norm) of a vector
  • Unit vectors
  • Orthonormal vectors
  • Orthogonal projections
  • Orthogonal complements
  • Cauchy-Schwarz inequality
  • The angle between vectors
Week 9
6.3 and 6.4
5.2 and 5.3

Orthonormal Bases and the Gram-Schmidt Process

Orthogonal Transformations and Orthogonal Matrices

  • Orthogonal transformations
  • Orthonormal Bases
  • Orthogonal matrices
  • The transpose of a matrix
  • The Gram-Schmidt Process
  • QR factorization
Week 10
6.5 and 6.6
5.4

The Least-squares Solution

  • The orthogonal complement of the image is equal to the left nullspace (or kernel of the transpose) for all matrices
  • The least-squares solution for a linear system
  • Data fitting using the least-squares solution


Week 11
3.1 and 3.2
6.1 and 6.2

Introduction to Determinants

Cramer's Rule

  • The determinant of 2 by 2 and 3 by 3 matrices
  • The determinant of a general n by n matrix
  • The determinant of a triangular matrix
  • Properties of the determinant
  • The determinant of the transpose
  • Invertibility and the determinant


Week 12
3.3
6.3

The Geometric Interpretation of the Determinant


  • Cramer's Rule
  • The adjoint and inverse of a matrix
  • The area of a parallelogram and the volume of a parallelepiped


Week 13
The beginning of 5.3 as well as the sections 5.1 and 5.2
7.1, 7.2 and the beginning of 7.3

Eigenvalues and Eigenvectors

  • The requirement for a matrix to be diagonalizable
  • Definition of an eigenvector
  • The characteristic equation used to find eigenvalues
  • Eigenvalues of a triangular matrix
  • Eigenspaces for specific eigenvalues


Week 14
5.3 and 5.4
3.4 and 7.3

Diagonalization of Matrices

  • Similar matrices
  • Diagonalization in terms of linearly independent eigenvectors
  • Algebraic and geometric multiplicity for a specific eigenvalue
  • The strategy for diagonalization