MAT2233

From Department of Mathematics at UTSA
Jump to navigation Jump to search

A comprehensive list of all undergraduate math courses at UTSA can be found here.

The Wikipedia summary of Linear Algebra and its history.

Topics List

Date Sections from Lay Sections from Bretscher Topics Prerequisite Skills Student Learning Outcomes
Week 1
1.1 and 1.2
1.1

Introduction to Linear Systems of Equations

  • Using elimination to find solutions of linear systems
  • The Geometrical interpretation of solutions to linear systems
Week 2
1.3, 1.4, and 1.5
1.2 and 1.3

Vectors, Matrices, and Gauss-Jordan Elimination

  • Vectors and vector spaces
  • Matrix notation
  • The Gauss-Jordan method for solving a linear system of equation
  • The rank of a matrix
  • Sums of Matrices
  • The product Ax (where A is a matrix and x is a vector)
  • The Dot product
  • Linear Combinations


Week 3
1.8 and 1.9
2.1

Introduction to Linear Transformations

  • Linear Transformation
  • Requirements for a transformation to be linear


Week 4
2.1
2.3

Matrix Algebra and Matrix Multiplication

  • Matrix Operations
  • Matrix products by columns
  • Matrix products using the dot product
Week 4
2.2 and 2.3
2.4

The Inverse of a Linear Transformation

  • The Identity matrix
  • The Inverse of a Matrix
  • The Inverse of a Linear Transformation
  • Various characterizations for an invertible matrix


Week 5
1.7, 2.8, and 2.9
3.2

Subspaces of Rⁿ and Linear Independence

  • Definition of a subspace of Rⁿ
  • Defining linear independence for a set of vectors
  • Definition of a basis for a subspace


Week 6
4.1
4.1

Introduction to Vector Spaces

  • Definition of a vector space(or linear space)
  • Subspaces of vector spaces
  • Linear combinations and bases for vector spaces
  • Examples of vector spaces of functions


Week 6
4.2
3.1

The Column Space and Nullspace of a Linear Transformation

  • The image (or column space) of a linear transformation
  • The kernel (or nullspace) of a linear transformation
  • Properties of the kernel


Week 7
4.3 and 4.5
3.3 and 4.1

The Dimension of a Vector Space


  • The number of vectors in a basis of Rn
  • Dimension of a subspace in Rⁿ
  • The dimension of a vector space
  • The dimension of the nullspace (or kernel) and the column space (or image)
  • The Rank-nullity Theorem


Week 8
6.1 and 6.4
Appendix A and 5.1

Dot Products and Orthogonality

  • Orthogonal vectors
  • Length (or magnitude or norm) of a vector
  • Unit vectors
  • Orthonormal vectors
  • Orthogonal projections
  • Orthogonal complements
  • Cauchy-Schwarz inequality
  • The angle between vectors
Week 9
6.3 and 6.4
5.2 and 5.3

Orthonormal Bases and the Gram-Schmidt Process

  • Orthogonal transformations
  • Orthonormal Bases
  • Orthogonal matrices
  • The transpose of a matrix
  • The Gram-Schmidt Process
  • QR factorization
Week 10
6.5 and 6.6
5.4

The Least-squares Solution

  • The orthogonal complement of the image is equal to the left nullspace (or kernal of the transpose) for all matrices
  • The least-squares solution for a linear system
  • Data fitting using the least-squares solution


Week 11
3.1 and 3.2
6.1 and 6.2

Introduction to Determinants

  • The determinant of 2 by 2 and 3 by 3 matrices
  • The determinant of a general n by n matrix
  • The determinant of a triangular matrix
  • Properties of the determinant
  • The determinant of the transpose
  • Invertibility and the determinant


Week 12
3.3
6.3

The Geometric Interpretation of the Determinant


  • Cramer's Rule
  • The adjoint and inverse of a matrix
  • The area of a parallelogram and the volume of a parallelepiped


Week 13
The beginning of 5.3 as well as the sections 5.1 and 5.2
7.1, 7.2 and the beginning of 7.3

Eigenvalues and Eigenvectors

  • The requirement for a matrix to be diagonalizable
  • Definition of an eigenvector
  • The characteristic equation used to find eigenvalues
  • Eigenvalues of a triangular matrix
  • Eigenspaces for specific eigenvalues


Week 14
5.3 and 5.4
3.4 and 7.3

Diagonalization of Matrices

  • Similar matrices
  • Diagonalization in terms of linearly independent eigenvectors
  • Algebraic and geometric multiplicity for a specific eigenvalue
  • The strategy for diagonalization


Date Sections Topics Prerequisite Skills Student Learning Outcomes
Week 1
1.1, 1.2

Systems of Linear Equations

  • Vectors and Matrices
  • Gauss-Jordan elimination
Week 2
1.3

Solutions of Linear Systems

  • Rank of a matrix
  • Matrix addition
  • The product Ax (where A is a matrix and x is a vector)
  • The Inner product
  • Linear Combinations


Week 3
2.1 and 2.2

Linear Transformations

  • Linear transformations and their properties
  • Geometry of Linear Transformations (rotations, scalings and projections)
Week 4
2.3 and 2.4

Matrix Products and Inverses

  • Matrix Products (both inner product and row-by-column methods)
  • The Inverses of a linear transform


Week 6
3.1

Image and Kernel of a Linear Transform

  • The image of a Linear transformation
  • The kernel of a linear transformation
  • Span of a set of vectors
  • Alternative characterizations of Invertible matrices


Week 6
3.2

Linear Independence

  • Subspaces of Rⁿ
  • Redundant vectors and linear independence
  • Characterizations of Linear Independence


Week 6
3.2

Bases of Subspaces

  • Bases and Linear independence
  • Basis of the image
  • Basis and unique representation


Week 5
3.3

The Dimension of a Subspace

  • Dimension of the Image
  • Rank-nullity theorem
  • Various bases in Rn


Week 7/8
3.4


Similar Matrices and Coordinates

  • Coordinates in a subspace of Rⁿ
  • Similar matrices
  • Diagonal matrices


Week 9
5.1

Orthogonal Projections and Orthonormal Bases

  • Magnitude (or norm or length) of a vector
  • Unit Vectors
  • Cauchy-Schwarz Inequality
  • Orthonormal vectors
  • Orthogonal complement
  • Orthogonal Projection
  • Orthonormal bases
  • Angle between vectors


Week 10
5.2

Gram-Schmidt Process and QR Factorization

  • Gram-Schmidt process
  • QR Factorization


Week 11
5.3

Orthogonal Transformations and Orthogonal Matrices

  • Orthogonal Transformations
  • Properties of Othogonal Transformations
  • Transpose of a Matrix
  • The matrix of an Orthogonal Projection


Week 11
5.3

Least Squares

  • The Least Squares Solution
  • The Normal Equation
  • Another matrix for an Orthogonal Projection


Week 11
6.1 and 6.2

Determinants

  • Properties of Determinants
  • Sarrus's Rule
  • Row operations and determinants
  • Invertibility based on the determinant


Week 12
6.3

Cramer's Rule

  • Parrallelepipeds in Rn
  • Geometric Interpretation of the Determinant
  • Cramer's rule


Week 13
7.1

Diagonalization

  • Diagonalizable matrices
  • Eigenvalues and eigenvectors
  • Real eigenvalues of orthogonal matrices


Week 14
7.2 and 7.3

Finding Eigenvalues and Eigenvectors

  • Eigenvalues from the characteristic equation
  • Eigenvalues of Triangular matrices
  • Characteristic Polynomial
  • Eigenspaces and eigenvectors
  • Geometric and algebraic multiplicity
  • Eigenvalues of similar matrices


Week 14
8.1

Symmetric Matrices

  • Orthogonally Diagonalizable Matrices
  • Spectral Theorem
  • The real eigenvalues of a symmetric matrix
Week 14
8.2

Quadratic Forms

  • Quadratic Forms
  • Diagonalizing a Quadratic Form
  • Definiteness of a Quadratic Form
  • Principal Axes
  • Ellipses and Hyperbolas from Quadratic Forms