Difference between revisions of "MAT2233"

From Department of Mathematics at UTSA
Jump to navigation Jump to search
(Added content to table (Orthogonality and determinants))
 
(27 intermediate revisions by 5 users not shown)
Line 6: Line 6:
 
==Topics List==
 
==Topics List==
 
{| class="wikitable sortable"
 
{| class="wikitable sortable"
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes
+
! Date !! Sections from Lay !! Sections from Bretscher !! Topics !! Prerequisite Skills !! Student Learning Outcomes
  
 
|-   
 
|-   
 +
  
 
|Week 1
 
|Week 1
Line 14: Line 15:
 
||
 
||
  
<div style="text-align: center;">1.1, 1.2</div>
+
<div style="text-align: center;">1.1 and 1.2</div>
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">1.1</div>
  
 
||
 
||
 
          
 
          
[[Systems of Linear Equations]]  
+
[[Introduction to Linear Systems of Equations]]  
  
 
||
 
||
  
* [[Systems of Equations in Two Variables| Adding and multiplying equations by constants]] <!-- 1073-Mod 12.1 -->   
+
* Adding equations and multiplying equations by constants <!-- 1073-Mod 12.1 -->   
* [[Solving Equations]] <!-- 1073-Mod R -->   
+
* [[Solving Equations and Inequalities]] <!-- 1073-Mod R -->   
  
 
||
 
||
  
* Vectors and Matrices
+
* Using elimination to find solutions of linear systems
* Gauss-Jordan elimination
+
* The Geometrical interpretation of solutions to linear systems
  
 
|-
 
|-
Line 37: Line 42:
 
||
 
||
  
<div style="text-align: center;">1.3</div>
+
<div style="text-align: center;">1.3, 1.4, and 1.5</div>
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">1.2 and 1.3</div>
  
 
||
 
||
 
          
 
          
[[Solutions of Linear Systems]]  
+
[[Vectors and Matrices]]
 +
 
 +
[[Gauss-Jordan Elimination]]  
  
 
||
 
||
  
* [[Systems of Linear Equations|Gauss-Jordan elimination]] <!-- 2233-1.1 & 1.2 -->   
+
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->   
 
* [[Linear Equations|Equation for a line]] <!-- 1073-Mod R -->   
 
* [[Linear Equations|Equation for a line]] <!-- 1073-Mod R -->   
  
 
||
 
||
  
* Rank of a matrix
+
* Vectors and vector addition
* Matrix addition
+
* Matrix notation
* The product Ax
+
* The Gauss-Jordan method for solving a linear system of equation
* Inner product
+
* The rank of a matrix
 +
* Sums of Matrices
 +
* The product Ax (where A is a matrix and x is a vector)
 +
* The Dot product
 
* Linear Combinations
 
* Linear Combinations
  
Line 60: Line 74:
  
  
|Week&nbsp;3  
+
|Week&nbsp;3
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">2.1</div>
  
 
||
 
||
  
<div style="text-align: center;">2.1 and 2.2</div>
+
<div style="text-align: center;">2.3</div>
  
 
||
 
||
 
    
 
    
[[Linear Transformations]]  
+
[[Matrix Algebra and Matrix Multiplication]]  
  
 
||
 
||
  
 
* [[Range of a Function]] <!-- 1073-Mod 1.2->   
 
* [[Range of a Function]] <!-- 1073-Mod 1.2->   
* [[Solutions of Linear Systems| Matrix addition]]  <!-- 2233-1.3-->   
+
* [[Vectors and Matrices]], [[Gauss-Jordan Elimination]]  <!-- 2233-1.3-->   
 
* [[Transformations of Functions]]  <!-- 1073-Mod 6 -->   
 
* [[Transformations of Functions]]  <!-- 1073-Mod 6 -->   
  
 
||
 
||
  
* Linear transformations and their properties
+
* Matrix Operations
* Geometry of Linear Transformations (rotations, scalings and projections)
+
* Matrix products by columns
 +
* Matrix products using the dot product
  
 
|-
 
|-
  
  
|Week&nbsp;4    
+
 
 +
|Week&nbsp;3    
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">2.2 and 2.3</div>
  
 
||
 
||
  
<div style="text-align: center;"> 2.3 and 2.4</div>
+
<div style="text-align: center;">2.4</div>
  
 
||
 
||
 
    
 
    
[[Matrix Products and Inverses]]  
+
[[The Inverse of a Linear Transformation]]  
  
 
||
 
||
  
* [[Solutions of Linear Systems| Linear Combinations]] <!-- 2233-1.3-->  
+
* [[Matrix Algebra and Matrix Multiplication]] <!-- 2233-2.3-->  
 
* [[Inverse functions and the identity function|Inverse Functions]] <!-- 1073-7.2-->  
 
* [[Inverse functions and the identity function|Inverse Functions]] <!-- 1073-7.2-->  
* [[Solutions of Linear Systems|Vectors and the Inner product]] <!-- 2233-1.3-->  
+
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->
  
 
||
 
||
  
* Matrix Products (both inner product and row-by-column methods)
+
* The Identity matrix
* The Inverses of a linear transform
+
* The Inverse of a Matrix
 +
* Various characterizations for an invertible matrix
  
  
Line 109: Line 134:
  
  
|Week&nbsp;6
+
 
 +
 
 +
|Week&nbsp;
 +
 
 +
||
 +
 
 +
<div style="text-align: center;">1.8 and 1.9</div>
  
 
||
 
||
  
<div style="text-align: center;">3.1</div>
+
<div style="text-align: center;">2.1</div>
  
 
||
 
||
 
    
 
    
[[Image and Kernel of a Linear Transform]]  
+
[[Introduction to Linear Transformations]]  
  
 
||
 
||
  
* [[Solutions of Linear Systems]] <!-- 2233-1.3-->
+
* [[Introduction to Linear Systems of Equations]] <!-- 2233-1.1 & 1.2 -->  
* [[Range of a Function|Image of a Function]]  <!-- 1073-Mod 1.2 -->  
+
* [[Vectors and Matrices]], [[Gauss-Jordan Elimination]]  <!-- 1073-7.2-->  
* [[Kernel of a Function]] <!-- DNE (recommend 1073 Mod 1.2 or Modern Algebra) -->  
+
* [[Transformations of Functions]] <!-- 1073-Mod 6 -->
  
 
||
 
||
  
* The image of a Linear transformation
+
* Linear Transformations
* The kernel of a linear transformation
+
* Requirements for a transformation to be linear
* Span of a set of vectors
 
* Alternative characterizations of Invertible matrices
 
 
 
  
 
|-
 
|-
  
 +
|Week&nbsp;5 
  
 +
||
  
|Week&nbsp;6
+
<div style="text-align: center;">1.7, 2.8, and 2.9</div>
  
 
||
 
||
Line 145: Line 175:
 
||
 
||
 
    
 
    
[[Bases and Linear Independence]]  
+
[[Subspaces of Rⁿ and Linear Independence]]  
  
 
||
 
||
+
 
* [[Solutions of Linear Systems]] <!-- 2233-1.3-->
+
* [[Matrix Algebra and Matrix Multiplication]] <!-- 2233-2.3-->  
* [[Image and Kernel of a Linear Transform]] <!-- 2233-3.1 -->
+
* [[The Inverse of a Linear Transformation]]  
  
 
||
 
||
  
* Subspaces in Different Dimensions
+
* Definition of a subspace of Rⁿ
* Bases and Linear independence  
+
* Defining linear independence for a set of vectors
* Characterizations of Linear Independence
+
* Definition of a basis for a subspace
  
  
 
|-
 
|-
  
 +
|Week&nbsp;6 
  
 +
||
  
|Week&nbsp;5
+
<div style="text-align: center;">4.1</div>
  
 
||
 
||
  
<div style="text-align: center;">3.3</div>
+
<div style="text-align: center;">4.1</div>
  
 
||
 
||
 
    
 
    
[[The Dimension of a Subspace]]  
+
[[Introduction to Vector Spaces]]
  
 
||
 
||
  
* [[Range of a Function|Image of a Function]]  <!-- 1073-Mod 1.2 -->  
+
* [[Matrix Algebra and Matrix Multiplication]]  <!-- 2233-2.3-->  
* [[Bases and Linear Independence]] <!-- 2233-3.2 -->
+
* [[Subspaces of Rⁿ and Linear Independence]]  
* [[Linear transformations]] <!-- 2233-2.1-->
 
  
 
||
 
||
  
* Dimension of the Image
+
* Definition of a vector space(or linear space)
* Rank-nullity theorem
+
* Subspaces of vector spaces
* Various bases in R<sup>n</sup>
+
* Linear combinations and bases for vector spaces
 +
* Examples of vector spaces of functions
  
  
Line 189: Line 221:
  
  
|Week&nbsp;7/8  
+
 
 +
|Week&nbsp;6  
  
 
||
 
||
  
<div style="text-align: center;"> 3.</div>
+
<div style="text-align: center;">4.2</div>
  
 
||
 
||
  
 +
<div style="text-align: center;">3.1</div>
 +
 +
||
 
    
 
    
[[Similar Matrices and Coordinates]]  
+
[[The Column Space and Nullspace of a Linear Transformation]]  
  
 
||
 
||
  
* [[Bases and Linear Independence]] <!-- 2233-3.2 -->  
+
* [[Introduction to Linear Transformations ]]
* '''[[Equivalence Relations]]''' <!-- DNE (recommend 1073-Mod R) -->
+
* [[Range of a Function]] <!-- 1073- mod 1.2-->  
 +
* [[The Inverse of a Linear Transformation]]  
  
 
||
 
||
  
* Coordinates in a subspace of R<sup>n</sup>
+
* The image (or column space) of a linear transformation
* Similar matrices
+
* The kernel (or nullspace) of a linear transformation
* Diagonal matrices
+
* Properties of the kernel
 
 
||
 
  
  
Line 217: Line 252:
  
  
 +
|Week&nbsp;7 
  
 +
||
  
|Week&nbsp;9
+
<div style="text-align: center;">4.3 and 4.5</div>
  
 
||
 
||
  
<div style="text-align: center;"> 5.1</div>
+
<div style="text-align: center;">3.3 and 4.1</div>
  
 
||
 
||
 
    
 
    
[[Orthogonal Projections and Orthonormal Bases]]  
+
[[The Dimension of a Vector Space]]  
  
 
||
 
||
  
* [[Parallel and Perpendicular Lines]] <!-- DNE (recommend 1093-2.1) -->
+
* [[Introduction to Vector Spaces]]
* [[Absolute value function]]<!-- DNE (recommend 1073-Mod R) -->
+
* [[Subspaces of Rⁿ and Linear Independence]]  
* [[Trig. Functions: Unit Circle Approach]] <!-- 1093-2.2 -->
+
 
* [[Matrix Products and Inverses|Inner Products]] <!-- 2233-2.3 and 2.4 -->
 
* [[Bases and Linear Independence]] <!-- 2233-3.2 -->
 
  
 
||
 
||
  
* Magnitude (or norm or length) of a vector
+
* The number of vectors in a basis of R<sup>n</sup>
* Unit Vectors
+
* Dimension of a subspace in Rⁿ
* Cauchy-Schwarz Inequality
+
* The dimension of a vector space
* Orthonormal vectors
+
* The dimension of the nullspace (or kernel) and the column space (or image)
* Orthogonal complement
+
* The Rank-nullity Theorem
* Orthogonal Projection
 
* Orthonormal bases
 
* Angle between vectors
 
 
 
||
 
  
  
Line 254: Line 284:
  
  
|Week&nbsp;10
+
|- 
 +
 +
 
 +
|Week&nbsp;8
  
 
||
 
||
  
<div style="text-align: center;">5.2 </div>
+
<div style="text-align: center;">6.1 and 6.2</div>
  
 
||
 
||
  
 
+
<div style="text-align: center;">Appendix A and 5.1</div>
[[Gram-Schmidt Process and QR Factorization]]
 
  
 
||
 
||
 
+
       
* [[Orthogonal Projections and Orthonormal Bases|Unit vectors]] <!-- 2233-5.1 and 5.2 -->
+
[[Dot Products and Orthogonality]]  
* [[Matrix Products and Inverses|Inner Products]] <!-- 2233-2.3 and 2.4 -->
 
* [[Orthogonal Projections and Orthonormal Bases|Orthonormal Bases]] <!-- 2233-5.1 and 5.2 -->
 
* [[Bases and Linear Independence]] <!-- 2233-3.2 -->
 
  
 
||
 
||
  
* Gram-Schmidt process
+
* [[The Dimension of a Vector Space]]
* QR Factorization
+
* [[Subspaces of Rⁿ and Linear Independence]]
  
 
||
 
||
  
 +
* Orthogonal vectors
 +
* Length (or magnitude or norm) of a vector
 +
* Unit vectors
 +
* Orthonormal vectors
 +
* Orthogonal projections
 +
* Orthogonal complements
 +
* Cauchy-Schwarz inequality
 +
* The angle between vectors
  
 
|-
 
|-
  
  
|Week&nbsp;11
+
|- 
 +
 +
 
 +
|Week&nbsp;9
  
 
||
 
||
  
<div style="text-align: center;">5.3</div>
+
<div style="text-align: center;">6.3 and 6.4</div>
  
 
||
 
||
 
+
 
[[Orthogonal Transformations and Orthogonal Matrices]]
+
<div style="text-align: center;">5.2 and 5.3</div>
  
 
||
 
||
 
+
       
* [[Image and Kernel of a Linear Transform]] <!-- 2233-3.1 -->
+
<p> [[Orthonormal Bases and the Gram-Schmidt Process]] </p>
* [[Matrix Products and Inverses|Inner Products]] <!-- 2233-2.3 and 2.4 -->  
+
<p> [[Orthogonal Transformations and Orthogonal Matrices]] </p>
* [[Orthogonal Projections and Orthonormal Bases]]  
 
  
 
||
 
||
  
* Orthogonal Transformations
+
* [[Subspaces of Rⁿ and Linear Independence]] 
* Properties of Othogonal Transformations
+
* [[Dot Products and Orthogonality]]
* Transpose of a Matrix
 
* The matrix of an Orthogonal Projection
 
  
 
||
 
||
  
 +
* Orthogonal transformations
 +
* Orthonormal Bases
 +
* Orthogonal matrices
 +
* The transpose of a matrix
 +
* The Gram-Schmidt Process
 +
* QR factorization
 +
 +
|- 
 +
  
|-
+
|Week&nbsp;10
  
 +
||
  
|Week&nbsp;11
+
<div style="text-align: center;">6.5 and 6.6</div>
  
 
||
 
||
  
<div style="text-align: center;">5.3</div>
+
<div style="text-align: center;">5.4</div>
  
 
||
 
||
 
+
       
[[Least Squares]]  
+
[[The Least-squares Solution]]  
  
 
||
 
||
  
* [[Linear transformations]] <!-- 2233-2.1-->
+
* [[Dot Products and Orthogonality]]  
* [[Orthogonal Transformations and Orthogonal Matrices]] <!-- 2233-5.3 -->
+
* [[The Column Space and Nullspace of a Linear Transformation]]
* [[Orthogonal Projections and Orthonormal Bases|Orthogonal Projections]] <!-- 2233-5.3 -->
 
  
 
||
 
||
  
* The Least Squares Solution
+
* The orthogonal complement of the image is equal to the left nullspace (or kernel of the transpose) for all matrices
* The Normal Equation
+
* The least-squares solution for a linear system
* Another matrix for an Orthogonal Projection
+
* Data fitting using the least-squares solution
 +
 
 +
 
 +
|-
 +
 
 +
 
 +
 
 +
|Week&nbsp;11
  
 
||
 
||
  
 
+
<div style="text-align: center;">3.1 and 3.2</div>
|-
 
|Week&nbsp;11
 
  
 
||
 
||
  
<div style="text-align: center;">5.3</div>
+
<div style="text-align: center;">6.1 and 6.2</div>
  
 
||
 
||
 
+
       
[[Determinants]]  
+
<p> [[Introduction to Determinants]] </p>
 +
<p> [[Cramer's Rule]] </p>
  
 
||
 
||
  
* [[Summation Notation]] <!-- DNE (recommend before Riemann Sums in 1214) -->
+
* [[Orthonormal Bases and the Gram-Schmidt Process]]  
* [[Sgn Function]] <!-- DNE (recommend 1073 Mod R) -->
+
* [[The Inverse of a Linear Transformation]]  
* [[Matrix Products and Inverses|Inverse of a Linear Transformation]] <!-- 2233-2.3 and 2.4 -->
+
* [[Sigma Notation]]
* [[Orthogonal Transformations and Orthogonal Matrices| Transpose of a Matrix]] <!-- 2233-5.3 -->
 
  
 
||
 
||
  
* Properties of Determinants
+
* The determinant of 2 by 2 and 3 by 3 matrices
* Sarrus's Rule
+
* The determinant of a general n by n matrix
* Row operations and determinants
+
* The determinant of a triangular matrix
* Invertibility based on the determinant
+
* Properties of the determinant
 +
* The determinant of the transpose
 +
* Invertibility and the determinant
  
||
 
  
  
 
|-
 
|-
  
|Week&nbsp;12  
+
 
 +
|Week&nbsp;12
  
 
||
 
||
  
<div style="text-align: center;">6.1, 6.2, and 6.3 </div>
+
<div style="text-align: center;">3.3</div>
  
 
||
 
||
  
 
+
<div style="text-align: center;">6.3</div>
[[Cramer's Rule]]
 
  
 
||
 
||
 +
       
 +
[[The Geometric Interpretation of the Determinant]]
  
* Properties of Determinants
+
||
* linear Systems
+
* Invertible matrices
+
* [[Orthonormal Bases and the Gram-Schmidt Process]] 
 +
* [[Introduction to Determinants]]
 +
* [[The Inverse of a Linear Transformation]]
  
 
||
 
||
  
* Parrallelepipeds in R<big>n</big>
 
* Geometric Interpretation of the Determinant
 
* Cramer's rule
 
  
||
+
* Cramer's Rule
 +
* The adjoint and inverse of a matrix
 +
* The area of a parallelogram and the volume of a parallelepiped
  
  
Line 396: Line 451:
  
  
|Week&nbsp;13  
+
|Week&nbsp;13
  
 
||
 
||
  
<div style="text-align: center;">7.1, 7.2, 7.3, and 8.1 </div>
+
<div style="text-align: center;">The beginning of 5.3 as well as the sections 5.1 and 5.2</div>
  
 
||
 
||
  
 
+
<div style="text-align: center;">7.1, 7.2 and the beginning of 7.3</div>
 +
 
 +
||
 +
       
 
[[Eigenvalues and Eigenvectors]]  
 
[[Eigenvalues and Eigenvectors]]  
  
 
||
 
||
  
* Finding real roots of a polynomial
+
* [[Introduction to Determinants]]   
* Finding the kernel of a function
+
* [[The Column Space and Nullspace of a Linear Transformation]]
 +
* [[The Inverse of a Linear Transformation]]
  
 
||
 
||
  
* Diagonalization
+
* The requirement for a matrix to be diagonalizable
* Finding eigenvalues
+
* Definition of an eigenvector
* Finding eigenvectors
+
* The characteristic equation used to find eigenvalues
* Geometric and algebraic multiplicity
+
* Eigenvalues of a triangular matrix
 +
* Eigenspaces for specific eigenvalues
  
||
 
  
 +
|-
  
|-
 
  
|Week&nbsp;14  
+
|Week&nbsp;14
  
 
||
 
||
  
<div style="text-align: center;">7.1, 7.2, 7.3, and 8.1 </div>
+
<div style="text-align: center;">5.3 and 5.4</div>
  
 
||
 
||
  
 
+
<div style="text-align: center;">3.4 and 7.3</div>
[[Spectral Theorem]]  
+
 
 +
||
 +
       
 +
[[Diagonalization of Matrices]]  
  
 
||
 
||
  
* Transpose of a matrix
+
* [[Eigenvalues and Eigenvectors]] 
* Basis
+
* [[The Column Space and Nullspace of a Linear Transformation]]
* Orthogonal matrices
 
* Diagonal matrices
 
  
 
||
 
||
  
* Symmetric matrices
+
* Similar matrices
* Spectral Theorem
+
* Diagonalization in terms of linearly independent eigenvectors
 +
* Algebraic and geometric multiplicity for a specific eigenvalue
 +
* The strategy for diagonalization
 +
 
 +
 
  
||
+
|-

Latest revision as of 12:58, 29 January 2022

A comprehensive list of all undergraduate math courses at UTSA can be found here.

The Wikipedia summary of Linear Algebra and its history.

Topics List

Date Sections from Lay Sections from Bretscher Topics Prerequisite Skills Student Learning Outcomes
Week 1
1.1 and 1.2
1.1

Introduction to Linear Systems of Equations

  • Using elimination to find solutions of linear systems
  • The Geometrical interpretation of solutions to linear systems
Week 2
1.3, 1.4, and 1.5
1.2 and 1.3

Vectors and Matrices

Gauss-Jordan Elimination

  • Vectors and vector addition
  • Matrix notation
  • The Gauss-Jordan method for solving a linear system of equation
  • The rank of a matrix
  • Sums of Matrices
  • The product Ax (where A is a matrix and x is a vector)
  • The Dot product
  • Linear Combinations


Week 3
2.1
2.3

Matrix Algebra and Matrix Multiplication

  • Matrix Operations
  • Matrix products by columns
  • Matrix products using the dot product
Week 3
2.2 and 2.3
2.4

The Inverse of a Linear Transformation

  • The Identity matrix
  • The Inverse of a Matrix
  • Various characterizations for an invertible matrix


Week 4
1.8 and 1.9
2.1

Introduction to Linear Transformations

  • Linear Transformations
  • Requirements for a transformation to be linear
Week 5
1.7, 2.8, and 2.9
3.2

Subspaces of Rⁿ and Linear Independence

  • Definition of a subspace of Rⁿ
  • Defining linear independence for a set of vectors
  • Definition of a basis for a subspace


Week 6
4.1
4.1

Introduction to Vector Spaces

  • Definition of a vector space(or linear space)
  • Subspaces of vector spaces
  • Linear combinations and bases for vector spaces
  • Examples of vector spaces of functions


Week 6
4.2
3.1

The Column Space and Nullspace of a Linear Transformation

  • The image (or column space) of a linear transformation
  • The kernel (or nullspace) of a linear transformation
  • Properties of the kernel


Week 7
4.3 and 4.5
3.3 and 4.1

The Dimension of a Vector Space


  • The number of vectors in a basis of Rn
  • Dimension of a subspace in Rⁿ
  • The dimension of a vector space
  • The dimension of the nullspace (or kernel) and the column space (or image)
  • The Rank-nullity Theorem


Week 8
6.1 and 6.2
Appendix A and 5.1

Dot Products and Orthogonality

  • Orthogonal vectors
  • Length (or magnitude or norm) of a vector
  • Unit vectors
  • Orthonormal vectors
  • Orthogonal projections
  • Orthogonal complements
  • Cauchy-Schwarz inequality
  • The angle between vectors
Week 9
6.3 and 6.4
5.2 and 5.3

Orthonormal Bases and the Gram-Schmidt Process

Orthogonal Transformations and Orthogonal Matrices

  • Orthogonal transformations
  • Orthonormal Bases
  • Orthogonal matrices
  • The transpose of a matrix
  • The Gram-Schmidt Process
  • QR factorization
Week 10
6.5 and 6.6
5.4

The Least-squares Solution

  • The orthogonal complement of the image is equal to the left nullspace (or kernel of the transpose) for all matrices
  • The least-squares solution for a linear system
  • Data fitting using the least-squares solution


Week 11
3.1 and 3.2
6.1 and 6.2

Introduction to Determinants

Cramer's Rule

  • The determinant of 2 by 2 and 3 by 3 matrices
  • The determinant of a general n by n matrix
  • The determinant of a triangular matrix
  • Properties of the determinant
  • The determinant of the transpose
  • Invertibility and the determinant


Week 12
3.3
6.3

The Geometric Interpretation of the Determinant


  • Cramer's Rule
  • The adjoint and inverse of a matrix
  • The area of a parallelogram and the volume of a parallelepiped


Week 13
The beginning of 5.3 as well as the sections 5.1 and 5.2
7.1, 7.2 and the beginning of 7.3

Eigenvalues and Eigenvectors

  • The requirement for a matrix to be diagonalizable
  • Definition of an eigenvector
  • The characteristic equation used to find eigenvalues
  • Eigenvalues of a triangular matrix
  • Eigenspaces for specific eigenvalues


Week 14
5.3 and 5.4
3.4 and 7.3

Diagonalization of Matrices

  • Similar matrices
  • Diagonalization in terms of linearly independent eigenvectors
  • Algebraic and geometric multiplicity for a specific eigenvalue
  • The strategy for diagonalization