Difference between revisions of "MAT2243"

From Department of Mathematics at UTSA
Jump to navigation Jump to search
(Removed 2243, renaming to 2253.)
Tag: Blanking
 
Line 1: Line 1:
==Applied Linear Algebra==
 
  
Prerequisite: [[MAT1214]]/[[MAT1213]] Calculus I
 
 
This comprehensive course in linear algebra provides an in-depth exploration of core concepts and their applications to optimization, data analysis, and neural networks. Students will gain a strong foundation in the fundamental notions of linear systems of equations, vectors, and matrices, as well as advanced topics such as eigenvalues, eigenvectors, and canonical solutions to linear systems of differential equations. The course also delves into the critical techniques of calculus operations in vectors and matrices, optimization, and Taylor series in one and multiple variables. By the end of the course, students will have a thorough understanding of the mathematical framework underlying principal component analysis, gradient descent, and the implementation of simple neural networks.
 
 
==List of Topics==
 
 
{| class="wikitable"
 
! Week !! Section !! Topic !! Prerequisites !! SLOs
 
|-
 
| 1 ||  || Notions of linear systems of equations to introduce the concepts of vector and matrices.   ||  ||
 
|-
 
| 2 ||  || Vector and matrix operations: Dot and cross products, matrix transpose, determinants.  ||  ||
 
|-
 
| 3 ||  || Vector and matrix operations: Matrix addition, multiplication and inverse. ||  ||
 
|-
 
| 4 ||  || Cramer's rule and solutions of linear systems ||  ||
 
|-
 
| 5 ||  || Full rank, undetermined, and overdetermined systems. Least square solutions ||  ||
 
|-
 
| 6 ||  || Eigenvalues and eigenvectors. Canonical solution to linear systems of differential equations.  ||  ||
 
|-
 
| 7 ||  || Calculus operations in vectors and matrices, i.e. how to derive a matrix with respect to a vector?  ||  ||
 
|-
 
| 8 ||  || Optimization: Linear problems, and nonlinear problems (constrained and unconstrained) ||  ||
 
|-
 
| 9 ||  || Lagrange multiplier ||  ||
 
|-
 
| 10 ||  || Taylor series in one and multiple variables. Jacobians and Hessians, i.e. nabla and Laplace operators.  ||  ||
 
|-
 
| 11 ||  || Principal component analysis ||  ||
 
|-
 
| 12 ||  || Gradient descent ||  ||
 
|-
 
| 13 ||  || Neural networks as nonlinear transformations ||  ||
 
|-
 
| 14 ||  || Implementation of a simple neural network with gradient descent
 
|}
 

Latest revision as of 17:30, 1 April 2023