MAT5293

From Department of Mathematics at UTSA
Jump to navigation Jump to search

Numerical Linear Algebra - MAT5293

Course description

Introduction to the fundamental algorithms and theory of numerical linear algebra. Topics include direct and iterative methods for solving general linear systems, optimization, least squares problems, and solutions of sparse systems arising from partial differential equations. Topics may include as well at the end of the semester neural networks, high dimensional spaces, randomized methods, sparse approximation, and dimension reduction techniques. Emphasis is placed during all semester on the mathematical analysis of algorithms, computational complexity, and applications to problems in science, engineering, and data analysis.

Catalog entry

MAT 5293. Numerical Linear Algebra. (3-0) 3 Credit Hours.

Prerequisite: MAT2233 (or MAT2253), or consent of instructor.

Content: 1. Review of Linear Algebra (a) Solution of linear systems and LU decompositions (b) Orthogonality, projections, and QR decompositions (c) Eigernvalues and SVD decompositions 2. Direct Numerical Linear Algebra (a) Stability, conditioning, and convergence: backward error analysis and well-posedness (b) LU and Cholesky decompositions (c) QR and SVD decompositions 3. Iterative Methods (a) Krylov subspace methods (b) Conjugate gradient method and preconditioning 4. Applications of Numerical Linear Algebra (a) Eigenvalue problems (b) Banded and sparse matrices (c) Least square problems using QR and SVD decompositions 5. Foundational Techniques of Machine Learning and Data Science (a) High-Dimensional Probability: Multivariate Gaussians, CLT, Concentration of measure. (b) Regression / Interpolation: Generalization/overfitting, Regularization, Cross-validation. (c) Optimization (Deterministic and Stochastic) (d) Neural Networks 3 Credit Hours

Textbooks:

  • Trefethen LN, Bau D. Numerical linear algebra. Society for Industrial and Applied Mathematics; 2022.
  • Demmel JW. Applied numerical linear algebra. Society for Industrial and Applied Mathematics; 1997.
  • Golub GH, Van Loan CF. Matrix computations. JHU press; 2013.
  • Mohri M, Rostamizadeh A, Talwalkar A. Foundations of machine learning. MIT press; 2018.

Topics List

Date Sections Topics Prerequisite Skills Student Learning Outcomes
Week 1

Solution of linear systems and LU decompositions

Week 2

Orthogonality, projections, and QR decompositions

Week 3

Eigernvalues and SVD decompositions

Week 4

Stability, conditioning, and convergence: backward error analysis and well-posedness

Week 5

LU and Cholesky decompositions

Week 6

QR and SVD decompositions

Week 7

Krylov subspace methods

Week 8

Conjugate gradient method and preconditioning

Week 9

Eigenvalue problems

Week 10

Banded and sparse matrices

Week 11

Least square problems using QR and SVD decompositions

Week 12

High-Dimensional Probability: Multivariate Gaussians, CLT, Concentration of measure.

Week 13

Regression / Interpolation: Generalization/overfitting, Regularization, Cross-validation.

Week 14

Deterministic Optimization: Gradient Descent, Convexity, Smoothness. Stochastic Optimization: Stochastic Gradient descent, Importance sampling, Condition number, Implicit regularization.

Week 15

Neural Networks: Resnet for supervised learning, Generative adversarial network for unsupervised learning, Implicit regularization, Overparameterization.