<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://mathresearch.utsa.edu/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Jose.morales&amp;*</id>
	<title>Department of Mathematics at UTSA - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://mathresearch.utsa.edu/wiki/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=Jose.morales&amp;*"/>
	<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=Special:Contributions/Jose.morales"/>
	<updated>2026-04-11T07:31:26Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.34.1</generator>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT5293&amp;diff=5443</id>
		<title>MAT5293</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT5293&amp;diff=5443"/>
		<updated>2025-12-12T17:52:46Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&lt;br /&gt;
=Numerical Linear Algebra - MAT5293=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
Introduction to the fundamental algorithms and theory of numerical linear algebra. Topics include direct and iterative methods for solving general linear systems, optimization, least squares problems, and solutions of sparse systems arising from partial differential equations.&lt;br /&gt;
Topics may include as well at the end of the semester neural networks, high dimensional spaces, randomized methods, sparse approximation, and dimension reduction techniques. Emphasis is placed during all semester on the mathematical analysis of algorithms, computational complexity, and applications to problems in science, engineering, and data analysis.&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
MAT 5293. Numerical Linear Algebra. (3-0) 3 Credit Hours.&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'':   [[MAT2233]] (or [[MAT2253]]), or consent of instructor. &lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Review of Linear Algebra&lt;br /&gt;
(a) Solution of linear systems and LU decompositions&lt;br /&gt;
(b) Orthogonality, projections, and QR decompositions&lt;br /&gt;
(c) Eigernvalues and SVD decompositions&lt;br /&gt;
2. Direct Numerical Linear Algebra&lt;br /&gt;
(a) Stability, conditioning, and convergence: backward error analysis and well-posedness&lt;br /&gt;
(b) LU and Cholesky decompositions&lt;br /&gt;
(c) QR and SVD decompositions&lt;br /&gt;
3. Iterative Methods&lt;br /&gt;
(a) Krylov subspace methods&lt;br /&gt;
(b) Conjugate gradient method and preconditioning&lt;br /&gt;
4. Applications of Numerical Linear Algebra&lt;br /&gt;
(a) Eigenvalue problems&lt;br /&gt;
(b) Banded and sparse matrices&lt;br /&gt;
(c) Least square problems using QR and SVD decompositions&lt;br /&gt;
5. Foundational Techniques of Machine Learning and Data Science&lt;br /&gt;
(a) High-Dimensional Probability: Multivariate Gaussians, CLT, Concentration of measure.&lt;br /&gt;
(b) Regression / Interpolation: Generalization/overfitting, Regularization, Cross-validation.&lt;br /&gt;
(c) Optimization (Deterministic and Stochastic)&lt;br /&gt;
(d) Neural Networks&lt;br /&gt;
3 Credit Hours &lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Trefethen LN, Bau D. Numerical linear algebra. Society for Industrial and Applied Mathematics; 2022.&lt;br /&gt;
* Demmel JW. Applied numerical linear algebra. Society for Industrial and Applied Mathematics; 1997.&lt;br /&gt;
* Golub GH, Van Loan CF. Matrix computations. JHU press; 2013.&lt;br /&gt;
* Mohri M, Rostamizadeh A, Talwalkar A. Foundations of machine learning. MIT press; 2018.&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Solution of linear systems and LU decompositions&lt;br /&gt;
||&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Orthogonality, projections, and QR decompositions&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Eigernvalues and SVD decompositions&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stability, conditioning, and convergence: backward error analysis and well-posedness&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
LU and Cholesky decompositions&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
QR and SVD decompositions&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Krylov subspace methods&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Conjugate gradient method and preconditioning&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*   &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Eigenvalue problems&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Banded and sparse matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Least square problems using QR and SVD decompositions&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
High-Dimensional Probability: Multivariate Gaussians, CLT, Concentration of measure.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Regression / Interpolation: Generalization/overfitting, Regularization, Cross-validation.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Deterministic Optimization: Gradient Descent, Convexity, Smoothness. Stochastic Optimization: Stochastic Gradient descent, Importance sampling, Condition number, Implicit regularization.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 15&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Neural Networks: Resnet for supervised learning, Generative adversarial network for unsupervised learning, Implicit regularization, Overparameterization.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT5JME&amp;diff=5435</id>
		<title>MAT5JME</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT5JME&amp;diff=5435"/>
		<updated>2025-09-05T16:07:34Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: Created page with &amp;quot;=Mathematical Physics 2 - MAT4JME/5JME=  ==Course description== The course intends to be a basic introduction to the mathematical and computational techniques in applied mathe...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics 2 - MAT4JME/5JME=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer-&lt;br /&gt;
ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
MAT 4XX2: Computational Science and Engineering. (3-0) 3 Credit Hours.&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': MAT 2214 (Calculus III) &amp;amp; MAT 3613 (Differential Equations I) with a&lt;br /&gt;
letter grade of C- or better, or successful completion of at least three credits of equivalent&lt;br /&gt;
courses. Basic Linux command-line experience or prior programming experience (say in C,&lt;br /&gt;
C++, Python, or Matlab) is desired but not required.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
3 Credit Hours &lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang’s 4 special matrices&lt;br /&gt;
||&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton’s Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*   &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 15&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 16&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 17&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Neural Networks (Convolutional, Deep). Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4JME&amp;diff=5434</id>
		<title>MAT4JME</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4JME&amp;diff=5434"/>
		<updated>2025-09-05T16:06:30Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: Created page with &amp;quot;=Mathematical Physics 2 - MAT4JME/5JME=  ==Course description== The course intends to be a basic introduction to the mathematical and computational techniques in applied mathe...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics 2 - MAT4JME/5JME=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer-&lt;br /&gt;
ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
MAT 4XX2: Computational Science and Engineering. (3-0) 3 Credit Hours.&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': MAT 2214 (Calculus III) &amp;amp; MAT 3613 (Differential Equations I) with a&lt;br /&gt;
letter grade of C- or better, or successful completion of at least three credits of equivalent&lt;br /&gt;
courses. Basic Linux command-line experience or prior programming experience (say in C,&lt;br /&gt;
C++, Python, or Matlab) is desired but not required.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
3 Credit Hours &lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang’s 4 special matrices&lt;br /&gt;
||&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton’s Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*   &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 15&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 16&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 17&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Neural Networks (Convolutional, Deep). Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT5XXX&amp;diff=5316</id>
		<title>MAT5XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT5XXX&amp;diff=5316"/>
		<updated>2025-01-24T21:54:08Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Mathematical Physics II - MAT4XXX/5XXX */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Introduction to Quantum Information Science and Engineering - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
This course will be an introduction accessible and welcoming to all STEM students. No prior quantum mechanics courses are expected since all the principles and techniques of quantum information will be taught during the course. The focus will be on qubits, entanglement, and decoherence, three key building blocks of quantum computing. &lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Linear Algebra [[MAT2233]] or Applied Linear Algebra [[MAT2253]], or equivalent (can be waived with approval of instructor), with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
Foundations of quantum mechanics such as unitary time evolution, entanglement, and the EPR paradox approached from the information perspective, and quantum entropy. Information and its encoding into physical systems such as photons, atoms, and superconducting circuits.  &lt;br /&gt;
Quantum control using quantum logic gates, providing a foundation for quantum programming. &lt;br /&gt;
Applications: quantum teleportation, quantum cryptography, quantum computing.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbook:'''&lt;br /&gt;
&lt;br /&gt;
* Nielsen, M. and Chuang, I. Quantum Computation and Quantum Information. UK, Cambridge University Press, 2012.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
An overview of quantum computing and information.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Classical Information Theory. Connection between information and thermodynamics&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Communications Theory. Physical qubits: spinning particles and photons&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Operators in Quantum Mechanics. Classical cryptography&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Quantum cryptography. Entanglement.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Mixed states and the density operator. Local measurements &amp;amp; open quantum systems.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Quantum non-locality and the Einstein-Podolsky-Rosen paradox &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Bell’s inequality. Quantum dense coding. Quantum teleportation&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Quantum non-locality and the Einstein-Podolsky-Rosen paradox &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Quantum computation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Von Neumann measurements &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Many-worlds interpretation of quantum mechanics.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Selected topic 1.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Presentations by students&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5315</id>
		<title>MAT4XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5315"/>
		<updated>2025-01-24T21:50:18Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Introduction to Quantum Information Science and Engineering - MAT4XXX/5XXX */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Introduction to Quantum Information Science and Engineering - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
This course will be an introduction accessible and welcoming to all STEM students. No prior quantum mechanics courses are expected since all the principles and techniques of quantum information will be taught during the course. The focus will be on qubits, entanglement, and decoherence, three key building blocks of quantum computing. &lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Linear Algebra [[MAT2233]] or Applied Linear Algebra [[MAT2253]], or equivalent (can be waived with approval of instructor), with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses. &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
Foundations of quantum mechanics such as unitary time evolution, entanglement, and the EPR paradox approached from the information perspective, and quantum entropy. Information and its encoding into physical systems such as photons, atoms, and superconducting circuits.  &lt;br /&gt;
Quantum control using quantum logic gates, providing a foundation for quantum programming. &lt;br /&gt;
Applications: quantum teleportation, quantum cryptography, quantum computing.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbook:'''&lt;br /&gt;
&lt;br /&gt;
* Nielsen, M. and Chuang, I. Quantum Computation and Quantum Information. UK, Cambridge University Press, 2012.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
An overview of quantum computing and information.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Classical Information Theory. Connection between information and thermodynamics&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Communications Theory. Physical qubits: spinning particles and photons&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Operators in Quantum Mechanics. Classical cryptography&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Quantum cryptography. Entanglement.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Mixed states and the density operator. Local measurements &amp;amp; open quantum systems.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Quantum non-locality and the Einstein-Podolsky-Rosen paradox &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Bell’s inequality. Quantum dense coding. Quantum teleportation&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Quantum non-locality and the Einstein-Podolsky-Rosen paradox &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Quantum computation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Von Neumann measurements &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Many-worlds interpretation of quantum mechanics.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Selected topic 1.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Presentations by students&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5314</id>
		<title>MAT4XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5314"/>
		<updated>2025-01-24T20:32:40Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Mathematical Physics II - MAT4XXX/5XXX */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Introduction to Quantum Information Science and Engineering - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
This course will be an introduction accessible and welcoming to all STEM students. No prior quantum mechanics courses are expected since all the principles and techniques of quantum information will be taught during the course. The focus will be on qubits, entanglement, and decoherence, three key building blocks of quantum computing. Topics: Foundations of quantum mechanics such as unitary time evolution, entanglement, and the EPR paradox approached from the information perspective, and quantum entropy. Information and its encoding into physical systems such as photons, atoms, and superconducting circuits.  Quantum control using quantum logic gates, providing a foundation for quantum programming. Applications: quantum teleportation, quantum cryptography, quantum computing. Pre-requisites: QST 6203 and QST &lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Linear Algebra [[MAT2214]], Applied Linear Algebra [[MAT2214]], or Engineering Mathematics, [[MAT3613]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang's 4 special matrices &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton's Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices. Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
 Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing. Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains. Neural Networks. Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT5XXX&amp;diff=5313</id>
		<title>MAT5XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT5XXX&amp;diff=5313"/>
		<updated>2025-01-24T20:28:48Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: Created page with &amp;quot;=Mathematical Physics II - MAT4XXX/5XXX=  ==Course description== The course intends to be a basic introduction to the mathematical and computational techniques in applied math...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics II - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer�ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Calculus III [[MAT2214]] and Differential Equations I [[MAT3613]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang's 4 special matrices &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton's Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices. Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
 Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing. Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains. Neural Networks. Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5312</id>
		<title>MAT4XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5312"/>
		<updated>2025-01-24T20:28:35Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Introduction to Quantum Information Science and Engineering - MAT4XXX/5XXX */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics II - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer�ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Calculus III [[MAT2214]] and Differential Equations I [[MAT3613]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang's 4 special matrices &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton's Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices. Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
 Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing. Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains. Neural Networks. Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5311</id>
		<title>MAT4XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5311"/>
		<updated>2025-01-24T20:27:02Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Mathematical Physics - MAT4XXX/5XXX */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Introduction to Quantum Information Science and Engineering - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer�ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Calculus III [[MAT2214]] and Differential Equations I [[MAT3613]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang's 4 special matrices &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton's Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices. Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
 Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing. Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains. Neural Networks. Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5310</id>
		<title>MAT4XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5310"/>
		<updated>2025-01-24T20:17:42Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Topics List */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer�ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Calculus III [[MAT2214]] and Differential Equations I [[MAT3613]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang's 4 special matrices &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton's Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices. Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
 Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing. Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains. Neural Networks. Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5309</id>
		<title>MAT4XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5309"/>
		<updated>2025-01-24T20:16:35Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Topics List */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer�ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Calculus III [[MAT2214]] and Differential Equations I [[MAT3613]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang's 4 special matrices &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. Gradient, Divergence. Laplace equation.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. Positive Definite Matrices&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. Oscillations &amp;amp; Newton's Laws.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. Clustering and k-means.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. Chebyshev, Legendre, and Bessel&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). Convolution and Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. Wavelets, Signal Processing.&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. Spectral Methods. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients, Krylov subspaces&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices. Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
 Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing. Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains. Neural Networks (Convolutional, Deep). Backpropagation. Machine Learning.&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5308</id>
		<title>MAT4XXX</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT4XXX&amp;diff=5308"/>
		<updated>2025-01-24T18:45:02Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: Created page with &amp;quot;=Mathematical Physics - MAT4XXX/5XXX=  ==Course description== The course intends to be a basic introduction to the mathematical and computational techniques in applied mathema...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics - MAT4XXX/5XXX=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
The course intends to be a basic introduction to the mathematical&lt;br /&gt;
and computational techniques in applied mathematics, computational science &amp;amp; engineer�ing, and data science &amp;amp; machine learning. This course will stress then how the methods of&lt;br /&gt;
mathematical modeling in the STEM disciplines have transitioned from the analytical (as in&lt;br /&gt;
Theoretical Physics) to the numerical (as in traditional methods in Computational Science&lt;br /&gt;
and Engineering) and more recently to Data-based methods (as in current developments in&lt;br /&gt;
Data Science and Machine Learning). The student will acquire the basic skills needed broadly&lt;br /&gt;
in Computational Science and Engineering, of which Computational Physics, Data Science,&lt;br /&gt;
Machine Learning, and Numerical Modeling in the Mathematical Sciences are a subset.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': &lt;br /&gt;
Calculus III [[MAT2214]] and Differential Equations I [[MAT3613]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Computational Science, Engineering, and Mathematics&lt;br /&gt;
(a) Linear Algebra and Computational Science &amp;amp; Engineering&lt;br /&gt;
(b) Applied Math and Computational Science &amp;amp; Engineering&lt;br /&gt;
(c) Fourier Series and Integrals&lt;br /&gt;
(d) Laplace Transform and Spectral Methods&lt;br /&gt;
(e) Initial Value Problems&lt;br /&gt;
(f) Conjugate Gradients and Krylov Subspaces&lt;br /&gt;
(g) Minimum Principles&lt;br /&gt;
2. Data Science and Machine Learning: a Mathematical Perspective&lt;br /&gt;
(a) Principal Components and the Best Low Rank Matrix&lt;br /&gt;
(b) Randomized Linear Algebra&lt;br /&gt;
(c) Low Rank and Compressed Sensing&lt;br /&gt;
(d) Markov Chains&lt;br /&gt;
(e) Stochastic Gradient Descent and ADAM&lt;br /&gt;
(f) Introduction to Machine Learning: Neural Networks&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
* Strang, G. Computational Science &amp;amp; Engineering. USA, Wellesley-Cambridge, 2007.&lt;br /&gt;
* Strang, G. Linear Algebra and Learning from Data. Wellesley-Cambridge Press, 2019.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Strang's 4 special matrices (Part 1)&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Strang's 4 special matrices (continued)&lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Differences, Derivatives, BC. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*Gradient, Divergence. Laplace equation.&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Inverses. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Positive De�nite Matrices&lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Stiffness Matrices. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Oscillations &amp;amp; Newton's Laws.&lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Graph Models. Networks. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Clustering and k-means.&lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Series. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Chebyshev, Legendre, and Bessel&lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fast Fourier Transform (FFT). &lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* Convolution and Signal Processing.&lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Fourier Integrals. Deconvolution, Integral Equations. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Wavelets, Signal Processing.&lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Computational implementation of Laplace and z- Transforms. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Spectral Methods. &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Finite Difference for ODEs. &lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Accuracy &amp;amp; Stability. Conservation Laws, diffusion, fluids&lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Elimination with reordering, multigrid methods, conjugate gradients&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Krylov subspaces&lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Regular. least sq. Linear programming. &lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* Adjoint. Stoch. Gradient Descent. ADAM.&lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Matrix-matrix Multiplication. 4 Fundamental Subspaces. Orthogonal Matrices.&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
*  Best low rank matrix. Rayleigh quotients. Factoring matrices and tensors.&lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* Randomized Linear Algebra. Low rank signals. Singular values. Compressed sensing.&lt;br /&gt;
||&lt;br /&gt;
Covariance Matrices. Multivariate Gaussian. Weighted least squares. Markov chains.&lt;br /&gt;
||&lt;br /&gt;
*  Neural Networks (Convolutional, Deep).&lt;br /&gt;
||&lt;br /&gt;
*  Backpropagation. Machine Learning.&lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT5143&amp;diff=5307</id>
		<title>MAT5143</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT5143&amp;diff=5307"/>
		<updated>2025-01-24T18:26:37Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Course description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics - MAT4143/5153=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
Mathematical Physics tentative topics list. This course is aimed at physics majors who wish to deepen their understanding of mathematical methods used in physics. This course is also suitable for Mathematics majors, who can deepen their understanding of applications and proofs of major theorems in Functional Analysis.&lt;br /&gt;
&lt;br /&gt;
==Catalog entry==&lt;br /&gt;
&lt;br /&gt;
''Prerequisite'': Calculus III [[MAT2214]] and Differential Equations II [[MAT3623]] with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
''Content'': &lt;br /&gt;
1. Topics in Complex Analysis.&lt;br /&gt;
2. Differential Equations: Dynamical systems, nonlinearity &amp;amp; chaos.&lt;br /&gt;
3. Nonlinear Waves in PDEs: Continuous Systems, Hamiltonian formulation of plasmas&lt;br /&gt;
and fluids, KdV eq., Nonlinear Schroedinger Eq., Sine/Klein-Gordon Equation(s).&lt;br /&gt;
4. Asymptotic Analysis methods, time-dependent/independent perturbation theory.&lt;br /&gt;
5. Functional Analysis in Mathematical Physics&lt;br /&gt;
6. Mathematical Formalism of PDEs for Physicists&lt;br /&gt;
7. Group Theory in Physics, Lie Algebras.&lt;br /&gt;
8. Tensor calculus: theory and applications.  3 Credit Hours &lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
*  Needham, T. (1997). Visual Complex Analysis. United Kingdom: Clarendon Press.&lt;br /&gt;
* Grimshaw, R. (1993). Nonlinear Ordinary Differential Equations (1st ed.). Routledge.&lt;br /&gt;
* R. Courant and D. Hilbert, Methods of Mathematical Physics. Vol. II: Partial Differential Equations, Wiley Classics Library, John Wiley &amp;amp; Sons Inc., New York, 1989.&lt;br /&gt;
* Bender, C. and Orszag, S. Advanced Mathematical Methods for Scientists and Engineers. Mc-Graw Hill.&lt;br /&gt;
* Kreyszig, E. (1989). Introductory Functional Analysis with Applications. Wiley.&lt;br /&gt;
* Methods of Applied Mathematics. Todd Arbogast and Jerry L. Bona. Department of Mathematics, and Institute for Computational Engineering and Sciences, University of Texas at Austin, 2008.&lt;br /&gt;
* P. J. Olver, Applications of Lie Groups to Differential Equations, Second Edition, Graduate Texts in Mathematics, vol. 107, Springer-Verlag, New York, 1993.&lt;br /&gt;
* Rutherford Aris, Vectors, Tensors and the Basic Equations of Fluid Mechanics. Dover Publications&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Complex Analysis Part I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Definition and algebraic properties of complex numbers, Riemann Sphere, Holomorphic functions and conformal mappings&lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Complex Analysis Part II&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*Integrals in the Complex Plane, Cauchy's theorem, Calculus of Residues&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Complex Analysis Part III&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Harmonic functions and Poisson's formula&lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Tensor Calculus Basics I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Using indices in three-dimensional cartesian vector analysis, deriving vector identities using index calculus, divergence, grad and curl in index notation, divergence and Stokes' theorem.&lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Tensor Caluclus Basics II&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Manifolds and coordinate transformations, vector fields, Riemannian geometry, covariant derivatives and Christoffel symbols&lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Applied Functional Analysis Part I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Hilbert spaces and inner products, orthogonality and completeness.&lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Applied Functional Analysis Part II&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* Operators in Hilbert spaces, eigenvalue problem, self-adjointness and spectral properties &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Applied Functional Analysis Part III&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Examples of Hilbert spaces in quantum mechanics, standard examples such as potential wells and harmonic oscillator  &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Overview about ordinary differential equations I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Systems of nonlinear/linear equations, basic existence and uniqueness theorems   &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Overview about ordinary differential equations II&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Phase-plane, linearization, stability, chaos&lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
PDE's of Mathematical Physics&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Standard examples, qualitative properties, conservation laws &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Introduction to Lie Groups and Symmetries I&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* Definition of a Lie group and examples, commutators and Lie brackets, Lie algebras&lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Introduction to Lie Groups and Symmetries II&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* Exponential maps, applications of Lie groups to differential equations, Noether's theorem&lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
KdV equation, completely integrable systems&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* Soliton solutions, infinite hierarchy of conservation laws  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=Main_Page&amp;diff=5301</id>
		<title>Main Page</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=Main_Page&amp;diff=5301"/>
		<updated>2025-01-23T23:05:36Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Upper Division */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;&amp;lt;strong&amp;gt;UTSA Department of Mathematics&amp;lt;/strong&amp;gt;&lt;br /&gt;
&lt;br /&gt;
To edit tables in each course below, you can use [https://tableconvert.com/mediawiki-to-excel MediaWiki-to-Excel converter] and/or the [https://tableconvert.com/excel-to-mediawiki Excel-to-MediaWiki converter] &lt;br /&gt;
&lt;br /&gt;
== Undergraduate Studies ==&lt;br /&gt;
===STEM Core===&lt;br /&gt;
* [[MAT1073]] College Algebra for Scientists and Engineers &lt;br /&gt;
* [[MAT1093]] Precalculus &lt;br /&gt;
* [[MAT1193]] Calculus for Biosciences &lt;br /&gt;
* [[MAT1214]] Calculus I (4 credit hours)&lt;br /&gt;
* [[MAT1213]] Calculus I (3 credit hours)&lt;br /&gt;
* [[MAT1224]] Calculus II (4 credit hours) &lt;br /&gt;
* [[MAT1223]] Calculus II (3 credit hours)&lt;br /&gt;
* [[MAT2214]] Calculus III (4 credit hours) &lt;br /&gt;
* [[MAT2213]] Calculus III (3 credit hours)&lt;br /&gt;
* [[MAT2233]] Linear Algebra &lt;br /&gt;
&lt;br /&gt;
===Minor in Mathematics===&lt;br /&gt;
To receive a minor, students must complete at least 18 semester credit hours, including 6 hours at the upper-division level at UTSA, and must achieve a grade point average of at least 2.0 (on a 4.0 scale) on all work used to satisfy the requirements of a minor. See [https://catalog.utsa.edu/undergraduate/bachelorsdegreeregulations/minors/  UTSA's Undergraduate Catalog]&lt;br /&gt;
&lt;br /&gt;
The Minor in Mathematics requires the Calculus series plus linear algebra, and upper division courses in either the Math Major or the Data &amp;amp; Applied Science Core&lt;br /&gt;
&lt;br /&gt;
===Data &amp;amp; Applied Science Core===&lt;br /&gt;
==== Lower Division ====&lt;br /&gt;
* [[MDC1213]] Sociocultural Foundations of Mathematics, Data Science, and Computing&lt;br /&gt;
* [[MAT2253]] Applied Linear Algebra (3 credit hours)&lt;br /&gt;
&lt;br /&gt;
==== Upper Division ====&lt;br /&gt;
* [[MAT4133]]/[[MAT5133]] Mathematical Biology&lt;br /&gt;
* [[MAT4143]]/[[MAT5143]] Mathematical Physics 1&lt;br /&gt;
* [[MAT4XXX]]/[[MAT5XXX]] Mathematical Physics 2&lt;br /&gt;
* [[MAT4XXX]]/[[MAT5XXX]] Introduction to Quantum Information Science and Engineering&lt;br /&gt;
* [[MAT4373]]/[[MAT5373]] Mathematical Foundations of Statistics I (discrete &amp;amp; continuous PDFs)&lt;br /&gt;
* [[MAT4383]]/[[MAT5383]] Mathematical Foundations of Statistics II (statistical inference)&lt;br /&gt;
* [[MDC4153]]/[[MDC5153]] Mathematical Foundations of Data Analytics&lt;br /&gt;
&lt;br /&gt;
===Math Major===&lt;br /&gt;
==== Lower Division ====&lt;br /&gt;
* [[MAT1313]] Algebra and Number Systems &lt;br /&gt;
* [[MAT2313]] Combinatorics and Probability&lt;br /&gt;
&lt;br /&gt;
==== Upper Division ====&lt;br /&gt;
* [[MAT3003]] Discrete Mathematics &lt;br /&gt;
* &amp;lt;del&amp;gt;[[MAT3013]] Foundations of Mathematics&amp;lt;/del&amp;gt; Course transitioning to be replaced by [[MAT3003]] Discrete Mathematics (below).&lt;br /&gt;
* [[MAT3203]] Abstract Linear Algebra&lt;br /&gt;
* &amp;lt;del&amp;gt;[[MAT3213]] Foundations of Analysis&amp;lt;/del&amp;gt; Course transitioning to be replaced by [[MAT3333]] Fundamentals of Analysis and Topology (below).&lt;br /&gt;
* [[MAT3333]] Fundamentals of Analysis and Topology&lt;br /&gt;
* [[MAT3233]] Modern Algebra&lt;br /&gt;
* [[MAT3333]] Fundamentals of Analysis and Topology&lt;br /&gt;
* [[MAT3313]] Logic and Computability&lt;br /&gt;
* [[MAT3613]] Differential Equations I &lt;br /&gt;
* [[MAT3623]] Differential Equations II &lt;br /&gt;
* [[MAT3633]] Numerical Analysis &lt;br /&gt;
* [[MAT3223]] Complex Variables &lt;br /&gt;
* [[MAT4033]] Abstract Linear Algebra II&lt;br /&gt;
* [[MAT4213]] Real Analysis I &lt;br /&gt;
* [[MAT4223]] Real Analysis II &lt;br /&gt;
* [[MAT4233]] Modern Abstract Algebra&lt;br /&gt;
* [[MAT4273]] Topology&lt;br /&gt;
* [[MAT4283]] Computing for Mathematics&lt;br /&gt;
* [[MAT4323]] Applied Graph Theory&lt;br /&gt;
* [[MAT4373]] Mathematical Statistics I&lt;br /&gt;
&lt;br /&gt;
===Business===&lt;br /&gt;
* [[MAT1053]] Algebra for Business &lt;br /&gt;
* [[MAT1133]] Calculus for Business &lt;br /&gt;
&lt;br /&gt;
===Math for Liberal Arts===&lt;br /&gt;
* [[MAT1043]] Introduction to Mathematics &lt;br /&gt;
&lt;br /&gt;
=== Elementary Education ===&lt;br /&gt;
* [[MAT1023]] College Algebra &lt;br /&gt;
* [[MAT1153]] Essential Elements in Mathematics I &lt;br /&gt;
* [[MAT1163]] Essential Elements in Mathematics II &lt;br /&gt;
&lt;br /&gt;
=== General Math Studies===&lt;br /&gt;
* [[MAT3233]] Modern Algebra&lt;br /&gt;
&lt;br /&gt;
== Graduate Studies ==&lt;br /&gt;
=== Core M.Sc. Studies ===&lt;br /&gt;
Core courses, common across all M.Sc. tracks, must be at least 50% of the credit 30 hours needed to obtain a M.Sc. degree. The following courses add up to 15 credit hours. &lt;br /&gt;
* Two courses in the Analysis &amp;amp; Algebra sequences in the following combinations: &lt;br /&gt;
** [[MAT5173]] Algebra  I &amp;amp; [[MAT5183]] Algebra II. &lt;br /&gt;
** [[MAT5173]] Algebra  I &amp;amp; [[MAT5243]] General Topology I. &lt;br /&gt;
** [[MAT5243]] General Topology I &amp;amp; [[MAT5253]] General Topology II. &lt;br /&gt;
** [[MAT5203]] Analysis I  &amp;amp; [[MAT5213]] Analysis II&lt;br /&gt;
** [[MAT5173]] Algebra I &amp;amp;  [[MAT5203]] Analysis I.&lt;br /&gt;
** [[MAT5173]] Algebra  I &amp;amp; [[MAT5123]] Cryptography I.&lt;br /&gt;
** [[MAT5123]] Cryptography I &amp;amp;  [[MAT5323]] Cryptography II.&lt;br /&gt;
* Two course in discrete mathematics among the following:&lt;br /&gt;
** [[MAT5423]] Discrete Mathematics I&lt;br /&gt;
** [[MAT5433]] Discrete Mathematics II&lt;br /&gt;
* One course in computation among the following:&lt;br /&gt;
** [[MAT5373]] Mathematical Statistics I&lt;br /&gt;
** [[MDC5153]] Data Analytics&lt;br /&gt;
* [[MAT5283]] Linear Algebra&lt;br /&gt;
&lt;br /&gt;
=== Qualifying Examination Tracks  ===&lt;br /&gt;
* [[MAT5183]] Algebra II (Pure, Applied tracks)&lt;br /&gt;
* [[MAT5123]] Cryptography (Pure, Applied tracks)&lt;br /&gt;
* [[MAT5323]] Cryptography II (Pure, Applied tracks))&lt;br /&gt;
* [[MAT5213]] Analysis II (Pure track)&lt;br /&gt;
* [[MAT5113]] Computing for Mathematics   (Pure, Applied tracks)&lt;br /&gt;
* [[MAT5433]] Discrete Mathematics II   (Pure, Applied tracks)&lt;br /&gt;
* [[MAT5383]] Mathematical Statistics II  (Applied tracks)&lt;br /&gt;
&lt;br /&gt;
=== M.Sc. Track in Pure Mathematics  ===&lt;br /&gt;
* [[MAT5443]] Logic and Computability&lt;br /&gt;
* [[MAT5243]] General Topology&lt;br /&gt;
* [[MAT5253]] General Topology II&lt;br /&gt;
* [[MAT5323]] Cryptography II&lt;br /&gt;
* [[MAT5183]] Algebra II&lt;br /&gt;
* [[MAT5223]] Theory of Functions of a Complex Variable&lt;br /&gt;
* [[MAT5343]] Differential Geometry&lt;br /&gt;
&lt;br /&gt;
=== M.Sc. Track in Applied &amp;amp; Industrial Mathematics ===&lt;br /&gt;
* [[MDC5153]] Data Analytics&lt;br /&gt;
* [[AIM 5113]] Introduction to Industrial Mathematics&lt;br /&gt;
* [[MAT 5113]] Computing for Mathematics&lt;br /&gt;
* [[MAT 5653]] Differential Equations I&lt;br /&gt;
* [[MAT 5673]] Partial Differential Equations&lt;br /&gt;
&lt;br /&gt;
=== M.Sc. in Mathematics Education ===&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
	<entry>
		<id>https://mathresearch.utsa.edu/wiki/index.php?title=MAT5143&amp;diff=5296</id>
		<title>MAT5143</title>
		<link rel="alternate" type="text/html" href="https://mathresearch.utsa.edu/wiki/index.php?title=MAT5143&amp;diff=5296"/>
		<updated>2025-01-21T22:34:13Z</updated>

		<summary type="html">&lt;p&gt;Jose.morales: /* Course description */&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;=Mathematical Physics - MAT4143/5153=&lt;br /&gt;
&lt;br /&gt;
==Course description==&lt;br /&gt;
Mathematical Physics tentative topics list. This course is aimed at physics majors who wish to deepen their understanding of mathematical methods used in physics. This course is also suitable for Mathematics majors, who can deepen their understanding of applications and proofs of major theorems in Functional Analysis.&lt;br /&gt;
&lt;br /&gt;
CATALOG ENTRY:&lt;br /&gt;
MAT 4XX1: Mathematical Physics I. (3-0) 3 Credit Hours.&lt;br /&gt;
Prerequisites: MAT 2214 (Calculus III) and MAT 3623 (Differential Equations II) with a letter grade of C- or better, or successful completion of at least three credits of equivalent courses.&lt;br /&gt;
&lt;br /&gt;
&lt;br /&gt;
'''Textbooks:'''&lt;br /&gt;
&lt;br /&gt;
*  Needham, T. (1997). Visual Complex Analysis. United Kingdom: Clarendon Press.&lt;br /&gt;
* Grimshaw, R. (1993). Nonlinear Ordinary Differential Equations (1st ed.). Routledge.&lt;br /&gt;
* R. Courant and D. Hilbert, Methods of Mathematical Physics. Vol. II: Partial Differential Equations, Wiley Classics Library, John Wiley &amp;amp; Sons Inc., New York, 1989.&lt;br /&gt;
* Bender, C. and Orszag, S. Advanced Mathematical Methods for Scientists and Engineers. Mc-Graw Hill.&lt;br /&gt;
* Kreyszig, E. (1989). Introductory Functional Analysis with Applications. Wiley.&lt;br /&gt;
* Methods of Applied Mathematics. Todd Arbogast and Jerry L. Bona. Department of Mathematics, and Institute for Computational Engineering and Sciences, University of Texas at Austin, 2008.&lt;br /&gt;
* P. J. Olver, Applications of Lie Groups to Differential Equations, Second Edition, Graduate Texts in Mathematics, vol. 107, Springer-Verlag, New York, 1993.&lt;br /&gt;
* Rutherford Aris, Vectors, Tensors and the Basic Equations of Fluid Mechanics. Dover Publications&lt;br /&gt;
&lt;br /&gt;
==Topics List==&lt;br /&gt;
{| class=&amp;quot;wikitable sortable&amp;quot;&lt;br /&gt;
! Date !! Sections !! Topics !! Prerequisite Skills !! Student Learning Outcomes&lt;br /&gt;
|-&lt;br /&gt;
|Week 1&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Complex Analysis Part I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Definition and algebraic properties of complex numbers, Riemann Sphere, Holomorphic functions and conformal mappings&lt;br /&gt;
|-&lt;br /&gt;
|Week 2&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Complex Analysis Part II&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
*Integrals in the Complex Plane, Cauchy's theorem, Calculus of Residues&lt;br /&gt;
|-&lt;br /&gt;
|Week 3&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Complex Analysis Part III&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Harmonic functions and Poisson's formula&lt;br /&gt;
|-&lt;br /&gt;
|Week 4&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Tensor Calculus Basics I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Using indices in three-dimensional cartesian vector analysis, deriving vector identities using index calculus, divergence, grad and curl in index notation, divergence and Stokes' theorem.&lt;br /&gt;
|-&lt;br /&gt;
|Week 5&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Tensor Caluclus Basics II&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Manifolds and coordinate transformations, vector fields, Riemannian geometry, covariant derivatives and Christoffel symbols&lt;br /&gt;
|-&lt;br /&gt;
|Week 6&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Applied Functional Analysis Part I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Hilbert spaces and inner products, orthogonality and completeness.&lt;br /&gt;
|-&lt;br /&gt;
|Week 7&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Applied Functional Analysis Part II&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* Operators in Hilbert spaces, eigenvalue problem, self-adjointness and spectral properties &lt;br /&gt;
|-&lt;br /&gt;
|Week 8&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Applied Functional Analysis Part III&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Examples of Hilbert spaces in quantum mechanics, standard examples such as potential wells and harmonic oscillator  &lt;br /&gt;
|-&lt;br /&gt;
|Week 9&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Overview about ordinary differential equations I&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Systems of nonlinear/linear equations, basic existence and uniqueness theorems   &lt;br /&gt;
|-&lt;br /&gt;
|Week 10&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Overview about ordinary differential equations II&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Phase-plane, linearization, stability, chaos&lt;br /&gt;
|-&lt;br /&gt;
|Week 11&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
PDE's of Mathematical Physics&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
* Standard examples, qualitative properties, conservation laws &lt;br /&gt;
&lt;br /&gt;
|-&lt;br /&gt;
|Week 12&lt;br /&gt;
||&lt;br /&gt;
&lt;br /&gt;
||&lt;br /&gt;
Introduction to Lie Groups and Symmetries I&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* Definition of a Lie group and examples, commutators and Lie brackets, Lie algebras&lt;br /&gt;
|-&lt;br /&gt;
|Week 13&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
Introduction to Lie Groups and Symmetries II&lt;br /&gt;
||&lt;br /&gt;
 &lt;br /&gt;
||&lt;br /&gt;
* Exponential maps, applications of Lie groups to differential equations, Noether's theorem&lt;br /&gt;
|-&lt;br /&gt;
|Week 14&lt;br /&gt;
||&lt;br /&gt;
* &lt;br /&gt;
||&lt;br /&gt;
KdV equation, completely integrable systems&lt;br /&gt;
||&lt;br /&gt;
*  &lt;br /&gt;
||&lt;br /&gt;
* Soliton solutions, infinite hierarchy of conservation laws  &lt;br /&gt;
|}&lt;/div&gt;</summary>
		<author><name>Jose.morales</name></author>
		
	</entry>
</feed>