Skip to main content
Back to top
Ctrl
+
K
Numerical Methods for Machine Learning: Course Notes
1. What is this?
2. Guidance
2.1. Do You Want a Job?
2.2. Good Coding
2.3. Psuedo Code Exercise
3. Array Basics
3.1. Arrays in Numpy
3.2. Broadcasting Examples
3.3. Dot Product
3.4. Matrix Multiplication
3.5. Solving Systems of Equations
4. Differential Calculus
4.1. Single Variable Calculus
4.2. Multivariable Calculus
4.3. Automatic Differentiation with Jax
4.4. Derivatives, a Helpful Lesson
5. Matrices
5.1. Transpose, Inverse, and Norm
5.2. Linear Transformations
5.3. Matrix Determinant
5.4. Orthogonal Matrix (Unitary for Real Matrices)
5.5. Matrix Rank
5.6. Matrix Condition Number
5.7. EigenValues and Principle Component Analysis
5.8. Matrix Factorization Methods
5.9. SVD and PCA
6. Numerical Optimization
6.1. Introduction to Optimization via Linear Regression
6.2. Convex Functions and Optimization
6.3. The Nelder-Mead Algorithm, Downhill Simplex, or the Amoeba
6.4. Measuring the Temperature of the Universe with SciPy’s fmin
6.5. Nested Optimizer Nelder Mead + Linear Regression
6.6. Particle Swarm and the Eggholder Challenge
6.7. Newton’s method for Root Finding
6.8. Newton’s Method for Optimization
6.9. Problems with the Hessian Matrix
6.10. Logistic Regression with Newton’s Method via Jax (Assignment)
6.11. Introduction to Gradient Descent
6.12. Gradient Descent in 1-Dimension
Repository
Open issue
Index