Numerical Methods for Machine Learning: Course Notes#
Joe McEwen and Jesse Loi#
version 0.2 (the typo edition) 2025
Help! This is an early version of the notes, if you find an error, please let us know. Thank you.
- 1. What is this?
- 2. Guidance
- 3. Array Basics
- 4. Differential Calculus
- 5. Matrices
- 6. Numerical Optimization
- 6.1. Introduction to Optimization via Linear Regression
- 6.2. Convex Functions and Optimization
- 6.3. The Nelder-Mead Algorithm, Downhill Simplex, or the Amoeba
- 6.4. Measuring the Temperature of the Universe with SciPy’s fmin
- 6.5. Nested Optimizer Nelder Mead + Linear Regression
- 6.6. Particle Swarm and the Eggholder Challenge
- 6.7. Newton’s method for Root Finding
- 6.8. Newton’s Method for Optimization
- 6.9. Problems with the Hessian Matrix
- 6.10. Logistic Regression with Newton’s Method via Jax (Assignment)
- 6.11. Introduction to Gradient Descent
- 6.12. Gradient Descent in 1-Dimension