Numerical Methods for Machine Learning: Course Notes#
Joe McEwen and Jesse Loi#
version 0.2 (the typo edition) 2025
Help! This is an early version of the notes, if you find an error, please let us know. Thank you.
- 1. What is this?
- 2. Guidance
- 3. Array Basics
- 4. Differential Calculus
- 5. Matrices
- 6. Numerical Optimization
- 6.1. Introduction to Optimization via Linear Regression
- 6.2. Convex Functions and Optimization
- 6.3. The Nelder-Mead Algorithm, Downhill Simplex, or the Amoeba
- 6.4. Particle Swarm and the Eggholder Challenge
- 6.5. Newton’s method for Root Finding
- 6.6. Newton’s Method for Optimization
- 6.7. Logistic Regression with Newton’s Method via Jax (Assignment)
- 6.8. Introduction to Gradient Descent
- 6.9. Gradient Descent in 1-Dimension