ML Mathematics (SOON)
Welcome to the course! This course covers the basics of the topic and practical examples. Each lecture will guide you through a specific topic with explanations and code samples.
Course Outline
Linear Algebra basics
Lecture 1. Vectors - definition, addition, scalar multiplication
Lecture 2. Matrices - operations, types, shapes
Lecture 3. Matrix multiplication (dot product + rules)
Lecture 4. Transpose, identity, and inverse
Lecture 5. Linear systems and solving with matrices
Linear Algebra for ML
Lecture 6. Vector spaces and span
Lecture 7. Linear independence and rank
Lecture 8. Projections and geometry of data
Lecture 9. Eigenvalues and Eigenvectors (with visuals)
Lecture 10. Principal component analysis (PCA)
Calculus basics
Lecture 11. Functions, limits and derivatives
Lecture 12. Rules of differentiation (product, chain rule)
Lecture 13. Partial derivatives (multivariable functions)
Lecture 14. Gradients, level curves, and contours
Lecture 15. Optimisation with gradient descent
Calculus for ML
Lecture 16. Taylor series and approximaitons
Lecture 17. Jacobian and Hessian matrices
Lecture 18. Convex functions and optimisation
Lecture 19. Backpropagation (deep dive math)
Lecture 20. Autograd in PyTorch
Probability and distributions
Lecture 21. Probability rules, independence, Bayes’ rule
Lecture 22. Random variables, expectation, variance
Lecture 23. Common distributions (Bernoulli, Binomial, Normal)
Lecture 24. Multivariate distributions and covariance
Lecture 25. Maximum Likelihood Estimation (MLE)
Information theory & ML math tools
Lecture 26. Entropy, KL divergence, cross-entropy
Lecture 27. Information gain and decision trees
Lecture 28. Linear regression
Lecture 29. Logistic regression (sigmoid + cross-entropy)
Lecture 30. Regularisation (L1, L2), bias-variance tradeoff