APL405: Machine Learning in Mechanics (Winter semester 2024)
Credit: 3 units (2-0-2)
Pre-requisites: APL101/MTL106/MTL108, COL106
Instructors: Rajdip Nayek (rajdipn@am.iitd.ac.in)
Class timings: Tue, Thu & Fri (11:00 to 11:50 AM) at LHC517
Practical Session: Fri (3 PM to 5 PM) at two labs LH503
Attendance and Marks: Link here
Office hours (TA): By email appointment
Office hours (Instructor): By email appointment (Room B24, Block 4)
Intended audience: BTech students in Applied Mechanics, Materials, Mechanical and Civil Engineering disciplines.
NOTE-For all course related emails, please put APL405 in the subject line
This is an introductory course to statistical machine learning for students with some background in calculus, linear algebra, and statistics. The course focuses on supervised learning, i.e., classification and regression. The course will cover a range of methods used in machine learning and data science, including:
These methods will be studied from various applications throughout the course. The course also covers important practical considerations such as cross-validation, model selection, and the bias-variance trade-off. The course includes theory (e.g., derivations and proofs) and practice (notably the lab and the course project). The practical part will be implemented using Python.
Module# | Main Topic | Sub Topics | Lecture Notes (2024) | Extra material |
---|---|---|---|---|
Mod 00 | Introduction | Lecture 1 | ||
Mod 01 | A Preliminary Approach to Supervised Learning |
Background k-Nearest Neighbours Decision Trees |
Lecture 2 Lecture 3 Lecture 4 |
[AL] Chapter 2 |
Mod 02 | Basic Parametric Models | Linear regression Logistic Regression Regularization |
Lecture 5 Lecture 6 Lecture 7 |
[AL] Chapter 3 |
Mod 03 | Evaluating Performance | Cross-validation Training error-generalization gap Bias-variance decomposition |
Lecture 8 Lecture 9 Lecture 10 |
[AL] Chapter 4 |
Mod 04 | Learning Parametric Models | Loss functions Parameter Optimization |
Lecture 11 Lecture 12 |
[AL] Chapter 5, Lecture 5 by Prof. Mitesh Khapra |
Mod 05 | Neural Networks | Feedforward neural network Backpropagation Convolutional Neural Network |
Lecture 13 Lecture 14 Lecture 15 |
[AL] Section 6.1, Video, Lecture 8 by Prof. Mitesh Khapra |
Mod 06 | Kernel Methods | Kernel Ridge Regression Theory of kernels Support Vector Classification |
Lecture 16 Lecture 17 Lecture 18 |
[AL] Chapter 8, [SN] Sections 5.5, 5.6 |
Mod 07 | Ensemble Methods | Bagging & Random Forests Boosting |
Lecture 19 Lecture 20 |
[AL] Chapter 7 |
Mod 09 | Generative Models & Unsupervised Learning |
Gaussian mixture models GMM (with EM) k-means clustering PCA |
Lecture 21 Lecture 22 Lecture 23 Lecture 24 |
[AL] Sections 10.1 [AL] Sections 10.2, slides |
There will be no make-up labs for students who might have missed the labs due to genuine medical reasons. Marks for the missed labs will be adjusted based on the performance of the class on all labs and your performance in the labs in which you were present.
Week# | Topics | Practical Questions | Notes |
---|---|---|---|
Wk 0 | Probability refresher | Practical 0 | Notes |
Wk 1 | k-Nearest Neighbours | Practical 1 | Section |
Wk 2 | Decision Trees | Practical 2 | Dataset |
Wk 3 | Linear Regression | Practical 3 | |
Wk 4 | Logistic Regression | Practical 4 | |
Wk 5 | Cross-validation & Bias-variance trade-off | Practical 5 | |
Wk 6 | Neural network in NumPy | Practical 6 | Solution |
Wk 7 | Neural network in PyTorch | Practical 7 | Tutorial of GD with PyTorch gradients |
Wk 8 | SVM | Practical 8 | |
Wk 9 | Boosting | Practical 9 |
Three homeworks will be given
HW# | Zip file | Writeup | Solutions |
---|---|---|---|
HW1 | Homework zip 1 | Homework 1 | Writeup_soln |
HW2 | Homework zip 2 | Homework 2 | |
HW3 | Datafiles Code Template |
Homework 3 | HW3sol |
Component | Scores |
---|---|
Practical Attendance + Exam | 5 + 10 |
Homework | 20 |
Project | 10 |
Quiz | 10 |
Minor | 20 |
Major | 25 |
Total | 100 |
Component | Solution |
---|---|
Minor | Solution |
Quiz | Solution |
Major | Solution |
The course project is described here
Datafiles for the course project can be found here