This is the OLD 2023 course website. For the current one, see here.
This course is offered jointly by the TML and MLO groups. Previous year’s website: ML 2022.
See here for the ML4Science projects.
Contact us: Use the discussion forum. You can also email the head assistant Lara Orlandic, and CC both instructors.
Instructors: Nicolas Flammarion and Martin Jaggi
Teaching Assistants
|
Student Assistants
|
Lectures | Tuesday | 16:15 – 18:00 | in Rolex Learning Center |
Wednesday | 10:15 – 12:00 | in Rolex Learning Center | |
Exercises | Thursday | 14:15 – 16:00 |
Rooms: INF1, INF119, INJ218, INM202, INR219 |
Language: | English | |
Credits : | 8 ECTS |
For a summary of the logistics of this course, see the course info sheet here (PDF).
(and also here is a link to official coursebook information).
Special Announcements
- Exam Date: Thursday 18.01.2024 from 15h15 to 18h15 (STCC – Garden Full)
- The links for the exercises signup and the discussion forum. All other materials are here on this page and github.
-
Projects: There will be two group projects during the course.
-
Project 1 counts 10% and is due Oct 30th.
-
Project 2 counts 30% and is due Dec 21st.
-
-
Code Repository for Labs, Projects, Lecture notes: github.com/epfml/ML_course
-
the exam is closed book but you are allowed one crib sheet (A4 size paper, both sides can be used); bring a pen and white eraser; you find the exams from the past years with solutions here:
Detailed Schedule
Lecture notes from each class are made available on github here, and videos here on mediaspace.
Date | Topics Covered | Lectures | Exercises | Projects |
---|---|---|---|---|
19/9 | Introduction, Linear Regression | 01a,01b 01c,01d |
||
20/9 | Loss functions | Lab 1 | ||
26/9 | Optimization | 02a | ||
27/9 | Optimization | Lab 2 | Project 1 start | |
03/10 | Least Squares, Overfitting | 03a,03b | ||
04/10 | Max Likelihood, Ridge Regression, Lasso | 03c,03d | Lab 3 | |
10/10 | Generalization, Model Selection, and Validation | 04a | ||
11/10 | Bias-Variance decomposition | 04b | Lab 4 | |
17/10 | Classification | 05a | ||
18/10 | Logistic Regression | 05b | Lab 5 | |
24/10 | Support Vector Machines | 06a | ||
25/10 | K-Nearest Neighbor | 06b | Lab 6 | |
31/10 | Kernel Regression | 7a | Proj. 1 due 30.10. | |
01/11 | Neural Networks – Basics, Representation Power | 7b | Lab 7 | |
07/11 | Neural Networks – Backpropagation, Activation Functions | 8a | Project 2 start | |
08/11 | Neural Networks – CNNs, Regularization, Data Augmentation, Dropout | 8b | Lab 8 | |
14/11 | Neural Networks – Transformers | 9a | ||
15/11 | Adversarial ML | 9b | Lab 9 | |
21/11 | Ethics and Fairness in ML | 10a, Ethics canvas | ||
22/11 | Unsupervised Learning, K-Means | 10b, 10c | Lab 10 | |
28/11 | Gaussian Mixture Models | 11a | ||
29/12 | EM algorithm | 11b | Lab 11 & Project Q&A | |
05/12 | Matrix Factorizations | 12a | ||
06/12 | Text Representation Learning | 12b | Lab 12 | |
12/12 | Self-supervised learning | 13a | ||
13/12 | Generative models | 13b | Lab 13 | |
19/12 | Guest lecture by Devis Tua | |||
20/12 | Projects pitch session (optional) | Proj. 2 due 21.12. |
Textbooks
(not mandatory)
Gilbert Strang, Linear Algebra and Learning from Data
Christopher Bishop, Pattern Recognition and Machine Learning
Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning
Michael Nielsen, Neural Networks and Deep Learning
Projects & ML4Science
Projects are done either in ML4Science in collaboration with any lab of EPFL, UniL or other academic institution, or the Reproducibility Challenge for ML papers, or one of the predefined ML challenges.
All info about the interdisciplinary ML4Science projects is available on the separate page here.
Machine Learning CS-433 – 2022
This is the OLD 2022 course website. For the current one, see here.
This course is offered jointly by the TML and MLO groups. Previous year’s website: ML 2023.
See here for the ML4Science projects.
Contact us: Use the discussion forum. You can also email the head assistant Lara Orlandic, and CC both instructors.
Instructors: Nicolas Flammarion and Martin Jaggi
Teaching Assistants
|
Student Assistants
|
Lectures | Tuesday | 16:15 – 18:00 | in Rolex Learning Center |
Wednesday | 10:15 – 12:00 | in Rolex Learning Center | |
Exercises | Thursday | 14:15 – 16:00 |
Rooms: INF1, INF119, INJ218, INM202, INR219 |
Language: | English | |
Credits : | 8 ECTS |
For a summary of the logistics of this course, see the course info sheet here (PDF).
(and also here is a link to official coursebook information).
Special Announcements
- Exam Date: Thursday 18.01.2024 from 15h15 to 18h15 (STCC – Garden Full)
- The links for the exercises signup and the discussion forum. All other materials are here on this page and github.
-
Projects: There will be two group projects during the course.
-
Project 1 counts 10% and is due Oct 30th.
-
Project 2 counts 30% and is due Dec 21st.
-
-
Code Repository for Labs, Projects, Lecture notes: github.com/epfml/ML_course
-
the exam is closed book but you are allowed one crib sheet (A4 size paper, both sides can be used); bring a pen and white eraser; you find the exams from the past years with solutions here:
Detailed Schedule
Lecture notes from each class are made available on github here, and videos here on mediaspace.
Date | Topics Covered | Lectures | Exercises | Projects |
---|---|---|---|---|
19/9 | Introduction, Linear Regression | 01a,01b 01c,01d |
||
20/9 | Loss functions | Lab 1 | ||
26/9 | Optimization | 02a | ||
27/9 | Optimization | Lab 2 | Project 1 start | |
03/10 | Least Squares, Overfitting | 03a,03b | ||
04/10 | Max Likelihood, Ridge Regression, Lasso | 03c,03d | Lab 3 | |
10/10 | Generalization, Model Selection, and Validation | 04a | ||
11/10 | Bias-Variance decomposition | 04b | Lab 4 | |
17/10 | Classification | 05a | ||
18/10 | Logistic Regression | 05b | Lab 5 | |
24/10 | Support Vector Machines | 06a | ||
25/10 | K-Nearest Neighbor | 06b | Lab 6 | |
31/10 | Kernel Regression | 7a | Proj. 1 due 30.10. | |
01/11 | Neural Networks – Basics, Representation Power | 7b | Lab 7 | |
07/11 | Neural Networks – Backpropagation, Activation Functions | 8a | Project 2 start | |
08/11 | Neural Networks – CNNs, Regularization, Data Augmentation, Dropout | 8b | Lab 8 | |
14/11 | Neural Networks – Transformers | 9a | ||
15/11 | Adversarial ML | 9b | Lab 9 | |
21/11 | Ethics and Fairness in ML | 10a, Ethics canvas | ||
22/11 | Unsupervised Learning, K-Means | 10b, 10c | Lab 10 | |
28/11 | Gaussian Mixture Models | 11a | ||
29/12 | EM algorithm | 11b | Lab 11 & Project Q&A | |
05/12 | Matrix Factorizations | 12a | ||
06/12 | Text Representation Learning | 12b | Lab 12 | |
12/12 | Self-supervised learning | 13a | ||
13/12 | Generative models | 13b | Lab 13 | |
19/12 | Guest lecture by Devis Tua | |||
20/12 | Projects pitch session (optional) | Proj. 2 due 21.12. |
Textbooks
(not mandatory)
Gilbert Strang, Linear Algebra and Learning from Data
Christopher Bishop, Pattern Recognition and Machine Learning
Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning
Michael Nielsen, Neural Networks and Deep Learning
Projects & ML4Science
Projects are done either in ML4Science in collaboration with any lab of EPFL, UniL or other academic institution, or the Reproducibility Challenge for ML papers, or one of the predefined ML challenges.
All info about the interdisciplinary ML4Science projects is available on the separate page here.