Instructor | Nicolas Macris | Instructor | Ruediger Urbanke |
Office | INR 134 | Office | INR 116 |
[email protected] | [email protected] | ||
Office Hours | By appointment | Office Hours | By appointment |
Teaching Assistant | [email protected] | Office | INR 036 |
Teaching Assistant | [email protected] |
Office | INR 141 |
Lectures | Monday, 08:15 – 10:00 |
Room | INM 202 |
Exercises | Tuesday, 17:15 – 19:00 |
Room | INR 219 |
Language: English
Credits: 4 ECTS
Prerequisites:
- Analysis I, II, III
- Linear Algebra
- Machine learning
- Probability
- Algorithms (CS-250)
Here is a link to official coursebook information.
Homework:
Some homework will be graded.
Grading:
If you do not hand in your final exam your overall grade will be NA. Otherwise, your grade will be determined based on the following weighted average: 10 % for the Homework, 90 % for the Final Exam. For the graded homeworks, you can discuss the homework with other people. But you have to write down your own solution and note on the first page the set of people that you discussed with.
Special Announcements
Online exercise sessions
Exercise sessions will take place online on Zoom on Tuesdays, from 5pm to 7pm. See https://epfl.zoom.us/. To access the exercise session please click the following link https://epfl.zoom.us/j/384790010
Due to unforeseeable reasons we must swap the lectures on tensors with the remaining lectures on gradient descent. Thank you for your understanding.
Graded homework
From Monday, 16 March, until Sunday, 19 April, all EPFL classes are given online. During this period, no handwritten homework can be handed in and we ask you to use LaTeX to write your homework.
Here is a template for the first graded homework: Latex template graded homework.
If you cannot compile LaTeX on your own computer, EPFL is providing Overleaf Professional accounts for all students: Overleaf EPFL . With Overleaf you can write and compile LaTeX directly from your web browser. To use the provided template (.tex), you can create a new project and upload the .tex file.
Submission instructions for Graded Homework 3
An email should be sent to both TAs with the corresponding PDF file (after you compile it from LaTex, no need to send the LaTex source code) and the filled notebook of Problem 4.
The title of the email should be LT-GHW3.
The pdf file should be titled Firstname_Lastname_GH3.pdf and the notebook Problem_4_Firstname_Lastname.ipynb.
Final exam
The final exam will take on Wednesday, 19 August 2020 from 8.15 am to 11.15 am in INM202.
The exam will be open-book (lecture notes, exercises, course materials) but no electronic devices allowed.
Q&A sessions prior to final exam
We will organize Q&A sessions on Zoom. The first session will take place on Tuesday, 28 July 2020 from 5pm to 7pm.
Link for 1st Q&A session: https://epfl.zoom.us/j/384790010
2019 final exam
Last year final exam
Solution to last year final exam
Topics
- PAC learning model (based on Chapters 2-7 in Understanding Machine Learning (UML) by Shalev-Shwartz and Ben David)
- Gradient descent (UML and notes by A. Montanari)
- Tensor decomposition (based on the review on Tensors Decompositions by Rabanser, Shchur and Günnemann).
For more advanced material see also: Algorithmic Aspects of Machine Learning by Moitra
Detailed Schedule
Date | Lectures | Homework | Solutions |
---|---|---|---|
17/2 | Chapters 3 and 4 (in UML) | Exercises 1, 3, 7, 8 of Chapter 3. Exercises 1 and 2 of Chapter 4. |
Solution 1 |
24/2 | Chapter 5 (in UML) | Guided proof of Hoeffding’s inequality | Solution 2 |
2/3 | Chapter 6 (in UML) | Exercise 5 of Chapter 3. Exercises 1 and 3 of Chapter 5. |
Solution 3 |
9/3 | Chapter 7 (in UML) | Graded Homework 1 Due Tuesday, March 24. |
Solution GHW1 |
16/3 | Remaining of Ch. 7 and start of Ch. 14 (in UML)
Video link |
Graded homework 1 continued (due Tuesday, March 24). | |
23/3 | Remaining of Ch. 14 (in UML)
Video link |
Homework 5 | Solution 5 |
30/4 |
Due to unforeseeable reasons we must swap lectures on tensors with the remaining on gradient descent. Thank you for your understanding. Lectures are based on reviews given above + hand written notes are found below (will be typed if time and health permit). Motivations and examples, multi-dimensional arrays, tensor product, tensor rank. Video 1st lecture (part 1 of 2) Video 1st lecture (part 2 of 2) Note: small mistake in formula for M_3th in Lecture-1. Unimportant and exact formula will be seen in exercises. |
Homework 6 | Solution 6 |
6/4 | Tens-chap-1.pdf
Continued. Tensor decompositions and rank, Jennrich’s thm Video 2nd lecture (part 1 of 2) |
Graded Homework 2 Due Tuesday April 21 Not graded: an extra exercise on Moore Penrose pseudoinverse |
Solution GHW2 |
13/4 | Easter Vacations | ||
20/4 | Tens-chap-2.pdf
Matricizations and Alternating Least Squares algorithm Video 3rd lecture (part 1 of 3) Video 3rd lecture (part 2 of 3) |
Homework 8 | Solution 8 |
27/4 | Tens-chap-2.pdf
Multilinear rank Tucker higher order singular value decomposition Video 4th lecture (part 1 of 2) |
Graded Homework 3 Notebook & data for Problem 4 |
Solution GHW3 Notebook solution problem 4 |
4/5 | Tens-chap-3.pdf Power method Applications: Gaussian Mixture Models, Topic models of documents Video 5th lecture (part 1 of 2) |
Graded Homework 3 continued – due May 12th | |
11/5 |
“Lecture notes on two-layer neural networks” by A. Montanari Video link 1 Video link 2 Video link 3 Video link 4 |
Graded Homework 4
New Deadline June 9 |
Solution GHW4 |
18/5 |
Neural Tangent Kernel by Jakot et al. Video link 1 Video link 2 Video link 3 |
||
25/5 | Neural Tangent Kernel by Jakot et al.
Video link 1 Video link 2 |
New deadline for graded Hmw 4: 9 June |
Textbooks and notes:
- Understanding Machine Learning by Shalev-Shwartz and Ben David
- Bayesian Reasoning and Machine Learning by David Barber(Cambridge)
- Pattern recognition and Machine Learning by Christopher Bishop (Springer)
- Introduction to Tensor Decompositions and their Applications in Machine Learning (Ranbaser, Shchur, Gunneman)
- Probability on Graphs. Random processes on graph and lattices by Geoffrey Grimmett (Cambridge) [Chap 7]
- Neural Tangent Kernel references: