Instructor | Nicolas Macris | Instructor | Ruediger Urbanke | |||
Office | INR 134 | Office | INR 116 | |||
[email protected] | [email protected] | |||||
Office Hours | By appointment | Office Hours | By appointment |
Teaching Assistant | Chan Chun Lam | [email protected] | Office | INR032 | ||
Teaching Assistant | Kirill Ivanov |
[email protected] | Office | INR 030 | ||
Teaching Assistant | Clement Luneau |
[email protected] |
Office | INR 141 |
Language: | English | |
Credits : | 4 ECTS |
Prerequisites:
- Analysis I, II, III
- Linear Algebra
- Machine learning
- Probability
- Algorithms (CS-250)
Here is a link to official coursebook information.
Homework:
Some homework will be graded.
Grading:
If you do not hand in your final exam your overall grade will be NA. Otherwise, your grade will be determined based on the following weighted average: 10 % for the Homework, 90 % for the Final Exam. For the graded homeworks, you can discuss the homework with other people. But you have to write down your own solution and note on the first page the set of people that you discussed with.
Special Announcements
Topics
- PAC learning model (based on Chapters 2-7 in Understanding Machine Learning (UML) by Shalev-Shwartz and Ben David)
- Gradient descent (UML and notes by A. Montanari)
- Graphical models (based on Chapters 1-5 and 9-11 in Bayesian Reasoning and Machine Learning by David Barber and Chap 8 in Pattern Recognition and Machine Learning by Christopher Bishop) PGM-Lect-1.pdf PGM-Lect-2.pdf Notes-Message-Passing.pdf PGM-Lect-3.pdf PGM-Lect-4.pdf
- Tensor decomposition (based on the review on Tensors Decompositions (Ranbaser, Shschur, Gunneman). For more advanced material see also: http://people.csail.mit.edu/moitra/docs/bookex.pdf) Tens-Lect-1.pdf Tens-Lect-2.pdf Tens-Lect-3.pdf
Detailed Schedule
(tentative, subject to changes)
Date | Lectures | Exercises | Solutions | |
---|---|---|---|---|
18/2 | Chap 3 and 4 (in UML) | 3.1; 3.3; 3.7; 3.8; 4.1; 4.2 | ||
25/2 | Chap 5 (in UML) | idem ++ | ||
4/3 | Chap 6 (in UML) | Graded: 5.1; 6.2; 6.5; 6.8; 6.9; 7.3 | ||
11/3 | Chap 7 (in UML) | idem | ||
18/3 | remaining of Chap 7 and Chap 14 start (in UML) | Deadline for handing in graded homework 19/3 during exercise session | ||
25/3 | remaining of Chap 14 (in UML) | 2nd graded homework:
Deadline 16 April |
||
1/4 | “Lecture notes on two-layer neural networks” by A. Montanari | Hand-out of 1st graded homework
(lecture and exercise session) |
||
8/4 | Introduction to graphical probabilistic models
(Chap 3 and 4 in D. Barber and Chap 8 in C. Bishop) |
2nd graded homework continued (exercise 5).
Deadline is 16 April |
||
15/4 | Factor graphs, Marginalization.
Notes on message passing for marginalization (sum-product algorithm) (Chap 4, 5 in Barber, Chap 8 in Bishop) |
Exercise 6 | ||
22/4 | Vacations | |||
29/4 | Sampling: Ancestral sampling for belief Networks and MCMC.
Learning graphical models: (Barber paragraphs 9.3 and 9.6 mostly 9.6.1) |
Exercises 6 continued. Use notes on message passing for problems 8, 9, 10 | ||
6/5 | Variational bayes EM, standard EM
Learning graphical models: (Barber 11.2 mostly 11.2.1, 11.2.2 and 11.5. |
3rd graded hmw
New Deadline: May 28. |
||
13/5 | Tensor methods: Next three classes based on the Review
Tensor product, Rank, Jennrich’s thm |
4th graded hmw
New Deadline: June 4 in mailbox in IPG corridor (INR) or with the assistants. |
||
20/5 | Tens-Lect-2.pdf
ALS, multilinear rank, Tucker HOSVD |
|||
27/5 | Tens-Lect-3.pdf
Applications: GMM, Topic models, multiview models. If time permits: Power Method, Whitening |
Textbooks and notes:
- Understanding Machine Learning by Shalev-Shwartz and Ben David
- Bayesian Reasoning and Machine Learning by David Barber(Cambridge)
- Pattern recognition and Machine Learning by Christopher Bishop (Springer)
- Introduction to Tensor Decompositions and their Applications in Machine Learning (Ranbaser, Shchur, Gunneman)
- Probability on Graphs. Random processes on graph and lattices by Geoffrey Grimmett (Cambridge) [Chap 7]