Instructor | Michael Gastpar | ||
Office | INR 130 | ||
[email protected] | |||
Office Hours | By appointment | ||
Lectures | Mondays, |
Room | https://epfl.zoom.us/j/85752913310 |
Exercises | integrated with lecture | Room |
Language: English Credits: 2 ECTS
Topics
- Information Measures
- The role of the Gaussian distribution
- Network Information Theory
Detailed Schedule
Note: This is very tentative and subject to change during the semester.
Date | Lectures | Reading & References | Exercises |
---|---|---|---|
22/2 | No class. | ||
1/3 | Introduction. Review. Kullback-Leibler Divergence. | Cover and Thomas, ch. 2 | |
8/3 |
Entropy. Differential Entropy. Mutual Information. Maximum Entropy. Maximal Correlation. |
Cover and Thomas, ch. 2 and 9
|
|
15/3 |
Maximal Correlation. Common Information (Wyner, Gacs-Körner) |
A. Wyner, The common information of two dependent random variables. (1975) | |
22/3 |
Common Information (Wyner, Gacs-Körner) Directed Information. Multivariate Information. |
||
29/3 |
Multivariate Information. f-divergences |
||
12/4 | f-divergences. | ||
19/4 | Renyi entropy, Sibson mutual information. | ||
26/4 | Applications: generalization error. | ||
3/5 | Network Information Theory | ||
10/5 | |||
17/5 | |||
31/5 |