Instructor | Emre Telatar |
Office | INR 117 |
[email protected] | |
Office Hours | By appointment |
Teaching Assistant | Adway Girish |
Office | INR 139 |
[email protected] | |
Office Hours | By appointment |
Teaching Assistant | Adrien Vandenbroucque |
Office | INR 033 |
[email protected] | |
Office Hours | By appointment |
Lectures | Monday | 11h15 – 13h00 (Room: BC 01) |
Tuesday | 13h15 – 15h00 (Room: CM 012) | |
Exercises | Tuesday | 15h15 – 17h00 (Room: DIA 004) |
Language: | English | |
Credits : | 8 ECTS |
Course Information
See the course information.
Announcements
- Please note the room change: Lectures on Tuesday will be held in CM 012.
- The first lecture will be held on Monday, September 9th in BC 01 at 11h15.
- The weekly homeworks are not graded. The graded homework will be announced explicitly.
Date |
Topics Covered | Homework | Solutions | Remarks/ Extra material |
||||
---|---|---|---|---|---|---|---|---|
Sep 9 | Source coding / Data compression: – injective, uniquely decodable, prefix-free (binary) codes – Kraft sum, Kraft inequalities |
HW 1 | ||||||
Sep 10 | – (Partial) converse to Kraft inequality – Expected codeword length: lower bound |
Soln 1 | ||||||
Sep 16 | Public holiday | |||||||
Sep 17 | – Expected codeword length: lower and upper bounds, asymptotic per-letter tightness Information measures: – Entropy – KL divergence |
Soln 2 | ||||||
Sep 23 | – Huffman code/algorithm – Mutual information |
HW 3 | ||||||
Sep 24 | – Conditional mutual information – Examples, properties – Data processing inequality |
Soln 3 | ||||||
Sep 30 | – Chain rule for mutual information – Typicality |
HW 4 | ||||||
Oct 1 | – Properties of typical sets/sequences – Entropy rate |
Soln 4 | ||||||
Oct 7 | – More on entropy rate Universal data compression: – Universal compression: Lempel-Ziv |
HW 5 | LZ notes | |||||
Oct 8 | – More on universal compression | Soln 5 | ||||||
Oct 14 | – Universal compression and prediction Transmission of data: – Channels |
HW 6 | ||||||
Oct 15 | – Stationary, memoryless channels without feedback | Soln 6 | ||||||
Oct 21, 22 | Break | |||||||
Oct 28 | – Converse theorem of channel coding | |||||||
Oct 29 | Midterm exam | Midterm | Midterm soln | |||||
Nov 4 | – KKT conditions for capacity | HW 7 | ||||||
Nov 5 | – Random coding argument to show achievability of coding theorem | Soln 7 | ||||||
Nov 11 | – Channel coding: achievability | HW 8 | ||||||
Nov 12 | Differential entropy: – Definition |
Soln 8 | ||||||
Nov 18 | HW 9 | |||||||
Nov 19 | Soln 9 |
Textbook
-
T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley 2006.
Additional Reading Material
- C. E. Shannon, A Mathematical Theory of Communications, Bell System Technical Journal, 1948.
To Review Basic Probability Theory
-
K. L. Chung, A Course in Probability Theory, Academic Press.
-
W. Feller, An Introduction to Probability Theory and Its Applications (vol. 1), Wiley.
-
G. Grimmett and D. Stirzaker, Probability and Random Processes, Oxford University Press.
-
A. Papoulis, Probability, Random Variables, and Stochastic Processes, McGraw-Hill.
-
S. M. Ross, A First Course in Probability, Pearson Education.