Instructor | Emre Telatar |
Office | INR 117 |
Phone | +41 21 69 37693 |
[email protected] | |
Office Hours | By appointment |
Teaching Assistant | Mert Bora Dogan |
Phone | +41 21 69 38790 |
Office | INR 034 |
[email protected] | |
Office Hours | By appointment |
Teaching Assistant | Marco Bondaschi |
Phone | +41 21 69 35139 |
Office | INR 111 |
[email protected] | |
Office Hours | By appointment |
Teaching Assistant | Lifu Jin |
[email protected] | |
Office Hours | By appointment |
Lectures | Monday | 11:00 – 13:00 (Room: BC 01) |
Tuesday | 13:00 – 15:00 (Room: ELA 2) | |
Exercises | Tuesday | 15:00 – 17:00 (Room: ELA 2) |
Language: | English | |
Credits : | 8 ECTS |
Course Information
See the course information.
Special Announcements
The weekly homeworks are not graded. The graded homework will be announced explicitly.
Midterm Exam
- Tuesday, November 1st, 13h15 – 16h00, ELA2.
- The midterm is worth 40% of the grade.
- The exam covers the material up to the lecture of October 18 included.
- You are allowed to bring ONE A4 cheat sheet, whose BOTH sides can be scribed.
- Bring your CAMIPRO cards, or a valid ID with a visible photo.
Graded Homework
- Due Monday, December 12nd, 23:59.
- The graded homework is worth 10% of the grade.
Final Exam
- Friday, January 27th, 15h15 – 18h15, CM1120.
- The midterm is worth 50% of the grade.
- You are allowed to bring TWO A4 cheat sheet, whose BOTH sides can be scribed.
- Bring your CAMIPRO cards, or a valid ID with a visible photo.
Detailed Schedule
Date |
Topics Covered | Assignment | Solutions | Remarks | ||||
---|---|---|---|---|---|---|---|---|
Sep 19 | Public Holiday | |||||||
Sep 20 | – Introduction to Source Coding – Non-singular codes – Uniquely decodable codes – Prefix-free codes – Kraft’s inequality for prefix-free codes – Kraft’s sum for products of codes – Kraft’s inequality for non-singular codes |
hw1.pdf | sol1.pdf | |||||
Sep 26 | – Reverse Kraft’s inequality – Kullback-Leibler divergence – Expected codeword length – Upper and lower bounds to the expected codeword length |
|||||||
Sep 27 | – Expected codeword length per symbol – Entropy – Optimal codes and their properties – Huffman procedure |
hw2.pdf | sol2.pdf | |||||
Oct 3 | – Conditional entropy – Chain rule for entropy – (Conditional) mutual information – Chain rule for mutual information – Data processing inequality |
|||||||
Oct 4 | – Upper and lower bounds to the entropy – Fano’s inequality – Entropy rate – Entropy rate for stationary processes |
hw3.pdf | sol3.pdf | |||||
Oct 10 | – Compression theorem for stationary processes – Compression of IID processes – Typicality theorems |
|||||||
Oct 11 | – Asymptotic equipartition property – Source coding via typicality |
hw4.pdf | sol4.pdf | |||||
Oct 17 | – Invertible finite state machines, information lossless (IL) machines – Lower bound to the output length for IL FSMs |
|||||||
Oct 18 | – Lempel-Ziv algorithm – Upper bound to the output length for Lempel-Ziv – Minimax regret |
hw5.pdf | sol5.pdf | |||||
Oct 24 | – Communication channels – Memoryless channels, stationary channels, feedback – Multi-letter Fano’s inequality – Channel coding converse |
|||||||
Oct 25 | – KKT conditions for capacity-achieving input distribution | hw6.pdf | sol6.pdf | |||||
Oct 31 | – KKT conditions for capacity-achieving input distribution – Computation of capacity |
|||||||
Nov 1 | Midterm | |||||||
Nov 7 | – Channel coding theorem and proofs by random constructions | |||||||
Nov 8 | – Differential entropy and other information measures – Relationship between H(X) and h(X) |
hw7.pdf | sol7.pdf | |||||
Nov 14 | – Differential information measures and quantization – Other properties of information measures for continuous distributions – Entropy maximizing distributions – Capacity of the additive Gaussian noise channel – Saddle point property of Gaussian distributions – Discrete time additive Gaussian noise channel and waveform channel |
|||||||
Nov 15 | – Parallel Gaussian Channels, water-filling – Bandlimited AWGN Channels |
hw8.pdf | sol8.pdf | |||||
Nov 21 | – Rate-Distortion Theory | |||||||
Nov 22 |
– Rate-Distortion Theory – Distributed source coding |
hw9.pdf | sol9.pdf | |||||
Nov 28 |
– Linear Codes – Generator and Parity Check Matrices |
|||||||
Nov 29 |
– Hamming Codes – Hamming Weight – Hamming Distance |
hw10.pdf | sol10.pdf | |||||
Dec 5 |
– Sphere-packing bound – Singleton Bound – Gilbert-Varshamov Bound – Codes over non-binary fields |
|||||||
Dec 6 |
– Reed-Solomon codes | hw11.pdf | sol11.pdf | |||||
Dec 12 |
– Polar codes | |||||||
Dec 13 |
– Polar codes | hw12.pdf | sol12.pdf | |||||
Dec 19 |
– Polar codes | |||||||
Dec 20 |
– Polar codes | hw13.pdf | sol13.pdf |
Textbook
-
T. M. Cover and J. A. Thomas, Elements of Information Theory, Wiley 2006.
Additional Reading Material
- C. E. Shannon, A Mathematical Theory of Communications, Bell System Technical Journal, 1948.
-
Notes on Tunstall Coding.
To Review Basic Probability Theory
-
K. L. Chung, A Course in Probability Theory, Academic Press.
-
W. Feller, An Introduction to Probability Theory and Its Applications, Wiley.
-
G. Grimmett and D. Stirzaker, Probability and Random Processes, Oxford University Press.
-
A. Papoulis, Probability, Random Variables, and Stochastic Processes, McGraw-Hill.
-
S. M. Ross, A First Course in Probability, Pearson Education.