Papers to study
E.T.Jaynes, Concentration of Distributions at Entropy Maxima
P. K. Newton and S. A. DeSalvo, The Shannon entropy of Sudoku matrices
I. Csiszar, Information Theoretic Methods in Probability and Statistics
H. Stern and T. Cover, Maximum Entropy and the Lottery
T. Cover, The Worst Additive Noise Under a Covariance Constraint
O. Rioul, A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information
J. M. Van Campenhout and T. M. Cover, Maximum Entropy and Conditional Probability
C. Villani, A Short Proof of the “Concavity of Entropy Power”
D. Chafai, Gaussian Maximum of Entropy and Reversed Log-Sobolev Inequality
Reading Material
Ali & Silvey, A general class of Coefficients of Divergence of One distribution from Another
I. Csiszar, Axiomatic Characterizations of Information Measures
F. Österreicher, Csiszar’s f-divergences – Basic Properties
M.M. Deza, Encyclopedia of Distances, Ch. 14
A.Dembo, T. Cover, J. Thomas, Information Theoretic Inequalities
A.J. Stam, Some Inequalities satisfied by the quantities of Information of Fisher and Shannon
N. M. Blachman, The Convolution Inequality for Entropy Powers
Presentations
Schedule Wednesday April 20
Room: INF 213
until 12:00, INM 203
after 12:00
10:00 AM | Karim ALI | A. Barron, Entropy and the Central Limit Theorem |
||
10:30 AM | Marc DESGROSEILLERS | C. Villani, A Short Proof of the Concavity of Entropy Power |
||
11:00 AM | Patrick FARNOLE | E.T.Jaynes, Concentration of Distributions at Entropy Maxima |
||
11:30 AM | Anastasiya TYCHINSKAYA | P. K. Newton and S. A. DeSalvo, The Shannon entropy of Sudoku matrices |
||
12:00 AM | Lyudmila YARTSEVA | J. Aczel, Measuring information beyond communication theory: Why some generalized information measures may be useful, others not |
12:30 AM | Mohamed KAFSI | L. Ekroot and T. Cover, The entropy of Markov Trajectories |
13:00 AM | Saurabh DESHPANDE | O. Rioul, A Simple Proof of the Entropy-Power Inequality via Properties of Mutual Information |