Recent Results and On-Going Research
Algebraic Network Information Theory
- Classical information-theoretic (achievability) proofs are based on an ingenious application of the probabilistic method called the random coding argument, according to which codes are drawn i.i.d. at random from a judiciously chosen distribution. This directly implies the existence of a code whose performance is at least as good as the average performance of such a randomly drawn code.
- A first inkling that for general communication scenarios, this is not sufficiently powerful, was gleaned by Korner and Marton in 1979, studying a special distributed source coding problem with two encoders (observing correlated binary sequences) and a decoder whose goal is to recover the modulo-2 sum of these two binary sequences.
- The starting point of our research is our work on computation over multiple-access channels (starting with our paper at the 2005 Allerton Conference on Communications, Control and Computation), which led to “compute-and-forward.” In this work, we realized that in some scenarios, it is crucial that the codes have algebraic structure – this enables a much improved performance.
- Building upon these results, we are now working towards a full Network Information Theory based upon algebraically structured codes.
Main Research Results
Network Information Theory
- A simple alternative proof and sharpening of the celebrated “interference alignment” result.B. Nazer, M. Gastpar, S. A. Jafar, and S. Vishwanath. Ergodic Interference Alignment. IEEE Transactions on Information Theory, 58(10):6355 – 6371, October 2012.
- The fourth fundamental relaying technique, “compute-and-forward.” (Previous techniques are “decode-and-forward”, “compress-and-forward/noisy network coding”, and “amplify-and-forward”.)B. Nazer and M. Gastpar. Compute-and-Forward: Harnessing Interference Through Structured Codes. IEEE Transactions on Information Theory, 57(10):6463-6486, October 2011.
(2013 Communications Society & Information Theory Society Joint Paper Award) - Cognitive radio networks may have to observe constraints on the interference power they generate onto other networks. We studied the resulting information-theoretic capacity problem. Surprisingly, for some networks that are notoriously difficult under standard transmit power constraints, capacity can be found in closed form under such interference power constraints.M. Gastpar. On capacity under receive and spatial spectrum-sharing constraints. IEEE Transactions on Information Theory, 53(2):471-487, February 2007.
- General relay protocols for relay networks.G. Kramer, M. Gastpar, and P. Gupta. Cooperative strategies and capacity theorems for relay networks. IEEE Transactions on Information Theory, 51(9):3037-3063, September 2005.
- Example showing that separating source from channel coding can be suboptimal in a scaling-law sense.M. Gastpar and M. Vetterli. Source-channel communication in sensor networks. Lecture Notes in Computer Science vol. 2634, Springer: New York, April 2003, p. 162-177.
(Later, we found a tight result: M. Gastpar. Uncoded transmission is exactly optimal for a simple Gaussian sensor network. IEEE Transactions on Information Theory, 54(11):5247-5251, November 2008.) - Significance of the “amplify-and-forward” relaying protocol in large relay networks.M. Gastpar and M. Vetterli. On the capacity of wireless networks: The relay case. In Proc IEEE Infocom 2002, New York, June 2002.
Distributed Signal Processing
- In Compressed Sensing, a fundamental task concerns recovering the sparsity pattern from noisy random projections. We show that for interesting sampling rates, exact recovery is not possible. We characterize the tradeoff between approximate recovery and the sampling rate.G. A. Reeves and M. C. Gastpar. The Sampling Rate-Distortion Tradeoff for Sparsity Pattern Recovery in Compressed Sensing. IEEE Transactions on Information Theory, vol. 58, num. 5, p. 3065-3092, 2012.
- A fundamental task of signal processing is to extract principal components from high-dimensional data. In a distributed setting, each sensor only observes a part of the high-dimensional data, and principal components have to be extraced partially in a local fashion. To solve this problem, we developed the “Distributed Karhunen-Loève Transform”.M. Gastpar, P. L. Dragotti, and M. Vetterli. The distributed Karhunen-Loeve transform. IEEE Transactions on Information Theory, 52(12):5177-5196, December 2006.
Statistical Neuroscience
- We develop approaches to deal with large data sets. Here are two current examples of on-going studies:K. So, A. C. Koralek, K. Ganguly, M. C. Gastpar, J. M. Carmena, Assessing functional connectivity of neural ensembles using directed information. Journal of Neural Engineering, vol. 9, 2012.K. So, K. Ganguly, J. Jimenez, M. C. Gastpar and J. M. Carmena. Redundant information encoding in primary motor cortex during natural and prosthetic motor control. Journal of Computational Neuroscience, vol. 32, num. 3, p. 555-561, 2012.
Research Meetings that we (co-)organized
2012
- Workshop on Methods of Information Theory in Computational Neuroscience held at CNS*2012, Atlanta/Decatur, GA, July 25-26, 2012
2011
- Workshop on Algebraic Structure in Network Information Theory held at the Banff International Research Station (BIRS), Banff, Canada, August 14-19, 2011
- Tutorial on Algebraic Structure in Network Information Theory held at the 2011 International Symposium on Information Theory, St. Petersburg, Russia, July 31 – August 5, 2011
- Workshop on Methods of Information Theory in Computational Neuroscience held at CNS*2011, Stockholm, Sweden, July 27-28, 2011