On the Entropy Region of Discrete and Continuous Random Variables and Network Information Theory
- Creators
- Shadbakht, Sormeh
- Hassibi, Babak
Abstract
We show that a large class of network information theory problems can be cast as convex optimization over the convex space of entropy vectors. A vector in 2^(n) - 1 dimensional space is called entropic if each of its entries can be regarded as the joint entropy of a particular subset of n random variables (note that any set of size n has 2^(n) - 1 nonempty subsets.) While an explicit characterization of the space of entropy vectors is well-known for n = 2, 3 random variables, it is unknown for n > 3 (which is why most network information theory problems are open.) We will construct inner bounds to the space of entropic vectors using tools such as quasi-uniform distributions, lattices, and Cayley's hyperdeterminant.
Additional Information
© 2008 IEEE. Issue Date: 26-29 Oct. 2008;Date of Current Version: 12 June 2009. This work was supported in part by the National Science Foundation through grant CCF-0729203, by the David and Lucille Packard Foundation, by the Office of Naval Research through a MURI under contract no. N00014-08-1-0747, and by Caltech's Lee Center for Advanced Networking.Attached Files
Published - Shadbakht2008p80662008_42Nd_Asilomar_Conference_On_Signals_Systems_And_Computers_Vols_1-4.pdf
Files
Name | Size | Download all |
---|---|---|
md5:3cf3f2989c2237dbe82493c8c6dce53d
|
150.7 kB | Preview Download |
Additional details
- Eprint ID
- 19180
- Resolver ID
- CaltechAUTHORS:20100726-103900559
- NSF
- CCF-0729203
- David and Lucile Packard Foundation
- Office of Naval Research (ONR)
- N00014-08-1-0747
- Caltech Lee Center for Advanced Networking
- Created
-
2010-07-27Created from EPrint's datestamp field
- Updated
-
2021-11-08Created from EPrint's last_modified field
- Other Numbering System Name
- INSPEC Accession Number
- Other Numbering System Identifier
- 10718960