Jaggi, Sidharth and Effros, Michelle (2004) Communication and distributional complexity of joint probability mass functions. In: International Symposium on Information Theory (ISIT '04), Chicago, IL, 27 June-2 July 2004. IEEE , Piscataway, NJ, p. 362. ISBN 0-7803-8280-3 http://resolver.caltech.edu/CaltechAUTHORS:JAGisit04
See Usage Policy.
Use this Persistent URL to link to this item: http://resolver.caltech.edu/CaltechAUTHORS:JAGisit04
The problem of truly-lossless (Pe = 0) distributed source coding  requires knowledge of the joint statistics of the sources. In particular the locations of the zeroes of the probability mass functions (pmfs) are crucial for encoding at rates below (H(X),H(Y)) . We consider the distributed computation of the empirical joint pmf Pn of a sequence of random variable pairs observed at physically separated nodes of a network. We consider both worst-case and average measures of information exchange and treat both exact calculation of Pn and a notion of approximation. We find that in all cases the communication cost grows linearly with the size of the input. Further, we consider the problem of determining whether the empirical pmf has a zero in a particular location and show that in most cases considered this also requires a communication cost that is linear in the input size.
|Item Type:||Book Section|
|Additional Information:||© Copyright 2004 IEEE. Reprinted with permission. Posted online: 2005-01-10. This work was supported by NSF Grant CCR-0220039 and a grant from the Lee Center for Advanced Networking at Caltech.|
|Subject Keywords:||source coding theory; network information theory|
|Usage Policy:||No commercial reproduction, distribution, display or performance rights in this work are provided.|
|Deposited By:||Archive Administrator|
|Deposited On:||13 Feb 2007|
|Last Modified:||22 Jun 2016 23:07|
Repository Staff Only: item control page