A Caltech Library Service

Communication and distributional complexity of joint probability mass functions

Jaggi, Sidharth and Effros, Michelle (2004) Communication and distributional complexity of joint probability mass functions. In: International Symposium on Information Theory (ISIT '04), Chicago, IL, 27 June-2 July 2004. IEEE , Piscataway, NJ, p. 362. ISBN 0-7803-8280-3.

See Usage Policy.


Use this Persistent URL to link to this item:


The problem of truly-lossless (Pe = 0) distributed source coding [1] requires knowledge of the joint statistics of the sources. In particular the locations of the zeroes of the probability mass functions (pmfs) are crucial for encoding at rates below (H(X),H(Y)) [2]. We consider the distributed computation of the empirical joint pmf Pn of a sequence of random variable pairs observed at physically separated nodes of a network. We consider both worst-case and average measures of information exchange and treat both exact calculation of Pn and a notion of approximation. We find that in all cases the communication cost grows linearly with the size of the input. Further, we consider the problem of determining whether the empirical pmf has a zero in a particular location and show that in most cases considered this also requires a communication cost that is linear in the input size.

Item Type:Book Section
Related URLs:
URLURL TypeDescription
Additional Information:© Copyright 2004 IEEE. Reprinted with permission. Posted online: 2005-01-10. This work was supported by NSF Grant CCR-0220039 and a grant from the Lee Center for Advanced Networking at Caltech.
Subject Keywords:source coding theory; network information theory
Record Number:CaltechAUTHORS:JAGisit04
Persistent URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:7435
Deposited By: Archive Administrator
Deposited On:13 Feb 2007
Last Modified:08 Nov 2021 20:42

Repository Staff Only: item control page