A Caltech Library Service

An Inequality On Entropy

McEliece, Robert J. and Yu, Zhong (1995) An Inequality On Entropy. In: Proceedings 1995 IEEE International Symposium on Information Theory. IEEE , Piscataway, N.J., p. 329. ISBN 0-7803-2453-6.

Full text is not posted in this repository. Consult Related URLs below.

Use this Persistent URL to link to this item:


The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper-bounded by log m. In this paper, we present a theorem which gives a non-trivial lower bound for H(X). We show that for any discrete random variable X with range R={x_0,…,x_(m-1)}, if p_i=Pr{X=x_i} and p_0⩾p_1⩾…pm_(-1), then H(X)⩾(2logm)/(m-1)Σ_(i=0)^(m-1)ip_i, with equality iff (i) X is uniformly distributed, i.e., p_i=1/m for all i, or trivially (ii) p_0=1, and p i=0 for 1⩽i⩽m-1.

Item Type:Book Section
Related URLs:
Additional Information:© 1995 IEEE. Date of Current Version: 06 August 2002. This work was support by a Grant from Pacific Bell.
Funding AgencyGrant Number
Other Numbering System:
Other Numbering System NameOther Numbering System ID
INSPEC Accession Number5280076
Record Number:CaltechAUTHORS:20120223-100956421
Persistent URL:
Official Citation:McEliece, R.J.; Zhong Yu; , "An inequality on entropy," Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on , vol., no., pp.329, 17-22 Sep 1995 doi: 10.1109/ISIT.1995.550316 URL:
Usage Policy:No commercial reproduction, distribution, display or performance rights in this work are provided.
ID Code:29436
Deposited By: Ruth Sustaita
Deposited On:23 Feb 2012 18:49
Last Modified:09 Nov 2021 17:07

Repository Staff Only: item control page