Binomial and Poisson distributions as maximum entropy distributions
@article{Harremos2001BinomialAP, title={Binomial and Poisson distributions as maximum entropy distributions}, author={Peter Harremo{\"e}s}, journal={IEEE Trans. Inf. Theory}, year={2001}, volume={47}, pages={2039-2041}, url={https://api.semanticscholar.org/CorpusID:16171405} }
The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence…
138 Citations
Poisson's law and information theory
- 2001
Mathematics, Computer Science
The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence…
Rate of convergence to Poisson law in terms of information divergence
- 2004
Computer Science, Mathematics
The precise bounds on the information divergence from a binomial distribution to the accompanying Poisson law are obtained. As a corollary, an upper bound for the total variation distance between the…
A criterion for the compound poisson distribution to be maximum entropy
- 2009
Computer Science, Mathematics
It is shown that the compound Poisson does indeed have a natural maximum entropy characterization when the distributions under consideration are log-concave, which complements the recent development by the same authors of an information-theoretic foundation for compoundPoisson approximation inequalities and limit theorems.
Nonuniform bounds in the Poisson approximation with applications to informational distances. II
- 2019
Mathematics, Computer Science
This part generalizes the results obtained in Part I and removes any constraints on the parameters of the Bernoulli distributions.
On the Maximum Entropy Properties of the Binomial Distribution
- 2008
Computer Science, Mathematics
It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.
ENTROPY AND THE ‘ COMPOUND ’ LAW OF SMALL NUMBERS
- 2008
Mathematics
An information-theoretic foundation for compound Poisson approximation limit theorems is presented, in analogy to the corresponding developments for the central limit theorem and for simple Poisson…
A Nash Equilibrium related to the Poisson Channel
- 2003
Computer Science, Mathematics
An information theoretical game is considered where both signal and noise are generalized Bernoulli sums with upper bounds on their mean values. It is shown that a pair of Poisson distributions is a…
Moments, Concentration, and Entropy of Log-Concave Distributions
- 2022
Mathematics
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in the convex order to derive moments, concentration, and entropy inequalities for certain classes of…
18 References
Some Elementary Results on Poisson Approximation in a Sequence of Bernoulli Trials
- 1978
Mathematics
In a finite series of independent success-failure trials, the total number of successes has a binomial probability distribution. It is a classical result that this probability distribution is subje...
Entropy of the Sum of Independent Bernoulli Random Variables and of the Multinomial Distribution
- 1981
Mathematics
A general poisson approximation theorem
- 1982
Mathematics
We study the rate of convergence in a limit theorem due to Kabanov-Liptser-Shiryayev. We show how the probabilities P(N t= k) can be computed from the compensator, when it is deterministic.
Refinements of Yu. V. Prokhorov's theorems on the asymptotic behavior of the binomial distribution
- 1987
Mathematics
One finds the asymptotic behavior of the minimax distance between the binomial and its approximating normal or Poisson “distributions,” taken with the second terms of their expansions.
ENTROPY AND THE CENTRAL LIMIT THEOREM
- 1986
Mathematics
On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance
Game Theoretical Equilibrium, Maximum Entropy and Minimum Information Discrimination
- 1993
Physics
Games are considered which rely on the concept of a code and which focus on the interplay between the observer and the system being observed. The games lead to specific principles of Game Theoretical…
$I$-Divergence Geometry of Probability Distributions and Minimization Problems
- 1975
Mathematics
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and…
Le Cam's inequality and poisson approximations
- 1994
Mathematics
where A = P1 + P2 + * .. + Pn Naturally, this inequality contains the classical Poisson limit law (Just set pi = A/n and note that the right side simplifies to 2A2/n), but it also achieves a great…