Binomial and Poisson distributions as maximum entropy distributions

@article{Harremos2001BinomialAP,
  title={Binomial and Poisson distributions as maximum entropy distributions},
  author={Peter Harremo{\"e}s},
  journal={IEEE Trans. Inf. Theory},
  year={2001},
  volume={47},
  pages={2039-2041},
  url={https://api.semanticscholar.org/CorpusID:16171405}
}
The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence

Poisson's law and information theory

    P. HarremoesAureh
    Mathematics, Computer Science
  • 2001
The binomial and the Poisson distributions are shown to be maximum entropy distributions of suitably defined sets. Poisson's law is considered as a case of entropy maximization, and also convergence

Rate of convergence to Poisson law in terms of information divergence

The precise bounds on the information divergence from a binomial distribution to the accompanying Poisson law are obtained. As a corollary, an upper bound for the total variation distance between the

A criterion for the compound poisson distribution to be maximum entropy

It is shown that the compound Poisson does indeed have a natural maximum entropy characterization when the distributions under consideration are log-concave, which complements the recent development by the same authors of an information-theoretic foundation for compoundPoisson approximation inequalities and limit theorems.

Nonuniform bounds in the Poisson approximation with applications to informational distances. II

This part generalizes the results obtained in Part I and removes any constraints on the parameters of the Bernoulli distributions.

On the Maximum Entropy Properties of the Binomial Distribution

It is shown that the Binomial(n,p) distribution maximizes the entropy in the class of ultra-log-concave distributions of order n with fixed mean np to show that the entropy never decreases along the iterations of this Markov chain.

ENTROPY AND THE ‘ COMPOUND ’ LAW OF SMALL NUMBERS

An information-theoretic foundation for compound Poisson approximation limit theorems is presented, in analogy to the corresponding developments for the central limit theorem and for simple Poisson

A Nash Equilibrium related to the Poisson Channel

An information theoretical game is considered where both signal and noise are generalized Bernoulli sums with upper bounds on their mean values. It is shown that a pair of Poisson distributions is a

Moments, Concentration, and Entropy of Log-Concave Distributions

We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in the convex order to derive moments, concentration, and entropy inequalities for certain classes of
...

Some Elementary Results on Poisson Approximation in a Sequence of Bernoulli Trials

In a finite series of independent success-failure trials, the total number of successes has a binomial probability distribution. It is a classical result that this probability distribution is subje...

A general poisson approximation theorem

We study the rate of convergence in a limit theorem due to Kabanov-Liptser-Shiryayev. We show how the probabilities P(N t= k) can be computed from the compensator, when it is deterministic.

Refinements of Yu. V. Prokhorov's theorems on the asymptotic behavior of the binomial distribution

One finds the asymptotic behavior of the minimax distance between the binomial and its approximating normal or Poisson “distributions,” taken with the second terms of their expansions.

ENTROPY AND THE CENTRAL LIMIT THEOREM

On etend un argument de Brown (1982) pour montrer que les informations de Fisher convergent vers la reciproque de la variance

Game Theoretical Equilibrium, Maximum Entropy and Minimum Information Discrimination

Games are considered which rely on the concept of a code and which focus on the interplay between the observer and the system being observed. The games lead to specific principles of Game Theoretical

$I$-Divergence Geometry of Probability Distributions and Minimization Problems

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and

Le Cam's inequality and poisson approximations

where A = P1 + P2 + * .. + Pn Naturally, this inequality contains the classical Poisson limit law (Just set pi = A/n and note that the right side simplifies to 2A2/n), but it also achieves a great