- Series
- Combinatorics Seminar
- Time
- Friday, April 24, 2009 - 3:00pm for 1 hour (actually 50 minutes)
- Location
- Skiles 255
- Speaker
- Mokshay Madiman – Department of Statistics, Yale University
- Organizer
- Prasad Tetali
We develop an information-theoretic foundation for compound Poisson
approximation and limit theorems (analogous to the corresponding
developments for the central limit theorem and for simple Poisson
approximation). First, sufficient conditions are given under which the
compound Poisson distribution has maximal entropy within a natural
class of probability measures on the nonnegative integers. In
particular, it is shown that a maximum entropy property is valid
if the measures under consideration are log-concave, but that it
fails in general. Second, approximation bounds in the (strong)
relative entropy sense are given for distributional approximation
of sums of independent nonnegative integer valued random variables
by compound Poisson distributions. The proof techniques involve the
use of a notion of local information quantities that generalize the
classical Fisher information used for normal approximation, as well
as the use of ingredients from Stein's method for compound Poisson
approximation. This work is joint with Andrew Barbour (Zurich),
Oliver Johnson (Bristol) and Ioannis Kontoyiannis (AUEB).