An (n,k)-Poisson Multinomial Distribution (PMD) is the distribution of the
sum of n independent random vectors supported on the set Bk={e1,…,ek} of standard basis vectors in Rk. We prove
a structural characterization of these distributions, showing that, for all
ε>0, any (n,k)-Poisson multinomial random vector is
ε-close, in total variation distance, to the sum of a discretized
multidimensional Gaussian and an independent (poly(k/ε),k)-Poisson multinomial random vector. Our structural characterization extends
the multi-dimensional CLT of Valiant and Valiant, by simultaneously applying to
all approximation requirements ε. In particular, it overcomes
factors depending on logn and, importantly, the minimum eigenvalue of the
PMD's covariance matrix from the distance to a multidimensional Gaussian random
variable.
We use our structural characterization to obtain an ε-cover, in
total variation distance, of the set of all (n,k)-PMDs, significantly
improving the cover size of Daskalakis and Papadimitriou, and obtaining the
same qualitative dependence of the cover size on n and ε as the
k=2 cover of Daskalakis and Papadimitriou. We further exploit this structure
to show that (n,k)-PMDs can be learned to within ε in total
variation distance from O~k(1/ε2) samples, which is
near-optimal in terms of dependence on ε and independent of n. In
particular, our result generalizes the single-dimensional result of Daskalakis,
Diakonikolas, and Servedio for Poisson Binomials to arbitrary dimension.Comment: 49 pages, extended abstract appeared in FOCS 201