990 research outputs found
A Size-Free CLT for Poisson Multinomials and its Applications
An -Poisson Multinomial Distribution (PMD) is the distribution of the
sum of independent random vectors supported on the set of standard basis vectors in . We show
that any -PMD is -close in total
variation distance to the (appropriately discretized) multi-dimensional
Gaussian with the same first two moments, removing the dependence on from
the Central Limit Theorem of Valiant and Valiant. Interestingly, our CLT is
obtained by bootstrapping the Valiant-Valiant CLT itself through the structural
characterization of PMDs shown in recent work by Daskalakis, Kamath, and
Tzamos. In turn, our stronger CLT can be leveraged to obtain an efficient PTAS
for approximate Nash equilibria in anonymous games, significantly improving the
state of the art, and matching qualitatively the running time dependence on
and of the best known algorithm for two-strategy anonymous
games. Our new CLT also enables the construction of covers for the set of
-PMDs, which are proper and whose size is shown to be essentially
optimal. Our cover construction combines our CLT with the Shapley-Folkman
theorem and recent sparsification results for Laplacian matrices by Batson,
Spielman, and Srivastava. Our cover size lower bound is based on an algebraic
geometric construction. Finally, leveraging the structural properties of the
Fourier spectrum of PMDs we show that these distributions can be learned from
samples in -time, removing
the quasi-polynomial dependence of the running time on from the
algorithm of Daskalakis, Kamath, and Tzamos.Comment: To appear in STOC 201
Probabilistic Framework for Sensor Management
A probabilistic sensor management framework is introduced, which maximizes the utility of sensor systems with many different sensing modalities by dynamically configuring the sensor system in the most beneficial way. For this purpose, techniques from stochastic control and Bayesian estimation are combined such that long-term effects of possible sensor configurations and stochastic uncertainties resulting from noisy measurements can be incorporated into the sensor management decisions
Robustly Learning Mixtures of Arbitrary Gaussians
We give a polynomial-time algorithm for the problem of robustly estimating a
mixture of arbitrary Gaussians in , for any fixed , in the
presence of a constant fraction of arbitrary corruptions. This resolves the
main open problem in several previous works on algorithmic robust statistics,
which addressed the special cases of robustly estimating (a) a single Gaussian,
(b) a mixture of TV-distance separated Gaussians, and (c) a uniform mixture of
two Gaussians. Our main tools are an efficient \emph{partial clustering}
algorithm that relies on the sum-of-squares method, and a novel \emph{tensor
decomposition} algorithm that allows errors in both Frobenius norm and low-rank
terms.Comment: This version extends the previous one to yield 1) robust proper
learning algorithm with poly(eps) error and 2) an information theoretic
argument proving that the same algorithms in fact also yield parameter
recovery guarantees. The updates are included in Sections 7,8, and 9 and the
main result from the previous version (Thm 1.4) is presented and proved in
Section
- …