876,296 research outputs found
Compound poisson approximation via information functionals
An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Nonasymptotic bounds are derived for the distance between the distribution of a sum of independent integer-valued random variables and an appropriately chosen compound Poisson law. In the case where all summands have the same conditional distribution given that they are non-zero, a bound on the relative entropy distance between their sum and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the summands have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals,'' and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds
On Deterministically Approximating Total Variation Distance
Total variation distance (TV distance) is an important measure for the
difference between two distributions. Recently, there has been progress in
approximating the TV distance between product distributions: a deterministic
algorithm for a restricted class of product distributions (Bhattacharyya,
Gayen, Meel, Myrisiotis, Pavan and Vinodchandran 2023) and a randomized
algorithm for general product distributions (Feng, Guo, Jerrum and Wang 2023).
We give a deterministic fully polynomial-time approximation algorithm (FPTAS)
for the TV distance between product distributions. Given two product
distributions and over , our algorithm
approximates their TV distance with relative error in time
.
Our algorithm is built around two key concepts: 1) The likelihood ratio as a
distribution, which captures sufficient information to compute the TV distance.
2) We introduce a metric between likelihood ratio distributions, called the
minimum total variation distance. Our algorithm computes a sparsified
likelihood ratio distribution that is close to the original one w.r.t. the new
metric. The approximated TV distance can be computed from the sparsified
likelihood ratio.
Our technique also implies deterministic FPTAS for the TV distance between
Markov chains
Bounds for Approximation in Total Variation Distance by Quantum Circuits
It was recently shown that for reasonable notions of approximation of states
and functions by quantum circuits, almost all states and functions are
exponentially hard to approximate [Knill 1995]. The bounds obtained are
asymptotically tight except for the one based on total variation distance
(TVD). TVD is the most relevant metric for the performance of a quantum
circuit. In this paper we obtain asymptotically tight bounds for TVD. We show
that in a natural sense, almost all states are hard to approximate to within a
TVD of 2/e-\epsilon even for exponentially small \epsilon. The quantity 2/e is
asymptotically the average distance to the uniform distribution. Almost all
states with probability amplitudes concentrated in a small fraction of the
space are hard to approximate to within a TVD of 2-\epsilon. These results
imply that non-uniform quantum circuit complexity is non-trivial in any
reasonable model. They also reinforce the notion that the relative information
distance between states (which is based on the difficulty of transforming one
state to another) fully reflects the dimensionality of the space of qubits, not
the number of qubits.Comment: uuencoded compressed postscript, LACES 68Q-95-3
- …