66 research outputs found
An entropy inequality for symmetric random variables
We establish a lower bound on the entropy of weighted sums of (possibly
dependent) random variables possessing a symmetric
joint distribution. Our lower bound is in terms of the joint entropy of . We show that for , the lower bound is tight if and
only if 's are i.i.d.\ Gaussian random variables. For there are
numerous other cases of equality apart from i.i.d.\ Gaussians, which we
completely characterize. Going beyond sums, we also present an inequality for
certain linear transformations of . Our primary technical
contribution lies in the analysis of the equality cases, and our approach
relies on the geometry and the symmetry of the problem.Comment: submitted to ISIT 201
Mean Estimation from Adaptive One-bit Measurements
We consider the problem of estimating the mean of a normal distribution under
the following constraint: the estimator can access only a single bit from each
sample from this distribution. We study the squared error risk in this
estimation as a function of the number of samples and one-bit measurements .
We consider an adaptive estimation setting where the single-bit sent at step
is a function of both the new sample and the previous acquired bits.
For this setting, we show that no estimator can attain asymptotic mean squared
error smaller than times the variance. In other words,
one-bit restriction increases the number of samples required for a prescribed
accuracy of estimation by a factor of at least compared to the
unrestricted case. In addition, we provide an explicit estimator that attains
this asymptotic error, showing that, rather surprisingly, only times
more samples are required in order to attain estimation performance equivalent
to the unrestricted case
- β¦