We provide improved lower bounds for two well-known high-dimensional private
estimation tasks. First, we prove that for estimating the covariance of a
Gaussian up to spectral error $\alpha$ with approximate differential privacy,
one needs $\tilde{\Omega}\left(\frac{d^{3/2}}{\alpha \varepsilon} +
\frac{d}{\alpha^2}\right)$ samples for any $\alpha \le O(1)$, which is tight up
to logarithmic factors. This improves over previous work which established this
for $\alpha \le O\left(\frac{1}{\sqrt{d}}\right)$, and is also simpler than
previous work. Next, we prove that for estimating the mean of a heavy-tailed
distribution with bounded $k$th moments with approximate differential privacy,
one needs $\tilde{\Omega}\left(\frac{d}{\alpha^{k/(k-1)} \varepsilon} +
\frac{d}{\alpha^2}\right)$ samples. This matches known upper bounds and
improves over the best known lower bound for this problem, which only hold for
pure differential privacy, or when $k = 2$. Our techniques follow the method of
fingerprinting and are generally quite simple. Our lower bound for heavy-tailed
estimation is based on a black-box reduction from privately estimating
identity-covariance Gaussians. Our lower bound for covariance estimation
utilizes a Bayesian approach to show that, under an Inverse Wishart prior
distribution for the covariance matrix, no private estimator can be accurate
even in expectation, without sufficiently many samples.Comment: 23 page