We consider the problem of recovering a distribution function on the real
line from observations additively contaminated with errors following the
standard Laplace distribution. Assuming that the latent distribution is
completely unknown leads to a nonparametric deconvolution problem. We begin by
studying the rates of convergence relative to the L2-norm and the Hellinger
metric for the direct problem of estimating the sampling density, which is a
mixture of Laplace densities with a possibly unbounded set of locations: the
rate of convergence for the Bayes' density estimator corresponding to a
Dirichlet process prior over the space of all mixing distributions on the real
line matches, up to a logarithmic factor, with the nβ3/8log1/8n rate
for the maximum likelihood estimator. Then, appealing to an inversion
inequality translating the L2-norm and the Hellinger distance between
general kernel mixtures, with a kernel density having polynomially decaying
Fourier transform, into any Lp-Wasserstein distance, pβ₯1, between the
corresponding mixing distributions, provided their Laplace transforms are
finite in some neighborhood of zero, we derive the rates of convergence in the
L1-Wasserstein metric for the Bayes' and maximum likelihood estimators of
the mixing distribution. Merging in the L1-Wasserstein distance between
Bayes and maximum likelihood follows as a by-product, along with an assessment
on the stochastic order of the discrepancy between the two estimation
procedures