This paper focuses on parameter estimation and introduces a new method for
lower bounding the Bayesian risk. The method allows for the use of virtually
\emph{any} information measure, including R\'enyi's α,
φ-Divergences, and Sibson's α-Mutual Information. The approach
considers divergences as functionals of measures and exploits the duality
between spaces of measures and spaces of functions. In particular, we show that
one can lower bound the risk with any information measure by upper bounding its
dual via Markov's inequality. We are thus able to provide estimator-independent
impossibility results thanks to the Data-Processing Inequalities that
divergences satisfy. The results are then applied to settings of interest
involving both discrete and continuous parameters, including the
``Hide-and-Seek'' problem, and compared to the state-of-the-art techniques. An
important observation is that the behaviour of the lower bound in the number of
samples is influenced by the choice of the information measure. We leverage
this by introducing a new divergence inspired by the ``Hockey-Stick''
Divergence, which is demonstrated empirically to provide the largest
lower-bound across all considered settings. If the observations are subject to
privatisation, stronger impossibility results can be obtained via Strong
Data-Processing Inequalities. The paper also discusses some generalisations and
alternative directions