400 research outputs found
Leveraging vague prior information in general models via iteratively constructed Gamma-minimax estimators
Gamma-minimax estimation is an approach to incorporate prior information into
an estimation procedure when it is implausible to specify one particular prior
distribution. In this approach, we aim for an estimator that minimizes the
worst-case Bayes risk over a set of prior distributions.
Traditionally, Gamma-minimax estimation is defined for parametric models. In
this paper, we define Gamma-minimaxity for general models and propose iterative
algorithms with convergence guarantees to compute Gamma-minimax estimators for
a general model space and a set of prior distributions constrained by
generalized moments. We also propose encoding the space of candidate estimators
by neural networks to enable flexible estimation. We illustrate our method in
two settings, namely entropy estimation and a problem that arises in
biodiversity studies
Six Statistical Senses
This article proposes a set of categories, each one representing a particular
distillation of important statistical ideas. Each category is labeled a "sense"
because we think of these as essential in helping every statistical mind
connect in constructive and insightful ways with statistical theory,
methodologies, and computation, toward the ultimate goal of building
statistical phronesis. The illustration of each sense with statistical
principles and methods provides a sensical tour of the conceptual landscape of
statistics, as a leading discipline in the data science ecosystem
Recommended from our members
Uncertainty Quantification
Uncertainty quantification (UQ) is concerned with including and characterising uncertainties in mathematical models.
Major steps comprise proper description of system uncertainties, analysis and efficient quantification of uncertainties in predictions and design problems, and statistical inference on uncertain parameters starting from available measurements.
Research in UQ addresses fundamental mathematical and statistical challenges, but has also wide applicability in areas such as engineering, environmental, physical and biological applications.
This workshop focussed on mathematical challenges at the interface of applied mathematics, probability and statistics, numerical analysis, scientific computing and application domains.
The workshop served to bring together experts from those disciplines in order to enhance their interaction, to exchange ideas and to develop new, powerful methods for UQ
Recommended from our members
Statistical inference and computation in elliptic PDE models
Partial differential equations (PDE) are ubiquitous in describing real-world phenomena. In many statistical models, PDE are used to encode complex relationships between unknown quantities and the observed data. We investigate statistical and computational questions arising in such models, adopting an infinite-dimensional `nonparametric' framework and assuming the observed data are subject to random noise. The main PDE examples are of elliptic or parabolic type.
Chapter 2 investigates the problem of sampling from high-dimensional Bayesian posterior distributions. The main results consist of non-asymptotic computational guarantees for Langevin-type Markov chain Monte Carlo (MCMC) algorithms which scale polynomially in key quantities such as the dimension of the model, the desired precision level, and the number of available statistical measurements. The bounds hold with high probability under the distribution of the data, assuming that certain `local geometric' assumptions are fulfilled and that a good initialiser of the algorithm is available. We study a representative non-linear PDE example where the unknown is a coefficient function in a steady-state Schr\"odinger equation, and the solution to a corresponding boundary value problem is observed.
Chapter 3 studies statistical convergence rates for nonparametric Tikhonov-type estimators, which can be interpreted also as Bayesian maximum a posteriori (MAP) estimators arising from certain Gaussian process priors. The theory is derived in a general setting for non-linear inverse problems and then applied to two examples, the steady-state Schr\"odinger equation studied in Chapter \ref{sampling} and a model for the steady-state heat equation. It is shown that the rates obtained are minimax-optimal in prediction loss.
The final Chapter 4 considers a model for scalar diffusion processes with an unknown drift function which is modelled nonparametrically. It is shown that in the low frequency sampling case, when the sample consists of for some fixed sampling distance , under mild regularity assumptions, the model satisfies the local asymptotic normality (LAN) property. The key tools used are regularity estimates and spectral properties for certain parabolic and elliptic PDE related to
- …