32 research outputs found
Besov priors for Bayesian inverse problems
We consider the inverse problem of estimating a function from noisy,
possibly nonlinear, observations. We adopt a Bayesian approach to the problem.
This approach has a long history for inversion, dating back to 1970, and has,
over the last decade, gained importance as a practical tool. However most of
the existing theory has been developed for Gaussian prior measures. Recently
Lassas, Saksman and Siltanen (Inv. Prob. Imag. 2009) showed how to construct
Besov prior measures, based on wavelet expansions with random coefficients, and
used these prior measures to study linear inverse problems. In this paper we
build on this development of Besov priors to include the case of nonlinear
measurements. In doing so a key technical tool, established here, is a
Fernique-like theorem for Besov measures. This theorem enables us to identify
appropriate conditions on the forward solution operator which, when matched to
properties of the prior Besov measure, imply the well-definedness and
well-posedness of the posterior measure. We then consider the application of
these results to the inverse problem of finding the diffusion coefficient of an
elliptic partial differential equation, given noisy measurements of its
solution.Comment: 18 page
Hierarchical Bayesian level set inversion
The level set approach has proven widely successful in the study of inverse problems for inter- faces, since its systematic development in the 1990s. Re- cently it has been employed in the context of Bayesian inversion, allowing for the quantification of uncertainty within the reconstruction of interfaces. However the Bayesian approach is very sensitive to the length and amplitude scales in the prior probabilistic model. This paper demonstrates how the scale-sensitivity can be cir- cumvented by means of a hierarchical approach, using a single scalar parameter. Together with careful con- sideration of the development of algorithms which en- code probability measure equivalences as the hierar- chical parameter is varied, this leads to well-defined Gibbs based MCMC methods found by alternating Metropolis-Hastings updates of the level set function and the hierarchical parameter. These methods demon- strably outperform non-hierarchical Bayesian level set methods
Development of high-resolution infrared thermographic imaging method as a diagnostic tool for acute undifferentiated limp in young children
Acute limp is a common presenting condition in the paediatric emergency department. There are a number of causes of acute limp that include traumatic injury, infection and malignancy. These causes in young children are not easily distinguished. In this pilot study, an infrared thermographic imaging technique to diagnose acute undifferentiated limp in young children was developed.
Following required ethics approval, 30 children (mean age = 5.2 years, standard deviation = 3.3 years) were recruited. The exposed lower limbs of participants were imaged using a high-resolution thermal camera. Using predefined regions of interest (ROI), any skin surface temperature difference between the healthy and affected legs was statistically analysed, with the aim of identifying limp. In all examined ROIs, the median skin surface temperature for the affected limb was higher than that of the healthy limb. The small sample size recruited for each group, however, meant that the statistical tests of significant difference need to be interpreted in this context. Thermal imaging showed potential in helping with the diagnosis of acute limp in children. Repeating a similar study with a larger sample size will be beneficial to establish reproducibility of the results
Hyperpriors for matérn fields with applications in bayesian inversion
We introduce non-stationary Matérn fofield priors with stochastic partial differential equations, and construct correlation length-scaling with hyperpriors. We model both the hyperprior and the Matérn prior as continuousparameter random fields. As hypermodels, we use Cauchy and Gaussian random fields, which we map suitably to a desired correlation length-scaling range. For computations, we discretise the models with finite difference methods. We consider the convergence of the discretised prior and posterior to the discretisation limit. We apply the developed methodology to certain interpolation, numerical differentiation and deconvolution problems, and show numerically that we can make Bayesian inversion which promotes competing constraints of smoothness and edge-preservation. For computing the conditional mean estimator of the posterior distribution, we use a combination of Gibbs and Metropolis-within-Gibbs sampling algorithms
Elliptic boundary value problems with Gaussian white noise loads
Abstract
Linear second order elliptic boundary value problems (BVP) on bounded Lipschitz domains are studied in the case of Gaussian white noise loads. The challenging cases of Neumann and Robin BVPs are considered.
The main obstacle for usual variational methods is the irregularity of the load. In particular, the Neumann boundary values are not well-defined.
In this work, the BVP is formulated by replacing the continuity of boundary trace mappings with measurability. Instead of variational methods alone, the novel BVP derives also from CameronâMartin space techniques.
The new BVP returns the study of irregular white noise to the study of LÂČ-loads
Enhancing industrial X-ray tomography by data-centric statistical methods
Abstract
X-ray tomography has applications in various industrial fields such as sawmill industry, oil and gas industry, as well as chemical, biomedical, and geotechnical engineering. In this article, we study Bayesian methods for the X-ray tomography reconstruction. In Bayesian methods, the inverse problem of tomographic reconstruction is solved with the help of a statistical prior distribution which encodes the possible internal structures by assigning probabilities for smoothness and edge distribution of the object. We compare Gaussian random field priors, that favor smoothness, to non-Gaussian total variation (TV), Besov, and Cauchy priors which promote sharp edges and high- and low-contrast areas in the object. We also present computational schemes for solving the resulting high-dimensional Bayesian inverse problem with 100,000â1,000,000 unknowns. We study the applicability of a no-U-turn variant of Hamiltonian Monte Carlo (HMC) methods and of a more classical adaptive Metropolis-within-Gibbs (MwG) algorithm to enable full uncertainty quantification of the reconstructions. We use maximum a posteriori (MAP) estimates with limited-memory BFGS (BroydenâFletcherâGoldfarbâShanno) optimization algorithm. As the first industrial application, we consider sawmill industry X-ray log tomography. The logs have knots, rotten parts, and even possibly metallic pieces, making them good examples for non-Gaussian priors. Secondly, we study drill-core rock sample tomography, an example from oil and gas industry. In that case, we compare the priors without uncertainty quantification. We show that Cauchy priors produce smaller number of artefacts than other choices, especially with sparse high-noise measurements, and choosing HMC enables systematic uncertainty quantification, provided that the posterior is not pathologically multimodal or heavy-tailed
Bayesian filtering in incoherent scatter plasma parameter fits
Abstract
Incoherent scatter (IS) radars are invaluable instruments for ionospheric physics, since they observe altitude profiles of electron density (Ne), electron temperature (Te), ion temperature (Ti) and lineâofâsight plasma velocity (Vi) from ground. However, the temperatures can be fitted to the observed IS spectra only when the ion composition is known, and resolutions of the fitted plasma parameters are often insufficient for auroral electron precipitation, which requires high resolutions in both range and time. The problem of unknown ion composition has been addressed by means of the fullâprofile analysis, which assumes that the plasma parameter profiles are smooth in altitude, or follow some predefined shape. In a similar manner, one could assume smooth time variations, but this option has not been used in IS analysis. We propose a plasma parameter fit technique based on Bayesian filtering, which we have implemented as an additional Bayesian Filtering Module (BAFIM) in the Grand Unified Incoherent Scatter Design and Analysis Package (GUISDAP). BAFIM allows us to control gradients in both time and range directions for each plasma parameter separately. With BAFIM we can fit F1 region ion composition together with Ne, Te, Ti and Vi, and we have reached 4 s/900 m time/range steps in fourâparameter fits of Ne, Te, Ti and Vi in E region observations of auroral electron precipitation