180 research outputs found

    Solving Geophysical Inversion Problems with Intractable Likelihoods: Linearized Gaussian Approximations Versus the Correlated Pseudo-marginal Method

    Get PDF
    A geophysical Bayesian inversion problem may target the posterior distribution of geological or hydrogeological parameters given geophysical data. To account for the scatter in the petrophysical relationship linking the target parameters to the geophysical properties, this study treats the intermediate geophysical properties as latent (unobservable) variables. To perform inversion in such a latent variable model, the intractable likelihood function of the (hydro)geological parameters given the geophysical data needs to be estimated. This can be achieved by approximation with a Gaussian probability density function based on local linearization of the geophysical forward operator, thereby, accounting for the noise in the petrophysical relationship by a corresponding addition to the data covariance matrix. The new approximate method is compared against the general correlated pseudo-marginal method, which estimates the likelihood by Monte Carlo averaging over samples of the latent variable. First, the performances of the two methods are tested on a synthetic test example, in which a multivariate Gaussian porosity field is inferred using crosshole ground-penetrating radar first-arrival travel times. For this example with rather small petrophysical uncertainty, the two methods provide near-identical estimates, while an inversion that ignores petrophysical uncertainty leads to biased estimates. The results of a sensitivity analysis are then used to suggest that the linearized Gaussian approach, while attractive due to its relative computational speed, suffers from a decreasing accuracy with increasing scatter in the petrophysical relationship. The computationally more expensive correlated pseudo-marginal method performs very well even for settings with high petrophysical uncertainty

    How German general practitioners justify their provision of complementary and alternative medicine

    Get PDF
    Background: Many German general practitioners (GPs) use complementary and alternative medicine (CAM) in their daily work although most CAM procedures are controversial from an academic point of view. Objective: We aimed to investigate how GPs justify their use of CAM. Methods: We performed semi-structured, individual face-to-face interviews with 20 purposively sampled, experienced GPs providing primary care within the framework of the German statutory health insurance system. A grounded theory approach was used for data analysis. Results: All GPs participating in this study used at least some CAM in their clinical practice. Participants did not have any major conflicts when justifying their use of CAM therapies. Important arguments justifying CAM provision were: using it as a supplementary tool to conventional medicine; the feeling that evidence and science leave many problems in primary care unanswered; a strong focus on helping the individual patient, justifying the use of procedures not based on science for therapeutic and communicative purposes; a strong belief in one’s own clinical experience; and appreciation of placebo effects. In general, participants preferred CAM therapies which seemed at least somewhat plausible to them and which they could provide in an authentic manner. Conclusions: Our results suggest that many German GPs integrate CAM treatments in their routine primary carework without perceiving any major internal conflicts with professional ideals

    Variational Bayesian inference with complex geostatistical priors using inverse autoregressive flows

    Get PDF
    We combine inverse autoregressive flows (IAF) and variational Bayesian inference (variational Bayes) in the context of geophysical inversion parameterized with deep generative models encoding complex priors. Variational Bayes approximates the unnormalized posterior distribution parametrically within a given family of distributions by solving an optimization problem. Although prone to bias if the chosen family of distributions is too limited, it provides a computationally-efficient approach that scales well to high-dimensional inverse problems. To enhance the expressiveness of the variational distribution, we explore its combination with IAFs that allow samples from a simple base distribution to be pushed forward through a series of invertible transformations onto an approximate posterior. The IAF is learned by maximizing the lower bound of the evidence (marginal likelihood), which is equivalent to minimizing the Kullback–Leibler divergence between the approximation and the target posterior distribution. In our examples, we use either a deep generative adversarial network (GAN) or a variational autoencoder (VAE) to parameterize complex geostatistical priors. Although previous attempts to perform Gauss–Newton inversion in combination with GANs of the same architecture were proven unsuccessful, the trained IAF provides a good reconstruction of channelized subsurface models for both GAN- and VAE-based inversions using synthetic crosshole ground-penetrating-radar data. For the considered examples, the computational cost of our approach is seven times lower than for Markov chain Monte Carlo (MCMC) inversion. Furthermore, the VAE-based approximations in the latent space are in good agreement. The VAE-based inversion requires only one sample to estimate gradients with respect to the IAF parameters at each iteration, while the GAN-based inversions need more samples and the corresponding posterior approximation is less accurate

    Hydrogeological multiple-point statistics inversion by adaptive sequential Monte Carlo

    Get PDF
    For strongly non-linear and high-dimensional inverse problems, Markov chain Monte Carlo (MCMC) methods may fail to properly explore the posterior probability density function (PDF) given a realistic computational budget and are generally poorly amenable to parallelization. Particle methods approximate the posterior PDF using the states and weights of a population of evolving particles and they are very well suited to parallelization. We focus on adaptive sequential Monte Carlo (ASMC), an extension of annealed importance sampling (AIS). In AIS and ASMC, importance sampling is performed over a sequence of intermediate distributions, known as power posteriors, linking the prior to the posterior PDF. The AIS and ASMC algorithms also provide estimates of the evidence (marginal likelihood) as needed for Bayesian model selection, at basically no additional cost. ASMC performs better than AIS as it adaptively tunes the tempering schedule and performs resampling of particles when the variance of the particle weights becomes too large. We consider a challenging synthetic groundwater transport inverse problem with a categorical channelized 2D hydraulic conductivity field defined such that the posterior facies distribution includes two distinct modes. The model proposals are obtained by iteratively re-simulating a fraction of the current model using conditional multiple-point statistics (MPS) simulations. We examine how ASMC explores the posterior PDF and compare with results obtained with parallel tempering (PT), a state-of-the-art MCMC inversion approach that runs multiple interacting chains targeting different power posteriors. For a similar computational budget, ASMC outperforms PT as the ASMC-derived models fit the data better and recover the reference likelihood. Moreover, we show that ASMC partly retrieves both posterior modes, while none of them is recovered by PT. Lastly, we demonstrate how the power posteriors obtained by ASMC can be used to assess the influence of the assumed data errors on the posterior means and variances, as well as on the evidence. We suggest that ASMC can advantageously replace MCMC for solving many challenging inverse problems arising in the field of water resources

    Bayesian tomography with prior-knowledge-based parametrization and surrogate modelling

    Get PDF
    We present a Bayesian tomography framework operating with prior-knowledge-based parametrization that is accelerated by surrogate models. Standard high-fidelity forward solvers (e.g. finite-difference time-domain schemes) solve wave equations with natural spatial parametrizations based on fine discretization. Similar parametrizations, typically involving tens of thousand of variables, are usually employed to parametrize the subsurface in tomography applications. When the data do not allow to resolve details at such finely parametrized scales, it is often beneficial to instead rely on a prior-knowledge-based parametrization defined on a lower dimension domain (or manifold). Due to the increased identifiability in the reduced domain, the concomitant inversion is better constrained and generally faster. We illustrate the potential of a prior-knowledge-based approach by considering ground penetrating radar (GPR) traveltime tomography in a crosshole configuration with synthetic data. An effective parametrization of the input (i.e. the permittivity distributions determining the slowness field) and compression in the output (i.e. the traveltime gathers) spaces are achieved via data-driven principal component decomposition based on random realizations of the prior Gaussian-process model with a truncation determined by the performances of the standard solver on the full and reduced model domains. To accelerate the inversion process, we employ a high-fidelity polynomial chaos expansion (PCE) surrogate model. We investigate the impact of the size of the training set on the performance of the PCE and show that a few hundreds design data sets is sufficient to provide reliable Markov chain Monte Carlo inversion at a fraction of the cost associated with a standard approach involving a fine discretization and physics-based forward solvers. Appropriate uncertainty quantification is achieved by reintroducing the truncated higher order principle components in the original model space after inversion on the manifold and by adapting a likelihood function that accounts for the fact that the truncated higher order components are not completely located in the null space

    An energy-based model approach to rare event probability estimation

    Full text link
    The estimation of rare event probabilities plays a pivotal role in diverse fields. Our aim is to determine the probability of a hazard or system failure occurring when a quantity of interest exceeds a critical value. In our approach, the distribution of the quantity of interest is represented by an energy density, characterized by a free energy function. To efficiently estimate the free energy, a bias potential is introduced. Using concepts from energy-based models (EBM), this bias potential is optimized such that the corresponding probability density function approximates a pre-defined distribution targeting the failure region of interest. Given the optimal bias potential, the free energy function and the rare event probability of interest can be determined. The approach is applicable not just in traditional rare event settings where the variable upon which the quantity of interest relies has a known distribution, but also in inversion settings where the variable follows a posterior distribution. By combining the EBM approach with a Stein discrepancy-based stopping criterion, we aim for a balanced accuracy-efficiency trade-off. Furthermore, we explore both parametric and non-parametric approaches for the bias potential, with the latter eliminating the need for choosing a particular parameterization, but depending strongly on the accuracy of the kernel density estimate used in the optimization process. Through three illustrative test cases encompassing both traditional and inversion settings, we show that the proposed EBM approach, when properly configured, (i) allows stable and efficient estimation of rare event probabilities and (ii) compares favorably against subset sampling approaches

    Summary statistics from training images as prior information in probabilistic inversion

    Get PDF
    A strategy is presented to incorporate prior information from conceptual geological models in probabilistic inversion of geophysical data. The conceptual geological models are represented by multiple-point statistics training images (TIs) featuring the expected lithological units and structural patterns. Information from an ensemble of TI realizations is used in two different ways. First, dominant modes are identified by analysis of the frequency content in the realizations, which drastically reduces the model parameter space in the frequency-amplitude domain. Second, the distributions of global, summary metrics (e.g. model roughness) are used to formulate a prior probability density function. The inverse problem is formulated in a Bayesian framework and the posterior pdf is sampled using Markov chain Monte Carlo simulation. The usefulness and applicability of this method is demonstrated on two case studies in which synthetic crosshole ground-penetrating radar traveltime data are inverted to recover 2-D porosity fields. The use of prior information from TIs significantly enhances the reliability of the posterior models by removing inversion artefacts and improving individual parameter estimates. The proposed methodology reduces the ambiguity inherent in the inversion of high-dimensional parameter spaces, accommodates a wide range of summary statistics and geophysical forward problem

    Two-dimensional probabilistic inversion of plane-wave electromagnetic data: methodology, model constraints and joint inversion with electrical resistivity data

    Get PDF
    Probabilistic inversion methods based on Markov chain Monte Carlo (MCMC) simulation are well suited to quantify parameter and model uncertainty of nonlinear inverse problems. Yet, application of such methods to CPU-intensive forward models can be a daunting task, particularly if the parameter space is high dimensional. Here, we present a 2-D pixel-based MCMC inversion of plane-wave electromagnetic (EM) data. Using synthetic data, we investigate how model parameter uncertainty depends on model structure constraints using different norms of the likelihood function and the model constraints, and study the added benefits of joint inversion of EM and electrical resistivity tomography (ERT) data. Our results demonstrate that model structure constraints are necessary to stabilize the MCMC inversion results of a highly discretized model. These constraints decrease model parameter uncertainty and facilitate model interpretation. A drawback is that these constraints may lead to posterior distributions that do not fully include the true underlying model, because some of its features exhibit a low sensitivity to the EM data, and hence are difficult to resolve. This problem can be partly mitigated if the plane-wave EM data is augmented with ERT observations. The hierarchical Bayesian inverse formulation introduced and used herein is able to successfully recover the probabilistic properties of the measurement data errors and a model regularization weight. Application of the proposed inversion methodology to field data from an aquifer demonstrates that the posterior mean model realization is very similar to that derived from a deterministic inversion with similar model constraint

    Social shaping of digital publishing: exploring the interplay between culture and technology

    Get PDF
    The processes and forms of electronic publishing have been changing since the advent of the Web. In recent years, the open access movement has been a major driver of scholarly communication, and change is also evident in other fields such as e-government and e-learning. Whilst many changes are driven by technological advances, an altered social reality is also pushing the boundaries of digital publishing. With 23 articles and 10 posters, Elpub 2012 focuses on the social shaping of digital publishing and explores the interplay between culture and technology. This book contains the proceedings of the conference, consisting of 11 accepted full articles and 12 articles accepted as extended abstracts. The articles are presented in groups, and cover the topics: digital scholarship and publishing; special archives; libraries and repositories; digital texts and readings; and future solutions and innovations. Offering an overview of the current situation and exploring the trends of the future, this book will be of interest to all those whose work involves digital publishing
    corecore