290 research outputs found

    Compressed particle methods for expensive models with application in Astronomy and Remote Sensing

    Get PDF
    In many inference problems, the evaluation of complex and costly models is often required. In this context, Bayesian methods have become very popular in several fields over the last years, in order to obtain parameter inversion, model selection or uncertainty quantification. Bayesian inference requires the approximation of complicated integrals involving (often costly) posterior distributions. Generally, this approximation is obtained by means of Monte Carlo (MC) methods. In order to reduce the computational cost of the corresponding technique, surrogate models (also called emulators) are often employed. Another alternative approach is the so-called Approximate Bayesian Computation (ABC) scheme. ABC does not require the evaluation of the costly model but the ability to simulate artificial data according to that model. Moreover, in ABC, the choice of a suitable distance between real and artificial data is also required. In this work, we introduce a novel approach where the expensive model is evaluated only in some well-chosen samples. The selection of these nodes is based on the so-called compressed Monte Carlo (CMC) scheme. We provide theoretical results supporting the novel algorithms and give empirical evidence of the performance of the proposed method in several numerical experiments. Two of them are real-world applications in astronomy and satellite remote sensing.Comment: published in IEEE Transactions on Aerospace and Electronic System

    Parameter estimation for computationally intensive nonlinear regression with an application to climate modeling

    Full text link
    Nonlinear regression is a useful statistical tool, relating observed data and a nonlinear function of unknown parameters. When the parameter-dependent nonlinear function is computationally intensive, a straightforward regression analysis by maximum likelihood is not feasible. The method presented in this paper proposes to construct a faster running surrogate for such a computationally intensive nonlinear function, and to use it in a related nonlinear statistical model that accounts for the uncertainty associated with this surrogate. A pivotal quantity in the Earth's climate system is the climate sensitivity: the change in global temperature due to doubling of atmospheric CO2\mathrm{CO}_2 concentrations. This, along with other climate parameters, are estimated by applying the statistical method developed in this paper, where the computationally intensive nonlinear function is the MIT 2D climate model.Comment: Published in at http://dx.doi.org/10.1214/08-AOAS210 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    EuCAPT White Paper: Opportunities and Challenges for Theoretical Astroparticle Physics in the Next Decade

    Get PDF
    Astroparticle physics is undergoing a profound transformation, due to a series of extraordinary new results, such as the discovery of high-energy cosmic neutrinos with IceCube, the direct detection of gravitational waves with LIGO and Virgo, and many others. This white paper is the result of a collaborative effort that involved hundreds of theoretical astroparticle physicists and cosmologists, under the coordination of the European Consortium for Astroparticle Theory (EuCAPT). Addressed to the whole astroparticle physics community, it explores upcoming theoretical opportunities and challenges for our field of research, with particular emphasis on the possible synergies among different subfields, and the prospects for solving the most fundamental open questions with multi-messenger observations.Comment: White paper of the European Consortium for Astroparticle Theory (EuCAPT). 135 authors, 400 endorsers, 133 pages, 1382 reference

    Framework for emulation and uncertainty quantification of a stochastic building performance simulator

    Get PDF
    A good framework for the quantification and decomposition of uncertainties in dynamic building performance simulation should: (i) simulate the principle deterministic processes influencing heat flows and the stochastic perturbations to them, (ii) quantify and decompose the total uncertainty into its respective sources, and the interactions between them, and (iii) achieve this in a computationally efficient manner. In this paper we introduce a new framework which, for the first time, does just that. We present the detailed development of this framework for emulating the mean and the variance in the response of a stochastic building performance simulator (EnergyPlus co-simulated with a multi agent stochastic simulator called No-MASS), for heating and cooling load predictions. We demonstrate and evaluate the effectiveness of these emulators, applied to a monozone office building. With a range of 25–50 kWh/m2, the epistemic uncertainty due to envelope parameters dominates over aleatory uncertainty relating to occupants' interactions, which ranges from 6–8 kWh/m2, for heating loads. The converse is observed for cooling loads, which vary by just 3 kWh/m2 for envelope parameters, compared with 8–22 kWh/m2 for their aleatory counterparts. This is due to the larger stimuli provoking occupants' interactions. Sensitivity indices corroborate this result, with wall insulation thickness (0.97) and occupants' behaviours (0.83) having the highest impacts on heating and cooling load predictions respectively. This new emulator framework (including training and subsequent deployment) achieves a factor of c.30 reduction in the total computational budget, whilst overwhelmingly maintaining predictions within a 95% confidence interval, and successfully decomposing prediction uncertainties

    Hawking Radiation and Analogue Experiments: A Bayesian Analysis

    Get PDF
    We present a Bayesian analysis of the epistemology of analogue experiments with particular reference to Hawking radiation. First, we prove that such experiments can be confirmatory in Bayesian terms based upon appeal to `universality arguments'. Second, we provide a formal model for the scaling behaviour of the confirmation measure for multiple distinct realisations of the analogue system and isolate a generic saturation feature. Finally, we demonstrate that different potential analogue realisations could provide different levels of confirmation. Our results provide a basis both to formalise the epistemic value of analogue experiments that have been conducted and to advise scientists as to the respective epistemic value of future analogue experiments
    corecore