8,449 research outputs found
Uncertainty and sensitivity analysis of functional risk curves based on Gaussian processes
A functional risk curve gives the probability of an undesirable event as a
function of the value of a critical parameter of a considered physical system.
In several applicative situations, this curve is built using phenomenological
numerical models which simulate complex physical phenomena. To avoid cpu-time
expensive numerical models, we propose to use Gaussian process regression to
build functional risk curves. An algorithm is given to provide confidence
bounds due to this approximation. Two methods of global sensitivity analysis of
the models' random input parameters on the functional risk curve are also
studied. In particular, the PLI sensitivity indices allow to understand the
effect of misjudgment on the input parameters' probability density functions
Probabilistic Numerics and Uncertainty in Computations
We deliver a call to arms for probabilistic numerical methods: algorithms for
numerical tasks, including linear algebra, integration, optimization and
solving differential equations, that return uncertainties in their
calculations. Such uncertainties, arising from the loss of precision induced by
numerical calculation with limited time or hardware, are important for much
contemporary science and industry. Within applications such as climate science
and astrophysics, the need to make decisions on the basis of computations with
large and complex data has led to a renewed focus on the management of
numerical uncertainty. We describe how several seminal classic numerical
methods can be interpreted naturally as probabilistic inference. We then show
that the probabilistic view suggests new algorithms that can flexibly be
adapted to suit application specifics, while delivering improved empirical
performance. We provide concrete illustrations of the benefits of probabilistic
numeric algorithms on real scientific problems from astrometry and astronomical
imaging, while highlighting open problems with these new algorithms. Finally,
we describe how probabilistic numerical methods provide a coherent framework
for identifying the uncertainty in calculations performed with a combination of
numerical algorithms (e.g. both numerical optimisers and differential equation
solvers), potentially allowing the diagnosis (and control) of error sources in
computations.Comment: Author Generated Postprint. 17 pages, 4 Figures, 1 Tabl
Calibration and improved prediction of computer models by universal Kriging
This paper addresses the use of experimental data for calibrating a computer
model and improving its predictions of the underlying physical system. A global
statistical approach is proposed in which the bias between the computer model
and the physical system is modeled as a realization of a Gaussian process. The
application of classical statistical inference to this statistical model yields
a rigorous method for calibrating the computer model and for adding to its
predictions a statistical correction based on experimental data. This
statistical correction can substantially improve the calibrated computer model
for predicting the physical system on new experimental conditions. Furthermore,
a quantification of the uncertainty of this prediction is provided. Physical
expertise on the calibration parameters can also be taken into account in a
Bayesian framework. Finally, the method is applied to the thermal-hydraulic
code FLICA 4, in a single phase friction model framework. It allows to improve
the predictions of the thermal-hydraulic code FLICA 4 significantly
A New Distribution-Free Concept for Representing, Comparing, and Propagating Uncertainty in Dynamical Systems with Kernel Probabilistic Programming
This work presents the concept of kernel mean embedding and kernel
probabilistic programming in the context of stochastic systems. We propose
formulations to represent, compare, and propagate uncertainties for fairly
general stochastic dynamics in a distribution-free manner. The new tools enjoy
sound theory rooted in functional analysis and wide applicability as
demonstrated in distinct numerical examples. The implication of this new
concept is a new mode of thinking about the statistical nature of uncertainty
in dynamical systems
Open TURNS: An industrial software for uncertainty quantification in simulation
The needs to assess robust performances for complex systems and to answer
tighter regulatory processes (security, safety, environmental control, and
health impacts, etc.) have led to the emergence of a new industrial simulation
challenge: to take uncertainties into account when dealing with complex
numerical simulation frameworks. Therefore, a generic methodology has emerged
from the joint effort of several industrial companies and academic
institutions. EDF R&D, Airbus Group and Phimeca Engineering started a
collaboration at the beginning of 2005, joined by IMACS in 2014, for the
development of an Open Source software platform dedicated to uncertainty
propagation by probabilistic methods, named OpenTURNS for Open source Treatment
of Uncertainty, Risk 'N Statistics. OpenTURNS addresses the specific industrial
challenges attached to uncertainties, which are transparency, genericity,
modularity and multi-accessibility. This paper focuses on OpenTURNS and
presents its main features: openTURNS is an open source software under the LGPL
license, that presents itself as a C++ library and a Python TUI, and which
works under Linux and Windows environment. All the methodological tools are
described in the different sections of this paper: uncertainty quantification,
uncertainty propagation, sensitivity analysis and metamodeling. A section also
explains the generic wrappers way to link openTURNS to any external code. The
paper illustrates as much as possible the methodological tools on an
educational example that simulates the height of a river and compares it to the
height of a dyke that protects industrial facilities. At last, it gives an
overview of the main developments planned for the next few years
Multi-Resolution Functional ANOVA for Large-Scale, Many-Input Computer Experiments
The Gaussian process is a standard tool for building emulators for both
deterministic and stochastic computer experiments. However, application of
Gaussian process models is greatly limited in practice, particularly for
large-scale and many-input computer experiments that have become typical. We
propose a multi-resolution functional ANOVA model as a computationally feasible
emulation alternative. More generally, this model can be used for large-scale
and many-input non-linear regression problems. An overlapping group lasso
approach is used for estimation, ensuring computational feasibility in a
large-scale and many-input setting. New results on consistency and inference
for the (potentially overlapping) group lasso in a high-dimensional setting are
developed and applied to the proposed multi-resolution functional ANOVA model.
Importantly, these results allow us to quantify the uncertainty in our
predictions. Numerical examples demonstrate that the proposed model enjoys
marked computational advantages. Data capabilities, both in terms of sample
size and dimension, meet or exceed best available emulation tools while meeting
or exceeding emulation accuracy
Coordinate Transformation and Polynomial Chaos for the Bayesian Inference of a Gaussian Process with Parametrized Prior Covariance Function
This paper addresses model dimensionality reduction for Bayesian inference
based on prior Gaussian fields with uncertainty in the covariance function
hyper-parameters. The dimensionality reduction is traditionally achieved using
the Karhunen-\Loeve expansion of a prior Gaussian process assuming covariance
function with fixed hyper-parameters, despite the fact that these are uncertain
in nature. The posterior distribution of the Karhunen-Lo\`{e}ve coordinates is
then inferred using available observations. The resulting inferred field is
therefore dependent on the assumed hyper-parameters. Here, we seek to
efficiently estimate both the field and covariance hyper-parameters using
Bayesian inference. To this end, a generalized Karhunen-Lo\`{e}ve expansion is
derived using a coordinate transformation to account for the dependence with
respect to the covariance hyper-parameters. Polynomial Chaos expansions are
employed for the acceleration of the Bayesian inference using similar
coordinate transformations, enabling us to avoid expanding explicitly the
solution dependence on the uncertain hyper-parameters. We demonstrate the
feasibility of the proposed method on a transient diffusion equation by
inferring spatially-varying log-diffusivity fields from noisy data. The
inferred profiles were found closer to the true profiles when including the
hyper-parameters' uncertainty in the inference formulation.Comment: 34 pages, 17 figure
- …